Hadoop Developer @ NYC or Jersey City

Hadoop Developer.
Location: NYC or Jersey City(Candidate should be in Tri-state)
Duration: 6 months
Rate: DOE(W2)


US Citizen and Green Card Holder ONLY on c2c
Unable to use h1b and OPT


Looking for passionate technologists with experience in CIB application development and are comfortable performing under pressure to deliver production quality systems in short time-frames.   The team works with the wholesale (CIB) business lines to produce data analytics and big data solutions.  Initially we are looking for strong java developers, but this group poses exciting opportunities in the future for individuals who are interested in green field projects using the latest technologies for data analytics and big data, with the ability to architect and build systems by selecting the right technology’s.

Key Responsibilities
  • Experience in implementing distributed and scalable algorithms (Hadoop, Spark)
  • Analyze, design and code business-related solutions, as well as core architectural changes, using an Agile programming approach resulting in software delivered on time and in budget;
  • Understand and implement quantitative models in production ready systems
  • Experience of working in a development teams, using agile techniques and Object Oriented development and scripting languages, is preferred.
  • Comfortable learning cutting edge technologies and applications to greenfield projects
The candidate will ideally have 2+ years experience with hands-on experience in:
  • Strong domain expertise for institutional securities business
  • Data-modeling and implementation
  • Experience in working with market / streaming data and time-series analytics
  • Experience on working with different caching strategies 
  • Experience on working with multiple solutions for data movements such as – file copy, pub-sub, ftp, etc
  • Development of web-based and digital framework for content delivery
  • Experience with Hadoop and the HDFS and Ecosystem
  • Experience with Core Java, Scala, Python, R
  • Experience on Relational Data Base Systems(SQL) and Hierarchical data management
  • Experience with batch processing
  • Experience working with Hortonworks or Cloudera (preferred)
  • Experience to Map Reduce
  • Experience to ETL tools such as sqoop, pig, data torrent/Apex and Pentaho is a plus
  • Experience with Navigator is a plus
  • Experience with REST API is a plus
  • Experience with streaming processing is a plus
  • Exposure to encryption tools (HP Voltage) is a plus
  • Exposure to NoSQL store is a plus

Comments

Popular posts from this blog

Get FREE Combo Search

9+ years- AWS Devops- Mountain View, CA

How to Post To Multiple Facebook Groups for Free?