Job ID :
11560
Company :
Internal Postings
Location :
Woodcliff, NJ
Type :
Contract
Duration :
6 Months
Salary :
Open
Status :
Active
Openings :
1
Posted :
27 Feb 2018
Job Seekers, Please send resumes to resumes@hireitpeople.com
Description:
  • The Market Intelligence goal is to complete data ingestions, data preparations to enable a consistent and innovative services based model supporting Data Lake, smart analytics, visualizations, predictive and prescriptive concepts to reality
  • Import and export data between an external RDBMS and clusters.
  • Proven understanding with Hadoop, Hive, Spark, HDFS and Scala.
  • Knowledge in Hadoop ecosystem.
  • Good knowledge of database structures, theories, principles, and practices.
  • Hands on experience in data loading tools like Flume, Sqoop, NiFi.
  • Hands on experience in HiveQL.
  • Knowledge of workflow/schedulers like Oozie.
  • Knowledge of Talend for data extraction, data transformation, and data loading.
  • Writing high-performance, reliable and maintainable code.
  • Analytical and problem solving skills, applied to BigData domain.
  • Sharp analytical abilities and proven design skills.
  • Knowledge or experience in building data integrations, understanding enterprise integration patterns and concepts
Day to Day job Duties: (what this person will do on a daily/weekly basis)
  • Import and export data between an external RDBMS and Hadoop clusters.
  • Develop ingestion and data preparation scripts 
  • Design data ingestion and data loading framework to allow smooth and error free data migration for initial, and incremental loading 
  • Define data mapping for source to target 
  • Define workflow and schedules using schedulers like Oozie.
  • Develop scripts using Talend for data extraction, data transformation, and data loading.
  • Writing high-performance, reliable and maintainable code.
  • Understand and design the Target data structure, to build data integrations, utilizing Enterprise integration patterns and concepts.
  • Collaborate with Data scientist to design target data model that will help in predictive analytics and dashboards.
  • Basic Qualifications: (what are the skills required to this job with minimum years of experience on each)
  • Minimum 4 years of experience in Big data 
  • Proven understanding with Hadoop, Hive, Spark, HDFS and Scala.
  • Knowledge in Hadoop ecosystem.
  • Good knowledge of database structures, theories, principles, and practices.
  • Hands on experience in data loading tools like Flume, Sqoop, NiFi.
  • Hands on experience in HiveQL.
  • Knowledge of workflow/schedulers like Oozie.
  • Knowledge of Talend for data extraction, data transformation, and data loading.
  • Analytical and problem solving skills, applied to BigData domain.
  • Sharp analytical abilities and proven design skills.
  • Minimum 5 years of experience in global team environment and working with offshore team.
  • Minimum 3+ years of experience conceptualizing and architecting the target environment for Big data 
Travel: No Travel. Person is required to be operate 5 days a week in NJ office.
Degree: Bachelors in Computer Science or equivalent work experience
Nice to Have: (But not a must)
  • Project lead and Architecture experience.