Job Seekers, Please send resumes to resumes@hireitpeople.comDescription:
- The Market Intelligence goal is to complete data ingestions, data preparations to enable a consistent and innovative services based model supporting Data Lake, smart analytics, visualizations, predictive and prescriptive concepts to reality
- Import and export data between an external RDBMS and clusters.
- Proven understanding with Hadoop, Hive, Spark, HDFS and Scala.
- Knowledge in Hadoop ecosystem.
- Good knowledge of database structures, theories, principles, and practices.
- Hands on experience in data loading tools like Flume, Sqoop, NiFi.
- Hands on experience in HiveQL.
- Knowledge of workflow/schedulers like Oozie.
- Knowledge of Talend for data extraction, data transformation, and data loading.
- Writing high-performance, reliable and maintainable code.
- Analytical and problem solving skills, applied to BigData domain.
- Sharp analytical abilities and proven design skills.
- Knowledge or experience in building data integrations, understanding enterprise integration patterns and concepts
- Import and export data between an external RDBMS and Hadoop clusters.
- Develop ingestion and data preparation scripts
- Design data ingestion and data loading framework to allow smooth and error free data migration for initial, and incremental loading
- Define data mapping for source to target
- Define workflow and schedules using schedulers like Oozie.
- Develop scripts using Talend for data extraction, data transformation, and data loading.
- Writing high-performance, reliable and maintainable code.
- Understand and design the Target data structure, to build data integrations, utilizing Enterprise integration patterns and concepts.
- Collaborate with Data scientist to design target data model that will help in predictive analytics and dashboards.
- Basic Qualifications: (what are the skills required to this job with minimum years of experience on each)
- Minimum 4 years of experience in Big data
- Proven understanding with Hadoop, Hive, Spark, HDFS and Scala.
- Knowledge in Hadoop ecosystem.
- Good knowledge of database structures, theories, principles, and practices.
- Hands on experience in data loading tools like Flume, Sqoop, NiFi.
- Hands on experience in HiveQL.
- Knowledge of workflow/schedulers like Oozie.
- Knowledge of Talend for data extraction, data transformation, and data loading.
- Analytical and problem solving skills, applied to BigData domain.
- Sharp analytical abilities and proven design skills.
- Minimum 5 years of experience in global team environment and working with offshore team.
- Minimum 3+ years of experience conceptualizing and architecting the target environment for Big data
Degree: Bachelors in Computer Science or equivalent work experience
Nice to Have: (But not a must)
- Project lead and Architecture experience.