Job Seekers, Please send resumes to resumes@hireitpeople.com
Must Have Skills:
- AWS
- Spark
Detailed Job Description:
- Snowflake, Airflow, Phython, AWS, EC2, Cognos, Kinesis and Design and implement distributed data processing pipelines using Spark, Hive, Sqoop, Python, and other tools and languages prevalent in the Hadoop ecosystem.
- Ability to design and implement end to end solution.
- Build utilities, user defined functions, and frameworks to better enable data flow patterns.
- Research, evaluate and utilize new technologies tools frameworks centered around Hadoop and other elements in the Big
Minimum years of experience*: 5+
Certifications Needed: No
Responsibilities you would expect the Subcon to shoulder and execute*:
- Good knowledge in Snowflake, Airflow, Phython, AWS, EC2, Cognos, Kinesis and Design and implement distributed data processing pipelines using Spark, Hive, Sqoop, Python, and other tools and languages prevalent in the Hadoop ecosystem
- Ability to design and implement end to end solution.
- Build utilities, user defined functions, and frameworks to better enable data flow patterns.
- Research, evaluate and utilize new technologies tools frameworks centered around Hadoop and other elements in the Big
Interview Process (Is face to face required?) No
Does this position require Visa independent candidates only? No