Job Seekers, Please send resumes to resumes@hireitpeople.com
Must Have Skills:
- Hadoop
- Spark
Detailed Job Description:
- Experience developing in the Hadoop ecosystem, leveraging tools such as Spark,
- Pig, Hive, Sqoop, and others. Spark is must have skill
- Experience developing with scripting languages such as Python
- Experience developing with the AWS EMR managed service
- Experience developing solutions in Snowflake SQLknowledge
- if workload automation tools such as Airflow, Autosys, ETL tools, etc
Minimum years of experience*: 5+
Certifications Needed: No
Top 3 responsibilities you would expect the Subcon to shoulder and execute*:
- Experience developing in the Hadoop ecosystem, leveraging tools such as Spark, Pig, Hive, Sqoop, and others.
- Spark is must have skill
- Experience developing with scripting languages such as Python
Interview Process (Is face to face required?) No
Does this position require Visa independent candidates only? No