Job Seekers, Please send resumes to resumes@hireitpeople.com
Must Have Skills:
- Hadoop
- Scala
- Hive/Pyspark
- Participate in design and implementation of the analytics architecture.
- Experience in working on Hadoop Distribution, good understanding of core concepts and best practices
- Good experience in building/tuning Spark pipelines in Python/Java/Scala
- Good experience in writing complex Hive queries to drive business critical insights
- Good Programming experience with Java/Python
- Understanding of Data Lake vs Data Warehousing concepts
- Experience with AWS Cloud, exposure to Lambda/EMR/Kinesis will be good to have
Minimum years of experience: 5-8 years
Certifications Needed: No
Top 3 responsibilities you would expect the Subcon to shoulder and execute:
- Design and development using PythonScala
- Deliver as per scrumAgile methodology
- Onsiteoffshore coordination
Interview Process (Is face to face required?): No
Does this position require Visa independent candidates only? No