Job Seekers, Please send resumes to resumes@hireitpeople.comMust Have Skills:
- Big data
- Hadoop
- HDFS
- Scala
- Azure
- Hive/Pyspark
- Required 14+ years of experience in Data warehouse, Big data and Hadoop implementation in Azure environment essentially in lead role
- Participate in design and implementation of the analytics architecture.
- Experience in working on Hadoop Distribution, good understanding of core concepts and best practices
- Good experience in building/tuning Spark pipelines in Python/Java/Scala
- Good experience in writing complex Hive queries to drive business critical insights
- Understanding of Data Lake vs Data Warehousing concepts
- Experience with AWS Cloud, exposure to Lambda/EMR/Kinesis will be good to have.
- Work with multiple teams, arrive at common decision understand architecture principle of organization and provide solution accordingly
Minimum years of experience: 5 - 8 years
Certifications Needed: No
Top 3 responsibilities you would expect the Subcon to shoulder and execute:
- As an Architect provide various solution option, guide team, resolve complex problem and review delivery
- Design and development using Python Scala
- Deliver as per scrum/ Agile methodology
Interview Process (Is face to face required?) No
Does this position require Visa independent candidates only? No