Job Seekers, Please send resumes to resumes@hireitpeople.com
Must Have Skills:
- Hadoop
- Scala
- Hive
- Pyspark
- Azure
- HDFS
Detailed Job Description:
- Required 8+ years of experience as lead developer with Data warehouse, Big data and Hadoop implementation in Azure environment.
- Participate in design and implementation of analytics architecture.
- Experience in working on Hadoop Distribution, good understanding of core concepts and best practices.
- Good experience in building/tuning Spark pipelines in Python/Java/Scala.
- Good experience in writing complex Hive queries to drive business critical insights.
- Understanding of Data Lake vs Data Warehousing concepts.
- Experience with AWS Cloud, exposure to Lambda/EMR/Kinesis will be good to have.
- Work with multiple teams, arrive at common decisions, understand architecture principle of organization and provide solution accordingly
Experience required: 5-8 Years
Top responsibilities you would expect the Subcon to shoulder and execute:
- As Lead developer, resolve technical queries, develop queries, work with QA and other teams for final implementation.
- Design and development using Python/Scala.
- Deliver as per scrum/Agile methodology.
- Onsite/offshore co - ordination
Interview Process (Is face to face required?) No