Job ID :
36311
Company :
Internal Postings
Location :
Chicago, IL
Type :
Contract
Duration :
6 Months
Salary :
DOE
Status :
Active
Openings :
1
Posted :
23 Mar 2022
Job Seekers, Please send resumes to resumes@hireitpeople.com

Must Have Skills:

  • Hadoop
  • Scala
  • Hive/Pyspark
Detailed Job Description:
  • Participate in design and implementation of the analytics architecture.
  • Experience in working on Hadoop Distribution, good understanding of core concepts and best practices
  • Good experience in building/tuning Spark pipelines in Python/Java/Scala
  • Good experience in writing complex Hive queries to drive business critical insights
  • Good Programming experience with Java/Python
  • Understanding of Data Lake vs Data Warehousing concepts
  • Experience with AWS Cloud, exposure to Lambda/EMR/Kinesis will be good to have

Minimum years of experience: 5-8 years

Certifications Needed: No

Top 3 responsibilities you would expect the Subcon to shoulder and execute:

  1. Design and development using PythonScala
  2. Deliver as per scrumAgile methodology
  3. Onsiteoffshore coordination

Interview Process (Is face to face required?): No

Does this position require Visa independent candidates only? No