Job Seekers, Please send resumes to resumes@hireitpeople.comMust Have Skills:
- Hive
- Sqoop
- Spark
- Big Data Cloud
- 5 Years of experience as a Hive, Spark developer.
- Good understanding of Big Data and Hadoop architecture in detailed.
- Work experience on data injection and processing using Sqoop and Hive.
- Hands on experience in HDFS, Hive, Sqoop, Spark and Scala
- Experience in Spark framework with Scala Python
- Experience in spark to ingest data and process data to data lake
- Good understanding of different Hadoop file formats like Avro, Parquet and ORC.
Minimum years of experience: 8 - 10 years
Certifications Needed: No
Top 3 responsibilities you would expect the Subcon to shoulder and execute:
- Design and create DDL for Hive tables.
- Prepare designed docs and S2T mappings Responsible to manage data coming from different sources and formats
- Design the queries to join multiple tables and applied functions like summation and aggregation.
Interview Process (Is face to face required?) No
Does this position require Visa independent candidates only? No