Job Seekers, Please send resumes to resumes@hireitpeople.comMust Have Skills:
- Big Data Hadoop\Hive\Pig\Sqoop
- Apache Spark
- Phyton
- Scala
- Sqoop
- Oozie
- Strong experience in Big Data and Java/J2EE technologies such as Apache Spark, Hadoop, Hive, Pig, Sqoop, Oozie, HBase, Zookeeper, Python & Scala.
- Experience in Spark Core, Spark SQL, Java MapReduce, Spark on Java Applications implementation
- Experience in real - time data processing using Spark Streaming and Kafka.
- Experience in Apache Spark, Hive and Pigs analytical functions
- Experience in analyzing large amounts of data sets writing PySpark scripts and Hive queries.
- Experience in performing ETL using Spark, Spark SQL.
- Experience in installing backup, recovery, configuration and development on multiple Hadoop distribution platforms Cloudera and Hortonworks including cloud platforms Amazon AWS and Google Cloud.
- Experience in Agile / Scrum Methodologies
- Strong UNIX/Linux systems administration skills, including configuration, troubleshooting and automation
Minimum years of experience: >10 years
Certifications Needed: No
Top 3 responsibilities you would expect the Subcon to shoulder and execute:
- Requirement Gathering and design
- Development and unit testing
- Deployment and support
Interview Process (Is face to face required?) No
Does this position require Visa independent candidates only? No