Job ID :
38833
Company :
Internal Postings
Location :
Philadelphia, PA
Type :
Contract
Duration :
12 Months
Salary :
DOE
Status :
Active
Openings :
1
Posted :
16 Nov 2022
Job Seekers, Please send resumes to resumes@hireitpeople.com
Must Have Skills:
  • Big Data Hadoop\Hive\Pig\Sqoop
  • Apache Spark
  • Phyton
  • Scala
Nice to have skills:
  • Sqoop
  • Oozie
Detailed Job Description:
  • Strong experience in Big Data and Java/J2EE technologies such as Apache Spark, Hadoop, Hive, Pig, Sqoop, Oozie, HBase, Zookeeper, Python & Scala.
  • Experience in Spark Core, Spark SQL, Java MapReduce, Spark on Java Applications implementation
  • Experience in real - time data processing using Spark Streaming and Kafka.
  • Experience in Apache Spark, Hive and Pigs analytical functions
  • Experience in analyzing large amounts of data sets writing PySpark scripts and Hive queries.
  • Experience in performing ETL using Spark, Spark SQL.
  • Experience in installing backup, recovery, configuration and development on multiple Hadoop distribution platforms Cloudera and Hortonworks including cloud platforms Amazon AWS and Google Cloud.
  • Experience in Agile / Scrum Methodologies
  • Strong UNIX/Linux systems administration skills, including configuration, troubleshooting and automation

Minimum years of experience: >10 years

Certifications Needed: No

Top 3 responsibilities you would expect the Subcon to shoulder and execute:

  1. Requirement Gathering and design
  2. Development and unit testing
  3. Deployment and support

Interview Process (Is face to face required?) No

Does this position require Visa independent candidates only? No