Job Seekers, Please send resumes to resumes@hireitpeople.com
Technical Hiring Criteria (Must Haves):
- Programming Languages: Spark, Scala
- Platform/Environment: Azure Data Bricks
- Database Management System: ADL
- Application Packages, etc.:
- Years of experience on each of the Technical must have skills: Spark, Scala - 3 years
Detailed Job Description:
- Minimum 6 years of hands on experience in Big Data and or Analytics, and all phases of SDLCBig Data
- Work experience in ingestion, storage, querying, processing, and analysis of Bigdata with hands on experience in Big data Eco system related technologies like Map Reduce, Spark, HDFS
- Working experience in SCALA
- Hands on experience of Object Oriented Programming OOPS
- In depth understandingknowledge of Hadoop Architecture and various components such as Hadoop High Availability architecture and good u
- Resource will be working with Azure Intelligence Group of Microsoft for Spark, Scala Development.
Minimum years of experience*: 5+
Certifications Needed: No
Top 3 responsibilities you would expect the Subcon to shoulder and execute*:
- Solid Experience of creating PLSQL Packages, Procedures, Functions, Triggers, Views and Exception handling for retrieving, manipulating, checking, and migrating complex data sets
- Experience in addressing performance issues
- Team player, motivated to learn new technologies, and able to grasp concepts and technologies quickly with analytical and problemsolving skills
Interview Process (Is face to face required?) No