Job Seekers, Please send resumes to resumes@hireitpeople.com
Must Have Skills (Top 3 technical skills only)*:
- Informatica BDM
- Hadoop Pyspark
- Hive
Nice to have skills (Top 2 only):
- Scala sqoop
- Oozie
Detailed Job Description:
- 6+ years of experience in Software development life cycle with primary experience in DW/BI, big data and related tools.
- Strong Knowledge of DW architecture, ETL Frameworks, Design solutions using Informatica big data, Best Practices and Troubleshooting.
- Proficiency in modern programming languages and tools - Python, XML, Informatica Big data Management, Hortonworks, Hadoop, Spark, Scala, HIVE, Oozie, Sqoop and Responsible for design and development of integration solutions with Hadoop/HDFS, Data Warehouses, and Analytics solutions.
- Familiar with HD insight and Spark cluster
- Work in a Global delivery environment, engaging with teams in an onsite/offshore model.
Minimum years of experience*: 5+
Certifications Needed: Yes
Responsibilities you would expect the Subcon to shoulder and execute*:
- Design and implement high-performance, scalable and optimized data solutions on top of terabytes of highly valuable data sets
- Manage and improve existing code base and add new features to support business needs
- Support and troubleshoot data and/or system issues as needed.
- Work in a collaborative manner with a smart, diverse and fast-growing teams to rapidly deliver high quality solutions.
- Should be able to support as and when necessary during weekend Migrations.
- Mentoring technical development team on optimal utilization of Big Data solutions and Apache Open Source Software.
- Leading a high-performance team of core big data framework developers in areas of data ingestion, data distribution, data enrichment etc.
- Work in a Global delivery environment, engaging with teams in an onsite/offshore model.
- Minimum years of experience*:5+
Does this position require Visa independent candidates only? Yes