Job Seekers, Please send resumes to resumes@hireitpeople.com
Job Duties:
- Responsible for coding and testing Big Data/DW and ETL-related programs and processes.
- Work closely with a Big Data/DW/ETL architect, DW and BI developers, source application developers, and database administrators.
- Support, maintain and enhance these environments to achieve a cross-functional, integrated reporting and analysis environment for University users.
- Work with the Cloudera platform
- Experience with Python programming
- Experience with NiFi tool
- Contribute to evolving design, architecture, and standards for building and delivering unique services and solutions.
- Implement best-in-industry, innovative technologies that will expand Inovalon's infrastructure through robust, scalable, adrenaline-fueled solutions.
- Participate in the enterprise infrastructure vision and strategy.
- Focus on service reliability and sustainability.
- Develop and execute plans for complex systems backed by excellence, confidence, and thorough engineering analysis.
- Leverage metrics to manage the server fleet and complex computing systems to drive automation, improvement, and performance.
- Responsible for operational support for production systems and optimization.
- Upgradation and maintenance of HDP/CDH cluster components including Ambari in Dev, UAT and Production environments.
- Provide support to development teams for Hadoop related issues.
- Create and manage different Data workflows based on requirements in ApacheNIFI.
- Monitor Hadoop cluster capacity and manage different queue's capacity and user limit factors in YARN Queue Manager.
Education: The minimum qualification required for the performance of the above specialty occupation duties is a bachelor's degree or equivalent in Computer Science or equivalent in a related field or a foreign equivalent is required closely related field with relevant experience.