Job Seekers, Please send resumes to resumes@hireitpeople.com
Job Responsibilities:
- Set up DR (Disaster Recovery)
- Operationalization and maintenance of current Enterprise Data Platform.
- Expansion of current Cloudera platform to scale and support new business.
- Disaster recovery planning and implementation for a healthcare product
- Experience with Cloudera CDP private and public cloud
- Repurpose clusters and add the nodes to the Operational clusters.
- Strategize and Test 24x7 Availability of HBase and Solr PDVs
- Implement a hybrid Big Data solution using Cloudera and AWS in a secure environment
- Transition EDP (Enterprise data platform) and data from on-prem to AWS Cloud
- Provide support to cross teams and cross projects with deployment and upgrade including troubleshooting of incidents to maintain the required service level.
- Work on complex, major or highly visible tasks in support of multiple projects that require multiple areas of expertise
- Automation and DevOPs
ESSENTIAL FUNCTIONS:
- 20% Develops and maintains infrastructure systems (e.g., data warehouses, data lakes) including data access APIs. Prepares and manipulates data using multiple technologies.
- 15% Interprets data, analyzes results using statistical techniques, and provides ongoing reports. Executes quantitative analyses that translate data into actionable insights. Provides analytical and data-driven decision-making support for key projects. Designs, manages, and conducts quality control procedures for data sets using data from multiple systems.
- 15% Develops data models by studying existing data warehouse architecture; evaluating alternative logical data models including planning and execution tables; applying metadata and modeling standards, guidelines, conventions, and procedures; planning data classes and sub-classes, indexes, directories, repositories, messages, sharing, replication, back-up, retention, and recovery.
- 15% Creates data collection frameworks for structured and unstructured data.
- 15% Improves data delivery engineering job knowledge by attending educational workshops; reviewing professional publications; establishing personal networks; benchmarking state-of-the-art practices; participating in professional societies.
- 10% Applies data extraction, transformation and loading techniques in order to connect large data sets from a variety of sources.
- 10% Applies and implements best practices for data auditing, scalability, reliability and application performance.
Qualifications:
- To perform this job successfully, an individual must be able to perform each essential duty satisfactorily.
- The requirements listed below are representative of the knowledge, skill, and/or ability required.
- Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions.
Preferred Qualifications:
- Knowledge, Skills and Abilities (KSAs)
- Knowledge and understanding of at least one programming language (i.e., SQL, NoSQL, Python)., Expert
- Knowledge and understanding of database design and implementation concepts., Expert
- Knowledge and understanding of data exchange formats., Expert
- Knowledge and understanding of data movement concepts., Expert
- Strong technical and analytical and problem-solving skills to troubleshoot to solve a variety of problems., Expert
- Requires strong organizational and communication skills, written and verbal, with the ability to handle multiple priorities., Advanced