Job Seekers, Please send resumes to resumes@hireitpeople.com
Detailed Job Description:
- Absolute Must Haves
- Big Data (Cloudera) Admin
- Solr
- Core Administrator (This isNOTa Big Data Developer or Big Data Engineer Role)
Job Responsibilities:
- Set up DR (Disaster Recovery)
- Operationalization and maintenance of current Enterprise Data Platform.
- Expansion of current Cloudera platform to scale and support new business.
- Disaster recovery planning and implementation for a healthcare product
- Experience with Cloudera CDP private and public cloud
- Repurpose clusters and add the nodes to the Operational clusters.
- Strategize and Test 24x7 Availability of HBase and Solr PDVs
- Implement a hybrid Big Data solution using Cloudera and AWS in a secure environment
- Transition EDP (Enterprise data platform) and data from on-prem to AWS Cloud
- Provide support to cross teams and cross projects with deployment and upgrade including troubleshooting of incidents to maintain the required service level.
- Work on complex, major or highly visible tasks in support of multiple projects that require multiple areas of expertise
- Automation and DevOPs
ESSENTIAL FUNCTIONS:
- 20% Develops and maintains infrastructure systems (e.g., data warehouses, data lakes) including data access APIs. Prepares and manipulates data using multiple technologies.
- 15% Interprets data, analyzes results using statistical techniques, and provides ongoing reports. Executes quantitative analyses that translate data into actionable insights. Provides analytical and data-driven decision-making support for key projects. Designs, manages, and conducts quality control procedures for data sets using data from multiple systems.
- 15% Develops data models by studying existing data warehouse architecture; evaluating alternative logical data models including planning and execution tables; applying metadata and modeling standards, guidelines, conventions, and procedures; planning data classes and sub-classes, indexes, directories, repositories, messages, sharing, replication, back-up, retention, and recovery.
- 15% Creates data collection frameworks for structured and unstructured data.
- 15% Improves data delivery engineering job knowledge by attending educational workshops; reviewing professional publications; establishing personal networks; benchmarking state-of-the-art practices; participating in professional societies.
- 10% Applies data extraction, transformation and loading techniques in order to connect large data sets from a variety of sources.
- 10% Applies and implements best practices for data auditing, scalability, reliability and application performance.
Qualifications:
- To perform this job successfully, an individual must be able to perform each essential duty satisfactorily.
- The requirements listed below are representative of the knowledge, skill, and/or ability required.
- Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions.