Job ID :
11447
Company :
Internal Postings
Location :
IRVINE, CA
Type :
Contract
Duration :
6 months
Salary :
open
Status :
Active
Openings :
1
Posted :
23 Feb 2018
Job Seekers, Please send resumes to resumes@hireitpeople.com
Job Details:

Must Have Skills (Top 3 technical skills only)

  • Solid administrative knowledge of Apache Hadoop Cloudera distribution
  • BI Tool integrations with Hadoop DBA experience HBASE experience with database replication and scaling
  • Design, install, and maintain highly available systems (including monitoring, security, backup, and performance tuning) Linux (RHEL) proficiency a must Scripting experience Automation experience (chefAnsible) Must possess good analytics and problem solving skills
Detailed Job Description:

Job Responsibility:


  • Analyzes, designs, creates and implements Big Data infrastructures, including access methods, device allocations, validation checks, organization and security.
  • Designs data models, logical and physical infrastructure designs, etc.
  • Assists in system planning, scheduling, and implementation
  • Cluster maintenance as well as creation and removal of nodes using tools like Ganglia, Nagios, Cloudera Manager Enterprise, Dell Open Manage and other tools
  • Performance tuning of Hadoop clusters and Hadoop MapReduce routines
  • Screen Hadoop cluster job performances and capacity planning
  • Monitor Hadoop cluster connectivity and security
  • Manage and review Hadoop log files
  • File system management and monitoring
  • HDFS support and maintenance
  • Diligently teaming with the infrastructure, network, database, application and business intelligence teams to guarantee high data quality and availability
  • Collaborating with application teams to install operating system and Hadoop updates, patches, version upgrades when required
  • DBA responsibilities performed by Hadoop Administrator – software installation and configuration, database backup and recovery, database connectivity and security, performance monitoring and tuning, disk space management, software patches and upgrades, automate manual tasks.
  • Perform DWH admins job responsibilities including developing, testing and monitoring batch jobs for tasks – ensure Referential integrity, perform primary key execution, accomplish data restatements, load large data volumes in a timely manner.

Required skills


  • Solid administrative knowledge of Apache Hadoop – Cloudera distribution
  • BI Tool integrations with Hadoop DBA experience HBASE
  • experience with database replication and scaling
  • Design, install, and maintain highly available systems (including monitoring, security, backup, and performance tuning)
  • Linux (RHEL) proficiency a must
  • Scripting experience
  • Automation experience (chef/Ansible)
  • Must possess good analytics and problem solving skills
  • Participate in impact-analysis and root cause analysis.
  • Strong sense of accountability, adaptability, flexibility and a sense of urgency.
  • Ability to work effectively with associates at all levels within the organization.
  • Demonstrated ability to establish priorities, organize and plan work to satisfy established timeframes.
  • Proven ability to handle multiple tasks and projects simultaneously.
  • Familiar with the Agile Methodology
  • Excellent analytical skills, attention to detail, and problem-solving skills.
  • Passionate about continuously improving organizational practices. Ability to learn and apply new technologies quickly and self-directed.
Minimum years of experience: 5+

Certifications Needed: No.