We provide IT Staff Augmentation Services!

Hadoop Admin Resume

0/5 (Submit Your Rating)

PROFESSIONAL SUMMARY:

  • Over 8 years of experience in Business Analytics, design implementation and support of Systems Administration / Engineering in diverse industries which includes hands on experience in Big Data ecosystem related technologies. Pragmatic and results orientated, with a focus on bottom line results, problem - solving and decision-making skills, combined with a pragmatic approach and sound business acumen.
  • Worked on Apache Hadoop (HDFS, MapReduce, Hive, Pig, Sqoop, Oozie, Zookeeper) for working with Big Data.
  • Extensive experience with complete Software Design Lifecycle including requirement gathering & analysis, planning, designing, developing, testing and implementing.
  • Expertise with Hadoop Cluster capacity planning, performance tuning, cluster Monitoring, Troubleshooting.
  • Experience in installation, configuration, supporting and monitoring Hadoop clusters using Apache, Cloudera distributions.
  • Expertise in designing and implementing HDFS access controls, directory and file permissions user authorization that facilitates stable, secure access for multiple users in a large multi-tenant cluster
  • Effectively addressing and managing rack aware configuration for quick availability and processing of data.
  • Experience in understanding the security requirements for Hadoop and integrating with Kerberos authentication infrastructure- KDC server setup, crating realm /domain, managing principles, generation key tab file each service and managing keytab using keytabtools.
  • Strong knowledge in configuring NameNode High Availability and NameNode Federation.
  • Experience on Commissioning, Decommissioning, Balancing, and Managing Nodes and tuning server for optimal performance of the cluster.
  • Experience in importing and exporting data using Sqoop from HDFS to Relational Database systems/mainframe and vice-versa.
  • Experience in deploying Hadoop cluster on Public and Private Cloud Environments like Amazon Web Services (AWS) EC2.
  • Upgraded the Hadoop cluster from CDH3 to CDH4.
  • Backup configuration and Recovery from a NameNode failure.
  • Excellent command on creating Backups & Recovery and Disaster recovery procedures and Implementing BACKUP and RECOVERY strategies for off-line and on-line Backups.
  • Experience in writing Shell scripts using bash, Perl, for process automation of databases, applications, backup and scheduling.
  • Experienced in Linux admin activities on Ubuntu, Redhat & Cent OS.
  • Excellent team player with immense ability to grasp new concepts and apply them as per business requirements. Well organized with interpersonal and developmental skills, strong work ethics and willingness to work hard to achieve goals and targets.

PROFESSIONAL EXPERIENCE:

Confidential

Hadoop Admin

Responsibilities:

  • Worked on analyzing Hadoop cluster and different big data analytic tools including Pig, Hbase database and Sqoop.
  • Responsible for building scalable distributed data solutions using Hadoop.
  • Day to day responsibilities includes solving developer issues, deployments moving code from one environment to other environment, providing access to new users and providing instant solutions to reduce the impact and documenting the same and preventing future issues.
  • Implemented ten nodes CDH3 Hadoop cluster on Ubuntu LINUX.
  • Involved in loading data from LINUX file system to HDFS.
  • Worked on installing cluster, commissioning & decommissioning of data node, name node recovery, capacity planning, and slots configuration.
  • Implemented test scripts to support test driven development and continuous integration.
  • Worked on tuning the performance Pig queries.
  • Worked with application teams to install operating system, Hadoop updates, patches, version upgrades as required.
  • Responsible to manage data coming from different sources.
  • Load and transform large sets of structured, semi structured and unstructured data.
  • Experience in managing and reviewing Hadoop log files.
  • Exported the analyzed data to the relational databases using Sqoop for visualization and to generate reports for the BI team.
  • Responsible for cluster maintenance, adding and removing cluster nodes, cluster monitoring and troubleshooting, manage and review data backups, manage and review Hadoop log files.
  • Installed Oozie workflow engine to run multiple Hive and pig jobs.
  • Analyzed large amounts of data sets to determine optimal way to aggregate and report on it.
  • Supported in setting up QA environment and updating configurations for implementing scripts with Pig and Sqoop.

Environment: Apache Hadoop, HDFS, Cloudera Manager, MapReduce, Hive, PIG, Sqoop, Oozie, Cassandra, MongoDB, SQL and Java.

Confidential

Linux/Database Administrator

Responsibilities:

  • Installing, configuring and updating Red Hat 7.x, 8, 9 Creating System Disk Partition, mirroring root disk drive, configuring device groups in UNIX and Linux environment.
  • Performed scheduled backup and necessary restoration.
  • Installation of Nagios monitoring tool, setting of different alert level to different parameters (System, Memory and Hard disk).
  • Installation of MYSQL (5.34/5.5/6) databases on Red hat Linux.
  • Performed various configurations which include networking and IPTables, resolving hostnames, SSH key less login.
  • Working with VERITAS Volume Manager 3.5 and Logical Volume Manager for file system management, data backup and recovery.
  • Apache Server Administration with virtual hosting.
  • Proxy server configuration.
  • User administration which included creating backup account for new users and deleting account for the retired or deleted users..
  • Installed and Configured SSH Gate for Remote and Secured Connection.
  • Automate administration tasks through use of scripting and Job Scheduling using CRON
  • Installation and configuration of Linux for new build environment.
  • Created volume groups logical volumes and partitions on the Linux servers and mounted file systems and created partitions.
  • Deep understanding of monitoring and troubleshooting mission critical Linux machines.
  • Improve system performance by working with the development team to analyze, identify and resolve issues quickly.
  • Ensured data recovery by implementing system and application level backups.
  • Performed various configurations which include networking and IPTable, resolving host names and SSH keyless login.
  • Managing Disk File Systems, Server Performance, Users Creation and Granting file access Permissions and RAID configurations.
  • Automate administration tasks through the use of scripting and Job Scheduling.
  • Installation and configuration of Linux for new build environment.
  • Installing and maintaining the Linux servers
  • Monitoring System Metrics and logs for any problems.
  • Running cron-tab to back up data.
  • Adding, removing, or updating user account information, resetting passwords, etc.
  • Creating and managing Logical volumes
  • Installing and updating packages using YUM.
  • Support pre-production and production support teams in the analysis of critical services and assists with maintenance operations.

Confidential

Business Analyst

Responsibilities:

  • Acted as a bridge between functional and technical aspects of the project.
  • Acts as a communication channel throughout the project life cycle.
  • Documenting business requirements, Data design and process re-engineering to fit with the Business need.
  • Fulfill user needs with system capabilities and required customizations.
  • Ensures engagement plans will enable profitable delivery.
  • Managing complex and large tenders and bid teams.
  • Setting targets and goals for sales and management teams.
  • Presenting business strategies and common goals clearly to work colleagues.
  • Intimately involved in the writing up of all marketing literature.
  • Identifies client requirements and scope of the project.
  • Designs approach / methodology for the project.
  • Prepare project reports to the management.

We'd love your feedback!