Hadoop Admin Resume
PA
SUMMARY
- Over 6+ Years of IT experience in Administration on various platforms like cloud, Hadoop and HP service manager
- Have extensive 3+ Years of Experience working on Hadoop eco - system.
- Experience in Configuring, Installing and Managing Apache Hadoop and Cloudera Hadoop.
- Extensive experience with Installing New Servers and rebuilding existing Servers.
- Experience in using Automation Tools like Puppet for Installing, Configuring and Maintaining Hadoop clusters.
- Experience in using Cloudera Manager 3.x, 4.x, 5.x and 5.5.x for Installation and Management of Hadoop Cluster.
- Experience in DRBD implementation for Name Node Metadata backup.
- Experience in HDFS High Availability.
- Experience in configuring, installing, benchmarking and managing Cloudera distribution of Hadoop on AWS, Virtual and Cloud servers.
- Expertise in writing Shell Scripts and Perl Scripts and debugging existing scripts.
- Experience in Performance Management of Hadoop Cluster.
- Experience in using Flume, KAFKA to load log files into HDFS.
- Expertise in using Oozie for configuring job flows.
- Experience in OS/Apache/RDBMS tuning.
- Managing the configuration of the cluster to meet the needs of data analysis whether I/O bound or CPU bound.
- Developed Hive Queries and automated those queries for analyzing on Hourly, Daily and Weekly basis.
- Coordinating Cluster Services through Zookeeper.
- Importing and exporting data into HDFS and Hive using Sqoop.
- Experiencein Importing andExportingpreprocessed data into commercial analytic databases like RDBMS, Netezza
- Over 1+ years of experience working on Open View Service Desk, Open View Service Center, Hewlett-Packard Service Manager 9.30/7.11 administration for Client PG.
- Experienced across various platforms on different applications, and coordinate with related support and implementation teams and assist in customization of applications according to the business requirements of the client, work on incidents, problems and change requests.
- Well experienced in Data upload, CI additions, and form level changes, top level changes to DB Rules, UI Rules, daily Checks, Incident Management and issues related to case exchanges, and Application Server restarts and performance issues.
- Provide 2nd level to 3rd level support to several customers in AMS region for CRM Applications.
- Analysis of Case Exchange issues with other CRM Tools, different instances of OVSD, Remedy and other third party applications.
- Scheduled proactive tasks like daily checks to avoid database incidents and application issues.
- Upload Configuration Items to Production server as per Client requests.
- Providing solution for Bugs in HP SM tool.
- Crystal Reporting from HP Service Manager 9.x on customer request.
- Investigation on downtimes of application servers.
- Importing data from Production to Test servers and vice versa.
- Customization of Global List Values and Global Variables also.
- Experience on Form Designing customization.
- Experience on EGN settings, Format Controls, View Creation & Modification.
- Deep understanding of data warehouse approaches, industry standards and industry best practices.
- A self-starter, committed to achieve results in team environment by working cohesively.
- Ability to learn quickly and apply new skills in addressing business and technical problems.
TECHNICAL SKILLS
Hadoop Ecosystem: HDFS, Hive, Pig, Flume, Oozie, Zookeeper, and Sqoop, Hue, Impala, solr, Kafka, Flume and Spark.
Automation Tools: Puppet, Cloudera Manager, MAPR Control System (MCS).
Network administration: TCP/IP fundamentals, wireless networks, LAN and WAN.
Languages: C, SQL, PIG LATIN, UNIX Shell Scripting, UML.
Security: Kerberos. SSL/TLS encryption
Monitoring and Alerting: Nagios, Ganglia. Cloudera Navigator
PROFESSIONAL EXPERIENCE
Confidential, PA
Hadoop Admin
Responsibilities:
- Experience in configuring, installing, benchmarking and managing Apache, Horton works and Cloudera distribution of Hadoop on AWS cloud and Virtual servers.
- Setting up Hadoop cluster Cloudera CDH 5.3.1, .5.4, 5.5.1 and 5.5.2.
- Designed and implemented disaster recovery of the CLDB data.
- Importing and exporting data using Sqoop from Netezza, SQL and Oracle DBs.
- Designed the scheme and implement hive table for the most widely used database in the company.
- Data cleansing and exporting that to the data-warehouse.
- Setting up security for the Hive databases.
- Providing 24*7 supports for the team by maintaining the health of the cluster.
- Highly involved in operations and troubleshooting Hadoop clusters.
- Upgrading the Cluster with the latest software available for bug fix.
- Developing ETL process to pull data into Hadoop Cluster from different sources (FTP, DATA WAREHOUSE).
- Manage the day to day operations of the cluster for backup.
- Involved in implementing security on Cloudera Hadoop Cluster using Kerberos by working along with operations team to move unsecured cluster to secured cluster.
- Implemented Hive Scripts according to the requirements.
- Jobs management using Fair scheduler.
- Cluster coordination services through Zookeeper.
- Installed multiple Hadoop and HBase clusters.
- Monitored cluster job performance and capacity planning.
- Implemented shell scripts for log-Rolling day to day processes and made it automated.
- Upgrading the Cluster with the latest software available for bug fix.
- Coordinating with other teams for data import and export.
- Implemented Oozie work-flow for ETL Process.
- Monitored cluster job performance and capacity planning.
Environment: MapReduce, HDFS, Hive, Impala, Java, SQL, Cloudera Manager 5.5.1, Pig, Sqoop, Oozie, flume Kafka and Sentry.
Confidential, Weehawken, NJ
Hadoop Admin
Responsibilities:
- Installed multiple Hadoop and HBase clusters.
- Installed Cloudera Manager 4.x and 5 on CDH 4 and 5 versions.
- Installed ganglia to monitor Hadoop daemons and Implemented the changes in configuration parameters and in parallel monitored the changes in Ganglia.
- By using flume collected web logs from different sources and dumped them into HDFS.
- Implemented Oozie work-flow for ETL Process.
- Exporting data from RDBMS to HIVE, HDFS and HIVE, HDFS to RDBMS by using SQOOP.
- Implemented shell scripts for log-Rolling day to day processes and made it automated.
- Implemented DRBD for Name node Metadata Backup.
- Coordinating FLUME, HBASE nodes and master using zookeeper.
- Automated Installing Hadoop cluster using puppet.
- I was the part of CKP (customer knowledge platform) project.
- Involved in Adhoc meeting to understand the client’s requirements.
- Involved in Scrum meeting to provide day to day updates.
- Implemented Oozie work-flow for ETL Process.
- Implemented Hive Scripts according to the requirements.
- Implemented designs to overcome the Read-Write complications of billions of records.
- Jobs management using Fair scheduler.
- Cluster coordination services through Zookeeper.
- Importing and exporting data into HDFS and Hive using Sqoop.
- Loading log data directly into HDFS using Flume.
Environment: Java 6, Eclipse, Oracle 10g, Sub Version, Hadoop, Hive, HBase, Linux, MapReduce, HDFS, Hive, Java (JDK 1.6), Hadoop Distribution of HortonWorks, Cloudera, DataStax, IBM DataStage 8.1, Oracle 11g/10g, PL/SQL, SQL*PLUS, Toad 9.6, Windows NT, UNIX Shell Scripting.
Confidential, Wilmington, DE
Cloud admin
Responsibilities:
- Assisted in creation of ETL process for transformation of data sources from existing RDBMS Systems.
- Involved in various POC activity using technology like Map reduce, Hive, Pig, and Oozie.
- Involved in designing and implementation of service layer over HBase database.
- Importing of data from various data sources such as Oracle and Comptel server intoHDFS using transformations such as Sqoop, Map Reduce.
- Analyzed the data by performing Hive queries and running Pig scripts to know user behavior like frequency of calls, top calling customers.
- Continuous monitoring and managing the Hadoop cluster through Cloudera Manager.
- Developed Hive queries to process the data and generate the data cubes for visualizing.
- Designed and developed scalable and custom Hadoop solutions as per dynamic data needs.
- Coordinated with technical team for production deployment of software applications for maintenance.
- Provided operational support services relating to Hadoop infrastructure and application installation.
- Supported technical team members in management and review of Hadoop log files and data backups.
- Participated in development and execution of system and disaster recovery processes.
- Formulated procedures for installation of Hadoop patches, updates and version upgrades.
- Automated processes for troubleshooting, resolution and tuning of Hadoop clusters.
Environment: Hadoop, Map Reduce, HDFS, Hive, Oozie, Java (JDK 1.6), Cloudera, NoSQL, Oracle 11g/10g, Toad 9.6, Windows NT, UNIX (Linux), Agile.
Confidential
Senior Technical Consultant
Responsibilities:
- Implemented and configured clusters and applications servers for HP Service Manager tool 7.11 and 9.30 versions.
- Developed automation scripts for many manual schedule jobs on application servers.
- Uploaded Configuration Items, new users, data Imports, UI rules.
- Involved in migration of HP SM .
- Created new database and migrated the old DB into new systems.
- Closely monitored the schedule tasks on new system to work efficiently.
- Provided solutions for many application and Database issues.
- Documented issues with solutions and shared with team.
- Implemented EGN settings, Format Controls, View Creation & Modification.