We provide IT Staff Augmentation Services!

Hadoop Administrator Resume

0/5 (Submit Your Rating)

Los Angeles, CA

SUMMARY

  • 10+ Years of extensive IT experience with 3 +years of experience as a Hadoop Administrator and 5 years of experience in UNIX/Linux Administrator along with SQL developer in designing and implementing Relational Database model as per business needs in different domains.
  • Experience in installation, management and monitoring of Hadoop cluster using Cloudera Manager.
  • Optimized the configurations of Map Reduce, pig and hive jobs for better performance.
  • Backup configuration and Recovery from a Namenode failure.
  • Expert level skills in Managing and Scheduling Jobs on a Hadoop cluster.
  • Ability to think creatively to help design innovative solutions to complex analytical questions.
  • Extensive experience in installation, configuration, management and deployment of Big Data components and the underlying infrastructure of Hadoop Cluster.
  • Good working knowledge on importing and exporting data from different databases namely MySQL, PostgreSQL, Oracle into HDFS and Hive using Sqoop.
  • Extensive experience in NoSQL and real time analytics.
  • Strong knowledge on yarn terminology and the High - Availability Hadoop Clusters.
  • Hands on experience in analyzing Log files for Hadoop and eco system services and finding root cause.
  • Experience in Chef, Puppet or related tools for configuration management.
  • As an admin involved in balancing the loads on server and tuning of server for optimal performance of the cluster.
  • Expertise in Installing, Configuration and Managing Red hat Linux 5, 6.
  • Good experience on scheduling cron jobs in Linux.
  • Worked with PLSQL stored procedures in creating reports, which required modified data input from the source.
  • Experience in encryption security layer in the Hadoop environment.

TECHNICAL SKILLS

Hadoop Framework: HDFS, Map Reduce, Hive, Pig, Zookeeper, Sqoop, Hbase, spark, solr, Impala, Sentry

OS: RedHat Linux, UNIX, Windows 2000/NT/XP

Languages: C, C++, SAS, PL/SQL

Scripting Languages: Unix Shell scripting

Database: Oracle 10g/11g, SQL server, Teradata

Database Tools: Oracle SQL Developer, SQL Plus

Version Control: CVS, SVN

PROFESSIONAL EXPERIENCE

Confidential, Los Angeles, CA

Hadoop Administrator

Responsibilities:

  • Migrating the production clusters from Cloudera to Hortonwork Distribution.
  • Implementation and deployment of new big data core services in production environments including HDFS, MapReduce, YARN, HBase, Impala search,Flume,Hive, Spark in AWS and physical Clusters.
  • Deployment and administration of common widely used services throughout the Ha-doop ecosystem including scheduling, data integration and monitoring services
  • Architecture, capacity planning, monitoring, maintenance and tuning and workload management of all key services to ensure the systems meet SLAs.
  • Experienced with upgrading the Ambari from 2.2 to 2.5 and HDP versions from 2.3 to 2.5
  • Experience and managing Hortonwork data flow (HDF).
  • Install and deploy workflows in the production Nifi cluster.
  • Experience in installing spark service and deploy production spark jobs.
  • Responsible for day-to-day management of Application clusters
  • Supporting users on tickets based system.
  • Experience in providing on call support for Hadoop production clusters 24 X 7.
  • Experience in providing Infrastructure Recommendations, Capacity Planning and develop utilities to monitor cluster better
  • Experience around managing large clusters with huge volumes of data
  • Experience with cluster maintenance tasks such as creation and removal of nodes, cluster monitoring and troubleshooting.
  • Experience installing and implementing security for Hadoop clusters.
  • Installing Hadoop Updates, patches and version upgrades.

Environment: HDFS, Hue, Oozie, Zoo keeper, Impala, Kerberos, Flume, Scoop, HDP, Ambari, Jira, DBoss, Spark, kafka, HDF, Nifi

Confidential, Tampa, FL

Hadoop Administrator

Responsibilities:

  • Responsible for implementation and ongoing administration of Hadoop infrastructure
  • Collaborating with application teams to install operating system and Hadoop updates, patches, version upgrades when required
  • Upgraded the UAT AND PROD cluster from CDH 5.1 to 5.4.3
  • Working with data delivery teams to setup new Hadoop users. This job includes setting up Linux users, setting up Kerberos principals and testing HDFS, Hive, Pig and MapReduce access for the new users
  • Installed and implemented the monitoring tools like ganglia and Nagios on both the clusters
  • Cluster maintenance as well as creation and removal of nodes using tools like Ganglia, Nagios, Cloudera Manager Enterprise, Dell Open Manage and other tools
  • Installed the configuration tools like puppet in cluster.
  • Implement and installed the user authentication and authorization for the cluster by Kerberos and sentry
  • Build a new sandbox cluster for the testing purpose and move data from secure cluster to insecure sandbox cluster by using distcp
  • Performance tuning of Hadoop clusters and Hadoop MapReduce routines
  • Screen Hadoop cluster job performances and capacity planning
  • Monitor Hadoop cluster connectivity and security
  • Experience in encryption set up in Hadoop cluster
  • HDFS support and maintenance
  • Hands on experiences on Kafka and storm
  • Aligning with the systems engineering team to propose and deploy new hardware and software environments required for Hadoop and to expand existing environments.
  • Diligently teaming with the infrastructure, network, database, application and business intelligence teams to guarantee high data quality and availability

Environment: Cloudera Manager, CDH 5.4.3, HDFS, Yarn, Hue, Sentry, Oozie, Zoo keeper, Impala, Solr, Kerberos, cluster health, Puppet, Ganglia, Nagios, Flume, Scoop, storm, Kafka, KMS

Confidential, Baltimore, MD

Hadoop Administrator

Responsibilities:

  • Involved in start to end process of Hadoop cluster setup where in installation, configuration and monitoring the Hadoop Cluster.
  • Responsible for Cluster maintenance, commissioning and decommissioning Data nodes, Cluster Monitoring, Troubleshooting, Manage and review data backups, Manage & review Hadoop log files
  • Monitoring systems and services, architecture design and implementation of Hadoop deployment, configuration management, backup, and disaster recovery systems and procedures.
  • Installation of various Hadoop Ecosystems and Hadoop Daemons.
  • Responsible for Installation and configuration of Hive, Pig, HBase and sqoop on the Hadoop cluster.
  • Configured various property files like core-site.xml, hdfs-site.xml, mapred-site.xml based upon the job requirement
  • Involved in loading data from UNIX file system to HDFS.
  • Provisioning, installing, configuring, monitoring, and maintaining HDFS, Yarn, HBase, Flume, Sqoop, Oozie, Pig, Hive
  • Monitored multiple Hadoop clusters environments using Ganglia and Nagios. Monitored workload, job performance and capacity planning using Ambari.
  • Expertise in recommending hardware configuration for Hadoop cluster
  • Installing, Upgrading and Managing Hadoop Cluster on Cloudera distribution
  • Trouble shooting many cloud related issues such as Data Node down, Network failure and data block missing.
  • Managing and reviewing Hadoop and HBase log files.
  • Experience with Unix or Linux, including shell scripting
  • Strong problem solving skills
  • Loading the data from the different Data sources like (Teradata and DB2) into HDFS using sqoop and load into Hive tables, which are partitioned.
  • Developed Hive UDF’s to bring all the customers information into a structured format.
  • Developed bash scripts to bring the Tlog files from ftp server and then processing it to load into hive tables.
  • Built automated set up for cluster monitoring and issue escalation process.
  • Administration, installing, upgrading and managing distributions of Hadoop (CDH3, CDH4, Cloudera manager), Hive, HBase.

Environment: Hadoop, HDFS, Map Reduce, Shell Scripting, spark, solr, Pig, Hive, HBase, Sqoop, Flume, Oozie, Zoo keeper, Base, cluster health, monitoring security, Redhat Linux, impala, Cloudera Manager, Hortonworks.

Confidential, Herndon VA

Hadoop Administrator

Responsibilities:

  • Installed and configured various components of Hadoop ecosystem and maintained their integrity
  • Planning for production cluster hardware and software installation on production cluster and communicating with multiple teams to get it done
  • Designed, configured and managed the backup and disaster recovery for HDFS data.
  • Experience with Unix or Linux, including shell scripting
  • Installing, Upgrading and Managing Hadoop Cluster on Cloudera distribution.
  • Commissioned Data Nodes when data grew and decommissioned when the hardware degraded
  • Worked with systems engineering team to plan and deploy new Hadoop environments and expand existing Hadoop clusters.
  • Worked with application teams to install Hadoop updates, patches, version upgrades as required
  • Installed and Configured Hive, Pig, Sqoop and Oozie on the HDP cluster.
  • Involved in implementing High Availability and automatic failover infrastructure to overcome single point of failure for Name node utilizing zookeeper services
  • Implemented HDFS snapshot feature
  • Involved in start to end process of Hadoop cluster setup where in installation, configuration and monitoring the Hadoop Cluster.
  • Ran monthly security checks through UNIX and Linux environment and installed security patches required to maintain high security level for our clients

Environment: Hadoop, HDFS, Map Reduce, Impala, Sqoop, HBase, Hive, Flume, Oozie, Zoo keeper, solr, Performance tuning, cluster health, monitoring security, Shell Scripting, NoSQL/HBase/Cassandra, Cloudera Manager.

Confidential, Chicago, IL

LINUX/UNIX administrator

Responsibilities:

  • Provided cross platform data access from Linux to Windows users by deploying codes on WebLogic servers
  • Managed Users on Unix/Linux systems,
  • Provided application support for various applications deployed in Linux environment.
  • Gathered requirements from Engineering team and build application installation Documents
  • Validated the Engineering requirements with Process Improvement teams
  • Managed Nodes, jobs and configuration using HPC Cluster Manager Tool
  • Monitored Sun Solaris and Linux Servers running in a 24x7 data center supporting approximately 100 servers
  • Installation, Maintenance, Administration and troubleshooting of Sun Solaris, AIX, HP-UX, Linux
  • As a Linux administrator primary responsibility includes building of new servers which
  • Includes rack mounting, installation of OS, configuring of various OS-native and third party tools, securing of OS
  • Installing & configuring, job scheduling
  • Experience with Unix or Linux, including shell scripting
  • Extensively worked on hard disk mirroring and stripe with parity using RAID controllers.
  • Involved in Server sizing and identifying, recommending optimal server hardware based on User requirements
  • Support of Applications running on Linux machines for multiple clients.

Environment: Windows 2008/2007 server, Unix Shell Scripting, SQL Manager Studio, Red Hat Linux, Microsoft SQL Server 2000/2005/2008 , MS Access, NoSQL, Linux/Unix, Putty Connection Manager, Putty, SSH.

Confidential, Jacksonville, FL

LINUX/UNIX administrator

Responsibilities:

  • Day - to-day administration on Sun Solaris, RHEL 4/5 which includes Installation, upgrade & loading patch management & packages
  • Assist with overall technology strategy and operational standards for the UNIX domains.
  • Manage problem tickets and service request queues, responding to monitoring alerts, execution of change controls, routine & preventative maintenance, performance tuning and emergency troubleshooting & incident support
  • Performed day-to-day administration tasks like User Management, Space Monitoring, Performance Monitoring and Tuning, alert log monitoring and backup monitoring.
  • Provides accurate root cause analysis and comprehensive action plans.
  • Manage daily system administration cases using BMC Remedy Help Desk
  • Investigated, installed and configured software fail-over system for production Linux servers
  • Monitor and maintain the disk space, backup systems and tape libraries and Implement change controls, capacity planning and growth projections on the systems.
  • Experience with Unix or Linux, including shell scripting
  • Planning and coordinating activities related to upgrades and maintenance on the systems.
  • Create status reports, project plans and attend team meetings to coordinate activities.

Environment: Linux/Unix, Sun Solaris, Red hat Linux, Unix Shell Scripting, Oracle10g, SQL Server 2005, XML, Windows 2000/NT/2003 Server, UNIX.

Confidential

SQL Server Developer

Responsibilities:

  • Created Functional specifications documents based on the requirements
  • Extensively worked on the stored procedures to migrate the legacy data on to the ware house accommodating various business transformations
  • Designed T-SQL scripts to identify long running queries and blocking sessions
  • Data migration (import & export - BCP) from Text to SQL Server
  • Error handling using Try-Catch Block
  • Developed Backup and Restore scripts for SQL Server 2002
  • Installed and Configured SQL Server 2002 in Test Environment with latest service packs
  • Involved in Unit, Functional and Integration testing process
  • Created highly complex SSIS packages using various Data transformations
  • Published and migrated data using SSIS, DTS Wizard Tool through control flow tasks and script tasks
  • Created logging for ETL load Confidential package level and task level to log number of records processed by each package and each task in a package
  • Responsible for Deploying, Scheduling Jobs, Alerting
  • Developed optimal stored procedures and queries to create data sets for reports
  • Designed and implemented customized report layouts
  • Generated Reports using Global Variables and Expressions
  • Successfully deployed reports in various sources like XML, Web Browser and PDF
  • Assisted development team in deploying and testing the application, which uses SQL Server as a database

Environment: MS SQL Server 2002, T-SQL, Windows NT and 2003

We'd love your feedback!