Big Data/hadoop Admin Resume
0/5 (Submit Your Rating)
SUMMARY
- 5 + Years of IT experience in the administration, modification, installation, and maintenance of Hadoop on Linux RHEL operating system
- Developed a solid understanding of operating systems like Linux, Unix, Windows.
- Gained experience in installing, administering, and supporting operating systems and hardware in an enterprise environment (CentOS/RHEL)
- Gained experience in complete Software Design Life Cycle including design, development, testing and implementation of moderate to advanced complex systems
- Gained experience in IT systems design, systems analysis, development and management
- Possessed hands on experience in installation, configuration, supporting and managing Hadoop Clusters using Apache, Horton works, Cloudera (CDH3, CDH4, CDH5), Yarn distributions
- Possessed hands on experience in Hadoop Cluster capacity planning, performance tuning, cluster Monitoring, Troubleshooting
- Demonstrated ability to design Big Data solutions for traditional enterprise businesses
- Gained experience in data Integrity/Recovery/High Availability; Service & data migration; Disaster Recovery Planning; Contingency Planning; Capacity Planning, Research & Development; Risk Assessment & Planning; Cost Benefits Analysis
- Gained experience in setting, configuring, and managing of security for Hadoop clusters
- Gained implementation experience in configuration and tuning of various components such as HDFS, MAP REDUCE, ZOOKEEPER, YARN, HBASE, HIVE, IMPALA, SQOOP, OOZIE, Cloudera Manager
- Furthered installation of various Hadoop Ecosystems and Hadoop Daemons.
- Involved in bench marking Hadoop/HBase cluster file systems various batch jobs and workloads
- Obtained Experience in monitoring and troubleshooting issues with Linux memory, CPU, OS, storage, and network
- Obtained good experience on design, configure, and manage the backup and disaster recovery for Hadoop data
- Gained hands on experience in analyzing Log files for Hadoop and eco system services and finding root cause
- Obtained experience on Commissioning, Decommissioning, Balancing, and Managing Nodes and tuning server for optimal performance of the cluster.
- Involved in Cluster maintenance, trouble shooting, Monitoring, and following proper backup& Recovery strategies
- Obtained experience in HDFS data storage and support for running map - reduce jobs
- Gained experience in installing and configuring Hadoop eco system such as Sqoop, pig, hive, Ansible etc.
- Obtained experience in importing and exporting the data using Sqoop from HDFS to Relational Database systems/mainframe and vice-versa
- Supported in optimizing performance of HBase/Hive/Pig jobs
- Closely worked with Developers and Analysts to address project requirements. Ability to effectively manage time and prioritize multiple projects
- Experienced in monitoring metrics on EC2, EBS, Elastic Load Balancer, RDS USING CloudWatch.
- Experience in backup services like EBS snapshots, S3 backups and Amazon Glacier.
- Experience in security policies like Security Groups, IAM roles and Multi Factor Authentication.
- Experienced in bootstrapping nodes, writing recipes, uploading cookbooks to Chef Server.
- Experience in Version control Software’s like Git, Bitbucket
PROFESSIONAL EXPERIENCE
BIG DATA/HADOOP ADMIN
Confidential
Responsibilities:
- Involved in start to end process of Hadoop cluster setup including installation, configuration and monitoring the Hadoop Cluster
- Administered Cluster maintenance, commissioning and decommissioning Data nodes, Cluster Monitoring, Troubleshooting
- Performed Adding/removing new nodes to an existing Hadoop cluster
- Implemented Backup configurations and Recoveries from a Name Node failure.
- Monitored systems and services, architecture design and implementation of Hadoop deployment, configuration management, backup, and disaster recovery systems and procedures
- Configured various property files like core-site.xml, hdfs-site.xml, mapred-site.xml based upon the job requirement
- Performed Importing and exporting data into HDFS using Sqoop
- Installed various Hadoop Ecosystems and Hadoop Daemons
- Installed and configured HDFS, Zookeeper, Map Reduce, Yarn, HBase, Hive, Scoop, Ansible and Oozie
- Integrated Hive and HBase to perform analysis on data
- Managed and reviewed Hadoop Log files as a part of administration for troubleshooting purposes. Communicated and escalated issues appropriately.
- Applied standard Back up policies to make sure the high availability of cluster.
- Involved in analyzing system failures, identifying root causes, and recommended course of actions. Documented the systems processes and procedures for future references.
- Worked with systems engineering team to plan and deploy new Hadoop environments and expand existing Hadoop clusters.
- Involved in Installing and configuring Kerberos for the authentication of users and Hadoop daemons.
- Monitored Clusters with Ganglia and Nagios
- Environment: Hadoop, HDFS, Zookeeper, Map Reduce, YARN, HBase, Hive, Sqoop, Ansible, Oozie, Linux- CentOS, Ubuntu, Red Hat, Big Data Cloudera CDH, Horton Works, Apache Hadoop, SQL plus, Shell Scripting
CLOUD (AWS) ADMIN
Confidential
Responsibilities:
- Maintained and Administered GIT Source Code Tool.
- Created Branches, Labels and performed Merges in Stash and GIT.
- Developed Processes, Tools, Automation for Jenkins based Software for Build system and delivering SW Builds.
- Managed Build results in Jenkins and Deployed using workflows.
- Maintain and track inventory using Jenkins and set alerts when the servers are full and need attention.
- Modeled the structure for multi-tiered applications orchestrate the processes to deploy each tier.
- Installed the application on AWS EC2 AMI, Red hat, Ubuntu Instances.
- Providing the application support 24X7 for both QA and PROD environments.
- Configured the storage on S3 Buckets.
- Experience working with IAM to create new accounts, roles, and groups.
- Developed build and Deployment Scripts using ANT and MAVEN as build tools in Jenkins to move from one environment to other environments.
- Hands on experience in Web Application Development using Client Script design technologies like AngularJS, jQuery as well as HTML, CSS, XML, JavaScript.
- Familiar and experienced with Agile Scrum development.
Associate Hadoop Admin
Confidential
Responsibilities:
- Installed and configured MapReduce, HIVE and the HDFS; implemented CDH3 Hadoop cluster on CentOS. Assisted with performance tuning and monitoring.
- Created HBase tables to load large sets of structured, semi-structured and unstructured data coming from UNIX, NoSQL, and a variety of portfolios.
- Configure & Modify Hadoop XML parameters according to hardware/storage requirements
- Manage databases from 40TB to 100 TB using Hadoop Cluster
- Tuning of XML parameters & monitor the performance using CDH
- Developed workflows using custom MapReduce, Pig, Hive, Sqoop
- Supported code/design analysis, strategy development and project planning.
- Created reports for the BI team using Sqoop to export data into HDFS and Hive.
- Developed multiple MapReduce jobs in Java for data cleaning and preprocessing.
- Assisted with data capacity planning and node forecasting.