Hadoop Administrator Resume
Dallas, TX
SUMMARY
- 7 + years of overall experience in Systems Administration and Enterprise Application Development in diverse industries which includes hands on experience in Bigdata ecosystem related technologies.
- 5 years of comprehensive experience as a Big Data & Analytics Administrator.
- Experience in working with MapReduce programs using Apache Hadoop for working with Big Data.
- Experience in installation, configuration, supporting and monitoring Hadoop clusters using HDFS.
- Experience in using Pig, Hive and HBase.
- Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice - versa.
- Familiar with Java virtual machine (JVM) and multi-threaded processing.
- Knowledge of NoSQL databases including HBase, MySQL.
- Knowledge in job workflow scheduling and monitoring tools like oozie and Zookeeper
- Experience in designing, developing and implementing connectivity products that allow efficient exchange of data between our core database engine and the Hadoop ecosystem.
- Excellent interpersonal and communication skills, creative, research-minded, technically competent and result-oriented with problem solving and leadership skills.
TECHNICAL SKILLS
Hadoop/Big Data: HDFS, Map reduce, HBase, Pig, Hive, Sqoop
Java & J2EE Technologies: Core Java, Servlets, JSP, JDBC, JNDI, Java Beans
Programming languages: Java, Linux shell scripts
Databases: Oracle 11g/10g/9i, MS-SQL Server, HBase
Web Servers: Web Logic, Web Sphere, Apache Tomcat
Web Technologies: HTML, XML, JavaScript
PROFESSIONAL EXPERIENCE
Confidential, Dallas, TX
Hadoop Administrator
Responsibilities:
- Experienced of Hadoop 1.x and Hadoop 2.X Installation, configuration and Hadoop Clients.
- Experienced of Deploying Hadoop single node cluster in pseudo Distributed mode and Full Distributed mode.
- Involved in installing Hadoop Ecosystem components (Hadoop, MapReduce, Yarn, Pig, Hive, Sqoop, Flume, Zookeeper and HBase).
- Involved in HDFS maintenance and administering it through Hadoop-Java API.
- Expert in importing and exporting data into HDFS using Sqoop and Flume.
- Configured FIFO and Fair Scheduler to provide service-level agreements for multiple users of a cluster.
- Managing nodes on Hadoop cluster connectivity and security.
- Experienced of manage user and group access to various Big Data Environments.
- Experienced of Service Monitoring, Service and Log Management, Auditing and Alerts, Hadoop Platform Security and Configuring Kerberos.
- Hadoop components and patch installation.
- Experience in using Sqoop to migrate data to and fro from HDFS and MySQL or Oracle and deployed Hive and HBase integration to perform OLAP operations on HBase data.
- Responsible to Configure on the Hadoop cluster and troubleshoot the common Cluster Problem
- Have handled issues related to cluster start, node failures and several java specific errors on the system.
- Cluster configuration and data transfer (distcp and hftp), inter and intra cluster data transfer.
- Understanding of Hadoop 2.X architecture, Yarn Framework, Map Reduces and HDFS Federation
- Hadoop server roles and their usage, Rack Awareness, Anatomy of Write and Ready, Replication Pipeline, and Data Processing
- Manage production cluster though Cloudera Manager.
Environment: Hadoop, Linux, Map Reduce, Pig, Sqoop, Java, Hive, Hbase, UNIX Shell Scripting.
Confidential, Phoenix, AZ
Hadoop Administrator
Responsibilities:
- Installed Name node, Secondary name node, Yarn (Resource Manager, Node manager, Application master), Data node.
- Installed and Configured HDP2.2
- Responsible for implementation and ongoing administration of Hadoop infrastructure.
- Monitored already configured cluster of 40 nodes.
- Installed and configured Hadoop components Hdfs, Hive, HBase.
- Communicating with the development teams and attending daily meetings.
- Addressing and Troubleshooting issues on a daily basis.
- Working with data delivery teams to setup new Hadoop users. This job includes setting up Linux users, setting up Kerberos principals and testing HDFS, Hive.
- Cluster maintenance as well as creation and removal of nodes.
- Monitor Hadoop cluster connectivity and security.
- Manage and review Hadoop log files.
- File system management and monitoring.
- HDFS support and maintenance.
- Diligently teaming with the infrastructure, network, database, application and business intelligence teams to guarantee high data quality and availability.
Environment: Hadoop, HDFS, Pig, Hive, MapReduce, Linux, Sqoop, Sql.
Confidential, San Francisco, CA
Hadoop Support Analyst
Responsibilities:
- Participated in Hadoop Deployment and infrastructure scaling.
- Day to Day Hadoop / HDFS maintenance and operations
- Data cluster monitoring and troubleshooting
- Participated Hadoop capacity planning.
- Manage and review data backups / restores.
- Big Data Environment tuning.
- Working with data delivery teams to setup new Hadoop users.
- Designed and allocated HDFS quotas for multiple groups.
- Participated in OS integration and application installation.
- Cluster and application problem resolution for restoring service
- Create run books for L1 Operations.
- Importing and exporting data into HDFS and Hive using Sqoop.
- Involved in defining job flows, managing and reviewinglog files.
- Worked on developing Linux scripts for Job Automation.
Environment: Hadoop, MapReduce, HDFS, Hive, Oracle 11g, Java, Struts, Servlets, HTML, XML, SQL, J2EE, Tomcat 6.
Confidential
Jr System Administrator
Responsibilities:
- Worked on configuring and tuning system and network parameters for optimum performance.
- Wrote shell scripts to automate the tasks.
- Developed tools to automate the deployment, administration, and monitoring of a large-scale Linux environment.
- Performed server tuning, operating system upgrades.
- Generated daily compliance reports for Service Releases, Emergency Releases and Quarterly Releases using Transactional SQL Queries.
- Configured various security roles for providing access as per requirement.
- Installed and configured Ldap Authentication.
- Installed and Configured Tomcat.
- Participated in Installation and Configuration of BI Applications i.e. MicroStrategy & Informatica on Linux Servers.
- Debug and correct installed system software as required.
Environment: Red hat Linux 6.x, 7.x, Windows XP, Shell Scripting, Tomcat.
Confidential
Jr Software Developer
Responsibilities:
- Responsible and active in the analysis, design, implementation and deployment of full Software Development Lifecycle (SDLC) of the project.
- Designed and developed user interface using JSP, HTML and JavaScript.
- Developed Struts action classes, action forms and performed action mapping using Struts framework and performed data validation in form beans and action classes.
- Defined the search criteria and pulled out the record of the customer from the database. Make the required changes and save the updated record back to the database.
- Validated the fields of user registration screen and login screen by writing JavaScript validations.
- Used DAO and JDBC for database access.
- Developed stored procedures and triggers using PL/SQL in order to calculate and update the tables to implement business logic.
- Design and develop XML processing components for dynamic menus on the application.
- Involved in post production support and maintenance of the application.
Environment: Oracle 11g, Java 1.5, Struts, Servlets, HTML, XML, SQL, J2EE, JUnit, Tomcat 6.