We provide IT Staff Augmentation Services!

Hadoop Admin Resume Profile

4.00/5 (Submit Your Rating)

SUMMARY OF QUALIFICATIONS

  • Around 6years of experience in the Information Technology industry.
  • Strong exposure to IT consulting, software project management, team leadership, design, development, implementation, maintenance/support and Integration of Enterprise Software Applications.
  • Experience in Big Data processing using Apache Hadoop. Extensive experience in HDFS, MapReduce, Pig and Hive.
  • Primary technical skills in Apache Hadoop, Map-Reduce, Pig, Hive, Hbase, ZooKeeper, Sqoop, Flume, OOZIE, Core Java, Java Script, J2EE, Oralce 11G/10G, HP PPM 9.2.
  • Experience in Databases such as Oracle 11G/10G, DB2, Teradata, MySQL and Sybase.
  • Having working knowledge on Oracle Administration.
  • Worked on Visual SourceSafe VSS .
  • Having working knowledge on Windows and UNIX shell scripting.
  • Working in different IDE's like PL/SQL Developer, TOAD, EditPlus3, Eclipse3.5, IntelliJ IDEA 7.2 and Netbeans6.5.
  • Excellent working experience for large financial corporation clients such as AIG, Genworth and American Express.
  • Around 4.3Yrs Experience in HP PPM 9.2/9.1/8.0/7.5/7.1 ITG tools and the implementations ranges from development, support, enhancement and up gradation.
  • Worked on different modules in PPM like Demand Management, Change Management, Project Management, Finance Management, Time Management, Resource Management, and Portfolio Management.
  • Having knowledge on IT service Management, Also Implemented various processes as per the ITIL V3 framework.
  • Having Knowledge on Service-Now.
  • Knowledge of Data Ware Housing concepts and Cognos 8 BI Suit and Business Objects.
  • Experience in preparing the impact analysis document for the new enhancements and up- gradation process.
  • Proficient in ITG reports and Portlets and Developed complex JSP and SQL reports and portlets.
  • Experience in migration of objects across different environments using Deployment management module.
  • Working on development projects, which include design, development and unit testing of applications.
  • Working on production and support projects, which include reporting and prioritizing the issues and defects and resolve them.
  • Understand and implement the Software development life cycle SDLC process.
  • Team Player with good communication and interpersonal skills and also goal oriented approach to problem solving issues.
  • Flexible on working on various technologies.
  • Oracle Database 11g Administrator Certified Associate.

TECHNICAL SKILLS

Big Data Technologies: Apache Hadoop, Map-Reduce, HDFS, Pig, Hive, Hbase, ZooKeeper, Sqoop, Flume, OOZIE.

Languages: Core Java, J2EE, SQL, PL/SQL, Unix Shell Scripting.

Web Technologies: JSP, EJB 2.0, JNDI, JMS, JDBC, HTML, JavaScript

Web/Application servers: Tomcat 6.0/5.0/4.0, JBoss 5.1.0

Databases: Oracle 11G/10G, SQL Server, DB2, Sybase, Teradata.

Operating Systems: MS-DOS, Windows XP, Windows 7, UNIX and Linux.

IDE: IntelliJ IDEA 7.2, EditPlus3, Eclipse3.5, NetBeans6.5, TOAD, PL/SQL Developer, Teradata SQL Client.

Frame Works: HadoopMapReduce, MVC, Struts 2.x/1.x.

Version Control: VSS Visual Source Safe , Subversion, CVS

Testing Technologies: JUnit 4/3.8

IT Governance Tool: HP PPM 9.2, 9.1, 8.0, 7.5, 7.1 and 6.0.

Project Management Tool: MS-Project.

Office Packages: MS-Office 2010, 2007, 2003 and Visio.

Business Intelligence: Business Object XI 3.1, Cognos 8.4

Office Packages: MS-Office 2010, 2007, 2003 and Visio

PROFESSIONAL EXPERIENCE

Confidential

Role: Hadoop Admin

Responsibilities:

  • Worked on data analysis in HDFS using MapReduce, Hive and PIG jobs.
  • Worked on MapReduce programming and Hbase.
  • Involved in developing the Pig scripts.
  • Involved in developing the Hive Reports.
  • Developed the Sqoop scripts in order to make the interaction between Pig and other database.
  • Worked on setting up pig, Hive and Hbase on multiple nodes and developed using Pig, Hiveand Hbase, MapReduce.
  • Monitoring and managing the Hadoop cluster through Cloudera Manager.
  • Involved in creating external table, partitioning, bucketing of table.
  • Coded Java/MapReduce Programs.
  • Scheduling and Monitoring jobs.
  • Extracting Hive query o/p to local file system in text/csv format.
  • Ensuring adherence to guidelines and standards in project process.
  • Responsible for creation and setting up of environment and re-configuration activities
  • Facilitating testing in different dimensions

Environment: Apache Hadoop, Pig, Hive, Map-reduce, Sqoop, UNIX, Oracle 11gR2, JAVA/J2EE, UNIX, WINDOWS.

Confidential

Role: Hadoop Admin

Responsibilities:

  • Engineer in Big Data team, worked with Hadoop, and its Ecosystem.
  • Used Flume to collect, aggregate and store the web log data onto HDFS.
  • Wrote Pig scripts to run ETL jobs on the data in HDFS.
  • Used Hive to do analysis on the data and identify different correlations.
  • Having knowledge on Installation and configuration of clouderahadoop on single or cluster environment.
  • Worked on setting up of environment and re-configuration activities.
  • Development and maintenance of the Hive-QL, Pig Scripts.
  • Facilitating testing in different dimensions
  • This project involves File transmission and electronic data interchange trades capture, verify, process and routing operations, Banking Reports Generation, Operational management.
  • Developed and Modified Oracle Packages, Procedures, functions, Triggers as per the business requirements.
  • DBMS developments include building data migration scripts using Oracle SQL LOADER.
  • Used Crontab for automation of scripts.
  • Wrote and modified stored procedures to load and modifying of data according to business rule changes.
  • Worked on production support environment.

Environment: Apache Hadoop, Pig, Hive, SQOOP, Flume, Java/J2EE, Oracle 11G, JBoss 5.1.0Application Server, Linux OS, Windows OS, etc

Confidential

Role: Hadoop Admin

Responsibilities:

  • Responsible for building scalable distributed data solutions using Hadoop. Worked hands on with ETL process using Pig.
  • Handled importing of data from various data sources, performed transformations using Hive, MapReduce, and loaded data into HDFS.
  • Extracted the data from Teradata into HDFS using Sqoop.
  • Analyzed the data by performing Hive queries and running Pig scripts to know user behavior like shopping enthusiasts, travelers, music lovers etc.
  • Exported the patterns analyzed back into Teradata using Sqoop.
  • Continuous monitoring and managing the Hadoop cluster through Cloudera Manager.
  • Developed Hive queries to process the data and generate the data cubes for visualizing.

Environment: Apache Hadoop, MapReduce, HDFS, Sqoop, Hive, Pig, Teradata, Linux.

Confidential

Role: Linux Admin

Responsibilities

  • Requirements gathering, analysis and design.
  • Involved in Business analysis, Architecture Design, Data modeling, Security modeling, Functional Design, Specification review, Test case reviews and Programming.
  • Coordinating with offshore team members by providing the functional/technical inputs and get the work done in specified time limits.
  • Working on middle tier services using Java/J2EE, Spring and Hibernate frameworks.
  • Developing web pages using client side technologies like HTML, CSS, Adobe Flex Javascript and Jquery.
  • Involved in software development full life cycle using methodologies like Agile , waterfall and scrum.
  • Designing database tables for new modules and writing SQL/PL SQL procedures for retrieval and insertion of data.
  • Configuring work flows in Mule and Active MQ to support the newly added functionality.
  • Participate in weekly project meetings and technical discussions.
  • Testing the developed module before delivering to final Quality Control Assurance.
  • Involved in configuring applications in various environments like Dev, QA and Production.
  • Heavily involved in migration projects from Microsoft applications to Java platform.
  • Involved in creating Business layer, Security layer, Database layer and Presentation layer.

Environment: Spring Frame work 2.0, Quartz Scheduler, Hibernate 2.1, JSTL, JSP, Java Script, AJAX, MS SQL Server 2008, JBoss,Agile, Acrum and waterfall, My Eclipse 6.0, Linux, LDAP, Subversion.

Confidential

Role: System Administration REdHat-Linux

Responsibilities:

  • Worked as a System Administrator on Linux- UNIX platforms.
  • Administered Linux servers Linux Red hat - RHEL 3/4/5/6 , for several functions including managing Apache/Tomcat server, mail server, MySQL database, and firewalls in both development and production.
  • Responsibilities as a Unix System Administrator include monitor and tune system to make sure of the optimum level of the performance.
  • Extensive experience in the concept of LVM, User System Resource Management and Job Scheduling.
  • Experience in Hardware and Software refreshes on the server.
  • Performed common system administration task including adding users, creating file systems, configuring volumes weekly mksysb.
  • Configuration of NIS, NFS, SAMBA, SENDMAIL, APACHE SERVICES on Linux UNIX Environment.
  • Adding Oracle ASM disk to the server. Creating and managing LVM.
  • Experience in installation and configuration of HBAs and associated firmware/drivers and scanning and configuration of LUNs/Volumes.
  • Create users with limited and full root privileges. Create and manage sudoers.
  • Develop Linux shell scripts to accomplish redundant tasks to simplify in distributed administration.
  • Responsible for resolving network issues using network tools like ping, tcptraceroute, traceroute, tcpdump.
  • Install and configure various services like DHCP, NFS, DNS, Apache Web Server, NIS, LDAP, Samba, SSH, FTP/SFTP, Sendmail/SMTP, Telnet, RPM Package Management, File System Management.
  • Perform standard system administration tasks, such as patching, software and hardware upgrades, troubleshooting and problem resolution.
  • Monitored the servers and Linux scripts regularly and performed troubleshooting steps tested and installed the latest software on server for end-users.
  • Responsible for Patching Linux Servers.
  • Perform day to day Linux administration such as user accounts, logon scripts, directory services, file system shares, permissions. Support Oracle database.
  • Worked on SAN Migration from EMC Clarion to DMX for RedHat Linux.
  • Experience in Binding Luns, Naming, and resizing, Unbinding LUNs through Navisphere.
  • Planning and implementing SAN/NAS equipment firmware and OS updates/upgrades.
  • Keep a track of the appropriate software and upgrade the software packages.
  • Develop and maintain the documents, library and procedural documents of the system.
  • Provided 24x7 on call support on a rotation basis.

We'd love your feedback!