We provide IT Staff Augmentation Services!

Hadoop Admin & Developer Resume

0/5 (Submit Your Rating)

El Segundo, CA

SUMMARY:

  • Over 7+ years of experience in software development.
  • 3+ years’ experience in Hadoop Framework and its ecosystem.
  • Familiar with data architecture including data ingestion pipeline design, Hadoop information architecture, data modelling and data mining, machine learning and advanced data processing.
  • Real time experience in Hadoop/Big Data related technology experience in Storage, Querying, Processing and analysis of data.
  • Excellent knowledge on Hadoop Architecture and ecosystems such as HDFS, Job Tracker, Task Tracker, Name Node, Data.
  • Expertise in writing Hadoop Jobs for analyzing data using MapReduce, Hive &Pig.
  • Knowledge in installing, configuring, and using Hadoop ecosystem components like Hadoop Map Reduce, HDFS, HBase, Oozie, Hive, Sqoop, Pig, Zookeeper and Flume.
  • Experience in managing and reviewing Hadoop log files.
  • Experience in analyzing data using HiveQL, Pig Latin, HBase and custom Map Reduce programs in Java.
  • Experience in importing and exporting data using Sqoop from HDFS to RDBMS and vice - versa.
  • Experienced in extending Hive and Pig core functionality by writing custom UDFs using Java.
  • Experience in building, maintaining multiple Hadoop clusters of different sizes and configuration and setting up the rack topology for large clusters.
  • Experience in installation, configuration, supporting and managing - Cloudera’s Hadoop platform along with CDH3&4 clusters.
  • Experience in NoSQL databases such as HBase, Cassandra and MongoDB.
  • Experienced in job workflow scheduling tool like Oozie.
  • Experienced in managing Hadoop cluster using Cloudera & MapR Manager Tool.
  • Experience in performance tuning by identifying the bottle necks in sources, mappings, targets and Partitioning.
  • Experienced in worked on Backend database programming using SQL, PL/SQL, Stored Procedures, Functions, Macros, Indexes, Joins, Views, Packages and Database Triggers
  • Good experience on using tools like SQL*Developer, Toad, WINSCP and Putty.
  • Hands on experience in application development using Java, RDBMS, and Linux shell scripting.
  • Ability to adapt to evolving technology, strong sense of responsibility and .

TECHNICAL SKILLS:

Hadoop/Big Data: HDFS, Mapreduce, HBase, Pig, Hive, Sqoop, Flume, Mongo DB, Cassandra, Power pivot, Puppet, Oozie, Zookeeper, Solr.

Java&J2EETechnologies: Core Java, Servlets, JSP, JDBC, JNDI, Java Beans

IDE s: Eclipse, Net beans

Big data Analytics: Datameer 2.0.5

Frameworks: MVC, Struts, Hibernate, Spring

Programming Languages: C,C++, Java, Python, Ant scripts, Linux shell scripts, Perl

Databases: Oracle 11g/10g/9i, MySQL, DB2, MS-SQL Server

Web Servers: Web Logic, Web Sphere, Apache Tomcat

Web Technologies: HTML, XML, JavaScript, AJAX, SOAP, WSDL

Network Protocols: TCP/IP, UDP, HTTP, DNS, DHCP

Testing: Win Runner, Load Runner, QTP

TOOLS: SQL Developer, SOAP UI, ANT, Maven, IBM MDM

PROFESSIONAL EXPERIENCE:

Confidential, El Segundo, CA

Hadoop Admin & Developer

Responsibilities:

  • Analysing the Functional Specifications Based On Project Required.
  • Worked with technology and business user groups for Hadoop migration strategy.
  • Installed & configured multi-node Hadoop Cluster and performed troubleshooting and monitoring of Hadoop Cluster.
  • Loaded data from various data sources into Hadoop HDFS/Hive Tables.
  • Used Datameer for integration with Hadoop and other sources such as RDBMS (Oracle), SAS, Teradata and Flat files.
  • Sqooped data from Teradata, DB2, Oracle to Hadoop and vice-versa.
  • Wrote Hive and Pig Scripts to analyze customer satisfaction index, sales patterns etc.
  • Extended Hive and Pig core functionality by writing custom UDFs using Java.
  • Orchestrated sqoop scripts, pig scripts, hive queries using Oozie workflows.
  • Worked on Performance Tuning of Hadoop jobs by applying techniques such as MapSide Joins, Partitioning, Bucketing and using different file formats such as SequenceFile, Parquette, RCFile, MapFile.
  • Worked on Data Lake architecture to build a reliable, scalable, analytics platform to meet batch, interactive and on-line analytics requirements.
  • Integrated Tableau with Hadoop data source for building dashboard to provide various insights on sales of the organization.
  • Worked on Spark in building BI reports using Tableau. Tableau was integrated with Spark using Shark & Spark SQL.
  • Implemented Apache Crunch library on top of map reduce and spark for data aggregation.
  • Evaluated NOSQL databases (MongoDb, Cassandra and Hbase) for Real Time services Platform, recommended best fit for the application.
  • Developed Map Reduce programs using Java to perform various transformations, cleaning and scrubbing tasks.
  • Participated in daily scrum meetings and iterative development.

Environment: Hadoop, MapReduce, Spark, Shark, Hive, Pig, Sqoop, Datameer, Oracle, Teradata, SAS, Tableau, Java 7.0, Log4J, Junit, MRUnit, SVN, JIRA, Cassandra.

Confidential, Tampa, FL

Hadoop Developer

Responsibilities:

  • Installed and configured multi-nodes fully distributed Hadoop cluster.
  • Involved in installing Hadoop Ecosystem components.
  • Responsible to manage data coming from different sources.
  • Involved in Hadoop Cluster environment administration that includes adding and removing cluster nodes. Analysed and clustered data using the Mahout.
  • Development of Interfaces and Conversions to load the data from legacy system to Oracle base tables using PL/SQL Procedures and developed various packages and functions
  • Supported Map Reduce Programs those are running on the cluster.
  • Implemented batch indexing with Map Reduce with Apache Solr.
  • Developed custom Unix SHELL scripts to do pre and post validations of master and slave nodes, before and after configuring the name node and data nodes respectively.
  • Involved in HDFS maintenance and administering it through Hadoop Java API.
  • Hands on experience in writing custom UDF’s and also custom input and output formats.
  • Configured Fair Scheduler to provide service-level agreements for multiple users of a cluster.
  • Maintaining and monitoring clusters. Loaded data into the cluster from dynamically generated files using Flume and from relational database management systems using Sqoop.
  • Managing nodes on Hadoop cluster connectivity and security.
  • Resolved configuration issues with Apache add-on tools.
  • Used Pig as ETL tool to do transformations, event joins, filter both traffic and some pre-aggregations before storing the data onto HDFS
  • Involved in writing Flume and Hive scripts to extract, transform and load the data into Database
  • Cluster capacity planning, performance tuning, cluster Monitoring, Troubleshooting.

Environment: Cloudera Hadoop, Linux, HDFS, Hive, Sqoop, Flume, Zookeeper, HBase, SQL, UNIX Shell Scripting.

Confidential, Cincinnati, OH

Java Developer

Responsibilities:

  • Followed agile methodologies to implement the project.
  • Involved in the Analysis and Designing of the Application.
  • Gathered and validated the requirements for the application.
  • Designed sequence diagrams and class diagrams using UML/ Rational Rose.
  • Prepared and presented demo for each and every transaction in the business process.
  • Used JSTL for the Development of JSP logic.
  • Interacted with Business Analyst for requirements gathering.
  • Designed Java classes as per OO Design.
  • Designed and developed JSP pages, Action classes for Struts.
  • Developed User Interface using Struts and Tiles framework.
  • Developed Data Centric applications using python, java.
  • Created and configured Struts-config.xml, Tiles-def.xml file to manage control flow.
  • Used Hibernate framework for enterprise component interaction with database.
  • Developed stored procedures on Oracle.
  • Maintained both Development and Production environments Using Oracle under Unix Environment.
  • Deployed and tested on Websphere Application server 6.0.
  • Used web services to get information about all other accounts of a client.
  • Subversion was used for the purpose of version control and source code sharing.
  • Written ANT scripts to build JAR, WAR and EAR files.
  • Used JUnit for unit testing.

Environment: s: JDK1.5, Struts1.2, Tiles, Hibernate, JSP, JSF, Servlets, Web Services, JMS, JSTL, XML, XMLSpy, JavaScript, Websphere6.0, Oracle 9i, Rational Rose, Subversion, Eclipse, JUnit, ANT.

Confidential

Java Developer

Responsibilities:

  • Responsible and active in the analysis, design, implementation and deployment of full Software Development Lifecycle (SDLC) of the project.
  • Designed and developed user interface using JSP, HTML and JavaScript.
  • Developed Struts action classes, action forms and performed action mapping using Struts framework and performed data validation in form beans and action classes.
  • Extensively used Struts framework as the controller to handle subsequent client requests and invoke the model based upon user requests.
  • Defined the search criteria and pulled out the record of the customer from the database. Make the required changes and save the updated record back to the database.
  • Validated the fields of user registration screen and login screen by writing JavaScript validations.
  • Developed build and deployment scripts using Apache ANT to customize WAR and EAR files.
  • Used DAO and JDBC for database access.
  • Developed stored procedures and triggers using PL/SQL in order to calculate and update the tables to implement business logic.
  • Design and develop XML processing components for dynamic menus on the application.
  • Involved in postproduction support and maintenance of the application.

Environment: s: Oracle 11g, Java 1.5, Struts, Servlets, HTML, XML, SQL, J2EE, JUnit, Tomcat 6.

Confidential

Java Developer

Responsibilities:

  • Involved in the analysis, design, implementation, and testing of the project.
  • Implemented the presentation layer with HTML, XHTML and JavaScript.
  • Developed web components using JSP, Servlets and JDBC.
  • Implemented database using SQL Server.
  • Designed tables and indexes.
  • Wrote complex SQL queries and Stored procedures.
  • Involved in fixing bugs and unit testing with test cases using JUnit.
  • Developed user and technical documentation.

Environment: s: Java, JSP, Servlets, JDBC, JavaScript, MySQL, JUnit, Eclipse IDE.

We'd love your feedback!