We provide IT Staff Augmentation Services!

Sr. Hadoop Developer Resume

0/5 (Submit Your Rating)

Jamesburg, NJ

SUMMARY:

  • 7+ years of professional experience in IT, including 3 years of experience with Hadoop, Spark, MapReduce and Hadoop Ecosystem (Pig, Hive, Hbase, Flume, Sqoop and Zookeeper).
  • Over 4years of experience in designing and developing applications using Core Java and J2EE technologies including Servlets, JSP, EJB, WebServices, XML, JDBC, Maven, HTML and Hibernate, Spring Frameworks.
  • Well versed in installing, configuring, supporting and managing of Big Data and underlying infrastructure of Hadoop Cluster.
  • Work experience in major components of Hadoop Ecosystem like Map Reduce, Hive, PIG, Hbase, ZooKeeper, Oozie, Flume, Kafka, Sqoop.
  • Hands on experience in major Big Data components Apache Spark (Using Scala), SparkSQL and Spark Streaming.
  • Extensive experience writing custom Map Reduce programs for data processing and UDFs for both Hive and Pig in Java.
  • Experienced in working with structured data using Hive QL, join operations, writing custom UDF’s and experienced in optimizing Hive Queries.
  • Strong experience in analyzing large amounts of data sets writing Pig scripts and Hive queries.
  • Experience in importing and exporting data using Sqoop from HDFS to Relational Database.
  • Experienced in job workflow scheduling and monitoring tools like Oozie.
  • Experience in Apache Flume for collecting, aggregating and moving huge chunks of data from various sources such as webserver, telnet sources etc.
  • Good understanding about NoSQL data models and hands on experience working with NoSQL databases like Hbase, Cassandra and MongoDB.
  • Worked on replicating data and Sharding data using MongoDB.
  • Experienced with performing CRUD operations using Hbase Java Client API and Rest API.
  • Experienced in migrating map reduce programs into Spark RDD transformations, actions to improve performance.
  • Experience in analyzing real time data using Spark Streaming.
  • Experience using various Hadoop Distributions (Cloudera, Hortonworks) to fully implement and leverage new Hadoop features.
  • Excellent Java development skills using Core Java, J2EE, J2SE, Servlets, JUnit, JSP, JDBC.
  • Experience in Web Services using XML, HTML and SOAP.
  • Familiarity working with popular frameworks like Hibernate, Spring and AJAX.
  • Experienced in performing CRUD operations using Hibernate, Spring Frameworks and JDBC Drivers.
  • Experience using integrated development environment like Eclipse, Net beans, and MyEclipse.
  • Excellent understanding of relational databases as pertains to application development using several RDBMS including in IBM DB2, Oracle 10g, MS SQL Server 2005/2008, and MySQL and strong database skills including SQL, Stored Procedure and PL/SQL.
  • Ability to work on diverse Application Servers like JBOSS, APACHE TOMCAT, WEBSPHERE.

TECHNICAL SKILLS:

Big Data: HDFS, Spark, SparkSQL, Spark Streaming, MapReduce, Hbase, Pig, Hive, Sqoop, Flume, Kafka, Oozie, Zookeeper

Hadoop Distribution: Horton Works, Cloudera

Java&J2EE Technologies: Core Java, Servlets, JSP, JDBC, Java Beans

Frameworks: Hibernate, spring and MRUnit

Build Tools: Maven, SQL Developer

Languages: C, C++, Java, Scala, Linux shell scripts, SQL

IDE s: Eclipse, Net beans, Oracle SQL Developer

Databases: Cassandra, MongoDB, Hbase, Oracle, MySQL, DB2

Web Servers: JBoss, Web Logic, Web Sphere, Apache Tomcat

Web Technologies: HTML, XML, JavaScript, CSS, AJAX, JSON, Servlets, JSP

Operating Systems: Ubuntu, Linux, UNIX, Mac OS - X, Windows 8, Windows 7, Windows Server 2008/2003

PROFESSIONAL EXPERIENCE:

Confidential, Jamesburg, NJ

Sr. Hadoop Developer

Responsibilities:

  • Proactively monitored systems and services, architecture design and implementation of Hadoop deployment, configuration management, backup, and disaster recovery systems and procedures.
  • Involved in Analyzing system failures, identifying root causes, and recommended course of actions.
  • Documented the systems processes and procedures for future s.
  • Worked with systems engineering team to plan and deploy new Hadoop environments and expand existing Hadoop clusters.
  • Monitored multiple Hadoop clusters environments using Ganglia.
  • Monitored workload, job performance and capacity planning using Cloudera Manager.
  • Worked with application teams to install operating system, Hadoop updates, patches, version upgrades as required.
  • Installed and configured Flume, Hive, Pig, Sqoop and Oozie on the Hadoop cluster.
  • Used Flume to collect, aggregate, and store the web log data from different sources like web servers, mobile and network devices and pushed to HDFS.
  • Performed Map Reduce programs on log data to transform into structured way to find user location, age group, spending time.
  • Analyzed the web log data using the HiveQL to extract number of unique visitors per day, page views, visit duration, most purchased product on website.
  • Developed SQL scripts using Spark for handling different data sets and verifying the performance over Map Reduce jobs.
  • Used Kafka to for injecting data to Spark.
  • Involved in converting Map Reduce programs into Spark transformations using Spark RDD's and Scala.
  • Developed Spark scripts by using Scala Shell commands as per the requirement.
  • Experience implementing machine learning techniques in spark by using spark MLlib.
  • Exported the analyzed data to Dashboards and to generate reports by our BI team.
  • Integrated Oozie with the rest of the Hadoop stack supporting several types of Hadoop jobs out of the box (such as Map-Reduce, Pig, Hive, and Sqoop) as well as system specific jobs (such as Java programs and shell scripts).
  • Involved in Installing and configuring Kerberos for the authentication of users and Hadoop daemons.

Environment: Hadoop, HDFS, Map Reduce, Hive, Pig, Spark, Scala Sqoop, Kafka, Ganglia, Hbase, Phoenix, Java, Shell Scripting, Ubuntu 13.04

Confidential, Bartlesville, OK

Responsibilities:

  • Involved in creating Hive tables, and loading and analyzing data using hive queries.
  • Developed Simple to complex Map Reduce Jobs using Hive and Pig.
  • Developed workflow in Oozie to automate the tasks of loading the data into HDFS and pre-processing with Pig.
  • Mentored analyst and test team for writing Hive Queries.
  • Worked with application teams to install operating system, Hadoop updates, patches, version upgrades of Horton works as required.
  • Developed multiple Map Reduce jobs in java for data cleaning and preprocessing
  • Worked with Linux systems and RDBMS database on a regular basis in order to ingest data using Sqoop.
  • Develop and maintains complex outbound notification applications that run on custom architectures, using diverse technologies including Core Java, J2EE, SOAP, XML, JMS, JBoss and Web Services.
  • Experienced in running Hadoop streaming jobs to process terabytes of xml format data.
  • Load and transform large sets of structured, semi structured and unstructured data.
  • Generated the datasets and loaded toHADOOPEcosystem.
  • Assisted in exporting analyzed data to relational databases using Sqoop.
  • Created and maintained Technical documentation for launching HADOOP Clusters and for executing Hive queries, Pig Scripts, Sqoop jobs.

Environment: Horton works, Hadoop, HDFS, Oozie, Pig, Hive, MapReduce, Sqoop and LINUX.

Confidential, Houston, TX

Java/Hadoop Developer

Responsibilities:

  • Using rational tools for analyzing use cases and prepared class model, sequence model and flow diagrams.
  • Implemented Service Oriented Architecture using JMS for sending and receiving messages.
  • Implemented Spring MVC, dependency injection and aspect oriented programming features along with Hibernate.
  • Validated all forms using Struts validation framework and implemented Tiles framework in the presentation layer.
  • Involved in writing the parsers for parsing and building the XML documents using DOM parser.
  • Java Mail API was used to notify agents about the free quote and for sending emails to the customers.
  • Worked on hibernate core interfaces like session factory, configuration, transactional and criteria interfaces.
  • Application development through Eclipse and websphere application server for deployment.
  • Involved in sharing my technical knowledge with other associates ensuring efficiency in the functional process.
  • Responsible for building scalable distributed data solutions using Hadoop.
  • Exported the analyzed data to relational databases using Sqoop for generating reports to BI team.
  • Worked on loading log data into HDFS through Flume.
  • Created and maintained technical documentation for executing Hive queries and Pig Scripts.
  • Worked on debugging and performance tuning of Hive & Pig jobs.
  • Used Oozie to schedule various jobs on Hadoop cluster.
  • Used Hive to analyze the partitioned and bucketed data.
  • Implemented Map Reduce programs to handle semi/ unstructured data like XML, JSON, Avro data files and sequence files for log files.
  • Preprocessed the data for analysis, using Pig UDF’s.
  • The hive jobs were wrapped and implemented by shell scripts.

Environment: Java, Servlets, JSPs, Java Script, HTML, Eclipse, Java Web Server 2.0, Spring, Hibernate, Struts, SQL, JMS, XML and DOM parser, Hadoop, HDFS, Hive, Pig, Flume, Sqoop, Spark, MapReduce.

Confidential, Raleigh, NC

J2EE Developer

Responsibilities:

  • Involved in almost all the phases of SDLC.
  • Complete involvement in Requirement Analysis and documentation on Requirement Specification.
  • Developed prototype based on the requirements using Struts2framework as part of POC (Proof of Concept)
  • Prepared use-case diagrams, class diagrams and sequence diagrams as part of requirement specification documentation.
  • Involved in design of the core implementation logic using MVC architecture.
  • Used Apache Maven to build and configure the application.
  • Used MongoDB to store semi structured data.
  • Configured struts.xmlfile with required action-mappings for all the required services.
  • Developed implementation logic using struts2 framework.
  • Developed JAX-WS web services to provide services to the other systems.
  • Developed JAX-WS client to utilize few of the services provided by the other systems.
  • Involved in developing EJB 3.0 Stateless Session beans for business tier to expose business to services component as well as web tier.
  • Implemented Hibernate at DAO layer by configuring hibernate configuration file for different databases.
  • Developed business services to utilize Hibernate service classes that connect to the database and perform the required action.
  • Developed JSP pages using struts JSP-tags and in-house tags to meet business requirements.
  • Developed JavaScript validations to validate form fields.
  • Performed unit testing for the developed code using JUnit.
  • Developed design documents for the code developed.
  • Used SVN repository for version control of the developed code.
  • Mentored junior developers in their development activities.

Environment: Java, JavaScript. J2EE, MVC, Struts2, JSP, Junit, SOAP, Jenkins, Hibernate, MongoDB, Apache Tomcat Server, and XML.

Confidential

Java/J2EE Developer

Responsibilities:

  • Involved in complete requirement analysis, design, coding and testing phases of project.
  • Designed and developed front end using JSP, Struts (tiles), XMl, Java Script and HTML.
  • Generated XML schemas and used XML Beans to parse XML files.
  • Designed Class and Sequence diagrams with UML and Data flow diagrams.
  • Extensively used Hibernate in data access layer to access and update information in the database.
  • Developed, implemented and maintained an asynchronous AJAX based rich client for improved customer experience using XML data and XSLT templates.
  • Extensive use of Struts framework for controller components and view components.
  • Implemented EJB in the service layer.
  • Developed business logic/ back-end code using java and ADF Business components.
  • Implemented the MVC architecture using Struts framework to get the free quote.
  • Experienced with implementing navigation using Spring MVC.
  • Implemented message driven beans to get from queues and to send again to support team using MSend commands.
  • Used Hibernate for object relational mapping persistence.
  • Experienced working on ETL as a tool for data source.
  • Worked on Autosys for batch scheduling
  • Performed Unit testing and rigorous integration testing of the whole application.
  • Performed query optimization to achieve faster indexing and making the system more scalable.
  • Used SVN for version control.

Environment: Java, Servlets, JSPs, Java Script, HTML, MySQL 2.1, Java Web Server 2.0, Spring, Hibernate, Struts, SQL, PL/SQL, CSS, JMS, Oracle, Autosys, XML and DOM parser.

Confidential

Java Developer

Responsibilities:

  • Actively participated in requirements gathering, analysis, design, and testing phases.
  • Designed use case diagrams, class diagrams, and sequence diagrams as a part of Design Phase.
  • Developed the entire application implementing MVC Architecture integrating JSF with Hibernate and spring frameworks.
  • Developed the Enterprise Java Beans (Stateless Session beans) to handle different transactions such as online funds transfer, bill payments to the service providers.
  • Implemented Service Oriented Architecture (SOA) using JMS for sending and receiving messages while creating web services.
  • Developed XML documents and generated XSL files for Payment Transaction and Reserve Transaction systems.
  • Developed SQL queries and stored procedures.
  • Developed Web Services for data transfer from client to server and vice versa using Apache Axis, SOAP and WSDL.
  • Used JUnit Framework for the unit testing of all the java classes.
  • Implemented various J2EE Design patterns like Singleton, Service Locator, DAO, and SOA.
  • Worked on AJAX to develop an interactive Web Application and JavaScript for Data Validations.

Environment: J2EE, JDBC, SQL, Java, Servlets, JSP, Struts, Hibernate, Web services, SOAP, WSDL, Design Patterns, MVC, HTML, JavaScript, WebLogic, XML, Junit, Web Sphere, My Eclipse.

We'd love your feedback!