We provide IT Staff Augmentation Services!

Hadoop Developer Resume

0/5 (Submit Your Rating)

Pleasanton, CA

SUMMARY:

  • 7 years of overall IT experience in a variety of industries, 2+ years of comprehensive experience as a Hadoop Developer in Finance, Health and Telecommunication sectors.
  • In depth understanding and knowledge of Hadoop Architecture and various components such as HDFS, Job Tracker, Task Tracker, Name Node, Data Node and MapReduce concepts
  • Hands - on experience with major components in Hadoop Ecosystem including Hive, HBase, HBase-Hive Integration, PIG, Sqoop, Flume, Avro, Oozie, Zookeeper and knowledge of Mapper/Reduce/HDFS Frame work.
  • Experience in analyzing data using HIVEQL, PIG Latin and custom MapReduce programs in JAVA. Extending HIVE and PIG core functionality by using custom UDF’s.
  • Experience in importing and exporting the data using Sqoop from HDFS to Relational Database systems/mainframe and vice-versa.
  • Experience with Oozie Workflow Engine in running workflow jobs with actions that run Hadoop Map/Reduce and Pig jobs.
  • Knowledge of NoSQL databases such as HBase, Cassandra and MongoDB
  • Work experience with cloud infrastructure like Amazon Web Services (AWS).
  • Involved in business requirements gathering for successful implementation and POC (proof-of-concept) of Hadoop and its ecosystem
  • Experience in managing and reviewing Hadoop Log files.
  • Experience in setting up automated monitoring and escalation infrastructure for Hadoop Cluster using Ganglia and Nagios.
  • Experience in installation, configuration, management and deployment of Big Data solutions and the underlying infrastructure of Hadoop Cluster.
  • Good understanding of Data Mining and Machine Learning techniques
  • Experience in developing solutions to analyze large data sets efficiently
  • Experience in the integration of various data sources like Java, RDBMS, Shell Scripting, Spreadsheets and Text files.
  • Solid background in Object-Oriented analysis and design OOAD . Very good at various Design Patterns, UML and Enterprise Application Integration EAI
  • Experience in Web Services using XML, HTML and SOAP.
  • Extensive experience with SQL, PL/SQL and database concepts
  • Diverse experience in utilizing Java tools in business, Web, and client-server environments including Java Platform, J2EE, EJB, JSP, Java Servlets, Junit, Java database Connectivity (JDBC) technologies and application servers like Web Sphere and Weblogic.
  • Familiarity in working with popular frameworks likes Struts, Hibernate, Spring MVC and AJAX.
  • Ability to blend technical expertise with strong Conceptual, Business and Analytical skills to provide quality solutions and result-oriented problem solving technique and leadership skills.

TECHNICAL SKILLS:

Big data/Hadoop Ecosystem: HDFS, Map Reduce, HIVE, PIG, HBase, Sqoop, Flume, Oozie and Avro

Programming Languages: C,C++, Java, SQL, PL/SQL, Linux shell scripts.

NoSQL Databases: MongoDB, Cassandra, HBase

Database: Oracle 11g/10g, DB2, MS-SQL Server, MySQL, Teradata.

Web Technologies: HTML, XML, JDBC, JSP, JavaScript, AJAX, SOAP

Frameworks: MVC, Struts 2/1, Hibernate 3, Spring 3/2.5/2.

Tools: Used: Eclipse, Putty, cygwin

Operating System: Ubuntu (Linux), Win 95/98/2000/XP, Mac OS, RedHat

ETL Tools: Informatica, pentaho.

Testing: Hadoop Testing, Hive Testing, Quality Center (QC)

Monitoring and Reporting tools: Ganglia, Nagios, Custom Shell scripts.

PROFESSIONAL EXPERIENCE:

Confidential - Pleasanton,CA

Hadoop Developer

Responsibilities:

  • Involved in complete Implementation lifecycle, specialized in writing MapReduce programs, Pig and Hive programs.
  • Exported the analyzed data to the relational databases using Sqoop for visualization and to generate reports for the BI team.
  • Extensively used Hive/HQL or Hive queries to query or search for a particular string in Hive tables in HDFS.
  • Experience in developing customized UDF's in java to extend Hive and Pig Latin functionality.
  • Created HBase tables to store various data formats of data coming from different portfolios.
  • Managing and scheduling Jobs (cleanup jobs) to remove the duplicate logdata files in HDFS using Oozie.
  • Used Flume extensively in gathering and moving log data files from Application Servers to a central location in Hadoop Distributed File System (HDFS).
  • Experienced with SOLR for indexing and search.
  • Implemented test scripts to support test driven development and continuous integration.
  • Responsible to manage data coming from different sources.
  • Experienced in Analyzing Cassandra database and compare it with other open-source NoSQL databases to find which one of them better suites the current requirements.
  • Used File System check (FSCK) to check the health of files in HDFS.
  • Experienced on loading and transforming of large sets of structured, semi structured and unstructured data.
  • Analyzed large amounts of data sets to determine optimal way to aggregate and report on it.

Environment: Hadoop 1x, HDFS, Map Reduce, Hive 0.10, Pig 0.11, Sqoop, HBase, Shell Scripting, Apache Solr, Java.

Confidential - Dublin, OH

Hadoop Developer

Responsibilities:

  • Developed solutions to process data into HDFS (Hadoop Distributed File System), process within Hadoop and emit the summary results from Hadoop to downstream analytical systems.
  • This data was exported into HDFS using Sqoop.
  • Hive was used to produce results quickly based on the report that was requested.
  • Integrated Hive with web server for auto generation of Hive queries for non-technical business user.
  • Integrated multiple sources data (SQL Server, DB2, TD) into Hadoop cluster and analyzed data by Hive-HBase integration.
  • Developed PIG UDFs for the needed functionality and custom Pigsloader known as timestamp loader.
  • Flume was used to collect the logs with error messages across the cluster.
  • Oozie and Zookeeper were used to manage the flow of jobs and coordination in the cluster respectively.
  • Processed data from Hadoop to relational databases or external file systems using SQOOP.
  • Developed several shell scripts, which acts as wrapper to start these Hadoop jobs and set the configuration parameters.
  • Kerberos security was implemented to safeguard the cluster.
  • Worked on a stand-alone as well as a distributed Hadoop application.
  • Tested the performance of the data sets on various NoSQL databases.
  • Understood complex data structures of different type (structured, semi structured) and de-normalizing for storage in Hadoop.

Environment: Hadoop, HDFS, Pig 0.10, Hive, MapReduce, Sqoop, Java Eclipse, SQL Server, Shell Scripting.

Confidential - Plano, TX

Java/J2EE/Hadoop Developer

Responsibilities:

  • Participated in requirement gathering and converting the requirements into technical specifications.
  • Created UML diagrams like use cases, class diagrams, interaction diagrams, and activity diagrams.
  • Developed the application using Struts Framework that leverages classical Model View Controller (MVC) architecture.
  • Extensively worked on User Interface for few modules using JSPs, JavaScript and Ajax.
  • Created Business Logic using Servlets, POJO’s and deployed them on Web logic server.
  • Wrote complex SQL queries and stored procedures.
  • Developed the XML Schema and Web services for the data maintenance and structures.
  • Implemented the Web Service client for the login authentication, credit reports and applicant information using Apache Axis 2 Web Service.
  • Responsible to manage data coming from different sources.
  • Developed map reduce algorithms.
  • Got good experience with NOSQL database.
  • Involved in loading data from UNIX file system to HDFS.
  • Installed and configured Hive and also written Hive UDFs.
  • Integrated Hadoop with Solr and implement search algorithms.
  • Worked with cloud services like Amazon web services (AWS)
  • Designed the logical and physical data model, generated DDL scripts, and wrote DML scripts for Oracle 10g database.
  • UsedHibernateORM framework withSpringframework for data persistence and transaction management.
  • Used struts validation framework for form level validation.
  • Wrote test cases in JUnit for unit testing of classes.
  • Involved in creating templates and screens in HTML and JavaScript.
  • Involved in integrating Web Services using SOAP.

Environment: Hive 0.7.1, Apache Solr - 3.5, HBase-0.90.x/0.20.x, JDK 1.5,, Struts 1.3, WebSphere 6.1, HTML, XML, JavaScript, JUnit 3.8,Oracle 10g, Amazon Web Services.

Confidential - McLean, VA

Java/J2EE Developer

Responsibilities:

  • Responsible for gathering business and functional requirements for the development and support of in-house and vendor developed applications
  • Gathered and analyzed information for developing, supporting, and modifying existing web applications based on prioritized business needs
  • Played key role in design and development of new application using J2EE, Servlets, and Spring technologies/frameworks using Service Oriented Architecture (SOA)
  • Wrote Action classes, Request Processor, Business Delegate, Business Objects, Service classes and JSP pages
  • Played a key role in designing the presentation tier components by customizing the Spring framework components, which includes configuring web modules, request processors, error handling components, etc.
  • Implemented the Web Services functionality in the application to allow external applications to access data
  • Used Apache Axis as the Web Service framework for creating and deploying Web Service Clients using SOAP and WSDL
  • Worked on Spring to develop different modules to assist the product in handling different requirements
  • Developed validation using Spring's Validation Interface and used Spring Core and MVC develop the applications and access data
  • Implemented Spring Beans using IOC and Transaction management features to handle the transactions and business logic
  • Design and developed different PL/SQL blocks, Stored Procedures in DB2 database
  • Involved in writing DAO layer using Hibernate to access the database
  • Involved in deploying and testing the application using Websphere Application Server
  • Developed and implemented several test cases using JUnit framework
  • Involved in troubleshoot technical issues, conduct code reviews, and enforce best practices

Environment: Java SE 6, J2EE 6, JSP 2.1, Servlets 2.5, Java Script, IBM Websphere7, DB2, HTML, XML, Spring 3, Hibernate 3, JUnit, Windows 7, Eclipse 3.5

Confidential - Seattle, WA

Java/J2EE Developer

Responsibilities:

  • Involved in various phases of Software Development Life Cycle (SDLC) as design development and unit testing.
  • Developed and deployed UI layer logics of sites using JSP, XML, JavaScript, HTML/DHTML, and Ajax.
  • CSS and JavaScript were used to build rich internet pages.
  • Agile Scrum Methodology been followed for the development process.
  • Designed different design specifications for application development that includes front-end, back-end using design patterns.
  • Developed proto-type test screens in HTML and JavaScript.
  • Involved in developing JSP for client data presentation and, data validation on the client side with in the forms.
  • Developed the application by using the Spring MVC framework.
  • Collection framework used to transfer objects between the different layers of the application.
  • Developed data mapping to create a communication bridge between various application interfaces using XML, and XSL.
  • Spring IOC being used to inject the parameter values for the Dynamic parameters.
  • Developed JUnit testing framework for Unit level testing.
  • Actively involved in code review and bug fixing for improving the performance.
  • Documented application for its functionality and its enhanced features.
  • Created connection through JDBC and used JDBC statements to call stored procedures.

Environment: Spring MVC, Oracle 11g J2EE, Java, JDBC, Servlets, JSP, XML, Design Patterns, CSS, HTML, JavaScript 1.2, Junit, Apache Tomcat, My SQL Server 2008.

Confidential

Application Developer

Responsibilities:

  • Developed the application under JEE architecture, developed, Designed dynamic and browser compatible user interfaces using JSP, Custom Tags, HTML, CSS, and JavaScript.
  • Deployed & maintained the JSP, Servlets components on Web logic 8.0
  • Developed Application Servers persistence layer using JDBC and SQL.
  • Used JDBC to connect the web applications to Databases.
  • Implemented Test First unit testing framework driven using Junit.
  • Developed and utilized J2EE Services and JMS components for messaging communication in Web Logic.
  • Configured development environment using Web logic application server for developers integration testing.

Environment: Java/J2EE, SQL, Oracle 10g, JSP 2.0, EJB, AJAX, Java Script, Web Logic 8.0, HTML, JDBC 3.0, XML, JMS, log4j, Junit, Servlets, MVC

We'd love your feedback!