We provide IT Staff Augmentation Services!

Hadoop Developer Resume

3.00/5 (Submit Your Rating)

Boston, MA

PROFESSIONAL SUMMARY:

  • Around 8+ years of professional IT experience including hands on experience in Big data ecosystem related technologies.
  • Excellent understanding / knowledge of Hadoop architecture and various components such as HDFS, Job Tracker, Task Tracker, Name Node, Data Node and Map Reduce programming paradigm.
  • Hands on experience in installing, configuring, and using Hadoop ecosystem components like Hadoop Map Reduce, HDFS, Oozie, Falcon, Hive, Sqoop, Pig, and Flume.
  • Experience in managing and reviewing Hadoop log files.
  • Experience in analyzing data using Hive QL, Pig Latin, Accumulo and custom Map Reduce programs in Java. Extending Hive and Pig core functionality by writing customUDFs.
  • Experience in installation, configuration, supporting and managing - Cloud Era’s Hadoop platform along with CDH3&4 clusters.
  • Hands on experience in application development using Java, RDBMS, and Linux shell scripting.
  • Extensively worked on database applications using DB2 UDB, Oracle, SQL*Plus, PL/SQL, SQL*Loader.
  • Solid understanding of the high volume, high performance systems.
  • Strong experience as a senior Java Developer in Web/intranet, client/server technologies using Java, J2EE, Servlets, JSP, JSF, EJB, JDBC and SQL.
  • Absolute knowledge of software development life cycle(SDLC), database design, RDBMS, data warehouse Experience in writing Complex SQL Queries involving multiple tables inner and outer joins.
  • Experience in optimizing the queries by creating various clustered, non-clustered indexes and indexed views using and data modeling concepts.
  • Experience with Oracle 9i -PL/SQL programming and SQL * Plus.
  • Experience in Object Oriented Analysis, Design (OOAD) and development of software using UML Methodology, good knowledge of J2EE design p Confidential erns and Core Java design p Confidential erns.
  • Excellent working knowledge of popular frameworks like Struts, Hibernate, and Spring MVC.
  • Experience in Agile Engineering practices.
  • Demonstrated leadership abilities and team work skills as well as the ability to accomplish tasks under minimal direction and supervision.

TECHNICAL SKILLS:

Operating Systems: Windows Vista /XP/NT/2000/98/95, Windows sever 2003, Unix, Linux

Big Data Technologies: Hadoop, HDFS, Hive, Map Reduce, Pig, Hbase,Accumulo, Sqoop, Flume, Zookeeper, Avro, Oozie, Falcon, Maven, Ant.

Databases: SQL, PL/SQL

Java/J2EE Technologies: CoreJava, Java Beans, J2EE (JSP, Servlets, EJB), Struts, Spring, JDBC, XML.

NoSQL Database: Accumulo, HbaseOffice Tools: MS-OFFICE - Excel, Word, PowerPoint, MS VisioProfessional.

PROFESSIONAL EXPERIENCE:

Confidential

Boston, MA

Hadoop Developer

Responsibilities:

  • Launching and Setup of Hadoop Cluster on AWS, which includes configuring different components of Hadoop.
  • Experience in Using Sqoop to connect to the DB2 and move the pivoted data to Hive tables
  • Managed the Hive database, which involves ingest and index of data.
  • Expertise in exporting the data from Avro files and indexing the documents in sequence or serde file format.
  • Hands on experience in writing custom UDF’s and also custom input and output formats.
  • Configured and Maintaining different topologies in storm cluster and deployed them on regular basis.
  • Understanding of Ruby scripts used to generated yaml files.
  • Monitoring clusters using Nagios to send timely email for the alerts.
  • Involving in GUI development using Javascript and AngularJS and Guice.
  • Developed Unit test case using Jmockit framework and automated the scripts.
  • Hands on experience on Oozie workflow scheduled through Falcon.
  • Maintaining different cluster security settings and involving in creation and termination of multiple cluster environment.
  • Involving in brain storming JAD sessions to design the GUI.
  • Worked in Agile environment, which uses Jira to maintain the story points and Kanban model.
  • Hands on experience on maintaining the builds in Bamboo and resolved the build failures in Bamboo.

Environment: Hadoop, Hive, Accumulo,Hbase,Sqoop, Oozie, Falcon, HDFS, MapReduce, Jira, Bitbucket, Maven, J2EE, Guice, AngularJS, Jmockit, Lucene, Storm, Ruby, Unix, Sql,AWS(Amazon Web Services).

Confidential, CA

HadoopConsultant

Roles and Responsibilities:

  • Developed data pipeline using Flume, Sqoop, Pig and Java mapreduce to ingest customer behavioral data and purchase histories into HDFS for analysis.
  • Developed job flows in Oozie to automate the workflow for extraction of data from warehouses and weblogs.
  • Used Pig as ETL tool to do transformations, event joins, filter bot traffic and some pre-aggregations before storing the data onto HDFS
  • Used Hive to analyze the partitioned and bucketed data and compute various metrics for reporting on the dashboard.
  • Loaded the aggregated data onto DB2 for reporting on the dashboard.

Environment: JDK1.6, RedHat Linux, HDFS, Map-Reduce, Hive, Pig, Sqoop, Flume, Zookeeper, Oozie, DB2, HBase.

Confidential, CA

Hadoop Developer

Roles and Responsibilities:

  • Created Hive tables to handle the JSON data
  • Defining custom SerDe
  • Designed and developed various MapReducer Programs for the project
  • Designing MapReduce jobs to bring the required fields into a desired format
  • Exporting data back to Database
  • Also created Design document for the above mentioned modules.

Environment: JDK1.6, RedHat Linux, HDFS, Map-Reduce, Hive, Sqoop, Oozie .

Confidential, Westbrook, ME

Senior Java/J2EE Developer

Roles and Responsibilities:

  • Analyze the requirements and communicate the same to both Development and Testing teams.
  • Involved in the designing of the project using UML.
  • Followed J2EE Specifications in the project.
  • Designed the user interface pages in JSP.
  • Used XML and XSL for mapping the fields in database.
  • Used JavaScript for client side validations.
  • Created stored procedures and triggers that are required for project.
  • Created functions and views in Oracle.
  • Responsible for updating database tables and designing SQL queries using PL/SQL.
  • Created bean classes for communicating with database.
  • Involved in documentation of the module and project.
  • Prepared test cases and test scenarios as per business requirements.
  • Prepared coded applications for unit testing using JUnit.

Environment: Environment: Struts, Hibernate, Spring, EJB, JSP, Servlets, JMS, XML, JavaScript, UML HTML, JNDI, CVS, Log4J, JUnit, Windows 2000, Web Sphere App server, RAD, Rational Rose, Oracle 9i.

Confidential

Minneapolis, MN

Java/J2EE Developer

Roles and Responsibilities:

  • Responsible for gathering business requirements, writing technical specifications.
  • Created UML diagrams to capture architecture and application design.
  • Developed UI and backend applications using Struts, Hibernate, JSP, HTML, DHTML, JavaScript, and AJAX.
  • Developed the application architecture and customized framework using STRUTS and Hibernate.
  • Used Hibernate to develop an application to manage the reference data to store/retrieve data out of the database tables.
  • Used Weblogic Server as the application server.
  • Flavors of Agile Methodologies (SCRUM) are used to improve the control over the project.
  • Involved in writing ANT scripts to build and deploy the application.
  • Used PL/SQL to retrieve data using multiple SQL statements.
  • Used DB2 8.x and Oracle 8.x as the database.
  • Managed Source Control and Version Control using CVS.

Environment: HTML, Java Script, JSP,Struts framework, EJB3, Java Beans, XML, Web services, Hibernate, Tomcat, Eclipse, Weblogic 7.0/8.1, Oracle 9.0, PL/SQL,CVS, Log4j,JUnit, Ant.

Confidential

Java Developer

Roles and Responsibilities:

  • Performed in various phases of the Software Development Life Cycle (SDLC)
  • Developed user interfaces using JSP framework with AJAX, Java Script,HTML,XHTML,and CSS
  • Performed the design and development of various modules using CBD Navigator Framework
  • Deployed J2EE applications in Web sphere application server by building and deploying ear file using ANT script.
  • Created tables, stored procedures in SQL for data manipulation and retrieval.
  • Used technologies like JSP, JavaScript and Tiles for Presentation tier.
  • CVS tool is used for version control of code and project documents.

Environment: JSP, Servlets, JDK, JDBC, XML, JavaScript, HTML, Spring MVC, JSF, Oracle 8i, Sun Application Server, UML, JUnit, JTest, Netbeans, Windows 2000.

Confidential

Java/J2EE Developer

Roles and Responsibilities:

  • Developing web tier by using Struts Framework, JSP and HTML
  • Used SQL queries to interact with database.
  • Used Web Logic application server for deployment
  • Experience in working in Unix environment using the commands
  • Used CVS for Source Control and Version Management
  • Tested the applications by using the test cases

Environment: Java 1.5, J2EE (Servlets, JSP ), Eclipse, SQL, Web Logic, JDBC, CVS, Windows XP.

We'd love your feedback!