We provide IT Staff Augmentation Services!

Hadoop Developer Resume

2.00/5 (Submit Your Rating)

Raleigh, NC

SUMMARY:

  • 9 years of overall experience in Enterprise/Distributed Application Development in diverse industries which includes hands on experience in Big Data/J2EE related technologies.
  • 3 years of comprehensive experience as a Big Data Developer.
  • Experience in using HDFS, Mapreduce, Pig, Hive, Sqoop, Oozie, Spark, Cassandra and Cloudera (CDH).
  • Experience in importing and exporting data using Sqoop from Relational Database Systems to HDFS and vice - versa.
  • Extending Hive and Pig core functionality by writing custom UDFs.
  • Experience in analyzing data using HiveQL, Pig Latin, and custom Mapreduce programs in Java.
  • Familiar with Java virtual machine (JVM) and multi-threaded processing.
  • Worked on NoSQL database Cassandra and having good knowledge on HBase and MongoDB.
  • Worked on job workflow scheduling and monitoring tools like Oozie and Zookeeper.
  • Knowledge on Kafka and Spark Streaming.
  • Familiar with Data Warehousing and ETL tools like Informatica and Pentaho
  • Experience in designing, developing and implementing connectivity products that allow efficient exchange of data between our core database engine and the Hadoop ecosystem.
  • Experience as a Technical Lead/Senior Java Developer in developing Web/intranet, client/server applications using Java, J2EE, Servlets, JSP, JSF, Struts, Spring, JDBC, Hibernate and SQL.
  • Experience with Agile, SDLC, Atlassian stack (JIRA, Confluence, Stash, GIT) and Jenkins/Nexus.
  • Worked on SOAP and Restful Web Services.
  • Worked on IBM MQ/JMS for messaging between multiple apps and EMC's Documentum upgrade on client applications.
  • Experience in developing web applications using JSP, JQuery and DOJO components.
  • Excellent knowledge and experience in working with Tomcat 8 Web Server, JBoss, WebLogic 8.1 and WebSphere 7.0 application servers.
  • Expertise in code quality & coverage by using PMD, SONAR, Fortify and Cobertura plugin integration with Eclipse & RAD.
  • Techno-functional responsibilities include interfacing with users, identifying functional and technical gaps, estimates, designing custom solutions, development, leading developers, producing documentation, and production support.
  • Excellent interpersonal and communication skills, creative, research-minded, technically competent and result-oriented with problem solving and leadership skills.

TECHNICAL SKILLS:

Hadoop/Big Data Tools: HDFS, Mapreduce, Pig, Hive, HBase, Spark, Sqoop, Flume, Cassandra, Oozie, Zookeeper.

Java & J2EE Technologies: Core Java, Servlets, JSP, JDBC, JNDI, Java Beans, SOAP Web Services.

IDE s: Spring Tool Suite, IntelliJ IDEA, Eclipse, Net beans and RAD.

Java/J2EE Frameworks: Struts, Hibernate, Spring.

Programming languages: Java, Scala, Ant scripts, Linux shell scripts.

Databases: Cassandra, Oracle 11g/10g/9i, DB2, MS-SQL Server.

Web Servers: Apache Tomcat, Weblogic, WebSphere and JBoss.

Web Technologies: HTML, XML, JavaScript, AJAX, JQuery, DOJO.

Atlassian Tools: JIRA, Confluence, Stash, GIT and Jenkins/Nexus.

ETL Tools: Informatica, Pentaho

Testing: SOA Test, SOAPUI and JUNIT.

Build Tools: Maven and ANT.

CICD Tools: Hudson, Jenkin s, IBM UDeploy

PROFESSIONAL EXPERIENCE:

Confidential, Raleigh NC

Hadoop Developer

Environment:: Hadoop, Map Reduce, HDFS, Hive, Pig, Java, My SQL, Cassandra, Cloudera Manager, Sqoop, Oozie, Zookeeper and Control M.

Responsibilities:

  • Responsible for building scalable distributed data solutions using Hadoop.
  • Worked on Apache Spark with Java(Datastax api) to create RDDs from Cassandra for data aggregation.
  • Handled importing of data from various data sources, performed transformations using PIG, Hive, MapReduce, loaded data into HDFS and Extracted the data from Teradata into HDFS using Sqoop.
  • Integrated HIVE with existing applications, involved in preparing data mappings and exploring data to clients with Hue.
  • Analyzed the data for business use cases by performing Hive queries and running Pig scripts.
  • Created custom UDFs using Pig.
  • Worked on Oozie workflow engine to run multiple Hive and Pig jobs.
  • Developed Hive queries to process the data and generate the reports for end users.
  • Worked with Data Engineering Team, Analysts and Business during application life-cycle.
  • Worked on Cloudera CDH upgrade from 5.4.9 to 5.7.2.

Confidential

Senior Java Developer

Environment:: JDK 1.7, JSP, Struts, Spring, SOAP/Restful Web Services, Angular JS, Web Sphere, Tomcat, DOJO.

Responsibilities:

  • Analysis of requirements for new projects for all products. This includes working with Analysts and doing Impact Analysis.
  • Involved in technical document preparation that includes preparing Data Flow/Class Diagrams using Microsoft Visio.
  • Development of code as per the business requirements using relevant design patterns.
  • Developed web pages using DOJO components for alert, confirm, popup and detailed tabular data with sorting and pagination.
  • Involved in changing the Electra framework to integrate new changes for next generation applications using latest technologies like Spring AOP for Authorization and logging.
  • Involved in generating soap web services client code, preparing code for unit testing and integrating the web service calls in web application code. For any issues testing web service using SOA test and SOAP UI
  • Performing Unit testing and system integration setup for enterprise releases.
  • Involved in Code review, coordinating with Secure Code Review team and providing fixed for SCR items
  • Providing fixes to the issues identified in testing phase.
  • Coordinating with Web services team for any code/xml changes for web services. Testing team for fixing the bugs identified during testing. Application Monitoring team to configure new web service methods into APM site. Release Engineering team for packaging of the developed component to be deployed in production environment.
  • Providing the support for post-production deployment for the warranty period.

Confidential, Houston, TX

Senior Java Developer

Environment:: JDK 1.5, JSP. Struts, IBatis, Oracle, SOAP Web Services, BEA WebLogic.

Responsibilities:

  • Analysis of requirements for new enterprise releases and impact analysis.
  • Involved in technical document preparation.
  • Development of code as per the business requirements.
  • Involved in generating soap web services client code, preparing code for unit testing and integrating the web service calls in web application code. For any issues testing web service using SOA test and SOAPUI.
  • Unit testing and system integration setup for enterprise releases.
  • Involved in Code review, coordinating with Secure Code Review team and providing fixed for SCR items.
  • Providing fixes to the issues identified in testing phase.
  • Packaging of the developed component to be deployed in production environment
  • Providing the support for post-production deployment.

Confidential, Raleigh, NC

Java Developer

Environment: JDK 1.5, JSP, Struts, Hibernate, Oracle, IBM Web Sphere.

Responsibilities:

  • Responsible and active in the analysis, design, implementation and deployment of full Software Development Lifecycle (SDLC) of the project.
  • Designed and developed user interface using JSP, HTML and JavaScript/JQuery.
  • Developed Struts action classes, action forms and performed action mapping using Struts framework and performed data validation in form beans and action classes.
  • Extensively used Struts framework as the controller to handle subsequent client requests and invoke the model based upon user requests.
  • Defined the search criteria and pulled out the record of the Banner Ads Reports from Google’s DoubleClick Ad Manager.
  • Used DAO and JDBC for database access.
  • Developed stored procedures and triggers using PL/SQL in order to calculate and update the tables to implement business logic.
  • Involved in post-production support and maintenance of the application.

We'd love your feedback!