We provide IT Staff Augmentation Services!

Hadoop Developer Resume

0/5 (Submit Your Rating)

NyC

SUMMARY:

  • Over 7 years of IT experience which includes close to 3 years of work experience in Big Data and Hadoop ecosystem related technologies.
  • Hands on experience in various Hadoop infrastructures such as Map Reduce, Hive, PIG, SQOOP, Flume, HBase and DB2.
  • Experience in writing with Map Reduce programs using Apache Hadoop for working with Big Data.
  • Expertise in all the stages of the Software development Life Cycle(SDLC) namely Requirements Analysis, Design specifications, Coding, Debugging, Testing (test scenarios, test plan, test cases, Execution of test cases, Unit test plan and Unit testing, Integration and System testing) & Documentation and Maintenance application programs.
  • Experience in working with Hadoop clusters using Cloudera(CDH3) distributions.
  • Experience in Importing and Exporting the Data using SQOOP between HDFS and Relational Database (Teradata) systems/Mainframe.
  • Experience in Streaming the Data to HDFS using Flume.
  • Hands on experience in using Map reduce programming model for Batch processing of data stored in HDFS.
  • Expertise in writing ETL Jobs for analyzing data using Pig.
  • Experience in NoSQL Column - Oriented Databases like HBase and its Integration with Hadoop cluster.
  • Participated in all phases of SDLC and assured Zero Defect Delivery.
  • Responsible for creation of all documentation such as Estimates, Test plans, Test results, Issue resolution and Approach documents.
  • Good Domain knowledge in Health Care Insurance.
  • Experience in J2EE Web Technologies like Struts1.2/1.3/2.0, Spring, Hibernate, Servlets, JSP, JSTL, JavaScript, CSS, AJAX, HTML5, Web2.0, GWT, Flex, Action Script, PHP, LAMP stack.
  • Experience in Data Analysis, proficient in writing SQL queries, Stored Procedures, functions, triggers, PL/SQL using Oracle, MySql, and SQL Server.
  • Experience in developing applications using SOAP & REST Web Services.
  • Expertise in developing Desktop Applications using Java, Swing, Applets, VC++, Windows MFC, wxWidgets.
  • Working as an Onsite coordinator in Onsite/Offshore model managing a team of 8 offshore members.
  • IT Point of contact/Subject Matter Expert for both Offshore and Onsite.
  • Profound understanding of Health Insurance domain with good working knowledge of Enrollment,, Benefits and Production Support.
  • Coordinated the successful simultaneous development of several projects with defect free delivery.
  • Excellent communication, teamwork, and client interaction skills complementing technical abilities.
  • Continuous interaction with business and hands on experience in translation of Business requirements to Technical requirements and Test cases and also to support UAT.
  • Coordinated with offshore developers to discuss technical requirements and delivery of the projects to meet client’s expectations.

TECHNICAL SKILLS:

Big data: Hadoop, Mapreduce, Hive, HBase, Pig, Flume, Avro, HDFS, Cloudera, Solr, Spark, Spark Stream, SQOOP, Cassandra, NoSQL, AWS, EC2, S3, SNS, SQS, Cloud Watch.

JavaJ2EE Technologies: Java, JSP & Servlets, EJBs, JMS, JAXP, XMlBeans, JDBC, Java Mail, XMlBeans, SAX, DOM, Java Security, XML-HTTP

Methodologies: UML, Design Patterns like Application Controller, Singleton, Chain of Responsibility, GRASP (Creator, Expert, Controller, High Cohesion, Low Coupling), Command, Builder, Context

Application Server: Websphere, Weblogic Server, Apache Tomcat.

Databases: Oracle, SQL Server

Operating System: Windows Vista/NT/2000/XP, Windows 7/8

PROFESSIONAL EXPERIENCE:

Confidential, NYC

Hadoop Developer

Responsibilities:

  • Involved in Design, Development, testing and deployment of the application in Hadoop environment
  • Created Pig Scripts and Hive Scripts for performing Validations and applying rules.
  • Used Apache Sqoop to Import and Export Data to HDFS from Oracle DB
  • Designed and Scheduled workflows using Oozie.
  • Created Shell scripts to load the Collections data to Oracle
  • Created reports using Tableau accessing hive data

Technology & Tools:Hadoop, HDFC, Map Reduce, Pig, Hive, Sqoop, Oozie, Java, Oracle, Linux, Eclipse, JDBC, Tableau.

Confidential, Las Vegas, Nevada

Hadoop Architect

Responsibilities:

  • Designed and developed Mapreduce/Yarn programs
  • Implemented Kerberos security to safeguard the cluster
  • Performed Data analysis through Pig, Map Reduce and Hive
  • Performed Cluster coordination services through Zookeeper
  • Developed POS analytical Component
  • Imported and exported data using Sqoop from or to HDFS and Relational DB Teradata, DB2
  • Implement Flume, Spark, Spark Stream framework for real time data processing.
  • Hands on experience in installing, configuring and using eco-System components like Hadoop MapReduce, HDFS, Hbase, Pig, Flume, Hive and Sqoop
  • Developed Social Media component to store data in Casandra
  • Developed analytical components using Scala, Spark and Spark Stream

Technology & Tools: Java, Scala, Python, J2EE, Hadoop, Spark, Cassandra, HBase, Hive, Pig, Python, Sqoop, MySQL, Teradata, DB2, Pentaho.

Confidential, Madison, WI

Hadoop Developer

Responsibilities:

  • Processed data into HDFS by developing solutions, analyzed the data using MapReduce, Pig, Hive and produce summary results from Hadoop to downstream systems
  • Used Sqoop widely in order to import data from various systems/sources (like MySQL) into HDFS
  • Applied Hive quires to perform data analysis on HBase using Storage Handler in order to meet the business requirements
  • Created components like Hive UDFs for missing functionality in HIVE for analytics.
  • Hands on experience with NoSQL databases like HBase, Cassandra for POC (proof of concept) in storing URL's and images.
  • Developing Scripts and Batch Job to schedule a bundle (group of coordinators) which consists of various Hadoop Programs using Oozie
  • Exported the analyzed data to the relational databases using Sqoop for visualization and to generate reports for the BI team.
  • Worked with cloud services like Amazon web services (AWS)
  • Involved in ETL, Data Integration and Migration
  • Used different file formats like Text files, Sequence Files, Avro
  • Cluster co-ordination services through Zookeeper
  • Assisted in creating and maintaining Technical documentation to launching HADOOP Clusters and even for executing Hive queries and Pig Scripts
  • Assisted in Cluster maintenance, cluster monitoring, adding and removing cluster nodes and Troubleshooting
  • Installed and configured Hadoop, Mapreduce, HDFS, Developed multiple MapReduce jobs in java for data cleaning and preprocessing.

Technology & Tools:MapReduce, HDFS Sqoop, Flume, LINUX, Oozie, Hadoop, Pig, Hive, Hbase, Cassandra, Hadoop Cluster, Amazon Web Services.

Confidential

Java Developer

Responsibilities:

  • Used Developer IDE and Involved in writing Stored Procedures and SQL queries.
  • Involved in implementation, code reviews and development of modules using Java, JSP and Java Script.
  • Written Stateless Session EJBs part of implementation of EJBs to interact with the database to update, retrieve, insert and delete values effectively, building, and optimizing J2EE applications.
  • Accessed stored procedures and functions using JDBC Callable statements.
  • Worked on Supply Chain management domain.
  • Worked on web-based reporting system with HTML, Java Script and JSP.
  • Deployed the application on Oracle Application Server (OAS).

Technology & Tools: J2EE, JSP, JDBC, EJB, Weblogic, Oracle, SQL, CVS, Oracle Application Server

Confidential, Englewood, CO

Sr.J2EE Developer/Lead

Responsibilities:

  • Used SCRUM (one of the Rapid Application Development) methodology to emphasize face to face communication and make sure that iteration is passing through full SDLC.
  • Used Weblogic as an application server for deploying the application.
  • Used JAX-WS, web services (EAI layer) for interacting with Mainframe based systems like CSG.
  • Worked on consuming/invoking SOA web services.
  • Responsible for maintaining Partners and Retailers to create customer orders and support.
  • User interface was developed using JSPs, HTML, DHTML and CSS.
  • Front end validations were done using Struts and Java Script.
  • Used UI frameworks like Struts framework for implementing MVC2 architecture.
  • Framework which is designed specific to Dish (EJBs) which is called by Front-End layers for communicating with the EAI (web methods) layer to interact with database.
  • Used SQL queries to fetch / write required information to database tables. Used Oracle 9i as data base to persist and retrieve information.
  • Used Maven2 for building the application, and completed testing by deploying on application server.
  • Implemented Java components by following OOAD principles.
  • For version controlling used SVN and Mercury Quality Center as the Defect Tracking System.
  • Implemented Ajax for couple of navigation screen flows for asynchronous calls.
  • Wrote unit test cases for Unit Level testing using JUnit.

Technology & Tools: Java, SOA, Web Services, JUnit 4.5, Struts, SQL Developer, XSD, WSDL, XML, XMLSpy, Web Services(JAX-WS), EJB, JSP, Servlets, Ajax, Eclipse, SVN, Mercury Quality Center, Oracle9i,SQL,UNIX, Log4J and Weblogic 10.1

Confidential

Responsibilities:

  • Client interaction for identification of the requirements and documenting the same.
  • Creation of database tables and then creation of Jsp pages to load the excel sheet data provided by client into database tables.
  • Creation of JSP pages to process data from database tables and to create fixed format reports and then export the same in MS Word format.
  • Upload data from MS Excel sheets to database Table using Jxl.jar and Java API
  • Written and executed unit test cases and completed integration.
  • Preparation of SL scripts to generate ad hoc reports and then export the same in excels sheets, using TOAD.

Technology & Tools: JSP, Java, JDBC, SQL, ORACLE 10g.Apache Tomcat

We'd love your feedback!