We provide IT Staff Augmentation Services!

Hadoop Developer Resume

0/5 (Submit Your Rating)

Charlotte, NC

SUMMARY

  • Overall 8+ years of experience in IT industry which includes 3+ years of experience in Big Data in implementing complete Hadoop solutions.
  • Expertise in Hadoop ecosystem components like Map Reduce, HDFS, Hive, Sqoop, Pig, Kafka for scalability, distributed computing and high performance computing.
  • In depth knowledge of Hadoop Architecture and various components such as HDFS, Job Tracker, Task Tracker, Name Node, Data Node and MRv1 and MRv2 (YARN).
  • Experienced in creating Map Reduce jobs in Java as per the business requirements.
  • Migrating ETL project to Hadoop with no defects and running in production.
  • Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice - versa.
  • Hands on experience in configuring and working with Kafka to load the data from multiple sources directly into HDFS.
  • Strong experience in data analytics using Hive and Pig, including by writing custom UDFs.
  • Knowledge of job workflow scheduling and monitoring tools like Oozie, UC4 and Zookeeper.
  • Exposure to Cloudera and Hortonworks development environment.
  • Experience in Core Java and multi-thread processing.
  • Extensive knowledge in using SQL Queries for backend database analysis.
  • Good experience in using Linux/Unix shell scripting.
  • Experience in UML (Class Diagrams, Sequence Diagrams, Use case Diagrams).
  • Hands on experience in application development using Core JAVA, RDBMS and Linux shell scripting.
  • Experienced in creating and analyzing Software Requirement Specifications (SRS) and Functional Specification Document (FSD).
  • Excellent working experience in Scrum / Agile framework, Iterative and Waterfall project execution methodologies.
  • Experienced in preparing and executing Unit Test Plan and Unit Test Cases after software development.
  • Worked extensively in Finance, Airline and Automotive Insurance domains.
  • Experienced to work with multi-cultural environment with a team and also individually as per the project requirement.
  • Excellent communication and inter-personal skills, self-motivated, organized and detail-oriented, able to work well under deadlines in a changing environment and perform multiple tasks effectively and concurrently.
  • Strong analytical skills with ability to quickly understand clients business needs. Involved in meetings to gather information and requirements from the clients.

TECHNICAL SKILLS

Hadoop/Big Data: HDFS, MapReduce, Pig, Hive, Sqoop, Oozie, Zookeeper, Apache Spark, Kafka, Apache Storm.

IDE Tools: Eclipse, IBM Websphere, NetBeans

Programming languages: Java, Linux shell scripts, Scala

Databases: Oracle 11g/10g/9i, MySQL, DB2, MS-SQL Server

Web Technologies: HTML, XML, JavaScript

Version control: Git, Git Hub

Design Technologies: UML

Middle ware: TIBCO

Development Approach: Agile, Waterfall, Iterative, Spiral

Operating Systems: All Versions of Microsoft Windows, UNIX and LINUX

Protocols: TCP/IP, HTTP, HTTPS, TELNET, FTP

PROFESSIONAL EXPERIENCE

Hadoop Developer

Confidential, Charlotte, NC

Responsibilities:

  • Worked with Data Ingestion techniques to move data from various sources to HDFS.
  • Designed applications for storing data to HDFS by using Kafka to get more performance.
  • DonetheSparkandKafkaintegrationtogetthedatafromKafkaeventstoSparkinput.
  • Designed RDDs with Spark Streaming and Spark SQLs
  • Presently implementing KAFKA.
  • Analysed different formats of Data.
  • Worked on writing Map reduce programs using Java.
  • Extensly worked with Partitions, Bucketing tables in Hive and designed both Managed and External table.
  • Worked on optimization of Hive Queries.
  • Created and worked with Sqoop jobs with full refresh and incremental load to populate Hive External tables.
  • Worked on Pig to do data transformations.
  • Developed UDF’s in Map/reduce, Hive and Pig.
  • Worked on Hbase and Its integration with Strom.
  • Designing and creating Oozie workflows to schedule and manage Hadoop, Hive, pig and sqoop jobs.
  • Worked with RDBMS import and export to HDFS.
  • Involved in requirement analysis.
  • Involved in giving KT to other team members.
  • Involved in preparing Project documentation.

Environment: Apache Spark (Spark SQL, Spark Streaming, Spark Graphs), Storm, Kafka, Sqoop, Hive.

Hadoop Developer

Confidential, Chicago, IL

Responsibilities:

  • Involved in loading data from RDBMS into HDFS using Sqoop queries.
  • Collecting and aggregating large amounts of log data using Apache Flume and staging data in HDFS for further analysis.
  • Responsible for preparing the File level and Column level metadata.
  • Responsible for building scalable distributed data solutions using Hadoop.
  • Developed multiple Map Reduce jobs in java for data cleaning and preprocessing.
  • Collecting and aggregating large amounts of log data using Apache Flume and staging data in HDFS for further analysis.
  • Responsible to manage data coming from different sources.
  • Involved in creating Pig tables, loading with data and writing Pig Latin queries which will run internally in Map Reduce way.
  • Extensively used Sqoop for importing the data from RDBMS to HDFS.
  • Handled the imported data to perform transformations, cleaning and filtering using Hive and Map Reduce.
  • Participating is technical design walkthrough and test summary walkthrough’s.
  • Create/Update the PRL (peer review log) to track the review comments/defects for the documents reviewed.
  • Attending the project kick-off meetings and ST meetings.
  • Prepared documentation and participated in preparing user’s manual for the application.
  • Guiding the team technically in resolving any issues being faced during coding, testing or in case of any production issued occur.
  • Created Hive tables, partitions and loaded the data to analyze using HiveQl queries.
  • Written Pig Scripts for sorting, joining and grouping the data.
  • Used MapReduce to process the large sets of unstructured data to compatible with RDBMS’s.

Environment: Hadoop Ecosystem CDH4, HDFS, MapReduce, Pig, Hive, HBase, Sqoop, Eclipse, SVN, Java, Unix, Java, RDBMS Oracle 11g and IBM DB2.

Java Developer

Confidential

Responsibilities:

  • Responsible for reviewing business user requirements and also participated in meeting the users with Business Analysts
  • Developed an application that helps Crew to check in, Check out, file complaints and generate reports based on number of complaints per week, per month and year.
  • Developed stored procedures, cursors and database Triggers.
  • Consumed Web Services (WSDL, SOAP, and UDDI) from third party
  • Used XSL/XSLT for transforming common XML format into displayable format
  • Used Log4J for logging and tracing the messages
  • Deployed application in IBM Websphere

Environment: Core Java, Servlets, Oracle 11i, JSP 2.2, HTML, XML, XSL, SOAP, Log4J, ANT.

Java Developer

Confidential

Responsibilities:

  • Developed an application that helps flight planners to schedule a flight, Cancel Flight, reroute and know the status of flight
  • Developed application using Java Applets
  • Developed stored procedures, cursors and database Triggers.
  • Used Log4J for logging and tracing the messages
  • Developed crystal reports for the application that calculates Taxi time, Flight waiting time etc.

Environment: Java Applets, Oracle 11i, Log 4j, Crystal reports.

Java Developer

Confidential

Responsibilities:

  • Utilized Agile Methodologies to manage full life-cycle development of the project.
  • Developed front end validations using JavaScript and developed design and layouts of JSPs and custom taglibs for all JSPs.
  • Used JDBC for database connectivity.
  • Developed web application using JSP custom tag libraries and Action. Designed Java Servlets and Objects using J2EE standards.
  • Used JSP for presentation layer, developed high performance object/relational persistence and query service forentire application.
  • Developed the XML Schema and Web services for the data maintenance and structures.
  • Worked with various Style Sheets like Cascading Style Sheets (CSS).
  • Designed database and created tables, written the complex SQL Queries and stored procedures as per the requirements.

Environment: Java/J2EE, Oracle 10g, SQL, PL/SQL, JSP, EJB, WebLogic 8.0, HTML, Java Script, JDBC, XML, JMS, log4j, MyEclipse.

We'd love your feedback!