We provide IT Staff Augmentation Services!

Hadoop Developer/ Team Lead Resume

0/5 (Submit Your Rating)

El Segundo, CA

SUMMARY

  • Over 8+ years of experience in software development, 3+ years of experience in all phases of Hadoop and HDFS development.
  • Experience in developing Map Reduce Programs using Apache Hadoop for analyzing the big data as per the requirement.
  • Experience With Data Torrent.
  • Experienced on major Hadoop ecosystem’s projects such as PIG, HIVE, Tez and HBASE.
  • Good working experience using Sqoop to import data into HDFS from RDBMS and vice - versa
  • Good knowledge in using job scheduling and monitoring tools like Oozie and ZooKeeper
  • Experience in Hadoop administration activities such as installation and configuration of clusters using Apache and Cloudera
  • Experience in developing solutions to analyze large data sets efficiently
  • Knowledge of administrative tasks such as installing Hadoop and its ecosystem components such as Hive and Pig
  • Knowledge of NoSQL databases such as HBase, and MongoDB
  • In depth understanding/knowledge of Hadoop Architecture and various components such as HDFS, JobTracker, TaskTracker, NameNode, DataNode and MapReduce concepts
  • Extending Hive and Pig core functionality by writing customUDFs
  • Good understanding of Data Mining and Machine Learning techniques
  • Experience in analyzing data using HiveQL, Pig Latin, and custom MapReduce programs in Java.
  • Strong work ethic with desire to succeed and make significant contributions to the organization
  • Strong problem solving skills, good communication, interpersonal skills and a good team player
  • Have the motivation to take independent responsibility as well as ability to contribute and be a productive team member

TECHNICAL SKILLS

Hadoop/Big Data Technologies: HDFS, MapReduce, Hive, Pig, Sqoop, Flume, Tez, Hbase, Oozie, Zookeeper, Kerberos

Programming Languages: Java JDK1.4/1.5/1.6 (JDK 5/JDK 6), C/C++, HTML, SQL, PL/SQL, AVS & JVS

Frameworks: Hibernate 2.x/3.x, Spring 2.x/3.x,Struts 1.x/2.x

Web Services: WSDL, SOAP, Apache CXF/XFire, Apache Axis, REST, Jersey

Client Technologies: JQUERY, Java Script, AJAX, CSS, HTML 5, XHTML

Operating Systems: UNIX, Windows, LINUX

Application Servers: IBM Web sphere, Tomcat, Web Logic, Web Sphere

Web technologies: JSP, Servlets, JNDI, JDBC, Java Beans, JavaScript, Web Services(JAX-WS)

Databases: Oracle 8i/9i/10g & MySQL 4.x/5.x

Java IDE: Eclipse 3.x, IBM Web Sphere Application Developer, IBM RAD 7.0

Tools: TOAD, SQL Developer, SOAP UI, ANT, Maven, Visio, Rational Rose, Endur 8.x/10.x/11.x

PROFESSIONAL EXPERIENCE

Confidential, El Segundo, CA

Hadoop Developer/ Team Lead

Environment: Hadoop 2.0, Hive, Sqoop, Hue, Tez, Data Torrent, Rapid Miner, Core Java, Cloudera Manager, Oracle, MySQL, UNIX, Oozie, Cloudera Distribution

Responsibilities:

  • Involved in Architecture of the Proof of Concept Involved in initial meetings with Cloudera Architect and BI teams for requirement gathering.
  • Worked on exporting data files and control files from Oracle to HDFS.
  • Created Hive external tables for append only tables and managed tables for Reload tables.
  • Worked on Hive windowing and Analytical functions.
  • Developed data merging scripts as part of the incremental loads.
  • Worked in Agile methodology of development process and attended scrum meetings bi-weekly.
  • Worked on Append, Insert and Updates tables using Hive windowing functions.
  • Created shell scripts to load raw, intermediate and reporting tables, and triggered scripts using oozie workflows.
  • Also working in POC project to improve performance using Data Torrent and Rapid miner.
  • Monitored and maintained cluster using Cloudera manager
  • Made changes to memory settings based on Cloudera standards.
  • Created Hive external tables for append only tables and managed tables for Reload tables.
  • Coded the operators in Core Java, pull the code to Github Repository and made git operations on Data torrent.
  • Worked on Append, Insert and Updates tables using Hive windowing functions.
  • Worked in an onsite-offshore model.
  • Worked with file formats TEXT, AVRO, PARQUET and SEQUENCE files.
  • Written map reduce programs within hive for loading data from temp tables to external tables.
  • Extensively worked on java the operators of data torrent live streamed data to HDFS.
  • Working in POC project to improve performance using Data Torrent.
  • Worked on Sqoop Importing JDBC to HDFS. Worked in Agile environment and used pivotal tracker for tracking stories.
  • Documented Low Level Design Document, Migration Document and updated existing Interface document with changes made.
  • Participated in weekly status meetings, and conducting internal and external reviews as well as formal walkthroughs among various teams, and documenting the proceedings.

Confidential, San Francisco, CA

Hadoop Admin/ Developer

Environment: Hadoop, MapReduce, HDFS, Hive, Pig, Tez, Hue, Oozie, Core Java, Eclipse, Hbase, Flume, Cloudera Manager, Oracle 10g, DB2, IDMS, VSAM, SQL*PLUS, Toad, Putty, Windows NT, UNIX Shell Scripting, Pentaho Kettle, Pentaho Bigdata, YARN

Responsibilities:

  • Involved in writing Mapreduce programs and tested using MRUnit.
  • Managing and scheduling Jobs on a Hadoop cluster using Oozie.
  • Involved in moving all log files generated from various sources to HDFS for further processing through Flume.
  • Involved in loading data from UNIX file system to HDFS.
  • Worked on Hue interface for querying the data.
  • Created Hive tables to store the processed results in a tabular format.
  • Created HBase tables to store variable data formats of data coming from different portfolios.
  • Involved in transforming data from Mainframe tables to HDFS, and HBASE tables using Sqoop and Pentaho Kettle.
  • Implemented best income logic using Pig scripts.
  • Implemented test scripts to support test driven development and continuous integration.
  • Responsible to manage data coming from different sources.
  • Experienced on loading and transforming of large sets of structured, semi structured and unstructured data.
  • Exported the analyzed data to the relational databases using Sqoop for visualization and to generate reports for the BI team.
  • Analyzed large amounts of data sets to determine optimal way to aggregate and report on it.
  • Participate in requirement gathering and analysis phase of the project in documenting the business requirements by conducting workshops/meetings with various business users.
  • Have deep and thorough understanding of ETL tools and how they can be applied in a Big Data environment.

Confidential, Boston, MA

Big Data/ Hadoop Developer

Environment: Java 6, Eclipse, Linux, Hadoop, HBase, Sqoop, Pig, Hive, Flume.

Responsibilities:

  • Installed and configured Hadoop MapReduce, HDFS, Developed multiple MapReduce jobs in java for data cleaning and preprocessing.
  • Importing and exporting data into HDFS and Hive using Sqoop.
  • Experienced indefining jobflows.
  • Experienced in managing andreviewingHadooplog files.
  • Extracted files from CouchDB through Sqoop and placed in HDFS and processed.
  • Experienced in runningHadoopstreaming jobs to process terabytes of xml format data.
  • Load and transform large sets of structured, semi structured and unstructured data.
  • Responsible to manage data coming from different sources.
  • Got good experience with NoSQL database.
  • Supported Map Reduce Programs those are running on the cluster.
  • Involved in loading data from UNIX file system to HDFS.
  • Installed and configured Hive and also written Hive UDFs.
  • Involved in creating Hive tables, loading with data and writing hive queries which will run internally in map reduce way.

Confidential, MD

Java/J2EE Developer

Environment: Java, JDK 1.5, Servlets, Hibernate, Ajax, Oracle 10g, Eclipse, Apache Ant, Web Services (SOAP), Apache Axis, Apache Ant, Web Logic Server, JavaScript, HTML, CSS, XML

Responsibilities:

  • Responsible for gathering and analyzing requirements and converting them into technical specifications
  • Used Rational Rose for creating sequence and class diagrams
  • Developed presentation layer using JSP, Java, HTML and JavaScript
  • Used Spring Core Annotations for Dependency Injection
  • Designed and developed a ‘Convention Based Coding’ utilizing Hibernate’s persistence framework and O-R mapping capability to enable dynamic fetching and displaying of various table data with JSF tag libraries
  • Designed and developed Hibernate configuration and session-per-request design pattern for making database connectivity and accessing the session for database transactions respectively. Used HQL and SQL for fetching and storing data in databases
  • Participated in the design and development of database schema and Entity-Relationship diagrams of the backend Oracle database tables for the application
  • Implemented web services with Apache Axis
  • Designed and Developed Stored Procedures, Triggers in Oracle to cater the needs for the entire application. Developed complex SQL queries for extracting data from the database
  • Designed and built SOAP web service interfaces implemented in Java
  • Used Apache Ant for the build process
  • Used ClearCase for version control and ClearQuest for bug tracking

Confidential, IL

Java/ J2EE Developer

Environment: Java, J2EE, XML, XML Schemas, JSP, HTML, CSS, PL/SQL, Junit, Log4j, IBM Web sphere Application Server.

Responsibilities:

  • Involved in creation of UML diagrams like Class, Activity, and Sequence Diagrams using modeling tools of IBM Rational Rose
  • Involved in the development of JSPs and Servlets for different User Interfaces
  • Used Struts action forms and developed Action Classes, which act as the navigation controller in Struts framework
  • Implemented the template-based categorization of presentation content using Struts-Tiles. MVC implementation using Struts framework
  • Involved in Unit Testing of Various Modules based on the Test Cases
  • Involved in Bug fixing of various modules that were raised by the Testing teams in the application during the Integration testing phase
  • Involved and participated in Code reviews
  • Used Log4J logging framework for logging messages
  • Used Rational ClearCase for version control
  • Used Rational Clear Quest for bug tracking
  • Involved in deployment of application on IBM Websphere Application Server

Confidential, NC

JAVA/ J2EE Application Developer

Environment: Java 1.6, Spring, Hibernate, Maven, Apache MQ, JUnit, JAXB, Oracle 10g, Oracle Coherence, Log4j, Shell Script, Soap UI, TOAD, SQL Developer, Quality Center, Linux, Windows

Responsibilities:

  • Responsible for design, document, implement, and unit test robust webservices framework to support templated payments and group payments in Java.
  • Worked in an onsite-offshore model. Lead offshore developers, assisted them in understanding requirements and provided code reviews.
  • Extensively used Spring Framework for Business Layer - accessing enterprise services like JNDI, JMS, and Job Scheduling.
  • Also used Spring for transaction management and dependency injection.
  • Created Database model, domain objects and DAO’s that interact with the database and store the template related data and events.
  • Used JAX-WS and Spring webservices to create and consume SOAP based webservices.
  • Used Hibernate as an ORM tool for database operations.
  • Created MDB’s to consume messages on various user events.
  • Worked in Agile software methodology with Scrum type development.

Confidential, OH

Junior JAVA Developer

Environment: Java,/J2EE, Eclipse, Web Logic Application Server, Oracle, JSP, HTML, JavaScript, JMS, Servlets, UML, XML, Eclipse, Struts, Web Services, WSDL, SOAP, UDDI

Responsibilities:

  • Responsible for understanding the business requirement.
  • Worked with Business Analyst and helped representing the business domain details in technical specifications.
  • Was also actively involved in setting coding standards and writing related documentation.
  • Developed the Java Code using Eclipse as IDE.
  • Developed JSPs and Servlets to dynamically generate HTML and display the data to the client side
  • Developed application on Struts MVC architecture utilizing Action Classes, Action Forms and validations.
  • Tiles were used as an implementation of Composite View pattern
  • Was responsible in implementing various J2EE Design Patterns like Service Locator, Business Delegate, Session Facade and Factory Pattern.
  • Code Review & Debugging using Eclipse Debugger.
  • Was responsible for developing and deploying the EJB (Session & MDB).
  • Configured Queues in WebLogic server where the messages, using JMS API, were published.
  • Consumed Web Services (WSDL, SOAP, and UDDI) from third party for authorizing payments to/from customers.
  • Writing/Manipulating the database queries.
  • Build web application using MAVEN as build tool.
  • Used CVS for Version control,
  • Performed unit testing using JUnit Testing Framework and Log4J to monitor the error log.

We'd love your feedback!