We provide IT Staff Augmentation Services!

Sr. Hadoop Developer Resume

0/5 (Submit Your Rating)

Reston, VA

PROFESSIONAL SUMMARY:

  • Overall 7+ years of IT experience in the field of Information Technology that includes analysis, design, development and testing of complex applications.
  • Around 3+ years of strong working experience with Big Data and Hadoop Ecosystems.
  • Strong experience working with Hadoop and its Ecosystem components like Mapreduce, HBase, Hive, Pig, Oozie, Sqoop, Flume, and Spark.
  • Excellent understanding / knowledge of Hadoop architecture and various components of Hadoop such as HDFS, Map Reduce& YARN.
  • Strong knowledge of Mapreduce framework. Worked extensively in customizing the mapreduce framework for writing complex business transformation jobs on structured and semi structured data.
  • Excellent knowledge of HiveQL and worked extensively on partitioning, bucketing, joins, and query tuning for performance.
  • Created custom UDFS for both Hive and Pig
  • Experience in using Hadoop distributions like Cloudera CDH5, Hortonworks2.0, AWS Cloud and open source Apache Hadoop.
  • Good understanding of spark architecture and new concepts in Spark like spark data frames, spark sql etc.
  • Extensive exposure in RDBMS architecture, modeling, design, and development, loading migration with Oracle, SQL Server, DB2 and experience with NOSQL data stores like HBase, MongoDB.
  • Experience in data import and export using Sqoop from HDFS to Relational Database Systems and vice - versa.
  • Experience in working with Flume to load the log data from multiple sources directly into HDFS.
  • Experience in designing both time driven and data driven automated workflows using Oozie and configuring Zookeeper for distributed synchronization and providing group services.
  • Experience in processing data serialization formats like Xml, JSON, Sequence Files and Avro.
  • Developed Unit tests for MapReduce code using MRUnit test framework.
  • Good understanding in LZOP and Snappy Compression Codec techniques.
  • Good experience in understanding and visualizing data by integrating Hadoop with Tableau.
  • Good experience in diagnosing, tuning, profiling map and reduce tasks.
  • Extensive experience in middle-tier development using J2EE technologies like JDBC, JNDI, JSP, Servlets, JSF, Spring, Hibernate, Struts, EJB.
  • Experience with web-based UI development using jQuery, CSS, HTML, XHTML and JavaScript.
  • Ability to learn and master new technologies, deliver outputs in short deadlines and having good programming skills.
  • Experience in queries to process the data and generate the Data cubes for visualizing.
  • Excellent with coordinating, mentoring and exchanging information with colleagues.
  • Capable of assuming responsibility, exercising and working independently as required.

TECHNICAL SKILLS:

Big Data& Ecosystem: HDFS, Map Reduce, HBase, Pig, Hive, Sqoop, Flume, Oozie, Solr, YARN, Spark

Big data Analytics: Impala, Tableau

Java & J2EE Technologies: Core Java, Servlets, JSP, JDBC, JNDI, Java Beans

Frameworks: MVC, Struts, Hibernate, Spring

Programming languages: Java, Python, R, Linux Shell Script, SQL, HiveQL, Pig Latin

IDE s: Eclipse, Net beans, RStudio, Visual Studio 2012

Databases: Oracle 11g/10g/9i, MySQL, DB2

Application Servers: Apache Tomcat 5/6/7, JBoss 6/7

Web Technologies: HTML, XML, JavaScript, SOAP, DOM

ETL Tools: Talend

Testing: MRUnit, Load Runner, Junit

Operating Systems: Windows, Ubuntu 12.04, CentOS 5.x, RHEL

Virtualization Environment: VMware, Oracle VB

Source Code Management: Git

Others: Weka 3, Revolution R, winSCP, Putty, MS-Office

PROFESSIONAL EXPERIENCE:

Confidential, Reston, VA

Sr. Hadoop Developer

Responsibilities:

  • Involved in design, development and implementation of Predictive analytic models.
  • Wrote Map Reduce jobs to launch and monitor processing-intensive computations on the cluster.
  • Responsible for data ingress and egress using Sqoop from HDFS to databases and vice-versa.
  • Worked on both bulk load and incremental loads of data from enterprise data warehouses to Hadoop.
  • Wrote Pig Latin Scripts to perform transformation procedures on the data in HDFS.
  • Created Oozie workflows to automate the entire data pipeline and schedule them using Oozie coordinator
  • Processed HDFS data and created external tables using HiveQL by partitioning and bucketing for granularity in order to analyze spikes, faults, optimize for customer experience.
  • Involved in identifying job dependencies to design workflow for Oozie and resource management for YARN.
  • Worked with common serialization formats like Json, Xml and Big data serialization formats like Avro and Sequence Files.
  • Developed Hive queries to process the data and generate the data cubes for visualizing.
  • Providing support for deployment of code to multiple environments.

Environment: Hadoop, MapReduce, HDFS, Hive, HBase, Cloudera, Java, Data cubes, Eclipse, Linux, Pig, Storm, Python, R, Cassandra

Confidential

Sr. Hadoop Developer

Responsibilities:

  • Analyzed business requirements and existing software for High Level Design.
  • Designed detailed software structure and architecture documents using Use cases, sequence diagram and UML.
  • Used clickstream logs from Adobe site catalyst to perform customer and user behavior analysis using Hadoop ecosystem tools.
  • Experience in working with Flume to load the log data from multiple sources directly into HDFS.
  • Developed Unit tests for Map Reduce code using MRUnit test framework.
  • Applied Map Reduce patterns like repartition-joins, semi-joins and sampling to big data.
  • Used Stack dumps to discover unoptimized user code.
  • Used Crunch log parsing to find the URL patterns and basic analytics.
  • Written Sqoop incremental pooling job to move new / updated info from Database to HDFS and HBase.
  • Applied Tableau analytics platform to analyze, visualize data and create reports for analysis.
  • Created Oozie coordinated workflow to execute Sqoop incremental job daily.
  • Examined Task logs and figured out the JVM startup arguments for a task
  • Worked on Splittable LZOP, Snappy and LZOP Compression Codec techniques.

Environment: Hadoop, MapReduce, HDFS, Hive, Pig, Java, SQL Server, Cloudera Manager, Sqoop, Flume, Oozie, Java (Jdk 1.6), Eclipse, Linux, Tableau, MongoDB

Confidential, Irvine CA

Sr. Java Developer

Responsibilities:

  • Developed Back end code using Core Java to implement technical enhancement following Java Standards.
  • Implemented front end changes based on spring integrated with JSF along with Rich faces view and Java code as required.
  • Developed Servlets and JSP based on MVC pattern using Struts Action framework.
  • Used Tiles for setting the header, footer and navigation and Apache Validation Framework for Form validation.
  • Involved in writing Hibernate queries and Hibernate specific configuration and mapping files.
  • Involved in fixing bugs and minor enhancements for the front-end modules.
  • Used JUnit framework for writing Test Classes.
  • Used Ant (build tool) for starting up the application server in various modes.
  • Proactively worked with managers and development teams to meet project goals within expected timelines.
  • Involved in completion of post-production documentation and auditing activities with QA team.

Environment: JSF, JSP, JMS, JPA, spring, Oracle, Web Services, Eclipse, Maven, SOAP UI, AJAX, Hibernate, JUnit.

Confidential

Java Developer

Responsibilities:

  • Point of contact for client requests for application development.
  • Played key role in developing the business layer and data management components of web based J2EE architecture.
  • Created Ant and Maven scripts to build and deploy the applications.
  • Developed Graphical User Interface using JSP, HTML5, JSTL, CSS, JavaScript, Backbone and custom tags, also used for manipulating, validating, customizing, error messages to the User Interface.
  • Developed applications using Struts Framework based on MVC design pattern.
  • Developed EJB's in Web Logic for handling business process, database access, asynchronous messaging and used to implement the business logic, JMS for communication for sending updates to various other applications and MDB for routing priority requests.
  • Also used JBOSS application server for application deployment.
  • Created UNIX shell and Perl utilities for data parsing and manipulation.
  • Used XML and XSD's to define data formats.
  • Experience using ICE Faces Tag Libraries to develop user interface components.
  • Involved in Low Level Designing using UML tools.
  • Involved in writing Spring Configuration XML file that contains declarations and business classes are wired up to the frontend managed beans using Spring IOC pattern.
  • Involved in JUnit Testing of various modules by generating the Test Cases.
  • Involved in Debugging various modules raised by the testing team in the application during the Integration testing phase.
  • Prepared technical reports & documentation manuals during the program development.

Environment: Java, J2EE, Spring, JavaScript, JPA, JQuery, Struts, Design Patterns, Ant, Maven, GUI, JSP, HTML5, JSTL, CSS, EJB, Postgre SQL, SQL, JMS, Weblogic, MDB, JBOSS, Eclipse, Unix, XML, XSD, JSF, ICEfaces, UML, Spring IOC, JUnit.

We'd love your feedback!