Hadoop Developer Resume
Chattanooga, TN
SUMMARY
- Over 7 years of experience in designing, developing and implementation of various projects on Hadoop, and Java - J2EE technologies.
- Over 3 years’ hands-on experience with Hadoop (Hortonworks, Cloudera) and all its eco-system technologies (Pig, HIVE, SQOOP, Oozie, Zookeeper, Kafka, Storm, Flume and Spark).
- Hands-on experience on major components in Hadoop Ecosystem including HIVE, HBase, HBase- HIVE Integration, Pig, SQOOP, Flume & knowledge of Mapper/Reduce/HDFS Framework.
- In depth understanding of Classic MapReduce and YARN architectures.
- Experienced in Apache Spark for implementing advanced procedures like text analytics and processing using the in-memory computing capabilities written in Scala.
- Experience in developing customized UDF's in java to extend HIVE and Pig Latin functionality.
- Developed Map Reduce jobs based on the use cases using Java Map Reduce, Pig and HIVE.
- Designed ingestion framework using flume for streaming logs and aggregated data into HDFS. Built data transform framework using MapReduce and Pig.
- Hands-on experience with "productionalizing"Hadoop applications (e.g. administration, configuration management, monitoring, debugging, and performance tuning).
- Worked on NoSQL databases including HBase, Cassandra and MongoDB.
- Expertise in Java for Enterprise business applications using Java and J2EE technologies like Servlets, JSP, JDBC, MVC, EJB, JQuery, Hibernate, Spring Framework, JavaScript, Ajax.
- Experienced in web designing using HTML, CSS and JavaScript, Angular JS.
- Proficient in developing MVC based web applications using Spring and Struts frameworks.
- Good Experience in implementing various J2EE design patterns like Singleton, Session Façade, Data Access Objects (DAO), Factory, Data Transfer Object (DTO) and Business Delegate in the development of Multi-Tier Distributed Applications
- Excellent communication and interpersonal skills and a very good team player with the ability to work independently.
TECHNICAL SKILLS
Hadoop Eco-System: HDFS, MapReduce, YARN, HIVE, Pig, SQOOP, HBase, Flume, Kafka, Spark, Storm, Impala, Oozie, and Zookeeper.
Languages and API’s: Java, JSP, Servlet, HTML, DHTML, JavaScript, JDBC, JNDI, SOAP, XML, C, C ++, SQL, PL/SQL, WSDL (Web Services Description Language), EJB.
Databases: Oracle 12g/11i/10i/9i, SQL Server 2008,2008R2, 2012, MongoDB, Cassandra.
Languages: UNIX shell scripting, Perl scripting.
Environment: Windows XP/Vista 7/8, UNIX, Linux, Sun Solaris.
Developers Tools: Toad, SQL *Plus, AQT.
Scheduler: Corntab, Control - M, Autosys.
Reporting Services: SQL Server Reporting Services, Business Objects.
Frameworks: Hibernate, Spring, Struts, JSF.
Application Servers: WebSphere, JBoss, Tomcat.
Development tools: Eclipse, NetBeans, Visual Studio.
PROFESSIONAL EXPERIENCE
Hadoop Developer
Confidential - Chattanooga, TN
Responsibilities:
- Worked on Hadoop environment with MapReduce, Kafka, SQOOP, Oozie, Flume, HBase, Pig, HIVE and Impala on a multi node environment.
- Gathered the business requirements from the business partners and the subject matters experts.
- Created HBase tables to load large sets of structured, semi-structured and unstructured data coming from UNIX, NoSQL and a variety of portfolios.
- Supported Map Reduce Programs those are running on the cluster and also wrote MapReduce jobs using Java API.
- Installed Hadoop, Map Reduce, HDFS, and Developed multiple map reduce jobs in Pig and HIVE for data cleaning and pre-processing.
- Installed Oozie workflow engine to run multiple HIVE and Pig jobs.
- Gained experience in running Hadoop streaming jobs to process terabytes of XML format data.
- Importing and exporting data into HDFS and HIVE using SQOOP.
- Developed Pig Latin scripts to extract the data from the web server output files to load into HDFS.
- Worked on Cluster coordination services through ZooKeeper.
- Monitored workload, job performance and capacity planning using Cloudera Manager.
- Used Flume to collect, aggregate, and store the web log data from different sources like web servers, mobile and network devices and pushed to HDFS.
- Performed data importing from various sources to HBase and Cassandra cluster using Sqoop.
- Used HIVE and created HIVE tables and involved in data loading and writing HIVE UDFs.
- Gained knowledge on building Apache Spark applications using Java and Scala.
- Implemented Kafka Custom encoders for custom input format to load data into Kafka Partitions.
- Real time streaming the data using Spark with Kafka for faster processing.
- Configured Spark streaming to receive real time data from the Kafka and store the stream data to HDFS using Scala.
Environment: Hadoop, YARN, Java, MapReduce, HDFS, HBase, HIVE, Pig, Linux, Eclipse, Kafka, Storm, Flume, Cassandra, Scala, Spark, Oozie, ZooKeeper, Cloudera- CDH4/5 Distribution, SQL Server, and Oracle 12g.
Hadoop Developer/ Java
Confidential, Pittsburg, PA
Responsibilities:
- Experience in defining, designing and developing Java applications, specially using Hadoop Map/Reduce by leveraging frameworks such as Cascading and HIVE.
- Developed the Java client API for node provisioning, load balancing and artifact deployment.
- Implemented JMS for asynchronous auditing purposes.
- Wrote MapReduce jobs using Pig Latin.
- Developed workflow using Oozie for running MapReduce jobs and HIVE Queries.
- Worked on Cluster coordination services through ZooKeeper.
- Worked on loading log data directly into HDFS using Flume.
- Involved in loading data from LINUX file system to HDFS.
- Experienced in running Hadoop streaming jobs to process terabytes of XML format data.
- Written Pig and HIVE jobs to parse the logs and structure them in tabular format to facilitate effective querying on the log data. Also have hand on Experience on Pig and HIVE User Define Functions (UDF).
- Involved in Extracting, loading data from HIVE into RDBMS using SQOOP.
- Integrate four square monitoring and production system with Kafka.
- Optimized existing reports for performance and generated new reports.
- Worked on resolving service requests submitted by the management on a daily basis.
- Used Oracle Java Developer and SQL Navigator as tools for Java and PL/SQL development.
Environment: Hadoop, MapReduce, HDFS, Java SE 6, Hadoop distribution of Cloudera Manager v4.7.1, HIVE, Pig, HBase, Flume, RabbitMQ, Oozie, PostgreSQL, RHEL6.3/6.4, Centos 6.3/6.4, Java Web Services, UNIX/LINUX Shell Scripting, Java SE 6, Servlets, XML, HTML, JavaScript, JSP, Hibernate, Oracle 11i, Oracle JDeveloper, SQL Navigator.
Java Developer
Confidential
Responsibilities:
- Developed the application following Agile scrum methodology and project tracking was done using Rally.
- Developed the project using Spring MVC, JQuery, JSP, HTML, CSS and JavaScript.
- Involved in designing and coding of Controllers for all modules using Spring MVC framework.
- Used the spring framework for Inversion of Control (IOC)/Dependency Injection.
- Responsible for developing the UI pages using HTML, DHTML, CSS, Java Script, JSP, Ajax, JSTL, JSP tag libraries, and custom tags.
- Used Hibernate ORM framework in the Data Access layer to interact with the Oracle database.
- Created Hibernate mapping files mapping DAO’s to the tables in the database.
- Coded Data Objects, Data Access Objects, and Business Objects in the application.
- Involved in creating JNDI lookup to locate the services/resources running in the middleware server.
- Used Ant build scripts to build and deploy application on web logic server.
- Wrote unit tests and integration tests using JUNIT to thoroughly test the various modules of the application.
- Resolved the bugs in the application by reading the log files.
- Configured and used Log4j for logging.
- Used SVN for version control of source code.
- Developed the application using eclipse IDE.
- Provided support for system testing and user acceptance testing.
- Participated in code reviews and design discussions.
Environment: Eclipse IDE, Spring 3.1, Hibernate 4.0.0, J2SE 5.0, Servlets, JSP, JSTL, JNDI, JavaScript, DHTML, HTML, Ajax, XML, LINUX, Windows XP, SVN, JUnit, Log4j, Ant, Unix, WebLogic.
Java Developer
Confidential
Responsibilities:
- Responsible for Functional Specification and System Design based on Business Requirement Document provided by business analyst.
- Actively participated in design and technical discussions.
- Designed and developed client side GUI using JSP, HTML and JavaScript.
- UsedJavacore concepts Collection Framework Interfaces like List, Set, Queue and also Map Interface.
- Used JavaScript for validations and integrating business server side components on the client side with in the browser.
- Used servlet as an interface between backend to frontend to handle http requests and to send response back to the front end.
- Worked on JDBC to connect to MySQL for implementing CRUD operations.
- Responsible for configuring Apache Tomcat Application Server to access the Database by setting up data source and MySQL Pool.
- Developed the business objects using EJB for calling data access objects.
- Used NetBeans IDE to develop the Application.
- Used SVN for version control across common source code used bydevelopers.
- Building Software modules using Apache Ant.
- Used Log4J to capture the log that includes runtime exceptions.
- Performed unit testing and Integration testing.
Environment: J2SE 1.4, EJB, Servlets, JSP, JavaScripts, CSS, HTML, XML, NetBeans IDE, SVN, ANT, Apache Tomcat Application Server, and MySQL.