Hadoop Developer Resume
Tampa, FL
OBJECTIVE
- Seeking a challenging position in an organization, where I can utilize my skills and efficiency for organizational growth and professional growth.
SUMMARY
- Software Professional having 11+ years of Software Development Life Cycle including hands on experience in Java/J2EE technologies and Big Data Analytics.
- Over 2+ years of work experience in storage, querying, processing and analysis of Big Data with hands on experience in Hadoop eco system and its related technologies.
- Having hands on experience in writing Map Reduce Jobs, Hive, Sqoop, OOZIE, Spark and J2EE technologies.
- Analyzed large amount of data sets by writing Hive Queries.
- Strong understanding onHadooparchitecture and MapReduce framework.
- Experience in job workflow scheduling and monitoring tools like OOZIE.
- Handled large volumes of data using Big Data technologies like Hadoop, Map Reduce, Hive, OOZIE and Sqoop.
- Expertise in Managing, Monitoring and Administration ofHadoopfor Multi Node Cluster with distributions like Horton works HDP
- Kerberzing the Hadoop Cluster.
- Experience in Extraction, Transformation & Loading (ETL) of data with different file formats like CSV, Text files, JSON and ORC.
- Experience in NoSQL Database Neo4j.
- Good Knowledge of Data Structures and Algorithms.
- Experience in developing applications using Java/J2EE technologies like Servlets, JSP, EJB, and JDBC.
- Experienced in developing REST Services using JAX - RS and SOAP based web services using JAX-WS.
- Experienced in developing applications using SPRING & HIBERNATE frameworks.
- Expertise in Maven, Ant for build, application tools.
- Expertise in Performance analysis with Memory Analyzer Tool and HP Diagnostics.
- Experience working on Version control tools like SVN and Git revision control systems such as GitHub.
- Ability to plan, manage, motivate and work efficiently in both independently or in a team effort.
- Good exposure in overall SDLC including requirement gathering, development, testing, debugging, deployment, documentation and production support.
- Experience in working with software methodologies like Agile and Waterfall.
- Excellent communication and presentation skills, self-motivated, highly committed to responsibilities.
- Ability to quickly grasp any new technologies and concepts.
TECHNICAL SKILLS
Languages: Core Java 1.8, XML, Multi-threading, JDBC, UML
Big Data Technologies: MapReduce, HDFS, Hive, Sqoop, OOZIE, YARN, SPARK
Web Technologies/ APIs: J2EE, JNDI, Servlets, JSP, Spring MVC, XSD, XPATH, JAXB, Web services, REST
Frameworks: Spring 4.0.1 (Core, AOP, JDBC, ORM), Hibernate 4.X,KAFKA
RDBMS: Oracle 10g, MySQL, Hive
IDEs: Eclipse Galileo/Indigo, Intellji
Servers: JBOSS Wild fly, Apache Tomcat 6
Caching Framework: JBOSS-CACHE
Version Control: SVN, Tortoise, GitHub
Continuous Integration: Jenkins
Build Tools: Ant, Maven 2.x
Issue Tracker System: HP ALM
Testing Framework: JUNIT, MR Unit
Performance Tools: HP Diagnostic Tools, Memory Accessor Tool, SOAP UI
Operating System: Windows 7, UNIX, Linux
No SQL Technologies: Hive, Neo4j
PROFESSIONAL EXPERIENCE
Confidential, Tampa, FL
Hadoop Developer
Responsibilities:
- Responsible for building scalable distributed data solutions using Hadoop.
- Developed Map Reduce jobs and Hive Jobs to summarize and transform data.
- Handled importing data from different data sources into HDFS using Sqoop and performing transformations using Hive, Map Reduce and then loading data into HDFS.
- Scheduled OOZIE workflow engine to run multiple Hive and Map Reduce jobs, which independently run with time and data availability
- Written Custom Action Executors in OOZIE.
- Experience in customizing map reduce framework Confidential different levels like input formats, data types Combiners and Partitioners.
- Configuring, Maintaining, and Monitoring Hadoop Cluster using Horton Work’s distribution.
- Developed Spark scripts using Spark SQL to access hive tables in spark for faster data processing.
- Fixed defects as needed during the QA phase, support QA testing, troubleshoot defects and identify the Source of defects.
Environment: Hadoop, Horton Works, HDFS, Hive, Spark, Sqoop, Java, OOZIE, Eclipse, MySQL.
Confidential
Hadoop Developer
Responsibilities:
- Managing and scheduling batch Jobs on aHadoopCluster using OOZIE.
- Developed custom Map Reduce programs for data analysis.
- Installed and configured various components ofHadoopecosystem.
- Writing Hive Queries and enhanced Hive performance by using Compression Techniques.
- Implemented techniques for efficient execution of Hive queries like Map Joins.
- Dumping the database data into HDFS by using Sqoop.
- Fixed defects as needed during the QA phase, support QA testing, troubleshoot defects and identify the Source of defects.
- Prepare High Level Design Document to give the overall picture of system integration
- Writing Acceptance Criteria and Unit Test Cases.
- Mentoring Junior Members and perform Code Review.
- Involved in agile methodologies, daily scrum meetings, Sprint planning's and Creating Stories.
- Use of Agile Methodology with Stories, Sprint and Scrum
Environment: Hadoop, Horton Works, HDFS, Hive, Spark, Sqoop, Java, OOZIE, Intellji, MySQL, XML, XSD.
Confidential
Senior Java Developer
Responsibilities:
- Involved in the High Level Design, Writing UML Diagrams and HLD Document.
- Implement the Business Logic using Core Java, J2ee Concepts.
- Writing Restful Services using JAX-RS.
- Applied Object Oriented concepts (inheritance, composition, interface, etc.) and design patterns (singleton, strategy Etc.)
- Applied Spring IOC Container to facilitate Dependency Injection.
- Used Neo4j Apis to interact with Neo4j Databases.
- Ensured that the code developed meets quality using Sonar.
- Perform Junit Testing and using Soap UI to test Rest Apis.
- Debugging the code, analyzing the defects, applying fixes.
- Mentoring juniors and reviewing the code.
- Writing Cypher Queries.
- Developing consumer code to receive SNMP Alarms from Kafka.
Environment: Java /J2EE, Neo4j, RESTful, XML, KAFKA,XSD, Eclipse, Soap UI, JBOSS, UNIX/ Windows.
Confidential
Senior Java Developer
Responsibilities:
- Actively involved in the entire application life cycle including design, development, debugging and testing of the system.
- Used various CoreJavaconcepts such as Multithreading, Exception Handling, Collection APIs to implement various features and enhancements.
- Created HLD and LLD for projects.
- Developed REST based and SOAP Based Web services.
- Designed various modules of the application using SPRING core.
- Developed Persistence Framework using Hibernate.
- Writing Junit Test cases.
- Used Maven as build-automation tool and Jenkins for the process of Continuous Integration the project.
- Involved in various internal releases of the application and supported the application modules during testing and production phases.
Environment: JAVA, Spring Core, Tomcat, Tortoise, Hibernate, Oracle, JUnit, SOAP, XML, SQL Developer
Confidential
Java/J2EE Developer
Responsibilities:
- Used various CoreJavaconcepts such as Multithreading, Exception Handling, Collection APIs to implement various features and enhancement.
- Used Dependency Injection feature of Spring MVC framework and O/R mapping tool Hibernate for rapid development and ease of maintenance.
- Preparing the functional specification and Acceptance Criteria.
- Writing Junit test cases.
- Used Maven as build-automation tool.
- Meeting the requirements as per the road map of the project
Environment: JAVA, Spring Core, Spring MVC, Tortoise, Hibernate, Oracle, JUnit, SOAP, XML, SQL Developer
Confidential
Java/ J2EE Developer
Responsibilities:
- Involved in the Development of the product.
- Implemented J2EE standards, MVC architecture using JSF Frameworkand SEAM.
- Writing DAO Layer using Hibernate.
- Involved in debugging the application for any existing issues.
- Used the CoreJavaconcepts to implement the Business Logic.
- Writing test cases and documenting the work done.
Environment: Java, Eclipse, ORACLE, JUnit, Hibernate, JSF, SEAM, Tomcat.
Confidential
Java/ J2EE Developer
Responsibilities:
- Study and analyze the business process flow and existing system.
- Developed Applications using WLP.
- Java and Quartz programming
- Writing test cases and documentation.
Environment: Java, Eclipse, QUARTZ, ORACLE. WebLogic