Hadoop Developer Resume Profile
Summary of Experience
- 8 years of professional experience in IT, including 2 years of hands on experience in Big Data, Hadoop Ecosystem Components
- Experience in writing Map Reduce programs using Apache Hadoop for analyzing Big Data.
- Hands on experience in writing Ad-hoc Queries for moving data from HDFS to HIVE and analyzing the data using HIVE QL.
- Experience in importing and exporting data using SQOOP from Relational Database Systems to HDFS.
- Experience in writing Hadoop Jobs for analyzing data using Pig Latin Commands.
- Knowledge of technologies like Apache Flume, Apache SPARK, Mahout, YARN.
- Good Knowledge of analyzing data in HBase using Hive and Pig.
- Proficient knowledge in BI tools like OBIEE,SSRS.
- Experience in administrative tasks such as installing Hadoop and its ecosystem components such as Hive and Pig in Distributed Mode.
- Passionate towards working in Big Data and Analytics environment.
- Extensive experience with SQL, PL/SQLShell Scripting and database concepts.
- Experience in developing applications using Java J2EE technologies.
- Extensive Knowledge in Java, J2ee, Servlets, JSP, JDBC,Struts and Spring Framework.
- Familiar in writing SQL queries and Stored Procedures and implementation on ORACLE 10g.
- Proficient Experience in using Zookeeper and Oozie Operational Servicesfor coordinating the cluster and scheduling workflows.
- Exposed to all stages of the Software Development Life Cycle SDLC with thorough understanding of the software testing process. Proficient in the use of various tools like IBM Rational Tools, Mercury Quality Center, and Quick Test Professional.
- Strong knowledge of standard industry methodologies like Software Development Life Cycle SDLC , Iterative Software Development Life Cycle Process, Rational Unified Process RUP , and Rational Tools used during various phases of RUP.
- Actively participated in organizing and training for User Acceptance Testing UAT for end-to-end product software release.
- Expertise in White Box, Gray Box and Black Box Testing.
- Practicing Agile and development frameworks and standards, Test Driven Design Development
- Highly motivated, self-starter with a positive attitude, willingness to learn new concepts and acceptance of challenges.
- Major strengths are familiarity with multiple software systems, ability to learn quickly new technologies, adapt to new environments, self-motivated, focused adaptive and quick learner with excellent interpersonal, technical and communication skills.
Skill Matrix
Languages | Java, Java Script, HTML, XML, XSD, XSL ,Web Services, MapReduce, Pig, Sqoop, Pig, Hive, Oozie, YARN, Hbase, Flume, Mahout, SPARK. |
J2EE Technologies : | JSP, Servlets, JDBC and EJB |
Servers | IBM Web Sphere Application Server 7.0, Web Logic and Tomcat |
Frameworks | IBM EAD4J Framework Struts, Spring, Hibernate, Hadoop. |
Java IDEs | RSA 8.0.x, RTC, RAD, Eclipse. |
Version Control / Tracking Tools | RTC, Rational Clear case, Rational Clear Quest, Rational Portfolio Manager, Rational Req pro, Rational Quality Management, SVN, CVS, Visual SourceSafe VSS |
Databases | DB2 9.x, Oracle, SQL DDL, DML, DCL and PL/SQL. |
Design Skills | J2EE design patterns, Object Oriented Analysis and Design OOAD , UML. |
Operating Systems | Windows7, Windows XP, 2000, 2003, Unix and Linux |
Experience
Hadoop Developer
Confidential
Responsibilities:
- Installed and configured Hadoop MapReduce, HDFS
- Developed multiple MapReduce jobs in java for data cleaning and preprocessing.
- Importing and exporting data into HDFS and Hive using Sqoop.
- Experienced in defining job flows. Experienced in managing and reviewing Hadoop log files.
- Extracted files from MongoDB through Sqoop and placed in HDFS and processed.
- Experienced in running Hadoop streaming jobs to process terabytes of xml format data.
- Load and transform large sets of structured, semi structured and unstructured data.
- Responsible to manage data coming from different sources.
- Supported Map Reduce Programs those are running on the cluster.
- Involved in loading data from UNIX file system to HDFS.
- Installed and configured Hive and also written Hive UDFs.
- Involved in creating Hive tables, loading with data and writing Hive queries which will run internally in map reduce way.
- Working knowledge in modifying and executing the UNIX shell scripts. Involved in web testing using soap UI for different member and provider portals.
- Involved in building and maintaining test plans, test suites, test cases, defects and test scripts using RQM.
- Conducted functional, system, data and regression testing.
- Involved in Bug Review meetings and participated in weekly meetings with management team.
Environment:
Java 6, Eclipse, Linux, Hadoop, HBase, Sqoop, Pig, Hive, Informatica, Cognos, MapReduce.
HADOOP DEVELOPER
Confidential
Responsibilities:
- Involved in architecture design, development and implementation of Hadoop deployment, backup and recovery systems.
- Worked on the proof-of-concept POC for Apache Hadoop framework initiation
- Worked on numerous POCs to prove if Big Data is the right fit for a business case.
- Developed MapReduce jobs for Log Analysis, Recommendation and Analytics.
- Wrote MapReduce jobs to generate reports for the number of activities created on a particular day, during a dumped from the multiple sources and the output was written back to HDFS
- Reviewed the HDFS usage and system design for future scalability and fault-tolerance. Installed and configured Hadoop HDFS, MapReduce, Pig, Hive, Sqoop.
- Wrote Pig Scripts to generate MapReduce jobs and performed ETL procedures on the data in HDFS.
- Processed HDFS data and created external tables using Hive, in order to analyze visitors per day, page views and most purchased products.
- Exported analyzed data to HDFS using Sqoop for generating reports.
- Used MapReduce and Sqoop to load, aggregate, store and analyze web log data from different web servers.
- Developed Hive queries for the analysts.
- Experience in optimization of Map reduce algorithm using combiners and partitions to deliver the best results and worked on Application performance optimization for a HDFS/Cassandra cluster.
- Performed Black box testing for the Web based application an interface to mainframe.
- Created and Executed Automation Test Scripts for Functional and Regression Testing.
- Preparing status reports and defect reports for daily status meetings and sending them to all the stakeholders of the project.
Environment:
Mapreduce, Hive, Pig, Sqoop, Oracle, MapR, Informatica, Microstrategy, Cloudera, Manager, Oozie, ZooKeeper.
JAVA DEVELOPER
Confidential
Responsibilities:
- Responsible for Business Analysis and Requirements Collection.
- Translated requirements into business rules made recommendations for innovative IT solutions.
- Documented data conversion, integration, load and verification specifications.
- Involved in many phases of SDLC like requirement gathering, designing, and development of the application.
- Understanding the SA Solution Architecture document and preparing the HLD High-level design DLD Detail-level design documents.
- Used Hibernate to create Object Relational Model with Java Objects that is mapped to the Database objects in Oracle using XML mapping files.
- Worked in Hibernate Configuration set up and defining .hbm files. Developed Java classes as Business Logic and Data Access Layer to communicate with the Hibernate API.
- Coded and developed front end user interfaces using the Struts framework, JSP and XSLT.
- Created and maintained the configuration of the Spring MVC Framework IoC container for module services in order to access the unified API's of these modules.
- Developed Message Driven Beans MDBs for receiving and processing data from IBM MQSeries.
- Provided ANT build script for building and deploying the application and deployed the application in Websphere Application Server.
- Exposed few API's like SOAP, UDDI, WSDL through Web Services and wrote Web-services to expose the business methods to external services.
- Prepared unit Test case using JUNIT framework. Created Unit Test cases for developed functionality and performed Unit testing.
- Design and developed SQL Queries, Views and stored procedures and SQL Server 2005 as the relational database.
- Worked with the team lead to coordinate the test case turnover process proceeding each cycle and selectively executed the test cases to verify that they are kept current.
- Performed integration and regression testing on Intranet Application
Environment: Java 1.5, WebSphere 6.1, Eclipse, DB2, EJB 2, JSP, XSLT, Struts 1.2, Spring1.x, Hibernate, Web Services, XML Beans, XML, XSLT, XML Schemas, SQL Server 2005, LDAP, UML, Windows, Linux.
JAVA DEVELOPER
Confidential
Responsibilities
- Developed UI using JSP.
- Developed controller using Struts Action Class.
- Developed Model using Java Beans as POJO.
- Development of DAO, DTO and Helper classes.
- Worked on Multi-Threading Utilities.
- Development of Data source and connection pool.
- Was responsible for developing, deploying and testing components on Oracle 9i Application
- Involved in requirement analysis and Design of Data Model.
- Written SQL queries in Oracle.
- Was responsible for developing testing components.
- Performed Unit testing, System Testing and Integration Testing.
- Prepared Use case, Class and Sequence diagrams using Rational Rose tool.
Environment:
Java, JSP, Servlets, Struts, Hibernate 3.0, XML, Eclipse IDE,Oracle
JAVA DEVELOPER
Confidential
Responsibilities:
- Redesigning the existing web application with new technologies.
- Design and development of the new Technical flow.
- Daily client on-site team interaction.
- Involved in creating UI screens using JSP, controllers using annotations at method level and class level.
- Developed DAO for communicating with the database.
- Used Hibernate DAO support for performing queries. And handled transactions using spring annotations.
- Involved in writing HQL Queries, criteria and named queries in DAO's.
- Involved in developing Spring IOC Inversion of control , DAO, MVC.
- Defined business facades to handle multiple calls to the dao's.
- Defined custom exceptions at business layer and DAO layer.
- Used xml to configure mapping files and defined DTO for named queries.
- Used spring support for restful webservices to communicate with the host machine for agreement forms.
- Used auto wired annotation for handling the dependent objects.
- Used Hibernate as ORM tool and defined the mapping and relationship of each table in database to java object
- Created test cases for DAO's and webservices.
- Used annotations to load the config file to test the components.
- Used maven to acquire the dependencies and build the application.
- Used perforce as a version control system.
- Worked in an agile environment.
- Actively participated in scrum meetings and updating Rally regarding the tasks and time spent on each tasks.
Environment:
Oracle 11g, Java 1.6, J2EE, JSP, Spring MVC, Spring ORM, Spring IOC, Spring Web Services, Hibernate, Web Services Restful, JBOSS EAP-6, Java Script, HTML, CSS, jQuery, AJAX, Informatica - Power Center 8.6,SSRS
JAVA DEVELOPER
Confidential
Responsibilities
- Involved in high-level and low-level design of application.
- Developed presentation components for admin console using JSP, HTML 5, CSS 3, JQuery, JavaScript, and AJAX.
- Created few Database Views.
- Performed Backend testing to validate data entered into database.
- Creating weekly status reports describing status of testing process.
- Managed the functional specifications, development, user acceptance testing.
- Written automated UNIX scripts to automate the process of running regular jobs.
- Used log4J to implement logging to generate audit, debug and error logs.
- Involved in developing of JDBCDAOs and DTOs, access of advanced SQL and PL/SQLstored procedures on database systems using spring templates.
- Developed test cases using Junit for unit testing and created test cases for unit, integration and UAT testing.
- Worked with web design team to apply CSS styles and java scripts.
- Used Eclipse IDE for the development of presentation, service and DAO layers.
- Involved in requirement gathering/analysis with product owners.
- Developed application in agile environment and participated in daily scrum meetings.
- Participated in sprint planning, story points estimate, show tell sessions, and sprint retrospective meetings.
Environment: JDK 1.6, Spring 3, Hibernate 3, MySQL, JQuery, XML, XSLT, XPath,
CSS 3,Apache Ant 1.7, Eclipse STS, MySQL, Subversion, Informatica, Web Services.
INTERN/JR.JAVA DEVELOPER
Confidential
Responsibilities:
- Worked on AJAX to develop an interactive Web Application and JavaScript for Data Validations.
- Developed the application under JEE architecture, developed Designed dynamic and browser compatible user interfaces using JSP, Custom Tags, HTML, CSS, and JavaScript. Deployed maintained the JSP, Servlets components on Web logic 8.0.
- Involved in the analysis, design, and development and testing phases of Software Development Lifecycle SDLC .
- Performed unit testing using JUnit framework.
- Designed Use Case Diagrams, Class Diagrams and Sequence Diagrams.
- Designed and implemented Business Delegate, Session Facade and DTO Design Patterns.
- Involved in implementing the DAO pattern for database connectivity and used the JDBC API extensively.
- Involved in the Client Interaction during various stages of the project.
- Developed Application Servers persistence layer using, JDBC, SQL.
- Used JDBC to connect the web applications to Data Bases.
- Requirement analysis and estimation for the tasks.
- Actively handled all the client interaction for the functionality.
- Handling the technical issues of the team.
- Analysis and fixing of defects and deploying in new patch releases on the production servers on a monthly basis.
- Participate in code reviews.
- Performing Unit Testing and execute System Test cases.
- Implemented Test First unit testing framework driven using JUnit.
Environment:
Struts 1.3, Spring Framework, Hibernate 3.5, JSP, Oracle 9i, AJAX, Ant 1.7, JDBC,Java Script HTML, CSS, RAD, WebLogic, Web Services JAX-WS, Apache Axis, JUNIT 4.2.