Hadoop Developer Resume
0/5 (Submit Your Rating)
SUMMARY
- 7 years of overall experience with strong emphasis on Design, Development, Implementation, Testing and Deployment of Software Applications with hands on experience in Hadoop, HDFS, MapReduce, Hadoop Ecosystem, ETL and RDBMS and development experience using Java, JSP,HTML,CSS,Javascript.
- In depth understanding/knowledge of Hadoop Architecture and various components such as HDFS, JobTracker, TaskTracker, NameNode, DataNode and MapReduce concepts.
- Highly experienced as Big Data Engineer with deep understanding of the Hadoop Eco System (MapReduce, Pig, Hive, Sqoop,Spark,Storm), HBase, NoSQL, ETL and RDBMS
- Prior experience working as Software Developer in Java/J2EE and related technologies.
- Experience on Hadoop clusters using major Hadoop Distributions like Cloudera, Hue.
- Experience in different layers of Hadoop Framework - Storage (HDFS), Analysis (Pig and Hive), Engineering (Jobs and Workflows).
- Good understanding of MapReduce programming and experience in analyzing data using Pig Latin and Hive QL.
- Created Hive tables to store data into HDFS and processed data using Hive QL
- Job/workflow scheduling and monitoring with Oozie.
- Expertise in Importing/Exporting data into HDFS from existing relational databases using Sqoop
- Managed data extraction for ETL Data warehouse and applied the transformation rules as necessary for data consistency
- Background with traditional databases such as Oracle, SQL Server.
- Experience in designing and coding web applications using Core Java and J2EE Technologies- JSP JDBC, Jenkins and github
- Experience with MRUnit and JUnit.
- Good Knowledge of analyzing data in HBase.
- Experience in object oriented analysis and design (OOAD), unified modeling language (UML) and design patterns.
- Excellent written and verbal communication skills, inter-personal skills and self-learning attitude.
- Extensive experience in all phases of Software Development Life Cycle (SDLC) including identification of business needs and constraints, collection of requirements, detailed design, implementation, testing, deployment and maintenance.
- Strong work ethic with desire to succeed and make significant contributions to the organization.
TECHNICAL SKILLS
Big Data (Hadoop Framework): HDFS, Map Reduce, Pig, Hive, Flume, Oozie, Zookeeper, HBase, Sqoop
Databases: Oracle,My SQL
Languages: SQL, JAVA/JSP, Pig Latin,Hive
OLAP concepts: Data warehousing, Data mining concepts
Development Tools: Eclipse, STS
Web Technologies: JSP, JDBC,HTML,CSS
Operating Systems: Windows 8,Windows 7, MacOS, Ubuntu
Web Server: Tomcat
PROFESSIONAL EXPERIENCE
Hadoop Developer
Confidential
Responsibilities:
- Worked on analyzing Hadoop cluster and different big data analytic tools including Pig, Hive and Sqoop. Responsible for building scalable distributed data solutions using Hadoop Cloudera.
- Involved in importing and exporting data (SQL Server, Oracle, csv and text file) from local/external file system and RDBMS to HDFS. Load log data into HDFS using Flume.
- ETL Data Cleansing, Integration &Transformation using Pig: Responsible of managing data from disparate sources.
- Exported the analyzed data to the relational databases using Sqoop for visualization and to generate reports for the BI team.
- Data Warehousing: Designed a data warehouse using Hive, created and managed Hive tables in Hadoop.
- Created various UDF functions in Pig and Hive to manipulate the data for various computations.
- Created Map Reduce Functions for certain computations.
- Workflow Management: Developed workflow in Oozie to automate the tasks of loading the data into HDFS and pre-processing with Pig.
- Created and maintained Technical documentation for launching Hadoop Clusters and for executing Hive queries and Pig Scripts.
- Used Spark to enhance the performance of the project
Hadoop Developer
Confidential
Responsibilities:
- Involved in moving all log files generated from various sources into Hadoop HDFS using Flume for further processing.
- Created Hive tables to store the processed results in a tabular format.
- Involved in developing Pig scripts.
- Experience on loading and transforming of large sets of structured, semi structured and unstructured data.
- Exported the analyzed data to the relational databases using Sqoop for visualization and to generate reports for the BI team.
- Analyzed large amounts of data sets to determine optimal way to aggregate and report on it.
- Good Knowledge of analyzing data in HBase using Hive and Pig.Experienced in defining job flows using Oozie.
- Experienced in managing and reviewingHadooplog files
- Used Pig as ETL tool to do transformations, even joins and some pre-aggregations before storing the data onto HDFS
- Responsible to manage data coming from different sources and application
- Developed multiple MapReduce Jobs in java for data cleaning and pre-processing
- Developed Simple to complex MapReduce Jobs using Hive and Pig
- Extending Hive and Pig core functionality by writing custom UDFs
- Involved in creating Hive tables, loading with data and writing Hive queries that will run internally in map reduce way.
- Creation of test cases using MRUnit as part of enhancement rollouts.
- Involved in Unit level and Integration level testing.
Java/J2EE Web Developer
Confidential
Responsibilities:
- Involved in Design, Development, Testing and Integration of the application.
- Designing using JSP.
- Implementing business logic and data base connectivity.
- Developed using simple Struts Validation for validation of user input as per the business logic and initial data loading.
- Writing database queries on Oracle.
- Writing Views, Cursors, Functions and triggers using SQL in the back end.
- Ability to quickly adjust priorities and take on projects with limited specifications.
- Worked closely with the testing team to make sure that all features were working properly.
- Co-coordinated Application testing with the help of testing team.
- Used JUnit for unit testing of application and implemented Test Driven Development (TDD) methodology.
Jr. Java Developer
Confidential
Responsibilities:
- Involved right from requirement gathering till the deployment phase.
- Developed screens based on JQuery to dynamically generate HTML and display the data to the client side. Extensively used JSP tag libraries.
- Developed the application using Spring Tool Set.
- Designed and developed Application based on Spring MVC Framework using MVC design pattern.
- Used Spring Core for dependency injection/Inversion of control (IOC).
- Configured Jenkins for successful deployment to test and production environments.
- Used XML to transfer the application data between client and server.
- Used JUnit to write repeatable tests mainly for unit testing.
- Participated in designing WebService framework in support of the product.
- Deployed the application in various environments DEV, QA and also Production.
- Used the JDBC for data retrieval from the database for various inquiries.
- Performed purification of the application database entries using Oracle 10g.
- Used GIT as source control.
- Generated Entity classes and schema using them.
- Created Application Property Files and implemented internationalization.
- Involved in complete development of 'Agile Development Methodology' and tested the application in each iteration
- Wrote complex SQL queries to retrieve data from the Oracle database.
- Was an effective team player and good contributor to the project.
Java Web Developer
Confidential
Responsibilities:
- Involved in SDLC Requirements gathering, Analysis, Design, Development and Testing of application developed using AGILE methodology
- Involved in Daily Scrum meetings, Sprint planning and estimation of the tasks for the user stories, participated in retrospective and presenting Demo at end of the sprint
- Designed and developed entire application implementing MVC Architecture
- Designing the front end using JSP, JQuery, CSS and Html as per the requirements that are provided.
- Used Maven to build and deploy the application on web logic server
- Implemented Object-relation mapping in the persistence layer using hibernate framework in conjunction with spring functionality
- Was actively involved in the designing of the front-end of the web application.
- Used Javascript for validation of page data in the JSP pages.
- Used Eclipse as the IDE for developing the application