Hadoop Developer Resume
East Providence, RI
SUMMARY
- 7 Years of extensive experience in IT including two years of Big Data Ecosystem related technologies.
- Experience in installation, configuration, management and deployment of Big Data solutions and the underlying infrastructure of Hadoop Cluster.
- Good understanding/knowledge of Hadoop Architecture.
- Hands - on experience on major components in Hadoop Ecosystem including Hive, HBase, HBase-Hive Integration, PIG, Sqoop, Flume and knowledge of Mapper/Reduce/HDFS Frame work.
- Set up standards and processes for Hadoop based application design and implementation.
- Experience in analyzing data using HIVEQL, PIG Latin and custom MapReduce programs in JAVA. Extending HIVE and PIG core functionality by using custom UDF’s.
- Good experience in analysis using PIG and HIVE and understanding of SQOOP.
- Experienced in developing MapReduce programs using Apache Hadoop for working with Big Data.
- Experience in Designing, developing and implementing connectivity products that allow efficient exchange of data between our core database engine and Hadoop ecosystem.
- Worked on NoSQL databases including HBase.
- Experience in importing and exporting the data using Sqoop from HDFS to Relational Database systems/mainframe and vice-versa.
- Hands on experience in application development using Java, RDBMS, and Linux shell scripting.
- Diverse experience utilizing Java tools in business, Web, and client-server environments including Java Platform, J2EE, EJB, JSP, Java Servlets, Struts, and Java database Connectivity (JDBC) technologies.
- Have good Knowledge in ETLand hands on experience in ETL.
- Solid background in Object-Oriented analysis (OOAD) and design. Very good at various Design Patterns, UML and Enterprise Application Integration EAI.
- Major strengths are familiarity with multiple software systems, ability to learn quickly new technologies, adapt to new environments, self-motivated, team player, focused adaptive and quick learner with excellent interpersonal, technical and communication skills.
- Good communication skills, work ethics and the ability to work in a team efficiently with good leadership skills.
TECHNICAL SKILLS
Big data/Hadoop: HDFS, Map Reduce, HIVE, PIG, HBase, Sqoop
Java Technologies: Core Java, I8N, JFC, Swing, Beans
Methodologies: Agile, UML, Design Patterns (Core Java and J2EE).
Programming Languages: C, C++, Java, Linux shell scripts.
Database: Oracle 11g/10g/9i, MySQL, MS-SQL Server, Teradata.
Web Servers: WebLogic, WebSphere, Apache Tomcat.
Web Technologies: HTML, XML, JavaScript, AJAX, SOAP, WSDL.
PROFESSIONAL EXPERIENCE
Confidential, Phoenix, AZ
Hadoop Developer
Responsibilities:
- Worked on analyzing Hadoop cluster using different big data analytic tools including Pig, Hive, and MapReduce
- Collecting and aggregating large amounts of log data using Apache Flume and staging data in HDFS for further analysis
- Worked on debugging, performance tuning of Hive & Pig Jobs
- Created Hbase tables to store various data formats of PII data coming from different portfolios
- Implemented test scripts to support test driven development and continuous integration
- Worked on tuning the performance Pig queries
- Involved in loading data from LINUX file system to HDFS
- Importing and exporting data into HDFS and Hive using Sqoop
- Experience working on processing unstructured data using Pig and Hive
- Experienced in running Hadoop streaming jobs to process terabytes of xml format data.
- Supported MapReduce Programs those are running on the cluster
- Gained experience in managing and reviewing Hadoop log files
- Involved in scheduling Oozie workflow engine to run multiple Hive and pig jobs
- Developed Pig Latin scripts to extract data from the web server output files to load into HDFS
- Extensively used Pig for data cleansing.
- Created and maintained Technical documentation for launching HADOOP Clusters and for executing Hive queries and Pig Scripts
- Implemented SQL, PL/SQL Stored Procedures
- Testing the various ETL processes that were developed.
- Actively involved in code review and bug fixing for improving the performance.
- Developed screens using JSP, DHTML, CSS, AJAX, JavaScript, Struts, spring, Java and XML
Environment: Hadoop, HDFS, Pig, Hive, MapReduce, Sqoop, LINUX, Cloudera, Big Data, Java APIs, Java collection, SQL, AJAX.
Confidential, East Providence, RI
Hadoop Developer
Responsibilities:
- Responsible for coding JavaBatch, Restfull Service, Map Reduce program, Hive query’s, testing, debugging, Peer code review, troubleshooting and maintain status report.
- Involved in identifying possible ways to improve the efficiency of the system
- Requirements Study, Software Development Specification, Development and Unit Testing use of MRUnit and Junit
- Prepare daily and weekly project status report and share it with the client.
- Interact with the client periodically to discuss project status
- Requirements Study, Software Development Specification, Development and Unit Testing use of Junit
- Generate code coverage report use of Emma and PMD
- Ensuring quality (Metrics, cohesion, coupling and etc) is maintained in the project deliverables
- Responsible for troubleshooting and resolving all issues
- Ensuring compliance to time sheets
- Active co-ordination and communication with onsite counterparts
Environment: Java, Java Batch, Restfull Service, JAXB, Hadoop, Map Reduce, Junit, MRUnit and Oracle/Informix
Confidential, MI
Java Developer
Responsibilities:
- Responsible for requirement gathering and analysis through interaction with end users.
- Involved in designing use-case diagrams, class diagram, interaction using UML model.
- Designed and developed the application using various design patterns, such as session facade, business delegate and service locator. Worked on Maven build tool.
- Involved in developing JSP pages using Struts custom tags, JQuery and Tiles Framework.
- Used JavaScript to perform client side validations and Struts-Validator Framework for server-side validation. Good experience in Mule development.
- Developed Web applications with Rich Internet applications using Java applets, Silverlight, JavaFX.
- Involved in creating Database SQL and PL/SQL queries and stored Procedures.
- Implemented Singleton classes for property loading and static data from DB.
- Debugged and developed applications using Rational Application Developer (RAD).
- Developed a Web service to communicate with the database using SOAP.
- Developed DAO (data access objects) using Spring Framework 3.
- Deployed the components in to WebSphere Application server 7.
- Actively involved in backend tuning SQL queries/DB script.
- Worked in writing commands using UNIX,Shell scripting.
- Involved in developing other subsystems’ server-side components.
- Production supporting using IBM clear quest for fixing bugs.
Environment: Java EE 6, IBM WebSphere Application Server 7, Apache-Struts 2.0, EJB 3, Spring 3.2, JSP 2.0, WebServices, JQuery 1.7, Servlet 3.0, Struts-Validator, Struts-Tiles, Tag Libraries, ANT 1.5, JDBC, Oracle 11g/SQL, JUNIT 3.8, CVS 1.2, Rational clear case,Eclipse 4.2,JSTL,DHTML
Confidential, MD
Java/J2EE Interface Developer
Responsibilities:
- Created Use case, Sequence diagrams, functional specifications and User Interface diagrams using Star UML.
- Involved in complete requirement analysis, design, coding and testing phases of the project.
- Participated in JAD meetings to gather the requirements and understand the End Users System.
- Developed user interfaces using JSP, HTML, XML and JavaScript.
- Generated XML Schemas and used XML Beans to parse XML files.
- Created Stored Procedures & Functions. Used JDBC to process database calls for DB2/AS400 and SQL Server databases.
- Developed the code which will create XML files and Flat files with the data retrieved from Databases and XML files.
- Created Data sources and Helper classes which will be utilized by all the interfaces to access the data and manipulate the data.
- Developed web application called iHUB (integration hub) to initiate all the interface processes using Struts Framework, JSP and HTML.
- Developed the interfaces using Eclipse 3.1.1 and JBoss 4.1 Involved in integrated testing, Bug fixing and in Production Support
Environment: Java 1.3, Servlets, JSPs, Java Mail API, Java Script, HTML, Spring Batch XML Processing, MySQL 2.1, Swing, Java Web Server 2.0, JBoss 2.0, RMI, Rational Rose, Red Hat Linux 7.1.
Confidential, Atlanta, GA
J2EE Developer
Responsibilities:
- Coded end to end (i.e. from GUI on Client side to Middleware to database and Connecting the back end Systems) on a subset of sub modules belonging to the above modules.
- Worked extensively on Swing.
- Most of the business logic is provided in Session Beans and the database transactions are performed using Container Managed Entity Beans. Worked on Parsing of XML Using DOM and SAX.
- Implemented EJB Transactions.
- Used JMS for messaging with IBM MQ-Series. Written stored procedures.
- Developed the Presentation layer, which was built using Servlets and JSP and MVC architecture on Web sphere Studio Application Developer (WSAD).
- Mentoring other programmers. Studied the implementation of Struts
- Implemented the Security Access Control both on client and Server side. Applet signing including Jar signing,
Environment: Java, Java Swing JSP, Servlets, JDBC, Applets, Servlets, JCE 1.2, RMI, EJB, XML/XSL, Visual Age java (VAJ), Visual C++.