Hadoop Developer Resume
Redwood City, CA
PROFESSIONAL SUMMARY:
- Around 8+ years of professional Software Development experience which includes 6 years of experience in Java/J2EE Development and 3 years in Big Data Ecosystem
- Strong experience in building High performing Bigdata Hadoop distributed systems and Datawarehousing applications that manage the large volumes of variety of data.
- Strongexperience with Hadoop framework and its ecosystem components such as: HDFS, MapReduce, PIG, HIVE, IMPALA, SQOOP, FLUME and YARN.
- In depth understanding/knowledge of Hadoop Architecture and its components such as HDFS, Job Tracker, Task Tracker, Name Node, Data Node and MapReduce.
- Expertise in writing Hadoop Jobs for analyzing data using MapReduce, Hive and Pig.
- Strong understanding of OOPs concepts, excellent programming skills in Java and MapReduce.
- Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice - versa.
- Experienced in extending Hive and Pig core functionality by writing custom UDF’s using Java.
- Experiencein working with large data sets using NoSQL database: HBase. Good knowledge in Cassandra.
- Experienced in job workflow scheduling and monitoring tools: Oozieand Zookeeper.
- Knowledge of administrative tasks such as installing Hadoop and its ecosystem components such as Hive and Pig.
- Good understanding of Spark architecture. Have done POC to process real time streaming data using Spark and Scala.
- Experience in developing solutions to analyze large data sets efficiently.
- Experience in Data Warehousing and ETL processesandStrong experience with ETL tool: Informatica power center andGood exposure with BI tools: OBIEE and Business Objects.
- Good understanding of Datawarehouse concepts: Star Schema Modeling, and Snowflake modeling, FACT and Dimensionstables, Summary tables and Slowly Changing dimensions.
- Experience with InformaticaPowerCenter Client tools - Designer, Repository manager, workflow manager/monitor and Server tools - Informatica Server, Repository Server manager.
- Expertise in implementing complex business rules by creatingmappings using various Informatica transformations, Sessions and Workflows.
- Good experience with UNIX Commands andShell scripting.
- Good experience in developing applications using Java/J2EE technologies like Core Java, JDBC, JSP, Servlets, EJB, Javascript, and HTML.
- Good experience with Hibernate,Struts, JSF and Web services. Good knowledge in MVCarchitecture and Various J2EE Design patterns.
- Strong Database skills. Good experience with Oracle (SQL, PL/SQL), SQL Server and DB2.Expertise in writing database objects like Stored Procedures, Triggers, SQL, PL/SQL packages, Functions and Cursors.
- Proficient in document management, able to prioritize and complete multiple tasks.
- Having strong technical skills, good problem research, trouble shooting and resolution skills.
- Confident possession of communication skills besides adaptability and quick learning.
TECHNICAL SKILLS:
Operating Systems: MS-DOS, WINDOWS, UNIX, HP-UX, AIX, LINUX, MVS, Z/OS
Hadoop Ecosystem: HDFS, MAP REDUCE, PIG, HIVE, HIVE QL, IMPALA, SQOOP, FLUME, YARN, SPARK, OOZIE, ZOOKEEPER
NOSQL Databases: HBASE, Neo4j, MangoDB, Couchbase, Cassandra
RDBMS Databases/Tools: Oracle (SQL, PL/SQL), SQL Server, DB2, Toad, SQL Navigator, SQL Server Management studio
DWH ETL Tools: Informatica Power Center 7.1/8.6/9.1, Oracle Warehouse Builder, Erwin (data modeling tool)
DWH Reporting Tools: Business Objects-XI, Cognos report net & 8BI, OBIEE, DAC
JAVA /J2EE /Web Technologies: Core Java, JDBC, JSP, Servlets, Javascript, AJAX, HTML, XML, XSLT, Struts, Hibernate, EJB, Spring, Webservices, SOAP, WSDL.
Other Tools/Software: UNIX Shell scripting, Putty, SecureFX, Secure CRT, WinSCP, HP-Reflection, Appworx, Autosys, FTP, MQ-Series, JMS, Maven, Ant, Junit, Websphere, UML, OOAD, Eclipse IDE, Netbeans, MS-Visio, MS-Office, Lotus Notes, Visual Source Safe (VSS),IBMClear case, TeamPrise Foundation Server (TFS), HP Quality Center, Admini Track, IBM Clear Quest, Remedy, HP OpenView Service desk, IBM-Mainframes, Cobol, Jcl, Ims-db, Vsam, Spufi and Tso/ispf.
PROFESSIONAL EXPERIENCE:
Confidential, Redwood city, CA
Hadoop Developer
Responsibilities:
- Involved in collecting the business requirements for the project.
- Attended business meetings to gather the requirements and wrote functional requirements document based on it.
- Participated in technical discussions and overall architecture as well as to communicate with the other integration teams.
- Design advanced systems by researching and evaluating complex business systems to develop detailed understanding of user needs;
- Work with users to define system requirements and create resolution;
- Work as a team member on large, highly complex technical/programming projects under minimal direction of senior staff and management;
- Regularly lead small to medium size projects, perform project management activities such as planning, sizing, configuration and scheduling activities;
- Develop solutions and make decisions that impact projects and staff members;
- Perform analysis, write program specifications and develop designs for medium to large size projects;
- Written Mapreduce, Hive Jobs to process the data as per business requirements to analyze further by data analytics.
- Load and transferred large complex sets of structured, semi-structured and unstructured data using Sqoop
- Used Hive partitioning and bucketing to improve the performance and to maintain historical data in tables in modular fashion.
- Written oozie workflows to schedule batch jobs
- Perform analysis, resolve problems and monitor to proactively prevent problems from occurring;
- Evaluate options for meeting user needs, and ensure that system requirements are identified, prioritized and incorporated in an effective, efficient manner;
- Troubleshoot complex development and production application problems and provide technical and production support on an on-call basis; and
- Work with project stakeholders to define system requirements for various projects.
- Used Apache Spark for large-scale data processing, handling real-time analytics and real streaming of data.
- Integrated Business Intelligence Reporting Solution like Tableau with various databases
Environment: Hortonworks Data Platform 2.1, Linux, HDFS, MapReduce, Yarn,Tableau, Spark, Hive, Sqoop, Oozie, Tez, Java, TeraData.
Confidential, San Rafael, CA
Hadoop Developer
Responsibilities:
- Worked with Business analysts and Converted Business requirements into Technical requirements.
- Received the large volumes of data from various sources into HDFS.
- Written Map Reduce code to preprocess the data.
- Created Hive external tables on top of the valid data sets.
- Developed complex business rules using Hive and Pig to transform and store the data in an efficient manner for Business Intelligence.
- Written Hive User defined functions (UDF’s) to accomplish critical logic.
- Used Informatica for ETL processing based on business needs and extensively used Oozie workflow engine to run multiple Hive and Pig jobs
- Provided NoSql solutions in MongoDB, Cassandra for data extraction and storing huge amount of data
- Used Sqoop to import/Export the data from RDBMS system into HDFS system.
- Very good experience with both MapReduce 1 (Job Tracker/Task Tracker) and MapReduce 2 (YARN)
- Wrote complex queries in SQL for performance tuning
- Worked closely with Business Stakeholders, UX Designers, Solution Architects and other team members to achieve results together
- Participated in business requirement analysis, solution design, detailed design, solution development, testing and deployment of various productsDelivered robust, flexible and scalablesolutions with a dedication to high quality that meet or exceed customer requirements and expectations
Environment: Java, Hadoop, Hive, Pig, Oozie, Sqoop, YARN, MongoDB, Cassandra, SQL, XML, Eclipse, Maven, JUnit, Linux, Windows, Subversion.
Confidential, Memphis, TN
Java Developer
Responsibilities:
- Involved in the designing of the project using UML.
- Followed J2EE Specifications in the project.
- Designed the user interface pages in JSP.
- Used XML and XSL for mapping the fields in database.
- Used JavaScript for client side validations.
- Created stored procedures and triggers that are required for project.
- Created functions and views in Oracle.
- Enhanced the performance of the whole application using the stored procedures and prepared statements.
- Responsible for updating database tables and designing SQL queries using PL/SQL.
- Created bean classes for communicating with database.
- Involved in documentation of the module and project.
- Prepared test cases and test scenarios as per business requirements.
- Involved in bug fixing.
- Prepared coded applications for unit testing using JUnit.
Environment: Java, JSP, Servlets, J2EE, EJB 3, Java Beans, Oracle, HTML, DHTML, XML, XSL, JavaScript, BEA WebLogic.
Confidential
Java Developer
Responsibilities:
- Understanding of System Requirements both functional as well as Technical.
- Involved in design and development of presentation layer using HTML, JSP.
- Developed beans, action classes, various business delegates, session facades and DAO
- Involved in designing the screens and client/Server side validations using JavaScript and validation frame work
- Participated in UAT and also incorporating the feedback/changes received from system users.
- Responsible for deployment of this application in test server
- Testing and debugging of the code.
Environment: Java1.4, JSP1.2, Servlets, Struts 1.3.1(Framework), Web logic, MySQL, JavaScript, Eclipse, Windows 2000.
Confidential
Java developer
Responsibilities:
- Involved in Full Life Cycle Development in Distributed Environment Using Java and J2EE framework.
- Responsible for developing and modifying the existing service layer based on the business requirements.
- Involved in designing & developing web-services using SOAP and WSDL.
- Involved in database design.
- Created tables, stored procedures in SQL for data manipulation and retrieval, Database Modification using SQL, PL/SQL, Stored procedures, triggers, Views in Oracle 9i.
- Created User Interface using JSF.
- Involved in integration testing the Business Logic layer and Data Access layer.
- Integrated JSF with JSP and used JSF Custom Tag Libraries to display the value of variables defined in configuration files.
- Used technologies like JSP, JSTL, JavaScript and Tiles for Presentation tier
- Involved in JUnit testing of the application using JUnit framework.
- Written Stored Procedures functions and views to retrieve the data.
- Used Maven build to wrap around Ant build scripts.
- CVS tool is used for version control of code and project documents.
- Responsible to mentor/work with team members to make sure the standards and guide lines are followed and delivery of tasks in time.
Environment: JQuery, JSP, Servlets, JSF, JDBC, HTML, JUnit, JavaScript, XML, SQL, Maven, Web Services, UML, WebLogic Workshop and CVS.