We provide IT Staff Augmentation Services!

Java Developer,resume Profile

5.00/5 (Submit Your Rating)

Summary:

  • Around 7 years of IT experience as a Java/Hadoop Developer in Finance, Health, Telecom and Insurance domains.
  • Good experience of the Hadoop Distributed File System and Eco System MapReduce, Pig, Hive, Sqoop and HBase .
  • Experience in Java, JSP, Servlets, Struts JBoss, JDBC and HTML.
  • Expertise in writing Map Reduce jobs using Java native code, Pig, Hive for data processing.
  • Worked on Importing and exporting data into HDFS and Hive using Sqoop.
  • Worked on Import Export of data using ETL tool Sqoop from MySQL to HDFS.
  • Developed Pig Latin scripts using operators such as LOAD, STORE, DUMP, FILTER, DISTINCT, FOREACH, GENERATE, GROUP, COGROUP, ORDER, LIMIT, UNION, SPLIT to extract data from data files to load into HDFS.
  • Wrote Hive queries for data analysis to meet the requirements.
  • Hands-on experience to setup 50 node clusters.
  • Created Hive tables to store data into HDFS and processed data using Hive QL.
  • Experience in Core Java.
  • Loaded data using to database using ETL tools like SQL loader and external tables to load data from data warehouse and various other database like SQL Server, DB2.
  • Background with traditional databases such as Oracle, SQL Server, and ETL tools / processes.
  • Loaded 20TB data from UNIX file to HDFS in a record 2 hours and reduced downtime by 96 .
  • Ability to adapt to evolving technology, strong sense of responsibility and accomplishment.
  • Exceptional organizational, multi-tasking, problem solving and leadership skills with result-oriented attitude.

Technical Skills:

Skills: Big Data Hadoop Framework HDFS, Map Reduce, Pig, Hive, Zookeeper, HBase and Sqoop.

Databases : Oracle 9i, My SQL, Hbase Languages : SQL, JAVA, J2EE, Pig Latin Development Tools : Eclipse, Toad, My SQL Web Technologies : JSP, Servlets, JavaBeans, JDBC Project Management : Microsoft Projects Operating Systems : Windows 8, Windows 7, UNIX, Linux

Professional Experience:

Confidenital

Java/J2EE/Hadoop Developer

Project: UI Modernization Onsite

Description: The vision of a modern Unemployment Insurance UI Information Technology IT system is to modernize the business and administrative functioning of Tax and Benefits for the unemployed for the Department of Labor, Sunnyvale, CA by design, development, and implementation of it. This will be achieved by developing system using latest software language, database system and various tools to design, development implementation to improve UI program services to California's employers and claimants, improve the integrity of the UI program, reduce manual effort and inefficiencies, increase federal and state compliance, and upgrade antiquated technologies through modernization of the state's UI automated applications and related processes and business functions.

Responsibilities:

  • Participated in requirement gathering and converting the requirements into technical specifications.
  • Created UML diagrams like use cases, class diagrams, interaction diagrams, and activity diagrams.
  • Developed the application using Struts Framework that leverages classical Model View Controller MVC architecture.
  • Extensively worked on User Interface for few modules using JSPs.
  • Wrote complex SQL queries and stored procedures.
  • Developed the XML Schema and Web services for the data maintenance and structures.
  • Implemented the Web Service client for the login authentication, credit reports.
  • Responsible to manage data coming from different sources.
  • Developed map reduce algorithms.
  • Got good experience with NOSQL database.
  • Involved in loading data from UNIX file system to HDFS.
  • Installed and configured Hive and also written Hive UDFs.
  • Designed the logical and physical data model, generated DDL scripts, and wrote DML scripts for Oracle 10g database.
  • Used struts validation framework for form level validation.
  • Wrote test cases in JUnit for unit testing of classes.
  • Involved in creating templates and screens in HTML and JavaScript.

Environment: Hive 0.7.1, HBase-0.90.x/0.20.x, JDK 1.5, , Struts 1.3, Websphere 6.1, HTML, JUnit 3.8, Oracle 10g.

Confidenital

Hadoop Developer

Project: FIPNet Service Onsite

Description: Objective of this project is improving patient care through an optimized supply chain using analytic innovations with Hortonworks Hadoop. Employing a unified data strategy has allowed Cardinal Health to see significant business value including a 50 timesaving for end users working with raw data.

Responsibilities:

  • Developed solutions to process data into HDFS Hadoop Distributed File System , process within Hadoop and emit the summary results from Hadoop to downstream analytical systems.
  • Data was exported into HDFS using Sqoop.
  • Hive was used to produce results quickly based on the report that was requested.
  • Integrated Hive with web server for auto generation of Hive queries for non-technical business user.
  • Integrated multiple sources data SQL Server, DB2, TD into Hadoop cluster and analyzed data by Hive-HBase integration.
  • Developed PIG UDFs and custom Pigsloader known as timestamp loader.
  • Flume was used to collect the logs with error messages across the cluster.
  • Oozie and Zookeeper were used to manage the flow of jobs and coordination in the clusters.
  • Processed data from Hadoop to relational databases or external file systems using SQOOP.
  • Developed several shell scripts, which acts as wrapper to start these Hadoop jobs and set the configuration parameters.
  • Kerberos security was implemented to safeguard the cluster.
  • Worked on a stand-alone as well as a distributed Hadoop application.
  • Tested the performance of the data sets on various NoSQL databases.
  • Understood complex data structures of different type structured, semi structured and de-normalizing for storage in Hadoop.

Environment: Hadoop, HDFS, Pig 0.10, Hive, MapReduce, Sqoop, Java Eclipse, SQL Server, Shell Scripting.

Confidenital

Hadoop Developer

Project: POC- Abinitio to HADOOP Migration

Objective of this PoC Bundle-1 is replacement of the ETL tool. This PoC is to prove that the work that is being done with ETL tool can also be done by using cost effective open source technology as PigLatin and Java which work on Hadoop Framework. The objective of PoC bundle-2 is replacement of ab initio. This PoC is to overcome the complexities of Bundle-1 in terms of performance. Staging to Core processing is for mapping the data from stage table to core table. Reconciliation Process includes validation of CDGL data, summerization and variance calculation and inserting/updating records into Recon summary table using Delta processing i.e. comparing the CDGL data with the previous day recon summary data .

Responsibilities:

  • Responsible for architecting Hadoop clusters with CDH3.
  • Extensively involved in Installation and configuration of Cloudera distribution Hadoop, NameNode, JobTracker, TaskTrackers and DataNodes.
  • Installed and configured Hadoop ecosystem like HBase, Flume, Pig and Sqoop.
  • Involved in Hadoop cluster task like Adding and Removing Nodes without any effect to running jobs and data.
  • Managed and reviewed Hadoop Log files, Load log data into HDFS using Flume.
  • Worked extensively in creating MapReduce jobs to power data for search and aggregation.
  • Worked extensively with Sqoop for importing metadata from Oracle.
  • Responsible for smooth error-free configuration of DWH-ETL solution and Integration with Hadoop. Designed a data warehouse using Hive.
  • Designing and implementing semi-structured data analytics platform leveraging Hadoop, with Solr Created partitioned tables in Hive.
  • Developed Pig Latin scripts to extract the data from the web server output files to load into HDFS.
  • Developed workflow in Oozie to automate the tasks of loading the data into HDFS and pre-processing with Pig.

Environment: Hadoop, MapReduce, HDFS, Pig, Hive, HBase, Java, Oracle 10g, MySQL

Confidenital

Java/J2EE Developer

Project: Hiring Simplified

Description: Hiring Simplified Application for GE Infra GE Oil and Gas for Customer GE, Singapore. The objective of this application is to find out the list of selected interviewed candidates and only Admin and HR can add and edit the details of candidates.

Responsibilities:

  • Worked on Servlet, JSP, Struts, JDBC and MVC
  • Worked with JavaScript to perform client side form validations.
  • Used Struts tag libraries as well as Struts tile framework.
  • Used Data Access Object to make application more flexible to future and legacy databases.
  • Actively involved in tuning SQL queries for better performance.
  • Wrote generic functions to call Oracle stored procedures, triggers, functions.

Environment: JDK, J2EE, UML, Servlet, JSP, JDBC, Struts, JavaScript, MVC, Tomcat, Eclipse.

Confidenital

Java/J2EE Developer

Project: Mass Communication Portal

Description: Mass communication application for GE PEG for TCS Internal, will assist associates to send the batch mail to multiple associates within GE ISU business, which will take very less time. There will be no limitation for number of recipients. Associates can send mail to different groups at a particular time. Associates can compose a new draft and can view the status of saved drafts, sent mail history and scheduled mail history. This application is only accessible by BGLs, PMO and AMs.

Responsibilities:

  • Involved in the complete SDLC life cycle, design and development of the application
  • Followed AGILE methodology and involved in SCRUM meetings
  • Applied Struts, tile and resource bundle to create international web sites with one set of source code greatly simplified code maintenance.
  • Developed application involving J2EE, Struts.
  • Worked on Servlet, JSP, Struts, JDBC and MVC Architecture
  • Involved in coding of JSP pages for the presentation of data on the View layer in MVC architecture.
  • Actively involved in tuning SQL queries for better performance.
  • Wrote generic functions to call Oracle stored procedures, triggers, functions.

Environment: J2EE, Servlet, JSP, JDBC, Struts, MVC, JBOSS, Eclipse.

Confidenital

Java/J2EE Developer

Project: ARK Access to Risk Knowledge

Description: ARK Access to Risk Knowledge Application for GE Corporate GRM for GE USA and an integrated suite of risk management tool which can assist in making judgment related to financial risk associated with the company. EDF calculation in ARK is done by a dll provided by MKMV. This dll resides on an IIS server and is called by the ARK web application whenever the EDF needs to be calculated.

ARK User Management module Autosys: The Autosys jobs trigger shell scripts which in turn call the JAVA files.

Responsibilities:

  • Involved in coding of JSP pages for the presentation of data on the View layer in MVC architecture.
  • Used J2EE design patterns like Factory Methods, MVC, and Singleton Pattern that made modules and code more organized, flexible and readable for future upgrades.
  • Worked with JavaScript to perform client side form validations.
  • Used Struts tag libraries as well as Struts tile framework.
  • Used JDBC to access Database with Oracle thin driver of Type-3 for application optimization and efficiency.
  • Client side validation done using JavaScript.
  • Used Data Access Object to make application more flexible to future and legacy databases.
  • Actively involved in tuning SQL queries for better performance.
  • Wrote generic functions to call Oracle stored procedures, triggers, functions.

Environment: JDK, J2EE, Servlet, JSP, JDBC, Struts, MVC, Java Script, Tomcat, Eclipse.

We'd love your feedback!