We provide IT Staff Augmentation Services!

Java/j2ee Developer Resume Profile

3.00/5 (Submit Your Rating)

Dublin, OH

PROFESSIONAL SUMMARY:

  • 7 plus years of professional experience in IT, including 2.8 years of work experience in Big Data, Hadoop Development and Ecosystem Analytics.
  • Passionate towards working in Big Data and Analytics environment.
  • Well versed in Installation, Configuration, Supporting and Managing of Big Data and Underlying infrastructure of Hadoop Cluster.
  • In depth knowledge of Hadoop Architecture and Hadoop daemons such as Name Node, Secondary Name Node, Data Node, Job Tracker and Task Tracker.
  • Experience in writing Map Reduce programs using Apache Hadoop for analyzing Big Data.
  • Hands on experience in writing Ad-hoc Queries for moving data from HDFS to HIVE and analyzing the data using HIVE QL.
  • Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice-versa.
  • Experience in writing Hadoop Jobs for analyzing data using Pig Latin Commands.
  • Experience in Integrating Hive and Sqoop with HBase and analyzing data in HBase.
  • Good Knowledge in NoSQL Databases like HBase and MongoDB.
  • Knowledge in extending Hive and Pig core functionality by writing custom UDFs like UDAFs and UDTFs.
  • Knowledge of administrative tasks such as installing Hadoop and its ecosystem components such as Hive and Pig in Pseudo-Distributed Mode.
  • Experience in understanding the security requirements for Hadoop and integrate with Kerberos authentication and authorization infrastructure.
  • Experience in using Apache Flume for collecting, aggregating and moving large amounts of data from application servers.
  • Experience in using Zookeeper and Oozie Operational Services for coordinating the cluster and scheduling workflows.
  • Good Knowledge in configuring and monitoring tools like Ganglia and Nagios.
  • Good Knowledge in Amazon AWS concepts like EMR and EC2 web services which provides fast and efficient processing of Big Data.
  • Experience in Launching EC2 instances in Amazon EMR using Console.
  • Knowledge on Reporting tools like Tableau Software which is used to do analytics on data in cloud.
  • Extensive experience with SQL, PL/SQL and database concepts.
  • Experience in developing applications using Java J2EE technologies.
  • Extensive Knowledge in Java, J2ee, Servlets, JSP, JDBC, Struts and Spring Framework.
  • Experience in working with popular frameworks likes Struts 2.0, Hibernate 3.0, Spring IOC, and Spring MVC.
  • Experience in Web Services using XML, HTML and SOAP.
  • Experience in using version control management tools like CVS, SVN and Rational Clear Case.
  • Highly motivated, self-starter with a positive attitude, willingness to learn new concepts and acceptance of challenges.
  • Major strengths are familiarity with multiple software systems, ability to learn quickly new technologies, adapt to new environments, self-motivated, focused adaptive and quick learner with excellent interpersonal, technical and communication skills.

TECHNICAL SKILLS:

Hadoop Core Services

HDFS, Map Reduce, Hadoop YARN.

Hadoop Data Services

Apache Hive, Pig, Sqoop, Flume.

Cloud Computing Services

AWS Amazon Web Services .

Reporting Tool

Tableau.

Java J2EE Technologies

Core Java 1.5, Servlets3.2, JSP 2.0, JDBC, Java Beans.

IDE Tools

Eclipse, Net Beans.

Programming Languages

C, Java, Unix Shell scripting.

Data Bases

Oracle 11g/10g/9i, DB2, MS-SQL Server, MySQL, MS-Access.

Web Servers

Web Logic 10.3, Web Sphere 6.1, Apache Tomcat 5.5/6.0.

Environment Tools

SQL Developer, Win SCP, Putty.

Frameworks

Struts 2.0, UML, Hibernate 3.0, Spring2.5

Version Control Systems

CVS, Tortoise SVN

Operating Systems

Windows 95/98/2000/XP/Vista/7/8, Unix

PROFESSIONAL EXPERIENCE:

Confidential

Java/J2EE Developer

Description: Confidential . is a fortune 500 health care services company specialized in distribution of pharmaceuticals and medical products, serving more than 60,000 locations. It consists of various segments like Pharmacy, Medical and Enterprise. Cardinal Health implemented Hadoop as a part of Data Analytics Platform Services for their end business users.

Responsibilities:

  • Worked on Sqooping the tables from various Databases like TERADATA Sales Data Warehouse , AS400, DB2 and SQL-SERVER to Hadoop File System
  • Developed shell scripts to automate the ingestion and deploy the tables for both snapshots and deltas.
  • Implemented custom delta processing strategy for ingestion of deltas for which sqoop does not support deltas incremental data on composite.
  • Worked on Ingestion of deltas for the company's biggest table INVOICE LINE 700 GB with partitioning on daily basis.
  • Developed Pig Latin scripts for data cleansing and Transformation.
  • Developed Shell Script to perform Data Profiling on the ingested data with the help of hive bucketing.
  • Worked with different File Formats like TEXTFILE, AVROFILE, CSV, ORC and AVRO for HIVE querying and processing.
  • Developed PIG and Hive UDF'S for manipulating the data according to Business Requirements.
  • Worked on the Ingestion of Files into HDFS from remote systems using MFT Managed File Transfer .
  • Worked on JAVA API for exporting data from HDFS to RDBMS database because of sqoop Auto Commit Issues.
  • Developed Wrapper scripts for SQOOP Ingestion and Hadoop Copy Merge.
  • Developed workbooks in Datameer for transformations An application on top of Hive which acts as BI tools to run every day.
  • Involved in preparing Bench Mark metrics for comparing MRv1 to MRv2 YARN .
  • Worked on Hive joins MEGAJOIN to produce the input data set to the Qlikview model.
  • Involved in Upgrading and testing of applications from HDP 1.2 to HDP 2.1
  • Worked on Tez Execution Engine to gather performance characteristics compared to MR.
  • Involved in creating Dashboard that shows count of files that were ingested and number of failures on daily basis which will be very helpful for Run Team.
  • Worked on Integrating Tableau with HiveServer2 using Horton Works ODBC Driver with LDAP Security.

Environment: Java, JDK 1.7, HDP 1.3, HDP 2.1, Datameer, Tableau.

Confidential

Hadoop Developer

Description Confidential is a Multinational healthcare information technology solution company. Cerner is currently in the process of migrating its client/server based application with Oracle database, to be a cloud based application and use HBase for storing and retrieving patient data. The extracted initial data is pushed to the series of MapReduce jobs to create HFiles that can be loaded into the HBase cluster.

Responsibilities:

  • Involved in loading data from Oracle database into HDFS using Sqoop queries.
  • Responsible for building scalable distributed data solutions using Hadoop.
  • Developed Map Reduce pipeline jobs to process the data and create necessary HFiles.
  • Involved in loading the created HFiles into HBase for faster access of large customer base without taking Performance hit.
  • Documented the systems processes and procedures for future references.
  • Provide batch processing solution to certain unstructured and large volume of data by using Hadoop Map Reduce framework.
  • Assisted in performing unit testing of MapReduce jobs using MRUnit.
  • Used Oozie Scheduler system to automate the pipeline workflow and orchestrate the map reduces jobs that extract the data on a timely manner.
  • Actively participated in software development lifecycle scope, design, implement, deploy, test , including design and code reviews, test development, test automation.
  • Used Zookeeper for providing coordinating services to the cluster.
  • Involved in story-driven agile development methodology and actively participated in daily scrum meetings.
  • Assisted in monitoring Hadoop cluster using Ganglia.

Environment: Hadoop, Map Reduce, HDFS, Hive, Oracle 11g/10g, HBase, Oozie, Java jdk1.6 , UNIX, SVN and Zookeeper, Maven.

Confidential

Hadoop Developer

Description: Confidential multinational communications company providing communications and data services to residential, business and governmental customers, It has implemented hadoop for analyzing their business data in various Databases for Decision-making.

Responsibilities:

  • Worked on Ingestion of data from various RDBMS Systems to HDFS through sqoop and vice-versa.
  • Done various performance Optimizations like using the Distributed cache for small datasets, Partitions, Bucketing.
  • Done the different data transformations, requested by the client.
  • Implemented various requirements using Pig scripts.
  • Worked on JAVA API for exporting data from HDFS to RDBMS Systems in cases where legacy systems don't support sqoop for exporting, because of auto-commit issues.
  • Responsible for building scalable distributed data solutions using Hadoop.
  • Worked extensively on shell scripting for automating the sqoop job flows.
  • Created Hive external tables for the data in HDFS and moved data from archive layer to business layer with hive transformations.
  • Used Oozie Scheduler system to automate the pipeline workflow and orchestrate the map reduces jobs that extract the data on a timely manner.
  • Experienced in writing PIG-UDF's for data cleansing.
  • Worked on SQL Queries for creating dashboards in Tableau.
  • Used Crontab and Oozie for job automation.
  • Worked on Debugging and gave Support to the applications in case of any failure.
  • Gained working knowledge on hadoop streaming for data mining using python.
  • Supported in setting up QA environment and updating configurations for implementing scripts with Pig and Sqoop.

Environment: Hadoop, HDFS, Hive, Oozie, Sqoop, PIG, Java JDK 1.6 , Eclipse, MySQL and UNIX, SVN, Zookeeper.

Confidential

Java/J2EE Developer

Description: Confidential is one of the largest banking institutions in the world. Bank offers various financial and banking services to its customers. Worked on application named Access portal. It is a part of online banking that allows a customer to view quick summary of transactions and account details. It also shows mutual funds associated with account.

Responsibilities:

  • Responsible for understanding the scope of the project and requirement gathering.
  • Developed the web tier using JSP, Struts MVC to show account details and summary.
  • Created and maintained the configuration of the Spring Application Framework IoC .
  • Implemented various design patterns Singleton, Business Delegate, Value Object and Spring DAO.
  • Used Spring JDBC to write some DAO classes to interact with the database to access account information.
  • Mapped business objects to database using Hibernate.
  • Involved in writing Spring Configuration XML files that contains declarations and other dependent objects declaration.
  • Used Tomcat web server for development purpose.
  • Involved in creation of Test Cases for JUnit Testing.
  • Used Oracle as Database and used Toad for queries execution and also involved in writing SQL scripts, PL/SQL code for procedures and functions.
  • Used CVS, Perforce as configuration management tool for code versioning and release.
  • Developed application using Eclipse and used build and deploy tool as Maven.
  • Used Log4J to print the logging, debugging, warning, info on the server console.
  • Worked on the Java Script for the Application front end.
  • Developed two web UI pages using Java Script Template
  • Supported user Acceptance testing of the application.
  • Developed new Java Server Pages with creation of custom tags and java beans.
  • JSP Standard Tag Library JSTL and Custom Tag Libraries were also developed for efficient flexible and easily maintainable of JSP pages.
  • Used Java Script for client side data validation.
  • Developed different modules using standard J2EE architecture. This includes Servlets, JSP, EJB, and JDBC.
  • Communicated to the clients and business on the daily basis.
  • Developed JSP pages for the new modules with applied business logic using Struts2 and Tiles built on the design paradigm like Business delegate and MVC.
  • Successfully delivered the project on time without any failures

Confidential

Java/J2EE Developer

Description: Confidential is a credit card management system designed to maintain multiple groups in credit card management. It is a database maintenance tool used by business people in checking status of applications, QA people in launching applications and developers in creating, deleting and modifying applications. Dashboard is a Java based system running on UNIX environment with Web logic, Enterprise Java Beans, and Java Server Pages.

Responsibilities:

  • Involved in complete requirement analysis, design, coding and testing phases of the project.
  • Created Use case, Sequence diagrams, functional specifications and User Interface diagrams using Star UML.
  • Developed the Presentation Layer using MVC Struts Framework and also used Tiles Layout for View Transactions and User Activity Monitoring module.
  • Wrote Action classes, Request Processor, Business Delegate, Business Objects, Service classes and JSP pages.
  • Implemented the Web Services functionality in the application to allow external applications to access data.
  • Developed user interfaces using JSP, HTML, XML and JavaScript.
  • Generated XML Schemas and used XML Beans to parse XML files.
  • Involved in development of Business tier with Stateless, Stateful Session beans with EJB 3.0 standards and developed business components.
  • Developed the code which will create XML files and Flat files with the data retrieved from Databases and XML files.
  • Created Data sources and Helper classes which will be utilized by all the interfaces to access the data and manipulate the data.
  • Used JMS to communicate with the Mainframe.
  • Followed TDD Test Driven Development and developed test cases by using JUnit for unit testing for each and every module developed.
  • Web logic is used as the application production server.

Environment: Java 5.0, Servlets, JSPs, Javascript, Webservices, log 4J, Web logic server, EJB 3.0.

Confidential

Java Developer

Description: This system is designed to handle inter-office communication and controlling employee activities. This provides overall infrastructure information, employee's personal information, training sessions, leave application, News box, Discussion forum, Loan details, weekly scheduling, tax information and Personal chat between the employees. Modules present in this project are Communication module, Employee details module and Services module.

Responsibilities:

  • Developed the application under JEE architecture, developed Designed dynamic and browser compatible user interfaces using JSP, Custom Tags, HTML, CSS, and JavaScript.
  • Deployed maintained the JSP, Servlets components on Tomcat.
  • Preparation of Developer's and Deployment Guide.
  • Involved in storing the details about all employees and retrieving from Oracle database when required by the Administrator for the employee detail module
  • Developed the weekly schedule for employees which let them plan their weekly activities in an interactive way.
  • Developed and utilized J2EE Services and JMS components for messaging communication in Web Logic

Environment: Java, JDK 1.4, Java Script, HTML, Servlets, Eclipse, JSP, Apache Tomcat, and Oracle.

We'd love your feedback!