Java Developer Resume Profile
Experience Summary |
- Experience of 10 years in IT industry
- Having 7 years of experience in Java/J2EE technology
- Having 3 years of experience in Bigdata/Hadoop
- Cloudera certified professional on Hadoop Big Data Processing
- Specially trained in HDFS, MapReduce, Pig and Hive
- Analyzed various Use Cases involved Big Data using MapReduce, Pig and Hive
- Having strong hands on experience in MapReduce,Pig and Hive.
- Havind strong working knowledge in Bedrock Haoop integration tool.
- Having strong hands experience in J2EE, JAVA, JSP, SERVLET, JSF, WEB SERVICES, EJB, MQ, JMS, MDB, JAVA SCRIPTS, PL/SQL, ORACLE and UNIX Technologies
- Strong Programming Skills in designing and implementation of multi-tier applications using JAVA, J2EE, JDBC, JSP, SERVLET,JSTL, HTML, JSF, Struts, JavaScript, JavaBeans, CSS, EJB, MQ, JMS, MDB XSLT,
- Having strong working experience in STRUTS, SPRING and HIBERNATE frameworks
- Strong experience in all the phases of software development life cycle including requirements gathering, analysis, design, implementation, deployment and support
- Handled the tasks of providing technical direction for developing, designing and integrating systems for customers
- Assigned the tasks of monitoring and reviewing the work of the development team
- Responsible for updating project manager regarding status of development efforts
- Served as a liaison between project manager and development staff
- Handled the tasks of identifying system deficiencies and implementing effective solutions
- Assigned the tasks of enforcing coding standards
- Handled the responsibilities of managing technical risks throughout the project
- Provided design and architecture solutions
- Provided production support and resolved problems
- Responsible for creating and executing development plans
- Performed code reviews and supervised junior developers
- Having 6 years of onsite experience US and ability to handle all client demands.
- Experience in software testing, Junit testing, regression testing, defect tracking and management using Clear Quest.
- Excellent written and verbal communication skills, presentation and problem solving skills.
- Strong communication relationship management skills, enthusiastic self-driven, with a high degree of commitment.
Technical Skills |
Hadoop related Big Data Technologies:- Hadoop, MapReduce, HDFS, Pig, Hive, Hbase, Oozie, Zookeeper, Sqoop,Flume Languages:-JAVA,JSP,SERVLET, JSF,WEB SERVICES,MQ,JMS,XML,JAVA SCRIPTS, SHELL SCRIPTS,AJAX,SAX, PL/SQL Tools used in Projects: - ,Bedrock Hadoop Integration Tool ,Websphere 6.1,Websphere 8, Sql navigator , Putty, WIN SCP ,FILE ZILA,ACTUATE DESIGNER,CVS, SVN,WSAD 5.1,RAD7.5 , RAD8, RTC ,CLEARCASE Databases:-ORACLE Architecture Known: - Three tier architecture and MVC. Frameworks Known : STRUTS ,SPRING ,HADOOP Estimation techniques :- Use Case Points, Process Know How: - Aware of complete SDLC process and implemented in all my previous projects Operating Systems:- WINDOWS and UNIX |
Relevant Project Experience |
Confidential
Description : that is being built within Optum Health for storage, transformation, integration retrieval of source data. Hadoop is an open source software framework that enables distributed parallel processing of huge volumes of data across inexpensive, commodity servers and storage systems. The proposed design of the data fabric has two data repositories. One which will store minimally processed inbound data Raw Zone , and the second which will store standardized, cleansed inbound and outbound data Hub Zone . These repositories will provide a centralized location where multiple information delivery projects can obtain the data they need quickly, without impacting other projects and leaving negative impact on the usability of the data. In Data Fabric, data from the source systems are extracted and loaded into intermediate SFTP server drop zone using Data Stage ETL jobs. After this the data is subjected to data lineage algorithms to do the watermark tagging and the watermark id is stamped to the data at row level. Then a series of data quality checks are run on the data and the quality metrics are captured in HDFS. The watermark process and data quality processes are collectively called as the Intake Batch of Data Fabric. The Intake batch data is made available in the Raw Zone. Once the Intake Batch completes the Integration Batch is triggered. In the Integration batch the first process is Data Standardization which transforms the data elements of disparate data types from multiple sources into a common notation. The next step in the Integration Batch is integration of Member data with Identity Management Service IMS which is a Master Person Index solution. The interface between Data Fabric and IMS is a Web Service hosted in IMS Application over secure HTTP protocol. The Intake batch data is made available in the Hub Zone. The next process is provisioning which involves systematic and scheduled delivery of data dumps both from the Raw and Hub Zones conforming target layouts specified by the customer requirements.
.
Confidential
Responsibility:
- Elicit and Document Functional Requirements, Technical requirements and infrastructure requirements using Use Cases, User Stories, Rule Specifications and other requirements Artifacts.
- Gather business requirements and document use cases.
- Generate various High level and Detail level Software design documents using Class diagrams and sequence diagrams.
- Design and coding various components using Hadoop framework such as MapReduce, Pig, HIVE and HBASE.
- Unit and integration testing of project modules
- Onsite Team Lead managing project delivery and client interaction.
- Coordinating with offshore for work assignments and Delivery Management
- Conducting technical reviews for peers and other resources in the team.
- Performance Evaluation for team members.
- Coordinating and conducting trainings for new joiners and other team members.
- Sharing a good rapport with the team effectively and professionally interacting with clients and team members.
- Code Review/Technical Consultation/Recommending implementation changes for other internal projects.
- Provide round the clock customer supports and monitor production application.
Software : SVN , Eclipse
Technology : MapReduce, HIVE Java, Unix
Confidential
Role : Senior Developer and SME
Responsibility:
- Elicit and Document Functional Requirements, Technical requirements and infrastructure requirements using Use Cases, User Stories, Rule Specifications and other requirements Artifacts.
- Gather business requirements and document use cases.
- Generated various design documents using Class diagrams and sequence diagrams.
- Involved in development of many enhancement requests pertaining to new vendor integration with existing Personal Quote application for Liberty Mutual.
- Extensive use of Core Java features such as Multi-threading, Collections, Exception Handling etc. in the application.
- Develop java program complying with coding standards defined by technical management
- Designed and developed Web Service end points using Apache Axis.
- Designed and developed web components using JSP and STRUTS framework.
- Involved in development of data access component using Hibernate.
- Involved in analyzing and fixing various production defects reported by business user. This involved end to end application knowledge and technical expertise in various Java/J2EE components.
- Developed and Deployed many interfacing connectivity components using JMS and MQ.
- Designed in house batch process to analyses the agent activities in Personal Quote application using Hadoop framework such as MapReduce, Pig and HIVE.
- Onsite Team Lead managing project delivery and client interaction.
- Coordinating with offshore for work assignments and Delivery Management
- Conducting technical reviews for peers and other resources in the team.
- Performance Evaluation for team members.
- Coordinating and conducting trainings for new joiners and other team members.
- Sharing a good rapport with the team effectively and professionally interacting with clients and team members.
- Code Review/Technical Consultation/Recommending implementation changes for other internal projects.
Duration : Feb 2012 to Jan 2013
Software : Websphere 8, UNIX, SQL Explorer, SVN, RAD8
Technology : JAVA, JSP, WEB SERVICES, MQ, JMS, MDB, XML, JAVA SCRIPTS, SHELL SCRIPTS, MapReduce, PIG and HIVE
Frameworks: STRUTS, Hibernate, Hadoop
Confidential
Role : Project Lead
Responsibility: -
- Requirement Analysis, design, coding, testing and implementation of application software.
- Leading the project and taking the bottom line of all deliverables
- Interaction with all stockholders and provide optimum solutions
- Design and develop web services client to send SOAP request and process the SOAP response
- Requirement elicitation from business team
- Design and develop web application using JSP, SERVLET
- Design and develop complicated business layer c in JAVA
- Design and develop XML parsers using SAX APIs
- Design and develop MDBs to receive and send XML messages
- Design and develop ORACLE procedure , tables , functions
- Design and develop Batch jobs using Shell Scripts
- Lead the team of 7 members and review all deliverables before submitting to client
- Interaction with Business end users to gather all requirements and mentor the team accordingly.
- Project estimation and project plan
- Project risk management
- Responsible for managing technical resources within project schedule and budgets
- Taking 100 responsibility of successful execution of project
- Business Analyst and support UAT and QA
- 24X7 production support of TRE application
- Effectively review all deliverables in all phase of SDLC
Duration : JUNE 2008 to JAN 2012
Software : Websphere 6.1, Oracle 10g, Actuate Designer, UNIX, FileZila, SQL Navigator, SVN, RAD7.5
Technology : JAVA, JSP, SEVLET,JSF, WEB SERVICES, MQ, JMS, MDB, XML, JAVA SCRIPTS,SHELL SCRIPTS, PL/SQL ,ORACLE,CSS,JAXM , JAXB , SAX
Frameworks: STRUTS, SPRING
Confidential
Role : Web Developer, Business Analyst and Production support
Responsibility:
- Analysis, Review, Coding and unit testing, offshore co-ordination
- Design and develop Einstein WEB application.
- Design and develop business logic using Java
- Design and develop Web Services client to access customer information
Duration : Feb 2007 to May 2008
Software : WSAD 5.1, Oracle 10g, SQL Navigator
Technology : JAVA, JSP, SEVLET, ORACLE, WEB SERVICES, JAVA SCRIPTS, SHELL SCRIPTS
Confidential
Role : Web Developer
Responsibility:
- Requirement Analysis , design, coding , testing and
- Design and develop oracle stored procedures and function
- Design and develop Web application presentation layer using JSP,SERVLET and STRUTS
Duration : JAN 2005 to JAN 2007
Software : WSAD 5.1, Oracle 10g, TOAD
Technology : JAVA, JSP, SEVLET, JSP, STRUTS, EJB, ORACLE, JAVA SCRIPTS, SHELL SCRIPTS, PL/SQL
Confidential
Role : Web Developer
Responsibility :
- Design and develop Library system maintenance web application.
- Design and develop different reports based on oracle stored procedure
Duration : July 2004 to Dec 2004
Software : Eclipse, Oracle 10g, TOAD
Technology : JAVA, JSP, SEVLET, JSP, STRUTS, EJB, ORACLE