Java Developer Resume Profile
Baltimore, MD
Summary
- 8 years of overall IT experience and 4 Years of comprehensive experience as a Apache Hadoop Developer. Expertise in writing Hadoop Jobs for analyzing data using Hive, Pig and oozie.
- Sun Certified Java Programmer with over 4 years of Extensive programming experience in developing web based applications and Client-Server technologies using Java, J2EE.
- Good knowledge of Hadoop Architecture and various components such as HDFS, Job Tracker, Task Tracker, Name Node, Data Node and MapReduce concepts.
- Experience in working with MapReduce programs using Hadoop for working with Big Data.
- Experience in analyzing data using Hive QL, Pig Latin and custom MapReduce programs in Java.
- Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice-versa.
- Working experience on designing and implementing complete end-to-end Hadoop Infrastructure including PIG, HIVE, Sqoop, Oozie, Flume and zookeeper.
- Experience in providing support to data analyst in running Pig and Hive queries.
- Developed Map Reduce programs to perform analysis.
- Performed Importing and exporting data into HDFS and Hive using Sqoop.
- Experience in writing shell scripts to dump the Sharded data from MySQL servers to HDFS.
- Experience in designing both time driven and data driven automated workflows using Oozie.
- Experience in setting up Infiniband network and build Hadoop cluster to improve the map reduce performance.
- Experience in performance tuning the Hadoop cluster by gathering and analyzing the existing infrastructure.
- Experience in automating the Hadoop Installation, configuration and maintaining the cluster by using the tools like puppet.
- Experience in setting up monitoring infrastructure for Hadoop cluster using Nagios and Ganglia.
- Experience in working with flume to load the log data from multiple sources directly into HDFS.
- Strong debugging and problem solving skills with excellent understanding of system development methodologies, techniques and tools.
- Worked in complete Software Development Life Cycle analysis, design, development, testing, implementation and support in different application domain involving different technologies varying from object oriented technology to Internet programming on Windows NT, Linux and UNIX/ Solaris platforms and RUP methodologies.
- Familiar with RDBMS concepts and worked on Oracle 10g/9i, SQLServer 7.0., DB2 8.x/7.x
- Involved in writing shell scripts, Ant scripts for Unix OS for application deployments to production region.
- Exceptional ability to quickly master new concepts and capable of working in-group as well as independently with excellent communication skills.
Technical Skills:
- Languages/Tools : Java, C, C , VB, XML, HTML/XHTML, HDML, DHTML.
- Big Data : Hadoop, Map Reduce, Hive, Pig, Sqoop and MRUnit
- J2EE Standards : JDBC, JNDI, JMS, Java Mail XML Deployment Descriptors.
- Web/Distributed : J2EE, Servlets 2.1/2.2 , JSP 2.0, Struts 1.1, Hibernate 3.0, JSF,
- Technologies JSTL1.1,EJB 1.1/2.0, RMI,JNI, XML,JAXP,XSL,XSLT, UML, MVC,
- STRUTS,Spring 2.0, Corba, Java Threads.
- Operating System : Windows 95/98/NT/2000/XP, MS-DOS, UNIX, Linux6.2
- Databases : Oracle 11g/10g/9i, MS SQL Server 2000, DB2, My sql
- Browser Languages : HTML, XHTML, CSS, XML, XSL, XSD, XSLT.
- Browser Scripting : Java script, HTML , DHTML, AJAX.
- App/Web Servers : IBM Websphere 5.1.2/5.0/4.0/3.5, Apache Tomcat, JBoss.
- GUI Environment : Swing, AWT.
- Messaging : SOAP, WSDL,UDDI, XML, SOA, IBM WebSphere MQ v5.3,
- Web Services Technology JMS.
- Networking Protocols : HTTP, HTTPS, FTP, UDP, TCP/IP, SNMP,SMTP,POP3.
- Testing Case Tools : JUnit, Log4j, CVS, ANT, JBuilder.
Professional Experience
Confidential
Java Developer
Description:
Confidential is the marketplace for individuals, families and small businesses to compare and enroll in health insurance and determine eligibility for Medicaid and other assistance programs, federal tax credits and cost-sharing reductions. Confidential Health Connection begins in October 2013, with insurance coverage beginning January 1, 2014 for individuals and families. Enrollment for small businesses begins when the Small Confidential.Confidential is a public corporation and independent unit of Confidential government established in Confidential The Confidential has a nine-member Board of Trustees that includes the Secretary of the Maryland Department of Health and Mental Hygiene, Maryland Insurance Commissioner and Executive Director of Maryland Health Care Commission. The Maryland Health Benefit Exchange is responsible for the administration of Maryland Health Connection.
Responsibilities:
- Installed and configured Hadoop Mapreduce, HDFS, Developed multiple MapReduce jobs in java for data cleaning and preprocessing.
- Installed and configured Pig and also written PigLatin scripts.
- Wrote MapReduce job using Pig Latin.
- Have solid understanding of REST architecture style and its application to well performing web sites for global usage.
- Involved in ETL, Data Integration and Migration. Imported data using Sqoop to load data from Oracle to HDFS on regular basis.
- Developing Scripts and Batch Job to schedule various Hadoop Program.
- Written Hive queries for data analysis to meet the business requirements.
- Creating Hive tables and working on them using Hive QL. Importing and exporting data into HDFS from Oracle Database and vice versa using Sqoop.
- Implemented test scripts to support test driven development and continuous integration.
- Responsible to manage data coming from different sources.
- Load and transform large sets of structured, semi structured and unstructured data.
- Experience in managing and reviewing Hadoop log files.
- Worked on Hive for exposing data for further analysis and for generating transforming files from different analytical formats to text files.
- Managing and scheduling Jobs on a Hadoop cluster.
- Installed and configured Hadoop MapReduce, HDFS, Developed multiple MapReduce jobs in java for data cleaning and preprocessing.
- Importing and exporting data into HDFS and Hive using Sqoop.
- Involved in creating Hive tables, loading with data and writing hive queries, which will run internally in map, reduce way.
- Used Pig as ETL tool to do transformations, event joins, filter bot traffic and some pre-aggregations before storing the data onto HDFS.
- Implemented J2EE standards, MVC2 architecture using JSF Framework.
- Implementing Servlets, JSP and Ajax to design the user interface.
- All the Business logic in all the modules is written in core Java.
- Written Hive queries for data analysis to meet the business requirements.
- Involved in writing Hive scripts to extract, transform and load the data into Database.
- Used JIRA for bug tracking.
- Used CVS for version control.
Environment: Hadoop, Hive, Linux, MapReduce, HDFS, Hive, Pig, Sqoop, No Sql Shell Scripting, Java JDK 1.6 , Java 6, Eclipse, Oracle 10g, PL/SQL, SQL PLUS, Toad 9.6, Linux, JIRA 5.1, CVS, JIRA 5.2.
Confidential
Hadoop developer
Description:
Confidential is committed to develop a globally-integrated Tax Organization that focuses on end-to-end standardized processes and technology executed at an optimal cost in addition to providing robust reporting and analytics to facilitate realization of significant Tax assets.The Global Tax Organization initiated a limited Discovery Project with Accenture PwC to assess the current state and determine the high level tax data, process, and technology capabilities required to meet accelerated financial reporting close requirements.
Responsibilities
- Developed Oozie Workflows for daily incremental loads, which gets data from Teradata and then imported into hive tables.
- Developed pig scripts to transform the data into structured format and it are automated through Oozie coordinators.
- Developed Hive queries for Analysis across different banners.
- Loading the data from the different Data sources like Teradata and DB2 into HDFS using sqoop and load into Hive tables, which are partitioned.
- Developed Hive UDF's to bring all the customers email id into a structured format.
- Developed bash scripts to bring the Tlog files from ftp server and then processing it to load into hive tables.
- All the bash scripts are scheduled using Resource Manager Scheduler.
- Moved data from HDFS to Cassandra using Map Reduce and BulkOutputFormat class.
- Developed Map Reduce programs for applying business rules on the data.
- Developed and executed hive queries for denormalizing the data.
- Supported Data Analysts in running Map Reduce Programs.
- Worked on importing and exporting data into HDFS and Hive using Sqoop.
- Worked on analyzing data with Hive and Pig.
- Developed and implemented the MVC Architectural Pattern using Struts Framework including JSP, Servlets and EJB.
- Implemented server side tasks using Servlets and XML.
- Created and deployed web pages using HTML, JSP, JavaScript, JQuery and CSS
- Experience in Implementing Rack Topology scripts to the Hadoop Cluster.
- Manage the day-to-day operations of the cluster for backup and support.
Environment: Hadoop, Hive, Linux, MapReduce, HDFS, Hive, Pig, Sqoop, No Sql Java 1.5, J2EE, Struts 2.0, WebSphere 7.0, IBM RAD 8.0, Rational Clearcase 7.0, XML, JAXP, Castor, XSL, XSLT, XML Schema XSD , WSDL 2.0, SOAP,JSP 2.2, CSS, Servlets, JavaScript, HTML, JMS, AXIS 2, Open source technologies ANT, LOG4j and Junit , Oracle 10g, UNIX.
Confidential
Hadoop J2EE Developer
Description:
Confidential a leading professional services and application development provider for Communications Companies and it was chosen to partner with Confidential Enterprise IT on this effort leveraging their background in IT project management as well as general skills in Hadoop design and development. This project, coupled with strong analysis requirements and strategy, will operationalize an enterprise class project management solution for to manage construction and engineering projects related to critical, capital intense, network projects.
Responsibilities
- Involved in design, development and maintenance of all Communications web applications.
- Importing and exporting data into HDFS and Hive using Sqoop.
- Installed and configured Hadoop Mapreduce, HDFS, Developed multiple MapReduce jobs in java for data cleaning and preprocessing.
- Used JMS for Point-to-Point asynchronous messaging for high transactional operation.
- Fix defects as needed during the QA phase, support QA testing, troubleshoot defects and identify the source of defects.
- Code and Unit Test according to client standards. Provide production support and quickly resolving the issues until Integration Test is passed.
- Used Java Script for Client side validations.
- Set up the deployment environment on WebSphere 6.1 Developed system preferences UI screens using JSP2.0 and HTML.
- Implemented business logic is present at the middle layer which is written in Java J2EE. Also the business rules are defined in xml which will be parsed by the Java rules engine for risk calculation.
- Experienced in defining job flows.
- Experienced in managing and reviewing Hadoop log files.
- Got good experience with NOSQL database.
- Experienced in running Hadoop streaming jobs to process terabytes of xml format data.
Environment: Java 1.6, J2EE, Spring MVC 2.5, WebSphere 6.1, IBM RAD 7.5, XML, XML Spy, XSD, XSL, XSLT, JSP, JavaScript, JTEST, HTML, Json, Agile, Scrum, CMS, Open source technologies ANT, LOG4j and Junit , Oracle 10g, Toad, UNIX, Hadoop Distribution of HortonWorks, Cloudera, MapReduce, Hadoop, Hive, HBase
Confidential
Java Developer
Confidential of consumer and business loans. This project offers an online mortgage loan calculation system. In this platform, Loan Officers can collect and review pre-application information to qualify potential borrowers and generate related reports. Borrowers can review monthly payment, make changes, add, modify or clear conditions for any loan.
Responsibilities:
- Developed UI using HTML, JavaScript, and JSP, and developed Business Logic and Interfacing Components using Business Objects, XML, and JDBC.
- Designed user-interface and checking validations using JavaScript.
- Implemented Strust2 MVC as web-tier and involved writing custom interceptors.
- Managed connectivity using JDBC for querying/inserting data management including triggers and Stored Procedures.
- Used Struts as the mid-tier and Hibernate ORM framework to map between Database tables and Java Beans.
- Involved in design of JSP's and Servlets for navigation among the modules.
- Established database connection to Oracle using Hibernate framework.
- Used Hibernate cache to avoid unnecessary database access.
- Configured maven for automated testing and deploying.
Environment:
JDK 1.5, J2EE, HTML, CSS, JavaScript, Ajax, jQuery, Oracle 10g, Struts 2, Hibernate 3, Maven, Eclipse 3.0
Confidential
Java Developer
The Tool is designed to work with database to help integrate data systematically, analyze and interpret important information. It allows to easily creating simple reports, and it also has comprehensive tools required to produce specialized reports for advanced analytic purposes.
Responsibilities:
- Handled a major functionality of the project that helps to View Report templates in different patterns by changing the selection criteria dynamically.
- Used J2EE design patterns such as MVC, Session Facade, Business delegate, and DAO.
- This component required dynamic behavior and inter-dependencies in various filter selections. The complexity was handled using the AJAX tool Smart Client and java script.
- Implemented O/R Mapping Hibernate to access the data from the database.
- Implemented using Eclipse IDE, Tomcat application server and deployed using JBoss.
- Used SVN source repository as version control.
Environment: Java 1.5, J2EE, Struts2.0, Spring2.5, Web Services, Hibernate 3.2, JSP2.0, HTML, SmartClient, Java Script, CSS, JMS, Web Services, Tomcat 5.0, SOAP, XML, XSLT, PL/SQL, Oracle10g, Log4j, Eclipse SDK 3.3, SVN, JBoss, JPA,Maven 2.0, JUnit, Linux.