Java Hadoop Developer Resume Profile
Jacksonville, FL
PROFESSIONAL SUMMARY:
- Hadoop/Java/Mainframes Developer with over 8 years of overall experience as software developer in design, development, deploying and supporting large scale distributed systems.
- 2 years of extensive experience as Hadoop Developer and Big Data analyst
- CLOUDERA Certified Developer for Apache Hadoop CCDH
- Excellent understanding of Hadoop architecture and underlying framework including storage management
- Have experience in installing, configuring and administrating Hadoop cluster for major Hadoop distributions like CDH 3, CDH4, and CDH5
- Expertise in using various Hadoop infrastructures such as MAPREDUCE, Pig, Hive, ZOOKEEPER, HBASE, SQOOP, OOZIE, Flume and spark for data storage and analysis
- Experience in writing custom UDFs for Hive to incorporate methods and functionality of Java into and HQL HIVEQL
- Good experience in OOZIE Framework and in automation of daily import jobs
- Experienced in managing Hadoop clusters and services using CLOUDERAMANAGER
- Experienced in troubleshooting errors in Hive and MAPREDUCE
- Highly experienced in importing and exporting data between HDFS and Relational Database Management systems using SQOOP
- Collected logs data from various sources and integrated in to HDFS using Flume
- Assisted Deployment team in setting up Hadoop cluster and services
- Good experience in Generating Statistics/extracts/reports from the Hadoop
- Good understanding of NoSQL Data bases like HBASE and Cassandra
- Strong experience in core Java, J2EE, SQL
- Strong experience in core Legacy Mainframes applications and Mainframes technology suite
- Having good knowledge in Benchmarking Performance Tuning of cluster
- Experienced in identifying improvement areas for systems stability and providing end end high availability architectural solutions
- Good experience in generating Statistics and reports from the Hadoop
- Extensive experience in developing applications using Core Java and multi-threading
- Determined, committed and hardworking individual with strong communication, interpersonal and organizational skills
TECHNICAL SKILLS:
- Hadoop Ecosystem: HDFS, MAPREDUCE, YARN, Hive, Pig, HBASE, Impala, Zookeeper, SQOOP,
- OOZIE, Apache Cassandra, Flume, Spark and Avro, AWS, Amazon EC2, S3
- Web Technologies: HTML, XML, JDBC, JSP, JavaScript, AJAX
- RDBMS: Oracle 10g/11g, MySQL, SQL server, Teradata
- No SQL: HBASE, Cassandra
- Web/Application servers: Tomcat
- Java frameworks: Struts, spring, Hibernate
- Methodologies: Agile, UML, Design Patterns Core Java and J2EE
- Data Bases: Oracle 11g/10g, Teradata, DB2, MS-SQL Server, MySQL, MS-Access
- Programming Languages: C , Java, SQL, PL/SQL, Linux shell scripts
- Tools Used: Eclipse, MS Office, Crystal Reports
- BI Tools: Tableau
PROFESSIONAL EXPERIENCE:
Confidential
Java Hadoop Developer
Loans extended to customers are accessed through accounts in Advanced Loan System ALS . The Advanced Loan System gives ability to offer better services by improving the associate's ability to manage the customer's relationship and to personalize service to the customer. ALS provides customer and account information for Lines-of-Credit and Installment Loans. This is the Core application which will host the account from loan booking through loan closure. All payment and maintenance processing is handled here. The project is to setup the Hadoop cluster and install various packages in the distribution, move various applications data from staging databases to HDFS. Also, the legacy data stored in Mainframe was moved to HDFS system. The data analysis in the HDFS is conducted by use of Hive and PIG jobs.
Responsibilities:
- Setup and benchmarked Hadoop/HBASE clusters for internal use.
- Developed Map Reduce programs for data analysis and data cleaning.
- Used Hive and created Hive tables and involved in data loading and writing Hive UDFs.
- Modernizing the Mainframe data processing using Hadoop.
- Used SQOOP to import data into HDFS and Hive from other data systems.
- Developed PIG Latin scripts for the analysis of semi structured data.
- Continuous monitoring and managing the Hadoop cluster through CLOUDERA Manager.
- Migration of ETL processes from RDBMS to Hive to test the easy data manipulation.
- Developed Hive queries to process the data for visualizing.
- Load and transform large sets of structured, semi structured and unstructured data.
- Data sanitization before loading to test environment.
- Participated in development/implementation of CLOUDERA Hadoop environment.
- Architecture design and develop the whole application to ingest and process high volume mainframe data COBOL into Hadoop infrastructure using Hadoop map-reduce.
- Design and develop customized business rule framework to implement business logic for the existing process in mainframe environment using hive, pig UDF functions.
- Experienced in working with various kinds of data sources such as Teradata and Oracle. Successfully loaded files to HDFS from Teradata, and loaded from HDFS to HIVE.
- Experienced in using Zookeeper and OOZIE Operational Services for coordinating the cluster and scheduling workflows.
- Analysis of XML and log files.
- Supported Map Reduce Programs which are running on the cluster. Involved in loading data from UNIX file system to HDFS.
- Exported the analyzed data to the relational databases using SQOOP for visualization and to generate reports for the BI team.
- Created an E-mail notification service upon completion of job for the particular team which requested for the data.
- Evaluated suitability of Hadoop and its ecosystem to the above project and implementing / validating with various proof of concept POC applications to eventually adopt them to benefit from the Big Data Hadoop initiative.
- Maintain System integrity of all sub-components related to Hadoop.
Environment: Apache Hadoop, HDFS, CLOUDERA Manager, Java, MAPREDUCE, Eclipse, Hive, PIG, SQOOP, OOZIE and SQL.
Confidential
Java Hadoop Developer
- Confidential Magellan. Oscar2000 is a thin-client Java based application that is embedded within Magellan making it seamless to the
- The main Functionalities of DFS Oscar2000 are:
- 1. Disbursing the funds for approved credit applications.
- 2. Approval of Checks System/Manual Check for disbursement and void the Check for further approvals.
- 3. Printing checks for Dealer to disburse funds to the customers.
- 4. Report generation for Disbursements, system checks, manual checks and void checks
Responsibilities
- Development of solution by taking the client requirements
- Coordinate and communicate with the client partners and business partners for new requirements
- Supporting UAT/SIT environment and resolving the issues tickets within the SLA
- Coordinating weekly client meetings and updating status on efforts spent
- Involved in deployment of the applications into different environments
- Involved in the preparation of Unit test cases and performed Unit Testing
- Ensure the overall quality of the project
- Participate in requirements and development review meeting, PMR
- Establish Best Practices and Lessons Learned and plan for continuous improvement of processes
- Project monitoring and metrics reporting
- Project to leverage unlimited power of Hadoop to deliver Insights to business on consumer lending. The solution is being built using standard ETL SQOOP and Hadoop ecosystem - HDFS cluster, Hive, Job Tracker and Task Tracker
- Hands on experience in using Hadoop ecosystem components like Hadoop Map Reduce, HDFS and Hive
- Importing and exporting data into HDFS via SQOOP
- Load and transform large sets of structured, semi structured and unstructured data
- Involved in loading data from UNIX file system to HDFS
- Developed Map Reduce program to convert data from storage systems, flat file system into HDFS
- Analysis of XML files. Used JAXB for parsing of XML and loaded the data into Oracle DB
- Created an automated process of loading data from ORACLE Database to HIVE and done analysis and reporting
Environment: Java, JSP, Servlets, JDBC, JavaBeans, Oracle, TFS Rational CLEARCASE for Source Control , Windows 2003/2008 Server, Windows XP, Nexus Request for Handling UAT/SIT Tickets .
Hadoop Environment: Apache Hadoop, HDFS, MAPREDUCE, Eclipse, Hive, PIG, SQOOP, OOZIE, CRON tab, Shell scripting and LINUX.
Confidential
Java Developer
Confidential Online customers should happen from the legacy system into the new system for each legacy applications. It also enables and disables the new and legacy applications from the clients. This project involved in many applications such as CPW ACH, Direct ACH, Payments Rewrite, Transaction Investigation, Escrow Online, ACH Positive Pay, Reverse Positive Pay, Positive Pay, Reconciliation, Stop Payment, Check Inquiry and Image Access.
Responsibilities:
- Participated in requirement discussions with all the interfacing systems.
- Attend daily client meetings with the status
- Task tracking, source control and agile planning.
- Continually interacted with the product team of each application during the design phase of the application.
- Responsible for distributing, tracking, communicating issues to developers and reporting status to manager on daily basis.
- Responsible for converting existing legacy customers to new system on each application.
- Developed complex SQL queries, stored procedures, functions, triggers, packages and created indexes wherever applicable in Oracle database.
- Participated in the build, deployment and release process for patches
Environment: JDK 1.6/1.5, J2EE, JDBC, Hibernate, XML, JUNIT, Oracle 10G, PL/SQL, SQL Server, Eclipse, Visual Studio, Microsoft Visio, Windows XP/7, DCT, Quality Center, Jira, Share point, Discovery, and UNIX
Confidential
Java Developer
Confidential Online will allow Confidential corporate clients to access global treasury, debt, cash management, investments, trade finance, foreign exchange capabilities and a variety of other financial capabilities at any time, and from anywhere. The channel will also provide an online payments hub, which aims to greatly speed up transactions to improve efficiency and profitability. CASHPRO Online is designed to give businesses access to their financial products anytime and from anywhere. As a business needs changes, they can update their profile in real-time, allowing CASHPRO to adjust to the client's needs as they grow and change.
Responsibilities:
- Participated in requirement discussions with all the stake holders.
- Responsible for distributing, tracking, communicating issues to developers and reporting status to manager on daily basis.
- Involved in High Level Design and Low Level Design document preparation.
- Development according to the specified design
- Published SOAP based web services using JAX-WS, JAXB, XSD, XML Bean and XML.
- Front end is developed based on struts MVC architecture
- SOAPUI has been used to test the web services.
- Struts and spring frameworks has been used for the newly designed UI Infrastructure services to interact with the legacy application systems.
- Developed Action classes, Action forms, Validate methods, struts-config.xml file using struts and also used various struts tag libraries.
- Used Enterprise Java Beans EJB session beans in developing business layer APIs.
- Hibernate is used as ORM
- HQL and Criteria API have been used extensively.
- Developed complex SQL queries, stored procedures, functions, triggers and created indexes wherever applicable in Oracle database.
- Co-ordination with Onshore development team
- Involved in debugging and testing the application for the change requests
- Preparing weekly status reports /Monthly status reports
- Coordinating with complete offshore team on filling weekly time sheets on Clarity and Field glass.
- Given the code walk through to the newly joined team members on the deliverables
- Planning the forecast for the individuals on their task sheets.
- Prepared the test case documents for enhancements
- JUNIT is used for unit testing and prepared JUNIT Test cases document.
- Participated in code review and involved in integration, unit, functional testing, peer testing and integration testing.
Environment: JDK 1.5/1.4, J2EE, Servlets, Strut, Spring, Hibernate 3/3.5/4.0, HQL, Maven 3.0, JAX-WX, JAXB, XML, XSD, SOAPUI, JQuery, CSS, JUNIT, Oracle 9i/10g, SQL, PL/SQL, Quality Center, SSH shell, SSH Client, Putty, VSS, WAS, Web Sphere, Visual Studio, Microsoft Visio, Microsoft Project, UML, Share point, Windows XP and UNIX.
Confidential
Mainframes Developer
Confidential is a type of Loan system under Dealer Financial Services technology DFST of Bank of America. It process the day to day Mortgage loans and Lines of Credit availed by the personal and business account holders of the bank.
- DALS Direct Advanced Loan System : System of record for all home equity secured loans, lines of credit, unsecured and other secured lines of credit, unsecured and other secured installment loans.
- IALS Indirect Advanced Loan System : System of record that supports the servicing of Auto, marine, RV, and Motorcycle for Indirect, Direct and E-lending channels.
- LTX Loan Transaction eXchange : A batch-only application that extracts all loan transactions and sends them downstream accordingly.
- CLR Credit Line Review : This application has a set of user-defined rules by which all Lines of Credit are reviewed on a regular basis.
Confidential includes a huge system of records. Processing of Loan transactions allows a charge to be posted on a Customer's statement and settlement. Reporting system developed by client continuously provides the required statistics for the analysis being done by the users. Worked closely with clients in understanding the system and the domain, and developed the solution to enhance the system according to the technical and Business needs.
Responsibilities:
- Analyze the requirements and document the same.
- Prepare the Impact analysis document and High-level design for the new requirement.
- Customize the product and develop new codes and then deploy in Production.
- Communicate with clients and delegate the work.
- Review the Low Level Design, Unit Test Plan and Unit Test Reports delivered from offshore
- Write complex SQL Queries using Teradata for regular and ad-hoc requirements.
- Set up the Environment and Data for Unit Testing.
- Data sanitization
- Co-ordinate with the clients for System test and User Acceptance Test
- Co-ordinate with Offshore counterpart for timely delivery of the work items with the highest Quality.
- Solve any issues or problem that comes up in the production.
- Maintain different versions of code, analysis, design and requirements for each customization and product enhancements
- Perform Quality Reviews to assure the quality along with the correctness of the deliverable.
- Supplying the reports based on the business rules defined by users.
- Developed tools which the ALS application uses exhaustively for quick resolution of complex production problems and the tools are of immense help during verification of batch processing.
- Created and modified DB2 Schema objects like Tables, Indexes.
- Generated and automated about 100 EASYTRIEVE reports for users.
- Disaster recovery planning. Software upgrades validation.
Environment: JCL, COBOL, CICS, EASYTRIEVE, REXX, VSAM, DB2, NDM, FTP, SMTP, MQ Series, File-Aid, File manager, ALS PANEL, API, RPI, DAG, Debug tool, Expediter, CA7 scheduler, Mainframe Z/OS, Microsoft Visio, Microsoft Project, UML, Share point, Windows XP and UNIX.
Confidential
Mainframes Developer
Confidential Fun ships company which has operations in Confidential and support Project. The modules in this project are individual reservations, group reservations, accounting, sales and marketing. The Defects raised by client are resolved based on the priority. Defects with high priority come under category HEAT and are delivered by end of the day. Other Defects are scheduled to be delivered once in a quarter.
Responsibilities:
- Handling defects and enhancements.
- Providing on call support and data fixing in production.
- Managing the change requests.
- Logic verification and validation of COBOL codes.
- Preparation and execution of unit test plan.
- Communication with clients
Environment: COBOL, JCL, REXX, C, C , HTML, SQL, FILE AID, SAR, PDSMAN, IRIS, UNISYS Mainframes