We provide IT Staff Augmentation Services!

Hadoop Developer Resume

0/5 (Submit Your Rating)

Richmond, VA

SUMMARY:

  • Having more than 10 years of experience in Analysis, design, development, testing of mainframe - hosted Legacy application-systems.
  • Having more than 2 Years of experience in Big Data Ecosystem related technologies, Configuring and administrating Hadoop clusters of major Hadoop distributions.
  • Expertise in all the stages of the Software development Life Cycle(SDLC) namely Requirement Analysis, design specifications, coding, debugging, testing (test scenarios, test plan, test cases, execution of test cases, unit test plan and unit testing, integration and system testing) and documentation and maintenance application programs.
  • Hands on experience in installing, configuring and using Hadoop ecosystem components like Hadoop MapReduce, HDFS, HBase, Oozie, Hive, HortonWorks, Sqoop, Pig, Zookeeper and Flume.
  • In depth knowledge of Job Tracker, Task Tracker, Name Node, Data Node and MapReduce concepts.
  • Experience in analyzing data using HIVEQL, PIG Latin and custom MapReduce programs in JAVA. Extending HIVE and PIG core functionality by using custom UDF’s.
  • Has very good working experience with AWS (Amazon Web Services).
  • Good experience in analysis using PIG and HIVE andunderstanding of SQOOP.
  • Experienced in developing MapReduce programs using Apache Hadoop for working with Big Data.
  • Experience in Designing, developing and implementing connectivity products that allow efficient exchange of data between our core database engine and Hadoop ecosystem.
  • Worked on NoSQL databases including HBase.
  • Has Experience on Medicaid Applications and Worked for HIPAA.
  • Has very good Development experience with Agile Methodology.
  • Has Expertise knowledge on Medicaid Management Information Systems (MMIS).
  • Has Excellent work experience with EDI (Electronic Data Interchange) Transactions.
  • Has excellent communication and interpersonal skills helps contribute to timely completion of project deliverables well short of schedule.
  • Expertise in MVS, COBOL, DB2, CICS, VSAM, JCL.
  • Excellent Communication skills, Team player and having interest towards then new Tools and Software quickly as required.
  • Having Expertise knowledge and Work experience on Object Oriented System Development (OOSD), Unified Modeling Language (UML), Rational tool and USE CASE modeling.
  • Proficient in creating Use case diagrams, data flow diagrams, sequence diagrams and collaboration diagrams.
  • Expertise in obtaining the project requirements from user, writing system specifications, translating user requirements into technical specifications, preparation of requirements documents, formulating the requirements into design specifications and tracking project progress.
  • Proficient in Data Mapping for EDI, X12 Transactions.
  • Expertise in testing phase, it includes test scenarios, test plans, test cases, execution of test cases, unit test plans and unit testing, integration testing and System Testing.
  • Strong Expertise in RDBMS Principles and DB2.
  • Strong technical and analytical skills, problem solving, communication and documentation skills.

TECHNICAL SKILLS:

Big Data: Hadoop, Map Reduce, HDFS, Hive, Pig, Sqoop, HBase, Zookeeper

Operating Systems: MS-Dos, WINDOWS NT, OS/390, ES/9000, Linux

Databases: DB2, ORACLE 8.0, SQL Server, Ms-Access, IMS DB.

Languages: COBOL, C, C++, PL/SQL, JCL,, XML, Java, JVM, PL/1,NATURAL

OLTP: CICS, IMS DC.

FrontEnd Tools: Visual Basic 6.0, MS.Net

Utility: VSAM

Tools: TSO/ISPF, SPUFI, File-Aid, Endeavor, Expeditor, Pan-valetSEEC, GOMA, INGENIUM, MQ Series, websphere, webservices.

Web Tools: HTML, Front Page

Methodologies: OOSD, UML, Agile.

PROFESSIONAL EXPERIENCE:

Confidential, Richmond, VA

Hadoop Developer

Responsibilities:

  • Involved in developing the solutions for log analysis and reporting purposes.
  • Involved in the regular Hadoop Cluster maintenance such as patching security holes and updating system packages.
  • Synchronized different configuration files of Hadoop Cluster nodes across the system.
  • Moved the system log files out of the Hadoop directory to make sure all the log files are in one place even after the installation directory changes due to an upgrade.
  • Configured the MapReduce property to make sure local temporary storage is using large disk partitions.
  • Created User accounts and given the users the access to the Hadoop Cluster.
  • Developed Hive queries to process the data and generate the data cubes for visualizing.
  • Analyzed the web log data using the HiveQL.

Environment: Big Data, Hadoop, HDFS, Hive, Pig, Sqoop, HiveQL, Hbase, JVM, MapReduce, Oozie, Oracle.

Confidential, New York

Hadoop Developer

Responsibilities:

  • Gained very good business knowledge on health insurance, claim processing, fraud suspect identification, appeals process etc.
  • Installed and configured Hadoop Map Reduce, HDFS, Developed multiple map reduce jobs in java for data cleaning and preprocessing.
  • Worked on analyzing Hadoop stack and different big data analytic tools including Pig and Hive, Hbase database and Sqoop.
  • Worked on Hive for exposing data for further analysis and for generating transforming files from different analyticall formats to text files.
  • Involved in loading data from UNIX file system to HDFS.
  • Involved in creating Hive tables, loading with data and writing hive queries which will run internally in map reduce way.
  • Worked extensively in creating MapReduce jobs to power data for search and aggregation.
  • Designed a data warehouse using Hive.
  • Worked extensively with Sqoop for importing metadata from Oracle. Extensively used Pig for data cleansing.
  • Created partitioned tables in Hive Worked with business teams and created Hive queries for ad hoc access.
  • Evaluated usage of Oozie for Workflow Orchestration Mentored analyst and test team for writing Hive Queries.
  • Gained very good business knowledge on health insurance, claim processing, fraud suspect identification, appeals process etc.

Environment: Hadoop, HDFS, Hive, Pig, Sqoop, Hbase, JVM, Java (jdk1.6), UNIX, MapReduce, Hadoop distribution of Hortonworks, Oozie, Oracle 11g/10g.

Confidential, Memphis, TN

Hadoop Consultant

Responsibilities:

  • Responsible for architecting Hadoop clusters with CDH3.
  • Extensively involved in Installation and configuration of Cloudera distribution Hadoop 2, 3, NameNode, Secondary NameNode, JobTracker, TaskTrackers and DataNodes.
  • Installed and configured Hadoop ecosystem like HBase, Flume, Pig and Sqoop.
  • Involved in Hadoop cluster task like Adding and Removing Nodes without any effect to running jobs and data.
  • Managed and reviewed Hadoop Log files.
  • Load log data into HDFS using Flume. Worked extensively in creating MapReduce jobs to power data for search and aggregation.
  • Worked extensively with Sqoop for importing metadata from Oracle.
  • Designed a data warehouse using Hive.
  • Created partitioned tables in Hive.
  • Mentored analyst and test team for writing Hive Queries.
  • Extensively used Pig for data cleansing.
  • Developed Pig Latin scripts to extract the data from the web server output files to load into HDFS.
  • Developed the Pig UDF’S to pre-process the data for analysis.
  • Developed workflow in Oozie to automate the tasks of loading the data into HDFS and pre-processing with Pig.
  • Cluster co-ordination services through ZooKeeper.

Environment: Hadoop, MapReduce, HDFS, Pig, Hive, HBase, ZooKeeper, Oozie, Java (jdk1.6), Oracle 11g/10g, PL/SQL, SQL*PLUS, Windows NT, UNIX Shell Scripting, Agile.

Confidential, Baton Rouge, Louisiana

Programmer Analyst

Responsibilities:

  • Responsible for modifications for existing COBOL, VSAM programs as per the requirements.
  • Responsible for preparing the test data and testing process for the required changes.
  • Analyze and understanding the requirement given by client.
  • Preparing a Technical specification for the given requirements.
  • Getting approvals on the Tested Components from clients.
  • Responsible for understanding the requirements properly and then start coding and unit testing
  • Responsible for unit testing for the programs which are implemented code changes.
  • Developed and enhanced Online and batch Modules in COBOL and CICS and used various tools like endeavor and expeditor.
  • Coding and unit testing the COBOL, VSAM programs as per the requirment documents.
  • Involved in finding and fixing logical errors while performing Peer reviews.
  • Debugging of the modules using Expediter and File-Aid tools and performed Unit testing.
  • Preparing Unit Test Cases.
  • Executing UTC to verify the program functionality.
  • Fixing the bugs that are encountered in testing and making sure the functionality is developed as per the requirements.

Environment: COBOL, VSAM, CICS, Fail-aid, Expeditor, TSO/ISPF, SPUFI, Endeavor.

Confidential, Raleigh, North Carolina

Programmer Analyst Sr Professional

Responsibilities:

  • Providing support for BSD development for EVS (Eligibility Verification System) and AVRS (Automated Voice Response System) builds.
  • Responsible for Developing the TDD for EVS and AVRS builds
  • Responsible for modifications for existing COBOL, VSAM programs to COBOL, DB2 programs.
  • Analyze and understanding the requirement given by client.
  • Preparing a requirements tresability matrix for the given requirements.
  • Working with Clients on the Business specification Documents.
  • Developing web pages for EVS and AVRS operations.
  • Defining the table structure for the new tables which are used in these modules.
  • Responsible for unit testing for the programs which are migrated from Legacy system.
  • Developed and enhanced Online and batch Modules in COBOL/DB2 under Endevor environment.
  • Coding and unit testing the COBOL, DB2 programs as per the design documents.
  • Involved in finding and fixing logical errors while performing Peer reviews.
  • Worked on building the AVRS Call flow diagrams and call flow message tables.
  • Worked on many X12 Transactions.
  • Participated in Analysis and Coding of the EDI Transactions.
  • Worked on E-Commerce module component of the EVS and AVRS builds.

Environment: COBOL, DB2, VSAM, CICS, Platinum, Fail-aid, Expeditor, TSO/ISPF, SPUFI, Endeavor.

Confidential, North Carolina

Programmer Analyst

Responsibilities:

  • Responsible for modifications for existing COBOL, VSAM programs to COBOL, DB2 programs.
  • Responsible for COP Team Detailed System Design Document
  • Analyze and understanding the requirement given by client.
  • Preparing a functional specification for the given requirements.
  • Getting signoff for Detailed System Design Document from clients.
  • Responsible for Converting the Legacy Cobol programs into Omnicaid Mainframe programs.
  • Responsible for Design documents approvals with client and then start coding and unit testing
  • Responsible for unit testing for the programs which are migrated from Legacy system.
  • Developed and enhanced Online and batch Modules in COBOL/DB2 under Endevor environment.
  • Coding and unit testing the COBOL, DB2 programs as per the design documents.
  • Involved in finding and fixing logical errors while performing Peer reviews.
  • Debugging of the modules using Expediter and File-Aid tools and performed Unit testing.
  • Preparing Unit Test Cases.
  • Executing UTC to verify the program functionality.
  • Fixing the bugs that are encountered in testing and making sure the functionality is developed as per the requirements.
  • Reviewing Design Specifications, Technical Specification documents and UTC.
  • Participated in weekly status meetings, and conducting internal and external reviews.

Environment: COBOL, DB2, VSAM, CICS, Platinum, Fail-aid, Expeditor, TSO/ISPF, SPUFI, Endeavor.

Confidential, North Carolina

Programmer Analyst

Responsibilities:

  • Responsible for modifications for existing COBOL, CICS, VSAM programs.
  • Responsible for Provider Subsystem Online process.
  • Responsible for preparation of Detailed System Design Document.
  • Analyze and understanding the requirement given by client.
  • Preparing a functional specification for the given requirements.
  • Getting signoff for Detailed System Design Document from clients.
  • Responsible for Design Document Addition.
  • Responsible for Design documents approvals with client and then start coding and unit testing
  • Involved in Maintenance, Enhancements and testing of the backend business process of MEDICAID Subsystems - Provider
  • Agree on effort and schedule for deliverables, resolving issues and clarifications with clients / users.
  • Responsible for the testing, it includes all the steps begin from test scenarios to Systems testing.
  • Worked on tasks related to complex and Provider sub-system.
  • Worked in provider module in correcting certain provider related validations.
  • Coded one time programs to fix bugs related to provider and NPI fields.
  • Developed and enhanced Online Modules in COBOL/VSAM under Endevor environment.
  • Coding / modifying the COBOL, VSAM programs as per the design documents.
  • Involved in finding and fixing logical errors while performing Peer reviews of all Subsystems.
  • Debugging of the online modules using Expediter and File-Aid tools and performed Unit testing thru online.
  • Preparing Unit Test Cases.
  • Executing UTC to verify the program functionality.
  • Fixing the bugs that are encountered in testing and making sure the functionality is developed as per the requirements.
  • Closing defects in Review Test Defect Form and tracking the rework efforts.
  • Preparing of delivery documents and making timely deliveries.
  • Participated in weekly status meetings, and conducting internal and external reviews.

Environment: COBOL, VSAM, CICS, Platinum, Fail-aid, Expeditor, TSO/ISPF, SPUFI, Endeavor.

Confidential, Colorado

Programmer Analyst

Responsibilities:

  • Responsible for modifications for existing COBOL, DB2 programs.
  • Responsible for preparation of Technical specifications.
  • Analyze and understanding the requirement given by client.
  • Preparing a functional specification for the giving requirements.
  • Getting signoff for functional specifications from the clients.
  • Responsible for functional specification approvals, then we have to proceed with design documents.
  • Responsible for Design documents approvals with client then proceeding with coding and unit testing.
  • Involved in Maintenance, Enhancements and testing of the backend business process of MEDICAID Subsystems - Recipient, Provider, Claims, and Third Party Liability (TPL) Management
  • Agree on effort and schedule for deliverables, resolving issues and clarifications with clients / users.
  • Responsible for the testing, it includes all the steps begin from test scenarios to Systems testing.
  • Worked on tasks related to complex claim-modules and Provider sub-system.
  • Worked in provider module in correcting certain provider related validations.
  • Coded one time programs to fix bugs related to provider and NPI fields.
  • Developed and enhanced Batch modules in COBOL/DB2 under Endevor environment.
  • Coding / modifying the COBOL, DB2 programs as per the design documents.
  • Involved in finding and fixing logical errors while performing Peer reviews of all Subsystems.
  • Debugging of the Batch modules using Expediter and File-Aid tools and performed Unit testing thru batch, system testing and Exam Entry online.
  • Preparing Unit Test Cases.
  • Executing UTC to verify the program functionality.
  • Fixing the bugs that are encountered in testing and making sure the functionality is developed as per the requirements.
  • Reviewing Design Specifications, Technical Specification documents and UTC.
  • Closing defects in Review Test Defect Form and tracking the rework efforts.
  • Preparing of delivery documents and making timely deliveries.
  • Participated in weekly status meetings, and conducting internal and external reviews as well as formal walkthroughs among various teams, and documenting the proceedings.

Environment: COBOL, DB2, Medicaid. Platinum, Fail-aid, Expeditor, TSO/ISPF, SPUFI, Endeavor.

Confidential

Sr.Programmer/Analyst

Responsibilities:

  • Responsible for modifications for existing COBOL, DB2 programs.
  • Responsible for preparation of Functional specifications and design documents.
  • Analyze and understanding the requirement given by client.
  • Preparing a functional specification for the giving requirements.
  • Coordinating functional specifications with the offshore team.
  • Make understand the offshore team about the requirement and functional specification.
  • Having error free confirmation from the clients on the functional specifications.
  • Getting signoff for functional specifications from the clients.
  • Responsible for functional specification approvals, then we have to proceed with design documents.
  • Responsible for Design documents approvals with client then go, once it is done then we go ahead with coding with onsite and offshore team.
  • Agree on effort and schedule for deliverables, resolving issues and clarifications with clients / users.
  • Responsible for the testing, it includes all the steps begin from test scenarios to Systems testing.
  • Distributing the work to the team members and assisting them to complete the tasks with high quality.
  • Coding / modifying the COBOL, DB2 programs as per the design documents.
  • Participated in weekly status meetings and preparing MOMs.
  • Preparing Unit Test Cases.
  • Executing UTC to verify the program functionality.
  • Fixing the bugs that are encountered in testing and making sure the functionality is developed as per the requirements.
  • Reviewing Design Specifications, Technical Specification documents and UTC.
  • Closing defects in Review Test Defect Form and tracking the rework efforts.
  • Preparing of delivery documents and making timely deliveries.
  • Monitoring the development activities of the team and updated to the Management.
  • Participated in weekly status meetings, and conducting internal and external reviews as well as formal walkthroughs among various teams, and documenting the proceedings.
  • Documented the Purpose of tasks / program changes so as to facilitate the personnel to understand the process and in corporate the changes as and when necessary.

Environment: INGENIUM, COBOL, DB2, Ingenium interface, PL/1.

Confidential

Programmer

Responsibilities:

  • Actively involved throughout the life cycle of the project from requirements gathering.
  • Worked on ACH (Automated Clearing House) Jobs for analyze and changes.
  • Understanding the existing legacy application by code walkthrough, screen review, reviewing technical specification and other documentation.
  • Extracting the Business Logic from the existing business process, and documenting it in the form of Use case Diagrams, Class Diagrams, Interaction Diagrams, State Diagrams as UML notations.
  • Analyzing the work requirement request.
  • Agree on effort and schedule for deliverables, resolving issues and clarifications with the client.
  • Understanding the Individual subsystems from the existing process and preparing System level and sub system level business process documents.
  • Attending the teleconference with client for clarification of business process.
  • Time tracking and preparing Work Breakdown Structure of the work.
  • Involvement in DB2 database modification to cater the enhancement requirements.
  • Cleaning up of the existing legacy data
  • Documenting the existing Business processing flow, required business process changes and incorporated business process changes.
  • Documenting the business process by using Data flow Diagrams, Sequence Diagrams, Activity Diagrams and collaboration diagrams.

Environment: OS/390, MVS, COBOL, PL/1, CICS, JCL, DB2, VSAM, SPUFI, QMF, Java, MQ Series, VB Script, XML, Endevor, MS-Office, IMS DB, IMS DC, SEEC, GOMA, OOSD, UML.

Confidential

Business Associate

Responsibilities:

  • Responsible for preparation of program specification.
  • Actively participated in Requirements gathering and understanding the functional requirements.
  • Carried out the design activities
  • Involved in coding.
  • Wrote the Test Plan, Test conditions based on the Functional Requirements and technical specification.
  • Responsible for setup of the System and integration Testing.
  • Responsible for writing the Test plan and Test cases for the functionality.
  • Documenting all the Test cases and Test scripts.
  • Responsible for testing all the changes.
  • Validating the all the SQL statements by using SPUFI or QMF.
  • Modify COBOL, CICS, DB2 and JCL’s as per requirement.
  • Actively participate in the discussion on the business rules.
  • Wrote complex business logics which validate the business process.
  • Sent data to various sub systems using vsam file, flat files or by storing data in DB2 databases.
  • Developing new BMS maps by using SDF.
  • Implementing Modifications on BMS maps by using SDF, Creating new online screens and modifying existing ones according to the Basic Functional Specifications document.
  • Modifying the programs to support the database and screen changes
  • Written the COBOL, DB2, CICS programs which interact with other business process.
  • Involvement in DB2 database modification to cater the enhancement requirements.
  • Debugging the programs using Xpediter to ensure the functional aspects
  • Executing the test scripts to verify the program functionality
  • Assisting the quality assurance team in testing the applications
  • Used pan-valet as a version controller.

Environment: OS/390, MVS, COBOL, PL/1, CICS, JCL, DB2, VSAM, Pan-valet, Mercator, SDF, SPUFI, QMF, Java, MQ Series, VB Script, XML, File-Aid, IMS DB, IMS DC, MS-Office.

We'd love your feedback!