We provide IT Staff Augmentation Services!

Etl And Mainframe Developer Resume

4.00/5 (Submit Your Rating)

Buffalo, NY

PROFESSIONAL SUMMARY:

  • Over 8 years of Dynamic career reflecting pioneering experience and high performance in System Analysis, design, development and implementation of Data Warehousing Systems and software programs using blend of IBM Data Stage 9.1/8.7/8.1/8.0.1 (InfoSphere Information Server for data quality, Web Sphere) and Mainframes(COBOL, JCL, DB2 and VSAM) technologies.
  • Professional in configuring new environments for ETL and mainframe projects those are accomplishing for the first time. ETL - Datastage:
  • Excellent Experience in Designing, Developing, Documenting, Testing of ETL jobs and mappings in Server, Parallel and Sequence jobs using Data Stage to populate tables in Data Warehouse and Data marts.
  • Proficient in developing strategies for Extraction, Transformation and Loading (ETL) mechanism.
  • Expert in designing Parallel jobs using various stages like Join, Merge, Lookup, Remove duplicates, Filter, Dataset, Lookup file set, Complex flat file, Modify, Aggregator and Sequence jobs using Job, Email, Wait-for-File, Execcommand, Exception, Sequencer, Terminator, Stat loop and End loop activity stages.
  • Expert in designing Server jobs using various types of stages like Sequential file, ODBC, Hashed file, Aggregator, Transformer, Sort, Link Partitioner and Link Collector.
  • Experienced in integration of various data sources (DB2-UDB, Oracle, PL/SQL, and MS-Access) into data staging area.
  • Expert in working with Data Stage, Administrator, Designer and Director.
  • Experience in analyzing the data generated by the business process, defining the granularity, source to target mapping of the data elements, creating Indexes and Aggregate tables for the data warehouse design and development.
  • Excellent knowledge of studying the data dependencies using metadata stored in the repository and prepared batches for the existing sessions to facilitate scheduling of multiple sessions.
  • Proven track record in troubleshooting of Data Stage jobs and addressing production issues like performance tuning and enhancement.
  • Expert in working on various operating systems like UNIX, Linux and Windows.
  • Expertise in UNIX shell scripts using K-shell for the automation of processes and scheduling the Data Stage jobs using Control M.
  • Experience in using software configuration management tools like Rational Concert(RTC) for version control.
  • Expert in unit testing, system integration testing, implementation and maintenance of databases jobs.
  • Effective in cross-functional and global environments to manage multiple tasks and assignments concurrently with effective communication skills.
  • Expert in Scheduling the complex dependencies using Control M scheduling tool.
  • Certified Technical Mainframes Assessment Track-1(COBOL and JCL) & Track-2(DB2, VSAM and CICS) at HSBC.
  • Profound ability to quickly understand complex mainframe systems.
  • Superior proficiency with mainframe applications built with COBOL, JCL, DB2 and VSAM.
  • Integrated existing designs with new requirements.
  • Resolved complex and conflicting design issues.
  • Supervised component and code test activities.
  • Proven ability to work in a high-pressure environment.
  • Ensured complete and accurate issue tracking and reporting.
  • Identified production issues impacting code modules.
  • Good Knowledge over CA7 scheduling tool, IBM Utilities like File manager and Debugger tools.
  • In additional to current responsibilities working as IQA(Internal Quality Analyst) for the project ensuring high-quality delivery of the projects.
  • Attended various Banking Domain s and Risk Level-I at HSBC.

TECHNICAL SKILLS:

Operating Systems: Linux, Z/OS, Windows, Unix Shell scripting.

ETL Tools: IBM InfoSphere Datastage and Quality stage V9.1, V8.7, V8.1 and IBM Web Sphere Datastage V8.0.1.

Mainframe: COBOL, JCL, DB2, VSAM, REXX

RDBMS: Oracle 10g/11g, DB2

Banking: Hadoop MapReduce, Hive

WORK EXPERIENCE:

Confidential, Buffalo, NY

ETL and Mainframe Developer

Responsibilities:

  • Involved as primary on-site ETL and Mainframe Developer during the analysis, planning, design and development stages of projects using IBM Datastage software (Datastage and Quality Stage v9.1) and Mainframe.
  • Prepared Data Mapping Documents and Design the ETL jobs based on the DMD with required Tables in the Dev Environment.
  • Active participation in decision making and QA meetings and regularly interacted with the Business Analysts & development team to gain a better understanding of the Business Process, Requirements & Design.
  • Used DataStage as an ETL tool to extract data from sources systems, loaded the data into staging database ORACLE.
  • Designed and Developed Data stage Jobs to Extract data from heterogeneous sources, Applied transform logics to extracted data and Loaded into Data Warehouse Databases.
  • Created Datastage jobs using different stages like Transformer, Aggregator, Sort, Join, Merge, Lookup, Data Set, Funnel, Remove Duplicates, Copy, Modify, Filter, Change Data Capture, Change Apply, Sample, Surrogate Key, Column Generator, Row Generator, Etc.
  • Extensively worked with Join, Look up (Normal and Sparse) and Merge stages.
  • Extensively worked with sequential file, dataset, file set and look up file set stages.
  • Extensively used Parallel Stages like Row Generator, Column Generator, Head, and Peek for development and de-bugging purposes.
  • Used the Data Stage Director and its run-time engine to schedule running the solution, testing and debugging its components, and monitoring the resulting executable versions on ad hoc or scheduled basis.
  • Converted complex job designs to different job segments and executed through job sequencer for better performance and easy maintenance.
  • Creation of Job sequences.
  • Maintained Data Warehouse by loading dimensions and facts as part of project. Also worked for different enhancements in FACT tables.
  • Created shell script to run data stage jobs from UNIX and then schedule this script to run data stage jobs through Control M scheduling tool.
  • Coordinate with team members and administer all onsite and offshore work packages.
  • Analyze performance and monitor work with capacity planning.
  • Performed performance tuning of the jobs by interpreting performance statistics of the jobs developed.
  • Documented ETL test plans, test cases, test scripts, and validations based on design specifications for unit testing, system testing, functional testing, prepared test data for testing, error handling and analysis.
  • Participated in weekly status meetings.
  • Developed Test Plan that included the scope of the release, entrance and exit criteria and overall test strategy. Created detailed Test Cases and Test sets and executed them manually. Mainframes:
  • Built new COBOL programs to load input files from ETL system into DB2 database with the help of SPUFI, IBM-File manager, IBM-Debugger and Endevor tools.
  • Created Jcls to run Cobol-DB2 programs and Scheduled the jobs using CA7 scheduling tool in production environment.
  • Generated the reports and sent them to Ondemand for user verification.

Environment: Staging: Datastage v9.1, Oracle 11g, Linux, RTC for version control, Control-M v7.0 for DS jobs scheduling. Data Load and Reporting: COBOL, JCL, DB2, SPUFI, IBM-File Manager, IBM-Debug tool, Endevor for version control, CA7 for mainframe jobs scheduling.

Confidential, Buffalo, NY

ETL and Mainframe Developer

Responsibilities:

  • Used the DataStage Designer to develop processes for extracting, cleansing, transforming, integrating and loading data into staging tables.
  • Developed complex ETL jobs using various stages like Lookup, Join, Transformer, Dataset, Row Generator, Column Generator, Complex Flat File(CFF), Datasets, Sequential File, Aggregator and Modify Stages.
  • Created queries to compare data between two databases to make sure data is matched.
  • Used the DataStage Director and its run-time engine to schedule running the solution, testing and debugging its components, and monitoring the resulting executable versions on an ad hoc or scheduled basis.
  • Created shared container to in corporate complex business logic in job.
  • Scheduling and monitoring the ETL jobs using Control M in UAT and production environments.
  • Created and modified batch scripts to Connect Direct(CDT) the files from ETL server to mainframe server.
  • Created Job Sequencers to automate the job.
  • Created new and modified existing UNIX shell scripts to run Datastage jobs and necessary scripts for datastage project.
  • Created parameter set to assign a value to job at run time.
  • UsedParallel Extender for Parallel Processingfor improving performance when extracting the data from the sources.
  • Worked with Metadata Definitions, Import and Export of Datastage jobs using Data stage Manager.
  • Providing the logical data model design, generating database, resolving technical issues, and loading data into multiple instances.
  • ImplementedPL/SQL scriptsin accordance with the necessaryBusiness rules and procedures. Mainframe Process:
  • Developed JCL to perform first pass logic that validates the source system files to makes sure that input files are complete and valid.
  • Developing the COBOL-DB2 programs to load input files from ETL and other loan systems.
  • Prepared the jobs using JCL to run COBOL-DB2 programs and Scheduled them in production using CA7.
  • Generate quarterly reports using Commercial Real Estate and Corporate Loan Data and sends them Federal Government.

Environment: Staging: Datastage v8.1, Oracle 10g, Unix scripting, RTC for version controlControl M for DS jobs scheduling.

Confidential, Buffalo, NY

ETL and Mainframe Developer

Responsibilities:

  • Gathering and Analyzing the Functional Requirements through interaction with Business users and the IT project managers located at Buffalo.
  • Prepare High level technical and functional Design Documents
  • Coordinate with the Business users and offshore team to resolve the functional and Technical issues.
  • Prepare data mapping documents as per the Business requirements.
  • Assisting the Offshore team on Coding and Testing.
  • Performing Peer review of the Results before delivering to HTS Counterparts/ Clients.
  • Managed development life cycle to ensure high-quality projects delivery.
  • Design Data integration/ETL jobs using Datastage and its foundation tools.
  • Analyze, Design, Optimize and implement Oracle PL/SQL stored procedures.

Environment: Staging: Datastage v8.1, Oracle 10g, Unix scripting, RTC for version control, Control M for DS jobs scheduling.

Confidential

Mainframe Developer

Responsibilities:

  • Impact Analysis and Creation of High level Technical Design documents.
  • Design: Preparation of Technical Specs of components according to requirements
  • 24/7 On Call/Stand by Support
  • Coding: Coded programs from Scratch as per Technical specs and did the changes to existing programs as per requirement changes.
  • Testing: Created and executed test cases as per test plan.
  • Prepare and maintain the documents for all phases of the project.
  • Pre and Post Implementation support and verification.

Mainframes: COBOL, JCL, DB2, SPUFI, IBM-File Manager, IBM-Debug tool, Endevor for version control, CA7 for mainframe jobs scheduling.

We'd love your feedback!