We provide IT Staff Augmentation Services!

Mainframe Developer Resume

2.00/5 (Submit Your Rating)

OBJECTIVE

  • In applying for this position I will be utilizing my noledge, acquired through my 7 years of experience in IT industry as developer and team leader in various technologies.

SUMMARY

  • Over 6+ years of IT experience including experience with IBM Mainframe, TERADATA, JCL, COBOL, REXX, CA7 and DB2
  • Over4+ years of IT experienceincludingexperience with IBM Infosphere/WebSphere Datastage 8.5/8.0.1(Designer, Director, Administrator) to implement ETL solutions and Data Warehousing projects
  • Experienced in design of complex Mainframe flows and design of jobs
  • Good experience in Teradata V2R6/V2R5/V2R4, BTEQ, Fast Load, Multi Load, Fast Export.
  • Experienced in DB2 stored procedure development in mainframe environments using COBOL, DB2 and JCL. Experience in performance tuning to reduce the process time.
  • 1+ Years of extensive Data Modeling experience using Dimensional Data Modeling, Star Join Schema Modelling, Fact and Dimensions Tables, Physical and Logical Data Modeling for OLTP, Enterprise Database Integration and Management among Oracle and DB2, Database Design (Physical and Logical), Data Administration and Maintenance, Application Support, Performance Tuning, Optimization.
  • Experienced inschedulingsequence andparalleljobsusingDataStageDirector.
  • Excellent noledge in Extraction, Cleansing and Modification of data from/to various Data Sources, Flat Files from Mainframe Systems and Comma Delimited files (CSV).

TECHNICAL SKILLS

ETL Tools: IBM Web Sphere Data Stage 8.5/6.1

Languages: C, SQL, UNIX Shell Programming, JCL, COBOL, REXX

Application Servers: IBM Web sphere Application Server 5.0/4.0/3.5

Operating Systems: UNIX, Windows 2000/XP/2003, ZOS

Database Tools: SQL Plus, QMF, Query Man, SPUFI

Databases: DB2 9.x/8.x/7.1, SQL Server 2005/08, Teradata

Teradata: Teradata V2R6/V2R5/V2R4, BTEQ, Fast Load, Multi Load, Fast Export and T - Pump.

Data Modeling: Dimensional Data Modeling, Data Modeling, Star Join Schema Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling

PROFESSIONAL EXPERIENCE

Confidential

Mainframe Developer

Responsibilities:

  • Provide 24X7 support all Pricing, Space applications in Confidential Systems.
  • Writing Teradata scripts to load data from source system to target Teradata tables on which Pricing Data Mart is built for reporting purposes.
  • Responsible for analyzing new requests by users and providing optimal solution.
  • Maintaining price changes whenever there is change in local taxes or state taxes.
  • Create Reports by writing Cobol Programs, ETL jobs as requested by customers.
  • Extensively worked with Data Stage - Designer, Director to load data from flat files, legacy data to target Teradata database.
  • Working closely with DB2 DBA to has system modified accordingly based on the new requirements.
  • Developing Technical Design and High level design documents for new requirements/ enhancements in the existing system.
  • Creating/Maintaining DB2 stored procedures which are used by pricing front end systems developed on Java.
  • Creating Cross Platform jobs in CA7 to trigger ETL and UNIX scripts.

Environment: IBM Mainframe, IBM Info Sphere Data Stage 8.1, 8.5, Teradata, DB2 UDB, UNIX, COBOL flat files, CA7.

Confidential

Business analyst

Responsibilities:

  • Worked with the Business analysts and the DBAs for requirements gathering, analysis, testing, and metrics and project coordination.
  • Designed complex flow to handle more than 10+ million records in job flow.
  • Writing COBOL programs to extract data from other sources/ applications.
  • Optimized the queries to improve the performance of the application.
  • Writing stored Procedures to process data.
  • Involved in creating tables in Teradata base and populating them from data stage using Teradata enterprise stage and mainframe using TPUMP/MLOAD utilities.
  • Extensively worked with Data Stage - Designer, Director and Administrator to load data from flat files, legacy data to target Teradata database
  • Performed data loading with multiple and parallel ETL processes. Understanding business needs and implementing the same into a functional database design.
  • Extensively worked on DataStage Job Sequencer to Schedule Jobs to taking care of Inter Dependencies.
  • Writing Teradata scripts to load data from source system to target Teradata tables
  • Developed Job Sequencers with restart capability for the designed jobs using Job Activity, Exec Command, E-Mail Notification Activities, Triggers, Exception handler, Terminator stage.
  • Responsible for unit, system and integration testing. Development test scripts, test plan and test data. Participated in UAT (User Acceptance Testing).Developed common components for ETL processes which are reusable and could be used across ETL projects.
  • Involved in low-level design and developed various jobs and performed data loads and transformations using different stages of Data Stage and pre-built routines, functions and macros.
  • Responsible for creating cross platform jobs on Mainframe to trigger DataSatge jobs
  • Responsible for creating Retry logic to restart jobs in case of server issues
  • Responsible for monitoring all the jobs dat are running, scheduled, completed and failed and Troubleshooting the failed jobs.

Environment: COBOL flat files, CA7, CICS, IBM Mainframe, REXX, DB2, Teradata, IBM Info Sphere Data Stage 8.1, 8.5, UNIX, MS Visio, SQL Server.

Confidential

Responsibilities:

  • Worked extensively with COBOL, JCL, and CA7 on mainframe side to extract data from legacy systems.
  • Used the DataStage Designer to develop processes for extracting, cleansing, transforming, integrating, and loading data into Data warehouse.
  • Developed various Server and Parallel jobs using Aggregator, Filter, Funnel, Copy, Change Capture, Merge, Look up, Join, Lookup stages.
  • Responsible for Ftp’ing the files produced to the respective servers
  • Performed Unit Testing and tuned for better performance with updates on data warehouse tables using DataStage Director for jobs Monitoring and trouble shooting.
  • Responsible for creation of complex PO generation/Split process which can will be completed in 3Hrs, with data handling capability of 0.75+ millions PO’s in a single flow.
  • Extensively involved in designing and developing of Fact and Dimension design.
  • Created DataStage jobs and Job Sequences and tuned them for better performance.
  • Developed the data warehouse repository using DataStage Manager by importing the source and target database schemas.
  • Performed Unit Testing and tuned for better performance with updates on data warehouse tables using DataStage Director for jobs Monitoring and trouble shooting.

Environment: COBOL, JCL, REXX, CA7, Data Stage 8.1, UNIX, DB2 UDB, Teradata, SQL, UNIX, Flat files

We'd love your feedback!