Mainframe Modernization Consultant Resume
SUMMARY:
- An accomplished mainframe professional with around 6 years of experience on Mainframe Technology specialized in Legacy Modernization initiatives such as “Re - Engineering, Re-hosting, Data Migration, MIPS Optimization Techniques, Legacy Code Assessment and Consulting.
- Extensive experience in all phases of Software Development Life Cycle (SDLC) including design, development, enhancement, migration and testing of applications using COBOL, JCL, DB2, VSAM, CICS, TSO/ISPF, REXX, Batch scheduling, Expeditor, Endeavor/Changeman, Debug Facility.
- Trained and Certified Confidential professional in Mainframe technology, Hands on experience in projects using COBOL, DB2, JCL and VSAM
- Proficient in gathering Business Requirements, functional and technical specifications and preparing of high level and low-level detail design (HTD and LTDs)
- Strong experience in relational database DB2 & expert in COBOL-DB2 application programming with exposure to DB2 Utilities
- Well versed in using String handling, Table processing using array, File handling, Database interface inside the COBOL programs
- Expertise in creating/working in JCL and its procedures for Defining datasets, Running Cobol programs, utility programs, conditional processing in job steps
- Strong experience with Mainframe application programming and creating datasets like PS, PDS members, GDG's, JCL's control cards, JCL’s, PROC's, copybooks and VSAM files(KSDS,RRDS,ESDS) required for project
- Expert in coding DDL/DML/DCL for retrieving and manipulating database tables using query management and reporting tools like SPUFI, QMF and DB2 FILE-AID for data manipulation and data cleansing
- Strong knowledge about normalization, DB2 stored procedures, functions, and referential integrity, Utilities, Cursors, Triggers, Views, Correlated Subquery and Subquery etc.
- Proficient in IBM utilities and Tools including ABEND-AID, SYNCSORT, IEBGENER, IEFBR14, IEBCOMPR and IDCAMS
- Having good knowledge by using FTP and NDM Direct to sending files from one system to other systems
- Good exposure to version control systems like CHANGEMEN and ENDEAVOR
- Experience in Scheduling the Daily, Weekly, Monthly and Yearly Batch Jobs, Executing the jobs and removing jobs in CTLM
- Monitoring the batch jobs and resolving the abends by using Abend Aid
- Got Very Good Exposure on Micro Focus Enterprise Analyzer and Enterprise Developer Intelligence Tool which analyzes the code inventory.
- Working on Pre-sales activities such as RPF, RFQ etc. Conducting Proof of Concepts and performing application Assessments
- Exposure of working in a cross-cultural working environment and demonstrated innovation and thought leadership in development of REXX, Python based solution accelerators
- Expertise in functional knowledge of banking, financial, and insurance applications ( Confidential, Card Services, Customer Relationship)
- Strong practical knowledge of Mainframe Applications with DB2 Database and handled the project from its initial requirements gathering to integrated testing and final Documentation
- Excellent project documentation skills which include creation of solution specifications, application control documents and project design documentation
TECHNICAL SKILLS:
Hardware Platforms: IBM 3090 Mainframe
OS: 390(MVS/ESA), Windows, MVS, UNIX
Front End/GUI: HTML, XML
DBMS: DB2, SQL SERVER
Programming Languages: COBOL, JCL, C, C++, Core Java, COCOA, CICS,VSAM
Tools: FILE-AID, SPUFI, QMF, Expeditor, REXX, Control-M, Quality center, ChangemenEndeavor, Confidential Office, VSAM, Expeditor, Service Now, MicrofocusEnterprise Environment (Developer & Analyzer), TSO/ISPF, EASYTRIEVE, DB2 utilities, SDSF,FTP
Scripting Languages: Java Script, Python, REXX
Implementation models: Waterfall, Agile
WORK EXPERIENCE:
Confidential
Mainframe Modernization Consultant
Responsibilities:
- Analyzed SMF data of HIGH CPU Consuming jobs and given recommendation for performance improvement and for MIPS Reduction
- Given Recommendation for DASD Clean up by analyzing the DASD-listings
- Implemented changes in SQL queries to extract the data from DB2 tables using SPUFI/QMF environment
- Suggested Recommendations for improvement of MIPS using SYNCSORT and changes in JCL to reduce the calls to DB2
- Ran EA tool for Code Quality Report, Optimization Report, Complexity Report and Deadlines Report
- Prepared a recommendation document for each proposed performance improvement change
- Utilized debugging tools including FILEAID, ABENDAID and XPEDITOR
- Recommended changes in COBOL-DB2 applications and SYNCSORT in JCL’s for MIPS optimization
- Implemented changes in SQL queries to extract the data from DB2 tables using SPUFI/QMF environment
- Incorporated recommended changes in respective applications for better improvement in MIPS usage
- Performed a unit testing for MIPS Optimization changes in applications
- Used Basic dataset commands, copy commands, Exclude commands, Move command, PDS commands and ISPF utilities for easy access of datasets
- Used FILE AID for file browsing and file compare
- Used TSO/ISPF utilities for browsing and accessing of PS,PDS,GDG’s
- Utilized debugging tools including ABENDAID and XPEDITOR
- Obtained a good savings by reduction of MIPS usage for changed applications
- Provided support to Client while deploying application having changes for optimization
- Ensuring timely deliveries of work requests after walkthrough & peer review
- Debugging and providing fixes for technical issues in the team
- Providing solutions to improve the existing processes
- Extensively used IBM, DB2, stored procedures, CICS Web Services
- Project delivery of various sizes, performance tuning, production support, trouble shooting.
Environment: - Mocha soft Client, COBOL, DB2, TSO/ISPF, EXPEDITOR, FILEAID, DB2 Utilities, Micro focus Enterprise Analyzer, SYNCSORT, FTP, NDM, JCL, JCL Procedures
Mainframe Modernization Consultant
Confidential
Responsibilities:
- Involved in SOW Preparation
- Prepared a Questionnaire for all SME’s of COBOL-DB2, COBOL-CICS, COBOL-IMS, and EZYTRIEVE Applications to know more details about their environment
- Took full owner ship of Microfocus EA Tool Running and generate Code Quality Report, Optimization Report, Complexity Report and Deadlines Report which describe about code inventory
- Analyzed Code Quality Report, Optimization Report, Complexity Report and Deadlines Report to give recommendations and assign score applications
- Preparing Document for each application which tells about its inventory findings, Migration Approach, Pain Points and how flexible the application for changes
- Took full ownership in bringing juniors up to the speed on various Tools & Accelerators
- Written python script for avoiding the Manual work
- Project delivery of various sizes, performance tuning, production support, trouble shooting.
- Extensively Worked on Job Sequences to control the execution of the job flow using various activities like Job Activity, Email Notification, Sequencer, Routine activity and Exec Command Activities
- Used FILE-AID to create and modify VSAM (KSDS), PDS, and PS.
- Used TSO/ISPF utilities to create and Maintain PDS, PS, etc. and to monitor Spools and to migrate files from one region to other regions.
- Analyzed all JCL JOBs, PROCs for daily, monthly & yearly Reporting purposes.
Environment: - Mocha soft Client, COBOL, CICS, EZYTRIEVE, DB2, TSO/ISPF, EXPEDITOR, FILEAID, Python, DB2 Utilities, Micro focus Enterprise Analyzer, SYNCSORT
Mainframe Consultant
Confidential
Responsibilities:
- Worked on Micro focus Enterprise Developer and compiled zRef Application components
- Created server instance in enterprise server and configured to up run for POC
- New database setup and created tables with data using SQL SERVER 2012 EXPRESS which were used by zRef Application
- Created a new CTLM schedule in test environment for scheduling of new jobs into existing schedule
- Thoroughly tested and implemented the same on Mocha soft and Rhumba client in AZURE Environment
- Uploaded Resource Definition File onto Azure using Command line environment
- Helped testing team in understanding the Re-hosted zRef application
- Documented entire process for future reference for the entire team
- Coordinating with the team members to meet the project Deadlines by planning
- Preparing project documents for quality assurance and quality control and for future references
- Collect inventory for high CPU consuming applications
- Deploy and Configure Confidential in-house tools (Inventory X-Ref, CRUD, COBOL BRE) in the client environment
- Analyze inventory and customize tools covering unique scenarios found in the inventory
- Execute tools on the inventory and produce artifacts
- Analyze artifacts and determine the feasibility of offloading applications to the Hadoop environment
- Preparing due-diligence, PoC report for the final delivery
- Used SYNCSORT DMX-h to offload flat files onto HDFS
- JCL/COBOL is moved to Hadoop Pig using internal tool
Environment: - COBOL, CICS, EZYTRIEVE, EXPEDITOR, DB2, TSO/ISPF, EXPEDITOR, FILEAID, Python, DB2 Utilities, Micro focus Enterprise Analyzer, SYNCSORT, CTLM
Confidential
Mainframe Developer
Responsibilities:
- Involved in Project Requirement Analysis, System Study and contributed to estimation plan
- Implemented Impact Analysis on applications based on project changes as per requirements
- Prepared and organized the Project related specification, design, and other documentation as per process
- Written new modules and Implemented code changes in applications according to coding standards in COBOL-DB2
- Performed unit testing on changed modules using masked production files
- Created new/Changed existing JCL’s and Procedures using custom REXX scripts for changed modules
- Created a new CTLM schedule in test environment for scheduling of new jobs into existing schedule
- Implemented System Integration testing using newly created CTLM schedule and changed modules
- Well versed in using QC tool for tracking of Unit test cases, System Integration test cases, Regression test cases, Integration test cases
- Doing review which involves Peer Review, Review of Unit/System Test Strategy/Plan/Results handling User Acceptance Testing Support
- Expertized in using Service now for monitoring of production jobs status, routing tickets to respective teams, connecting with DB2 admin teams for DB2 related changes and DB2 bulk loads
- Handled Project deployment and provided production support using Service now while deployment
- Analyzed the job log and resolved production ABENDs; performed impact analysis to resolve user queries.
- Actively involved in contributing opinions and solutions to Project/team problems
- Provided the support for project deployment using IMR fixes, and implementation and CR activities
- Involved in training and Knowledge transition for new resources joining the team
- Responsible to ensure Quality of Deliverables- Components and Documents
Environment: - DB2, JCL, COBOL, Service now, CHANGEMEN, Quality Center, REXX, CTLM, TSO/ISPF, SPUFI, QMF, FILEAID, Python, DB2 Utilities, DMX-h SYNCSORT, FILEAID, TSO/ISPF, EXPEDITOR