Sr.datastage Developer Resume
East Peoria, IL
Professional Summary
- Over 7 Years of overall IT experience in Analyzing, Designing, Developing, Testing, Implementing and Maintaining client/server business systems.
- More than 5 years of hands on experience as DataStage Consultant.
- Experience in Data warehousing and Data migration.
- Experience with Extraction Transformationand Loading (ETL) tool – Ascential websphere DataStage 7.0, 7.2, 7.5 and IBM InfosphereDataStage 8.5 and 8.1.
- Extensively worked on Datastage Parallel Extender and Server Edition.
- Used both Pipeline and Partition Parallelism for improving performance.
- Developed Parallel jobs using various stages like Join, Merge, Lookup, Surrogate key, Scd, Funnel, Sort, Transformer, Copy, Remove Duplicate, Filter, Pivot and Aggregator stages for grouping and summarizing on key performance indicators used in decision support systems.
- Frequently used Peek, Row Generator and Column Generator stages to perform the Debugging.
- Expertise in Software Development Life Cycle (SDLC) of Projects - System study, Analysis, Physical and Logical design, Resource Planning, Coding and implementing business applications.
- Expertise in performing Data Migration from various legacy systems to target database
- Expertise in Data Modeling, OLAP/ OLTP Systems, generation of Surrogate Keys, Data Modeling experience using Ralph-Kimball methodology and Bill-Inmon methodology, implementing Star Schema, Snow Flake Schema, using Data Modeling tool Erwin.
- Experience in UNIX Shell scripting as part of file manipulation, and have strong knowledge in scheduling Data Stage jobs using Crontab as well as familiarity with Autosys.
- In Depth knowledge in Data Warehousing & Business Intelligence concepts with emphasis on ETL and Life Cycle Development including requirement Analysis, Design, Development, Testing and Implementation.
- Expertise in OLTP/OLAP System Study, Analysis and Dimensional Modeling, E-R modeling. Involved in Designing Dimensional Model (Star schema and Snowflake schema), Database Administration.
- Experience in Data Warehouse development, worked with Data Migration, Data Conversion, and (ETL) Extraction/Transformation/Loading using Ascential DataStage with DB2 UDB, Oracle, SQL Server.
- Extensively used DataStage tools (Data Stage Designer, Data Stage Manager and Data Stage Director). Experience in Forward/Reverse Engineering using Erwin.
- Involved in dimensional data modeling (star schema, snow flake schema) and fact and dimension table design, physical and logical data modeling using Erwin tool.
- Extensive experience in development, debugging, troubleshooting, monitoring and performance tuning using DataStage Designer, DataStage Director, Datastage Manager.
- Creation of jobs sequences and job schedules to automate the ETL process by extracting the data from flat files, Oracle and Teradata into Data Warehouse.
- Experience in Integration of various data sources like Oracle, TeraData, DB2, SQL Server, Mainframes into ODS and DWH areas.
- Experience in writing, testing and implementation of procedures, functions, packages and triggers at Database level using PL/SQL.
- Used PVCS, Clearcase and Subversion to control different Versions of the jobs.
Technical Skills
ETL Tools: Datastage 8.1/8.0/7.5x2/7.x/6.x EE & SE (Administrator, Designer, Director, Manager), MetaStage, QualityStage, ProfileStage [Information Analyzer], Parallel Extender, Server & Parallel Jobs
Databases:Oracle 8i/9i/10g, TeraData, SQL Server, DB2 UDB/EEE,Mainframe.
Operating System:Red Hat Enterprise Linux 4.x/3.x/2.1, HP-UX 10.x/11.x, Sun Solaris 2.5/2.6/8/9/10, IBM AIX 5.2/5.1, Windows 95/98/2000/NT/XP
Languages:SQL, PL/SQL, UNIX Shell Scripting, Perl Scripting, C, Cobol.
Data modeling tools:Erwin 4.0, Star Schema, Snow flake schema, Fact and Dimensions.
Tools:SQL* Loader, SQL*Plus, SQL Tools
Professional Experience
Confidential, East Peoria IL November 2011-Present
Sr.Datastage Developer
Confidential is the world’s largest manufacturer of construction and mining equipment, diesel and natural gas engines and natural gas turbines.
Responsibilities:
- Worked for ICC Team and Mach3 Middleware Team.
- Worked on Datastage IIS V8.5 and IS V8.0
- Frequent interaction with the current Team Mach3 Middleware Team.
- Understanding the TTDs provided, developing, processing the code and unit test the Job as per the requirement.
- Everyday interaction with the Middleware Team & colleagues from SAP, Mainframe teams for the issues related to Inbound and outbound process.
- Constant work on the SAP Idoc, IDOC segment, XML extract stage, MQseries, Complex flat files, Datasets, Flat files,XML stage, Lookups, joiner, FTP the files to mainframe etc..
- Worked on various Middleware Datastage Jobs( RICEF’s) belong to Vendor, Comp Parts, MRC Receipts, Demand&Demand PO, General Ledger, BOM, SuperBOM, VPPA Routings, Service Building indicator, Order Acknowledgement, Change Master, 2973 Brazil Input files and many more. Provided Support to multifarious Middleware Jobs.
- Coding for Java Transformation stage and xml Stage
- Incessant usage of UNIX commands for the Sequence Jobs.
- Monitoring and scheduling the Jobs in Datastage Director and in the Tidal and solving the issues occurred.
- Used Tidal Job Scheduling Tool for the Offshift support work 24x7 every seventh week for migration of Jobs.
- Frequent Usage of Tufops to save the input and output file and this is used for the Datastage Job input or output path.Tufops is convenient to share the file to SAP, Mainframe, and Datastage etc.. according to the Job requirement
- BMC Remedy for creating tickets when on support with migration issues and when DEV, QA, Pre-Prod& Prod disk space issues
- Used Citrix for secured processing of Jobs for Datastage designer, director Tidal test, pre-prod and Prod.
- Working on CGDS migration process using the Datastage tool and DB2 UDB, SQL,Tera Data database.
- Development of datastage design concepts, execution, testing and deployment on the client server.
- Modifying the existing Job if required.
- Used Datastage Designer for developing various jobs to extract, cleansing, transforming, integrating and loading data into Data Warehouse.
- Used Datastage Director to schedule running the jobs, monitoring scheduling and validating its components.
- Used Erwin for Data modeling.
- Frequent usage of Clear Case version control.
- Running and monitoring of Jobs using Datastage Director and checking logs.
- Involved in performing extensive Back end Testing by writing SQL queries to extract the data from database using Oracle SQL and Pl/SQL.
- Unit testing for the Jobs Developed
- Monitoring all data loads and fixing the errors
- Successive development of WIKI’s for Middleware RICEF’s or datastage jobs for the common and future issues come across in the Mach3 Middleware Team
- Used Primavera in according to datastage work requirement
Confidential, Charlotte NC September 2011-November 2011
Sr.Datastage Developer
Confidential is a leading organization which provides insurance and retirement for people who work in the academic, research, medical and cultural fields.
Responsibilities:
- Hands on experience in Transforming Business specific rules into functional Specs.
- Worked on OMNI fund ID Remediation project.
- Frequent work the Data Integration Architect to create ETL standards, High level and Low level design document.
- Responsible for Production Support and involved in On-Call for Data Integration Applications.
- Experience in Integration of various sources like Teradata, DB2UDB, SQL Server, Oracle, Sybase, My-Access.
- Good knowledge in writing shell scripts to automate file manipulation and data loading procedure.
- Involved in System study, Analysis and Project Planning.
- Hands on experience in tuning the Datastage Jobs, identify and resolve, performance tuning, bottlenecks in various levels like source and target jobs.
- Strong experience in designing Parallel, Server Jobs, Job Sequencers and Batch Jobs in Datastage.
- Development and support experience with Perl/ applications.
- Frequent usage of different Stages like CDC, Look up, Join, Surrogate Key, debugging stages, pivot, remove duplicate etc.
- Involved in writing SQL Queries.
- Frequently Used Star Team version Control for exporting and importing of Jobs using the Datastage tool.
- Tuning of SQL-Statements, stored procedures.
- Involved Unit testing and deployment of the application.
- Transferring Old Data from Legacy system to Application Database.
- Training the users, support and maintenance of the application.
- Discussions with client for bug fixing and customization of application.
Environment: Datastage 8.5/8.1/8.0,Oracle 10g,Teradata, SQL, PL/SQL,Perl, COBOL, UNIX, Windows NT
Confidential, Milwaukee WI February 2010 – August 2011
Sr.ETL /Sr.DataStage Developer
Confidential is one of the world’s leading technology providers to the banking industry. FIS does processing for more than 300 banks and financial companies for around 260 different applications residing on 18 different servers with more than 80 terabytes of data a day.
Responsibilities:
- Involved in complete Data Warehouse Life Cycle from requirements gathering to end user support.
- Provide day-to-day and month-end production support for various applications like Business Intelligence Center, and Management Data Warehouse by monitoring servers, jobs on UNIX.
- Worked in onsite-offshore environment, assigned technical tasks, monitored the process flow, conducted status meetings and making sure to meet the business needs.
- Involved in Designing, Testing and Supporting DataStage jobs.
- Developed Parallel jobs using various stages like Join, Merge, Lookup, Surrogate key, Scd, Funnel, Sort, Transformer, Copy, Remove Duplicate, Filter, Pivot and Aggregator stages for grouping and summarizing on key performance indicators used in decision support systems.
- Accomplished various development requests through mainframe utilities, CICS Conversation
- Meet the clients on a weekly basis to provide better services and maintain the SLAs.
- Redesigned, modified the existing jobs and shell scripts in production environment to fix the daily aborts.
- Developed plug-ins in C language to implement domain specific business rules
- Use Control-M to schedule jobs by defining the required parameters and monitor the flow of jobs.
- Automated the process of generating daily and monthly status reports for the processing jobs.
- Created Teradata Stored Procedures to generate automated testing SQLs
- Drop indexes, remove duplicates, rebuilt indexes and rerun the jobs failed due to incorrect source data.
- Involved in the process of two client bank mergers by taking care of the customer account numbers, bank numbers, and their respective applications.
Environment: IBM Infosphere Datastage 8.5/8.1, SunOS 5.8, Control-M 6.4.01, PL/SQL Developer 7.1, Teradata12,Erwin,Autosys,Toad,Microsoft Visual Studio 2008 (Team Foundation Server), Case Management System, CA Harvest Change Management
Confidential, Rochester NY October 2009 – February 2010
ETL /DataStage Developer
Confidential is a leading health insurance organization in the United States. My role involves working both in team for Claim processor project, which aims at developing extracts for the different states. Working in team for those projects involved developing jobs from scratch and working on shell scripts for them.
Responsibilities:
- Extensively worked on gathering the requirements and also involved in validating and analyzing the requirements for the DQ team.
- Used both Pipeline and Partition Parallelism for improving performance.
- Used lookup stage with reference to Oracle tables for insert/update strategy and updating of slowly changing dimensions.
- Imported metadata from repository, created new job categories, routines and data elements using Datastage Manager.
- Involved in Performance Tuning of Jobs.
- Designed the mappings between sources external files and databases such as SQL server, .CSV and Flat files to Operational staging targets
- Assisted operation support team for transactional data loads in developing SQL & Unix scripts
- Responsible to performance-tune ETL procedures and STAR schemas to optimize load and query Performance.
- Migrated XML data files to Oracle data mart for Data Lineage Statistics.
- Used import/export utilities to transfer data from production instance to the development environment.
- Written Configuration files for Performance and production environment.
- Used the Data stage Designer to develop processes for extracting, cleansing, transforming, integrating, and loading data into data warehouse database.
- Worked on ETL enhancements and bug fixes as required through proper release process.
- Used the DataStage Director and its run-time engine to schedule running the solution, testing and debugging its components, and monitoring the resulting executable versions (on an ad-hoc or schedule basis).
- Developing Korn Shell scripts to automate file manipulation and data loading procedures
- Used PVCS to control different Versions of the jobs.
- Involved in performance tuning of the ETL process and performed the data warehouse testing.
- Involved in test strategy and create test scripts for the developed solution.
- Extensive designing UNIX shell scripts to handle huge files and use them in DataStage.
- Worked on ETL enhancements and bug fixes as required through proper release process.
- Worked with Autosys for setting up production job cycles for daily, weekly, monthly loads with proper dependencies.
Environment: Ascential Datastage 7.5.2/7.5.3 (Server / Parallel), Oracle 10g\\9i, Db2 UDB, PVCS, Unix Windows XP, Toad, SQL Developer 2.0, Autosys, Erwin.
Confidential, Columbus OH September 2008 – October 2009
ETL / DataStage Developer
Confidential Enterprise Warehouse (AFEW) was used to maintain and analyze various store need and trends Abercrombie & Fitch, and provide information related to Various Assets and their value / status, space and clothing lines and trends Information.
Responsibilities:
- Worked extensively with Parallel Stages like Copy, Join Merge, Lookup, Row Generator, Column Generator, Modify, Funnel, Filter, Switch, Aggregator, Remove Duplicates and Transformer Stages etc.
- Gathered requirements and wrote specifications for ETL Job modules.
- Worked as SME in providing support to the team in designing the flow of complex jobs.
- Apart from providing technical support to the team and I also handled escalations.
- Worked on production support by selecting and transforming the correct source data.
- Data Warehouse was implemented using sequential files from various Source Systems.
- Worked closely with Database Administrators and BA to better understand the business requirement.
- Developed Mapping for Data Warehouse and Data Mart objects.
- Performed through data cleansing by using the Investigate stage of Quality Stage and also by writing PL/SQL queries to identify and analyze data anomalies, patterns, inconsistencies etc.
- Used DataStage Manager for importing metadata from repository, new job categories and creating new data elements.
- Design and Develop ETL jobs using DataStage tool to load data warehouse and Data Mart.
- Performance tuning of ETL jobs.
- Perform data manipulation using BASIC functions and DataStage transforms.
- Import relational metadata information for project.
- Developed DataStage Routines for job Auditing and for extracting job parameters from files.
- Create master controlling sequencer jobs using the DataStage Job Sequencer.
- Create and use DataStage Shared Containers, Local Containers for DS jobs and retrieving Error log information.
- Design, build, and manage complex data integration and load process
- Developed PL/SQL scripts to perform activities at database level.
- Developed UNIX scripts to automate the Data Load processes to the target Data warehouse.
Environment: IBM Ascential DataStage 7.5(DataStage, Quality Stage, Information Analyzer, Metadata Workbench, Business Glossary), Oracle 9i/10g, DB2 UDB, TeraData, Mainframe, PL/SQL, Oracle 10g with 2 node RAC, Autosys, Erwin 4.2, TOAD, SQL Developer, PVCS, Business Objects XI, Shell Scripts, HP Unix, Windows XP.
Confidential, Buffalo NY January2007–August 2008
Datastage Developer
Confidential is one of the largest Banking and Financial and Mortgage services organizations in the world. The Project facilitates the active reporting process for HR Benefits department by Loads Health insurance plans and service of Confidential employee\'s data and GL- Data in to Oracle Database for reporting.
Responsibilities:
- Extracted, Cleansed, Transformed, Integrated and Loaded data into a DW database using DataStage Developer.
- Designed and Created Parallel Extender jobs which distribute the incoming data concurrently across all the processors, to achieve the best performance.
- Involved in jobs and analyzing scope of application, defining relationship within and between groups of data, star schema, etc.
- Created and stored the Server Jobs to Shared Container and used them in Parallel jobs.
- Imported metadata into repository and exported jobs into different projects using DataStage Manager.
- Developed shell scripts to automate file manipulation and data loading procedures.
- Used DataStage PX for splitting the data into subsets and flowing of data concurrently across all available processors to achieve job performance.
- Used Universe Basic for developing user defined Routines and Transformations.
- Extensively used DataStage XE Parallel Extender to perform processing of massive data volumes.
- Created Autosys Scripts to schedule jobs.
- Used ClearCase for Version Control and Migration of code between Development, UAT and Production environment.
Environment: Ascential DataStage7.5.2, Oracle 9i/10g, DB2, DB2UDB, mainframe, PVCS, SQL, PL/SQL, TOAD, Clear Case, Autosys, Shell Scripts, HP UNIX