We provide IT Staff Augmentation Services!

Sr.datastage Consultant Resume

5.00/5 (Submit Your Rating)

Detroit, MI

PROFESSIONAL SUMMARY

  • Eight years of IT Experience in Software Analysis, Design and Development in Client/Server systems, Data warehousing and Business Intelligence applications in UNIX and Windows environments.
  • Good exposure to Software Development Life Cycle (SDLC) including analysis, design, construction, testing, implementation and support.
  • Over Five years of experience in building Data Warehouse/Data Marts using Ascential DataStage/IBM WebSphere DataStage (IBM Information Server).
  • Two years of experience in IBM WebSphere QualityStage/Information Analyzer.
  • Extensively worked with DataStage Parallel Extender and Server Edition.
  • Well versed with various DataStage PX strategy and implemented number of DataStage Jobs using DataStage 8.01.
  • Good Knowledge of dimensional modeling (star and snowflake schema) for DataWarehouses and Data Marts.
  • Excellent skills and experience using the different stages in Server and Parallel Extender such as Aggregator, Pivot, Join, Lookup, Sort, Merge, Funnel, change capture, change apply stages.
  • Experience in QualityStage like investigating, standardize, match and survive stages.
  • Excellent experience in Teradata development and in using Bteq’s, Fastload, Multiload, TPump, FastExport utilities.
  • Experienced in development of ETL methodology for data transformations and processing in corporate wide ETL solutions.
  • Experience in integration of various data source like Oracle, SQL Server, Complex Flat Files, Flat files, UDB DB2 and MS Access into staging area.
  • Knowledge of SAP environment like extracted data from SAP application server.
  • Good knowledge in XML parsing and using it in IBM Information Server.
  • Strong technical exposure with good degree of competence in business systems like Automotive, Healthcare, Insurance, Retail and Financial domains.
  • Experienced in database design, data analysis, development, SQL performance tuning, data models, ETL processes and data conversions.
  • Strong knowledge in writing and maintaining PL/SQL, Stored Procedures, Functions, Packages and Triggers.
  • Good working knowledge in writing Shell Scripts in UNIX environment.
  • Proficient in using databases like Oracle, MS SQL server, DB2, and in PL/SQL and SQL*Plus.
  • Excellent communication, analytical skills and strong ability to perform as a team player.

Technical SkillS

Operating Systems: Windows NT 4.0, Windows 2000/2003/XP, Sun Solaris, UNIX, Linux
Languages: Oracle PL/SQL, C, C++, XML, HTML, UNIX Shell Scripting, JCL, COBOL
Database/Tools: SQL*Plus Client, SQL*Loader, Export and Import, SQL Developer, SQL Navigator, ERWin 4.5, TOAD 9.5,
ETL Tools: IBM DataStage8.01, Ascential DataStage/7.5.2/7.0/6.0, (Datastage Enterprise, Parallel and Server Edition), IBM WebSphere Quality stage, information analyzer, Informatica 8.0/7
Teradata Utilities: BTEQ, Fast Load, Multi Load, T Pump, Fast Export.
Databases: Oracle 10g/9i/8i, Teradata V2R4, IBM DB2, IBM Mainframes, SQL Server 7.0/2000/2005

Professional Experience

Confidential,Detroit, MI May 2009 – Till date
Sr.Datastage Consultant
GMAC Financial Services is a global finance company and specializes in automotive finance, real estate finance, insurance, commercial finance and online banking. The main goal of the EDW Project is to extract the data from Global Information Warehouse (GIF), standardized the source data and load in to Enterprise Data Ware house (EDW) using IBM Information Server 8.1.
Responsibilities:

  • Involved in the design and documentation of High Level design document and detailed design document.
  • Created source to target mapping documents from staging area to Data Warehouse and from Data Warehouse to various Data Marts.
  • Involved in mapping team discussions to solve the issues in mapping document/Design.
  • Used Star Schema for loading data into Facts and Dimension tables in Oracle Warehouse.
  • Designed jobs using different parallel job stages such as Join, Merge, Lookup, Remove Duplicates, Copy, Filter, Funnel, Dataset, Lookup File Set, Change Data Capture, Modify, and Aggregator.
  • Extensively used IBM INFORMATION SERVER (8.1) and designed Data Stage ETL jobs for extracting data from heterogeneous source systems, transform and finally load into Data Warehouse.
  • Extensively handled slowly changing dimension (SCD) and created the parameter sets in the IBM InfoSpehere Data stage Designer and made use of these in the Data stage jobs.
  • Implemented the slowly change dimensions (Type 2) using both CDC (change data capture) and Slowly Changing Dimension Stage.
  • Used the Quality Stage to standardize and cleansing the source data which comes from GIF.
  • Extensively worked on Integrity’s processes such as Investigation, Standardization, Matching and Survive and pre-defined rule sets for data cleansing.
  • Used Shared Containers for repeated business logic, which is used across the project especially for Audit, Balance and Control process.
  • Complex jobs were developed to include changing business rules and perform quality checks on data received from external vendors.
  • Used DataStage Director to Run and Monitor the Jobs performed, automation of Job Control using Batch logic to execute and schedule various DataStage jobs for testing.
  • Exporting of DataStage Components & Packaging of Projects using DataStage Designer and to create backups.
  • Wrote Shell scripts to schedule batch jobs to obtain data overnight from various locations, through Scheduling tool called TIVOLI scheduler for scheduling various jobs.
  • Worked on COGNOS 8.4 for generating the RISK Dashboards for the monthly refresh.
  • Worked on Framework Manger, Transformer and Report Studio for creating the designs, models and reports for the monthly Dashboard.
  • Involved in development of UNIX shell scripts for Batch jobs etc /State file creations etc.
  • Used Sub Version (Version Control Tool) for versioning DataStage Jobs, UNIX Shell scripts and DDL’s for the Tables.
  • Perform data analysis to understand and triage issues in this project.
  • Experience on Performance Tuning on Data stage jobs, SQL and Oracle.
  • Prepared test docs which involved preparation of test data, test case and unit testing of the jobs
  • Provided data modeling support for numerous strategic application development projects.

Environment: IBM Information Server 8.1(DataStage EE), Quality Stage, Oracle 10g, Tivoli Scheduler, COGNOS 8.4, SQL, PL/SQL, HP – UX v11, UNIX Shell Scripts.

Confidential,Des Moines, IA Sep 08 – April 09
DataStage Developer

Wellmark Blue Cross and Blue Shield is one of the biggest companies that provide health insurance to millions of people around the United States. Wellmark Blue Cross and Blue Shield associated with Blue Cross Blue Shield Association (BCBSA) is part of a national network of 39 plans that insures nearly 100 million people or nearly one in three Americans in all 50 states, the District of Columbia, and Puerto Rico. I was involved in Membership, Providers and Claims projects for the development of IDS/IDW Data Warehouse. The main aim of these projects is to read the files from the mainframe, do a net change, capture the changes, apply business logic and load into the Data Warehouse which is a DB2 Database by using IBM WebSphere DataStage 8.01
Responsibilities:

  • Extensive experience in Data Warehousing, design of Extraction Transform & Load (ETL) environment.
  • Involved in the design and development of IDS/IDW Data Warehouse.
  • Worked in different environments like Client/Server, Data Warehousing, Data Processing and XML technologies.
  • Worked with data mapping and logical & physical modeling along with design and implementation.
  • Designed and developed processes for Extracting, Cleansing, Transforming, Integrating, and Loading data into Staging Area and then loading into the IDS/IDW Data Warehouse which is further used for creating Data Marts.
  • Very well versed with DataStage configuration files and PX data sets in the IBM Information Server 8.01.
  • Extensively worked with Parallel Extender/Enterprise Edition (EE) using Parallel Processing (pipeline and partition parallelism) techniques to improve job performance while working with bulk data sources.
  • Worked with Parallel Extender to speed up data processing, scalability and performance.
  • Developed many Shared Containers that are used for the Audit Balance and Control processes in the system and in various Datastage jobs.
  • Designed complex jobs in IBM Information Server 8.01 by using different stages like Aggregator, Copy, Filter, Funnel, Join, Lookup, Modify, Sort and Transformer.
  • Used Complex Flat File stage to extract data from AS400 systems which is in EBSIDIC format and convert to ASCII format for the transformations and load into DB2 tables.
  • Used Change data capture stage to capture the changes and load into the DB2 database depending on the condition (Update, Insert, Delete).
  • Used Upsert method in the DB2 database if the incoming data has to both inserted and updated in the DB2 database.
  • Used Stage variables in Transformer for decreasing the complexity of the function and increasing the performance of the job.
  • Used XML Input stage for parsing the XML data and load into the delimited file for the Response and Request messages.
  • Parameterized DataStage jobs to allow portability and flexibility during runtime and passing values through UNIX Shell Script.
  • Extensively worked in improving performance of the jobs using Modify stage, Column generator stage and Filter stage instead of using Transformer stage.
  • Experience in Fine Tuning, Trouble shooting, bug fixing, defect analysis And Enhancement the Data Stage Jobs for optimal performance.
  • Involved in performance tuning of parallel jobs using performance statistics, APT Dump score.
  • Used various Standard and Custom Routines in DataStage jobs for improving the performance of the job and decreasing the complexity of the job.
  • Used Before/After Job-Subroutines in Job Properties for performing the functions in UNIX before or after the Datastage jobs to further enhance performance of the system.
  • Was responsible for setting the Environmental Variables and Job Parameters both at the project level and job level using DataStage Administrator and Designer.
  • Written Shell Scripts to read job parameters from the files, trigger DataStage Sequencers those in-turn triggers the DataStage jobs.
  • Written ETL configuration file for scheduling DataStage jobs, for automation of ETL processes and handle Error Management and Audit Management.
  • Involved in unit testing, system testing and integration testing for the Datastage jobs and ETL process.
  • Designed the mapping between the source and target columns to meet the business requirements.
  • Responsible for Preparation of Technical Design Documents, Test Case Specifications, and sending the Datastage jobs for Code Review done by the Tech Lead.
  • Actively participated in the Team meetings to gather the business requirements and developing the Specifications.
  • Mentored Mid Level ETL Datastage Developers to the company standards.

Environment: DataStage 8.1, IBM DB2 9.5, Sybase, SQL Server 2005, XML Editor, UNIX Shell Scripting, IBM AIX 5.9, Sun Solaris 10, Windows XP, Microsoft Visio 2007

Confidential,San Antonio, TX Jan 08 – Aug 08
DataStage Developer
United Services Automobile Association (USAA) is a Fortune 200 financial services company focused on providing banking, investing, and insurance to people and families that serve, or served, in the military of the United States and other selected federal agencies. It is one of the largest companies for Auto, Homeowners and Life insurance that is designed to suit individual needs and budgets. The Scope of this project is to build a Data warehouse for the customer service and integrate all the information in to the data warehouse for generating reports. The main Objective of this product is to collect all the information and keep in the CCIF (Customer Collection Information File) which is an IMS database. All the data is transferred to the ECIF (Enterprise Collection Information File) by using IBM WebSphere DataStage 8.01 and MQ series and finally the data is loaded into the EDW (Enterprise Data warehouse) which is a DB2 database by using IBM WebSphere DataStage 8.01.
Responsibilities:

  • Involved in the Data Modeling like star schema and snowflake schema for loading data into Data warehouse.
  • Worked closely with data modeler and database architect during the design and development of ETL technical specification document.
  • Designed the mapping between the source and target columns using IBM WebSphere DataStage 8.01.
  • Designed transformation Logic in DataStage PX Designer for loading data from enterprise data warehouse systems.
  • Used IBM WebSphere DataStage 8.01 to extract, cleanse, transform, integrate and load data into DB2 tables.
  • Used various stages like lookup stage, Merge stage, Join stage, Copy stage, Change Capture stage, Funnel stage, Transformer stage, Aggregator stage, Filter stage, Sort stage using DataStage Designer.
  • Used Development/Debugging stages like Row generator, Column Generator, Head, Tail and Peek.
  • Used stage variables and constraints in Transformer for filtering source data in DataStage designer.
  • Created shared containers to use the business logic in multiple jobs and for simplifying design and maintenance.
  • Experienced in fine tuning, Trouble shooting, bug fixing, defect analysis in DataStage Jobs.
  • Worked with Production support team to handle the issues and documented the process to handle the data discrepancy and job abort issues.
  • Extensively worked on Error handling, cleansing of data and creating Flat files.
  • Used DataStage Director to clear the job logs, job resources and status files and Responsible for monitoring all the jobs in the DataStage Director.
  • Extensively used XML files and parsed the tables and columns from the MQ message for loading the data into DEP (Data Event Publisher).
  • Used MQ explorer for loading the data from DEP into the DataStage jobs using MQ connector in the DataStage designer.
  • Used DTS for loading all the DB2 tables at a time.
  • Used DataStage Administrator for assigning and getting some privileges and used to add projects and develop new environmental variables.
  • Used IBM DB2 General Administration Tools like control center and Squirrel SQL to view the tables and run queries.
  • Provided data to the front end Java developers using J2EE and Web Services.
  • Build and Maintain Version control to maintain various production versions.
  • Extensively worked on Unix/Linux OS environment.
  • Wrote shell scripts to run DataStage sequencer jobs and to assign parameters at run time.
  • Involved in unit testing, integration testing and system testing of DataStage jobs for performance tuning.
  • Mentored the existing ETL developers once the integration testing was successful and to the satisfaction of the customer.

Environment: DataStage 8.01, Oracle 10g, PL/SQL, DB2 UDB, IBM DB2, XML, Squirrel SQL, Linux Redhat, J2EE, MQ Explorer.

Confidential,Hartford, CT Jan 07 – Dec 07
DataStage Developer
Travelers Insurance insures individuals, small-to-medium businesses and is among the country’s largest providers of insurance for homeowners and drivers, with their offices located in various regions. The data is integrated from the operational sources like Oracle into Data Warehouse to provide a client data activity for the various departments and was involved in analysis, design, testing and deployment of data from source systems to the warehouse according to the end users requirements.
Responsibilities

  • Designed and populated dimensional model (star and snowflake schema) for a Datawarehouse and Datamarts.
  • Used IBM WebSphere Datastage 8.01 for distributing load among different processors by implementing pipeline and partitioning of data in parallel extender.
  • Used DataStage parallel extender while running jobs for bulk data splitting and to pass the data into subsets to all available nodes for best job performance.
  • Worked with parallel extender to speed up data processing, scalability and performance.
  • Implemented logic for slowly changing Dimensions and used change capture and change apply stages to achieve it.
  • Configured nodes for parallel extender as per number of processors available to process the data by using DataStage Manager.
  • Designed jobs using different parallel job stages such as Copy stage, Change Capture stage, Funnel stage, Transformer stage, Aggregator stage, Filter stage, Sort stage.
  • Worked with stages like Data Set, File Set, FTP Stage, Merge, Join, and Look Up for performing lookup functions on the source and lookup datasets.
  • Involved in Designing Parallel Jobs to Create, Extract, Transform, Load and Update operations.
  • Used Quality stage for investigating, matching of accurate data from the sources.
  • For integrating the data from various source files, we used information analyzer for better performance.
  • Used DataStage Designer for developing various jobs to extract, cleansing, transforming, integrating and loading data into DataWarehouse.
  • Converted old batch jobs into the sequencer jobs to minimize the time window.
  • Generating of unique keys for composite attributes while loading the data into DataWarehouse.
  • Developed customized Routines and Transformations.
  • Used DataStage Manager for managing DataStage repository (view and edit), define custom routines & transforms, import and export items between different DataStage systems or exchange metadata with other datawarehousing tools.
  • Used DataStage Administrator to assign privileges to users or user groups to control purging of Repository and DataStage client applications or jobs they run, move, and rename projects.
  • Used Data Set to extract and write data and also to act as an intermediate file in a job. Data Set is also used as a reference input.
  • Used ERWin 3.5/4.0 as leading data modeling tool for Logical (LDM) and physical data model (PDM).
  • Provided flexible resources to users as per the business requirements by exporting to Universe and BO reports.
  • Developed scripts for downloading and uploading data as before and after job routines using shell scripts and SQL.
  • Used Control-M for scheduling the Datastage jobs.

Environment: DataStage 8.01/7.5.2 (Designer, Director, Manager and Administrator), IBM WebSphere QualityStage/Information analyzer, Oracle 10g/9i, SQL, PL/SQL, TOAD 8.6, ERWin 4.2, Control-M, Win 2000/NT, UNIX AIX3.2

Confidential,CA Oct 05 – Dec 06
DataStage Developer
PacifiCare Health Systems is the leading health care provider in West Coast. It also allows the brokers and providers in order to serve the people better. Worked on DDS (Data Delivery System) application. This application is to provide business users with client information and to add new states to provide healthcare in new states. The source for the application is generated from DB2, which come in as datasets and are being loaded into the warehouse, Oracle 9i. The source data that is in flat files is being transformed to apply the warehousing concepts and then loaded into the warehouse. The data is being loaded into the warehouse overnight to facilitate the users with reports by morning. Since the data is huge and because of the time limitations, we used datastage Enterprise Edition for loading data into warehouse.
Responsibilities

  • Extensively used DataStage Designer to develop various Parallel jobs to extract, cleanse, transform, integrate and load data into Enterprise Data Warehouse.
  • Built custom parallel stages to use in parallel environment.
  • Knowledge of configuration files for Parallel jobs.
  • Used Parallel Extender Development/Debugging stages like Row generator, Column Generator, Head, Tail and Peek stage.
  • Developed Parallel Jobs using Aggregator, Join, Transformer, Sort, Merge, Filter, Lookup, Link Practitioner/Link collector.
  • Used different partition methods in Parallel Extender job for bulk loading of data and to get better performance.
  • Worked with DataStage Manager to import and export metadata, DataStage Components between the projects.
  • Involved in Designing, Source to Target Mappings between sources to operational staging targets, using Star Schema.
  • Involved in Performance Tuning of Parallel Jobs using Performance Statistics, Dump Score.
  • Used Various Standard and Custom Routines in DataStage jobs.
  • Responsible for adopting the Company Standards for Stage & Link Naming Conventions.
  • Extensively worked with DataStage Shared Containers for Re-using the Business functionality.
  • Extensively wrote user-defined SQL for overriding Auto generated SQL queries in DataStage.
  • Uploaded the data from legacy system to flat files according to SAP table requirement.
  • Involved in unit testing, system testing and integration testing.
  • Created and edited the design specification documents for the jobs.
  • Actively participated in the Team meetings to gather the business requirements and developing the Specifications.
  • Participated in the review of Technical, Business Transformation Requirements Document.
  • Participated in discussions with Team leader, Group Members and Technical Manager regarding any technical and Business Requirement issues.
  • Coordinated with team members at times of change in Business requirements and change in Data Mart Schema.

Environment: DataStage EE 7.5.2/7.0, Oracle 9i, DB2 5.2, UNIX Shell Scripting, AIX 4.3x, Windows NT 4.0.

Confidential,Minneapolis, MN Aug 04 – Sep 05
ETL Developer
Target Technologies provides end-to-end payment solutions to acquirers and their merchants to reliably process millions of credit, debit and other card transactions. Part of the Data Architecture Team, involve in gathering business requirements and prepare design documents, responsible for extracting data from Teradata and Oracle databases and doing conversions if needed before loading into Datawarehouse.
Responsibilities

  • Involved in understanding Business Process and coordinating with Business-users to get specific user requirements to build the Data Marts.
  • Documented user requirements, translated requirements into system solutions and develop implementation plan and schedule.
  • Modeled the Star Schema Data Marts by identifying the Facts and Dimensions using ERWin Data modeling tool.
  • Used DataStage 7.0 to integrate Sales and Marketing data along with Staging Area into Marketing and Sales Data Marts.
  • Created jobs in DataStage to extract from heterogeneous data sources like SQL Server 2000, PeopleSoft and Text files to Oracle 9i.
  • Used Link Practitioner to Portion Data in order to enhance performance of DataStage jobs. And Link Collector to collect portioned data together.
  • Migration of data from Staging Area into Marketing and Sales Data Marts to store reliable data for decision support system using DataStage as ETL tool.
  • Used DataStage Designer to develop processes for extracting, cleansing, transforming, integrating and loading data into Data Warehouse database.
  • Used different types of Stages like FTP, Sequential File, Hashed File, Transformer, Sort Aggregator, and ODBC for developing different Jobs.
  • Used DataStage Director to Run and Monitor the Jobs performed.
  • Complex queries are written to facilitate the supply of data to other teams.

Environment: DataStage 7.0 (Designer, Manager, Director, Administrator) Server Edition, ERWin, IBM AIX 5.1, Oracle 9i, PL/SQL, Business Objects 5.0, SQL Server 2000, UNIX Shell Scripting.

Confidential,Bristol, CT Nov 03 – Jul 04
Data Warehouse Developer
This project is consolidating all the different warehouses into Teradata. ESPN has decided to maintain Teradata instead of having four or five small warehouses like sales, finance, HR.
Responsibilities

  • Demonstrated in-depth knowledge of Data Warehouse life cycle and architecture in a Tera-Bytes environment.
  • Full exposure of relational and multidimensional Data Modeling concept.
  • Developed UNIX scripts to load data from flat files using Teradata MLoad and FLoad.
  • Developed UNIX scripts to extract the data from Oracle and load into Teradata.
  • Developed JCL to load data to Teradata using FLoad, MLoad.
  • Developed SQL scripts to implement business rules.
  • Developed archive jobs to archive the application.
  • Developed Teradata BTEQ scripts to implement the business logic.
  • Actively participated in designing and developing EET Data Warehouse.
  • Involved in migrating the data warehouse from DB2 to Teradata.
  • Involved in data quality testing.

Environment: Teradata V2R4 (BTEQ, FastLoad, MultiLoad, FastExport), SQL, Oracle 9i/8i, UNIX Shell Scripting, JCL.

Confidential,Englewood, CO Jan 03 – Oct 03
PL/SQL Developer
XRM is a solution designed to automatically generate and process service orders for the Local Service Requests (LSRs) submitted by Competitive Local Exchange Carriers (CLECs). Verizon Wholesale systems support the processing of Local Service Requests for its clientele, the CLEC community. This LSR is then forwarded to workflows to start processing them. Generating reports based on retail and whole sale customers.
Responsibilities

  • Design and analyze requirements by interacting with the business users and to identify Business Objects.
  • Design the database objects and prepare the Technical design document based on the detailed design in compliance with the business standards.
  • Develop PL/SQL packages, procedures, functions, database triggers, test, debug, modify and document systems based on the technical design document. Develop interface systems, perform data conversions and Extract, Transform and Load data from other systems to Oracle Database.
  • Deployed the new builds to development and Pre-production environments. Loading test data using Oracle SQL*Loader.
  • Prepare detailed specification for business application projects including development of new applications and customizations to be integrated with existing applications.
  • Design, modify and analyze business applications and processes to recommend and implement system enhancements or process improvements.
  • Support the team members on software design, analysis and learn and apply new technologies relevant to the application.
  • Used collections (PL/SQL tables, v-array, and nested tables) for reading (bulk-collect) and adding/updating (FORALL operations) of data into the database.
  • Resolve issues during production support of the application. Extensive use of database development tools like TOAD and SQL Navigator.
  • Develop scripts and generate reports using the scripts as scheduled.
  • Ensuring proper version control using VSS of all the developed code for any future purposes.
  • Monitoring of the new procedures that are deployed in production at month-end to examine any performance issues.
Environment: Oracle 9i/8i Database, PL/SQL, UNIX Shell scripting.

Education:

Bachelors Degree (Computer Science) in JNTU University, INDIA

We'd love your feedback!