Etl Architect Resume
NJ
Summary:
- A Certified DataStage Developer having 9 years of extensive ETL tool experience in IBM Websphere DataStage (Ver7.x ,Ver8.x), designing and developing jobs using DataStage Designer, DataStage Manager and DataStage Director .
- AHM(American Health Management)250 certified
- 5 years of experience in ETL Architect
- More than 7 year experience on Onsite/offshore model.
- Having good experience in IIW and WCC ETL Data Architect.
- Having Excellent experience in ETL Error handling frame work
- Worked in SQL SERVER, DB2UDB, Oracle, Teradata and PL/SQL.
- Good experience on DataStage installation and Administration.
- Good experience on Onsite and offshore co-ordination and Handling big ETL teams.
- Worked on Autosys, ESP, Tivoli scheduling tools to schedule DataStage jobs.
- Excellent experience in using highly scalable parallel processing infrastructure, using DataStage Enterprise Edition.
- Development of UNIX shell scripts for enhancing the ETL job performance.
- Proven track record in troubleshooting of DataStage jobs and addressing production issues like performance tuning and enhancement.
- Used multiple stages like SAP R/3 Packs, Sequential File, Pivot, Transformer, Aggregator, Join, Lookup, Sort and Filter during ETL development.
- Involved in cleansing process of the data from different sources like SAP legacy systems, Oracle repository, IDOC formatted flat files.
- Resolved the performance issues while extracting & loading data into or from SAP, Oracle, Teradata, SQLServerand DB2 using Oracle connector, DB2 Connector,Teradata MultiLoad, FastLoad, ABAP R/3 stage and BAPI R/3 stage.
- Experienced in customer facing role and customer management.
- Experienced in onshore/offshore project execution
- Experienced in techno functional role
- Very good and hands on experience on Datastage ( 8.5/8.7)
- Excellent knowledge of studying the data dependencies using Metadata of DataStage and preparing job sequences for the existing jobs to facilitate scheduling of multiple jobs.
- Strong understanding of the principles of DW using Fact Tables, Dimension Tables and Star Schema modeling.
- Validated the U.S address data against USPS database using Quality Stage and CASS Plug-in stage.
- Loaded data from DB2 to Idocs using SAP BAPI Plug-in Stage.
- Performed Column key analysis and primary key analysis using Information Analyzer Tool.
- Skilled in Estimation, planning, coordination, and execution of system applications and engineering projects.
- Strong writing and documentation skills for the management, development and control of documentation.
- Very good working knowledge on configuration management tools like SubVersion and Clearcase.
- Worked Global Delivery Model and handled GDC Onsite & Offshore team successfully (USA, China, Philippine & India) and projects teams as Tech Lead and ETL Architect
- Trained on Netezza Connector
Educational Qualifications:
- Master in Computer Applications (MCA)
Certification:
- IBM Websphere IIS DataStage Enterprise Edition v7.5 certification
- AHM(American Health Management)250 certified
Software/Hardware
Skills
Total Exp
Data Warehousing
IBM Web sphere DataStage, DataStage Enterprise Edition 7.5.2/8.0/8.2/8.5/8.7 (DS-PX.), Erwin, SAP R/3,Information Analyser,Quality Stage,FastTrack,MQ Plug-in, Files,Mainframe complex files,Calssicfed server
9 Yrs
Databases
Oracle 10g, SQL Server 7.0/2000, DB2 Universe, Server 2000,PL/SQL, MS-Access,Teradata.Connector to Netezza (trained )
8 yrs
Operating Systems
UNIX, Windows 95/98/2000/2003/XP/NT,iBM AIX,SUN-OS
8 yrs
Languages
XML, C,Unix Shell Script
4 yrs
Configuration Management Tools
Clearcase, SubVersion
2 yrs
Scheduling Tools
TIVOLI 1.3,AutoSys 4.5,ESP,Control-M
8 yrs
Professional Experience:
Project 1#
Project : Common Track
Confidential,Role : ETL Architect/Onsite Tech Lead
Duration : Jun 2011- till date
Environment: IBM DataStage 8.5/8.7 (Administrator, Designer, Director, Manager), Windows XP,SUN-OS, Oracle, Teradata, Quality Stage 8.5, Information analyzer, fast track
Description:
The objective of the project is to replace the existing WCC architecture with Insurance Data warehouse model which includes all the mandatory data warehousing methods are carried out in a more sophisticated way. Data will be loaded into ETL staging and will be moved to EBMR (Employee Benefits Member Repository) and the incremental data will be loaded to the target RDS (reporting Data Store) database.
IBM Web sphere DataStage 8.5 is used to load the raw information from various source files and to perform the various levels of transformation (as per business requirement) and also to increase the performance of the data flow into the system.
Responsibilities:
- Designed DataStage reusable jobs for extracting data from Oracle and Teradata database, transform and finally loading into the Oracle data warehouse.
- Worked on Details design/ETL Architect documents
- Designed LLD and High level documentation.
- Leading the onsite/offshore team.
- Coordinating with Business team for functional requirements.
- Proving technical and functional knowledge to team members from Onsite.
- Designed best practices on DataStage to team members.
- Worked with DataStage Director in running the solution, testing and debugging the components and monitoring the resulting executables
- Excellent knowledge of studying the data dependencies using Metadata of DataStage and preparing job sequences for the existing jobs to facilitate scheduling of multiple jobs.
- Used CASS to validate the US Postal addresses.
- Standardized the data using Standardize stage for addresses, name etc using Quality Stage.
Project 2#
Project : Blue Harmony
Confidential,Role : ETL Architect/Tech Lead/project Lead
Duration : OCT’2009- Jun 20011
Environment : DataStage 8.1,Qualtity stage, Information analyzer, fast track, SUNOS,SAP-plug-in(BAPI, IDoc, ABAP). MQ plug-in,XML,DB2
Description:
Blue Harmony is an enterprise transformation program that will take business unit and region-specific core processes such as Quote to Cash and Finance, and integrate them horizontally across the enterprise on a common SAP platform. Blue Harmony and the new governance model for enterprise transformation programs will accelerate our pursuit of IBM as a Globally Integrated Enterprise by integrating and streamlining processes across the enterprise.
Key Responsibilities:
Responsibilities:
- Designed DataStage ETL jobs for extracting data from oracle database, transform and creating Reports.
- Prepared the Mapping documents for Developers
- Used multiple stages like SAP R/3 Packs, Sequential File, Pivot, Transformer, Aggregator, Join, Lookup, Sort and Filter during ETL development.
- Resolved the performance issues while extracting & loading data into or from SAP using ABAP R/3 stage, BAPI R/3 stage.
- Extracted data from IDOCS using IDOC Extract Stage, applied different business rules for transformation and loaded in database.
- Extracted and transformed data from DB2 then loaded to MATMAS01 and CREMAS01 IDOCS using IDOC Load Stage.
- Loading the data into SAP ECC system for Vendor Master, Material Master and Customer Master Data using SAP IDOC load stage.
- Excellent knowledge of studying the data dependencies using Metadata of DataStage and preparing job sequences for the existing jobs to facilitate scheduling of multiple jobs.
- Loaded data from DB2 to SAP using IDOC and BAPI Stage.
- Used CASS to validate the US Postal addresses.
- Standardized the data using Standardize stage for addresses, name etc using different country rule sets like U.S rule sets.
- Worked with Standardize Stage for address standardization.
Project 3#
Project : RailCorp
Confidential,Role : Tech Lead
Duration : Feb’09-Oct’09
Environment : IBM Websphere DataStage 8.1(Administrator, Designer, Director, Manager), Windows XP, Sun-OS, Oracle, Toad, Putty, Secure File Transfer, Information analyzer, fast track
Description:
This project intends to develop a strategic solution to provide analytical reporting capability around train performance and reliability based on the operational data sources. The proposed solution is to provide a central repository of consolidated and integrated data that will provide analytical capability. This solution runs on Sun Solaris and extensively uses the capabilities of DataStage parallel engine. As part of this solution I have worked in designing and developing ETL Jobs to move data from Drop zone to storage area.
Key Responsibilities:
- Prepared the Estimates for subject area.
- DataStage 8.1 and oracle 10g installation on SunOS 5.10
- DataStage administration
- Preparing the Low level Design Documents and Understanding requirements
- Coordinated onsite and offshore teams.
- Provided the technical/functional support to Team Members
- Reviewed the New/changed component coded by offshore team
- Tracked the offshore Issues and work status.
- Adhering to quality requirements
- Adhere to the client or project specific quality and documentation standards as part of project execution
- Designed Mapping documents ETL architecture documents and specifications.
- Analyzed the Source Data and designed the source system documentation.
- Conducting the Meetings with Project Manager, Business Analysts and Team Members on any technical and/or Business Requirement issues.
- Development of detailed specifications for data management scripts.
- Document the developed code for promotion to production environment
PROJECT 4:
Project : PHS feeds to FSDB (FSDB)
Confidential,Role : Tech Lead
Duration : Feb’08-Feb’09
Environment : DataStage 7.5.2, UNIX, IBM AIX 5.3.1 Oracle 9i, Toad, Autosys
Description:
The creation of feeds from PHS to FSDB for revenue, membership and claims. These feeds would be used to load current activity into PS GL until the customer migrates to the UP systems. The feeds will also be used to load 3 years of history to support the reserving and migration to UP.UHC will be able to control and manage the PHS reserve process by utilizing RPS (Reserve Production System) for all PHS claims and not just the fully migrated claims. The financial close management process will follow UHC s standard process thereby alleviating the complications of maintaining dual reserve models. The summary level of claims history would be housed in FSDB for use by RPS but also stored at a lower detail level in order to support DOI audits. This claim detail data is suggested to be housed in Galaxy for possible use by additional financial departments, underwriting and customer reporting, pending their support. UHC Finance is also requesting membership and revenue historical feeds. In order to properly track the migration of groups we need detailed policy level information loaded for analytical purposes. We also need to load no migrated information so we can ensure membership and revenue are only reported once during the migration process. This information will be used to perform per member per month analysis, trend analysis and explain changes from month to month.
Key Responsibilities:
- Prepared the Low level Design Documents.
- Coordinated the business team to understand requirements
- Prepared Detailed Technical Design documents for developers
- Coordinating with onsite team and offshore team.
- Providing technical/functional support to Team Members
- Reviewing the New/changed component coded by offshore team
- Adhere to the client or project specific quality and documentation standards as part of project execution
- Designed Mapping documents ETL architecture documents and specifications.
- Analyzed the Source Data and designed the source system documentation.
- Conducting the Meetings with Project Manager, Business Analysts and Team Members on any technical and/or Business Requirement issues.
- Worked with deployment team to deploy the code in different environments(SYS,UAT,PRE-PROD,PROD)
- Extensively used Autosys for scheduling the DataStage Jobs.