We provide IT Staff Augmentation Services!

Etl Developer Resume

2.00/5 (Submit Your Rating)

PROFESSIONAL SUMMARY

  • Over 7 years of IT experience in design, development and implementation of data integration, client/server and Oracle development applications
  • 6 years of Extraction, Transformation and Loading (ETL) experience using IBM Websphere DataStage 8.0.1/8.0, Ascential DataStage 7.5/7.0/6.x/5.x (Administrator, Director, Manager, Designer), Parallel Extender, Quality Stage (Integrity) and development of Data Warehouse/Data mart Applications
  • Expertise in Dimensional data modeling, Star schema modeling, Snow-Flake modeling, identification of fact and dimension tables, Normalization, Physical and Logical data Modeling using Erwin and Oracle Warehouse Builder to implement Business Intelligent systems
  • Experience in Full Software Development Life Cycle (SDLC) in Collecting Requirements, Design, Coding, Unit Test Plan Preparations, System Testing and Project documentation
  • Experience in both Parallel extender jobs and Server Jobs in DataStage.
  • Designed and developed jobs using Parallel Extender for splitting bulk data in to subsets and to dynamically distribute to all available nodes to achieve best job performance
  • Strong knowledge in scheduling DataStage jobs using Crontab as well as preparing Test Plans and Testing of Batch programs - changes to scheduling (IBM Tivoli Workload Scheduler, Autosys and Jobtrac)
  • Excellent experience in troubleshooting of DataStage jobs and addressing production issues like performance tuning and enhancements.
  • Excellent knowledge of studying the data dependencies using Metadata stored in the DataStage Repository
  • Experience in Integration of various data sources like UDB, Sybase, Oracle, TeraData, DB2UDB, SQL Server, Sybase, and Ms-Access
  • Experience in creating reports using Mainframe SAS and Unix SAS
  • Hands on experience in SQL Loader for data migration from Legacy system
  • Experience in data backup, creating archives and retrieving or restoring the data using IBM Tivoli Storage Managment
  • Experienced in UNIX Shell scripting (Korn, BASH, and CSH) as part of file manipulation
  • Good understanding of Cognos, Microstrategy and Business Objects OLAP tools
  • Extensive experience in client-server and internet application development using Oracle, MS SQL server, PL/SQL, stored procedures, Functions, Triggers, ODBC and Visual Basic
  • Experience in UNIX, Linux, Z/OS operating Systems
  • Profound knowledge in ASP, C#, VB6, XML, Java, C++ and .NET
  • Experienced in translating user requirements to technical specifications and writing system specifications
  • Strong understanding of business processes & its interface with IT. Highly organized and detail-oriented professional with strong technical skills.
  • Excellent communication skills and interpersonal abilities with maximum contribution to attain the team goal

TECHNICAL SKILLS

ETL Tools : IBM Websphere DataStage 8.0.1/8.0, Ascential DataStage 7.5/7.1/6.0/5.2/5.1
(Parallel Extender, Quality Stage)
OLAP Tools : Business Objects 6.1/5.1/5.0/4.1 (Supervisor, Designer, BO Reporter, WEBI,
BCA, Info View), Cognos, Microstrategy 7.2/7.1
Databases : Oracle 11g/10g/9i/8i/8.x, MS SQL Server 7.0/6.5, DB2 UDB, MS Access 7.0,
TeraData, ODBC, PeopleSoft CRM 8.4/8.9, Sybase
Database Tools : SQL* Plus 9.2, SQL Loader 9.2, SQL Navigator 4.1, TOAD 7.5, DB2 Import, DB2
Export
Other Tools : IBM Tivoli (TSM/TWS/TEC), Autosys 4.5, Rational ClearCase and ClearQuest,
MS Visual Studio
Data Modeling Tools : Erwin 4.1/3.5, ER/Studio 3.5, Microsoft Visio 2000
Operating Systems : Windows 7/Vista/XP/2000/NT/98, Unix, Solaris 5.8, IBM AIX 5.x, OS390, z/OS
Prog. Languages : COBOL, SAS, SQL, PL/SQL, Unix Shell scripting, C, C++, Java, VB 5.0
Web Development : J2EE, Applets, Servlets, EJB, XML, HTML 4.0, ASP
Educational Qualification

  • Bachelors in Computer Science Engineering

CERTIFICATIONS

  • Oracle PL/SQL Developer Certified Associate (OCA)

PROFESSIONAL EXPERIENCE

Confidential, McLean, VA May 2009 - Jan 2010
ETL Developer
Confidential, is a shareholder owned company whose people are dedicated to lowering the costs and increasing access to quality housing for more of America's families. The corporate Data Warehouse (CDW) is a centralized data repository that makes accessible historical and current business relevant data as of the prior business day from disparate data sources residing within the company transaction processing and end-user systems into one relational database for decision support, management reporting, ad-hoc and trend analysis, modeling and data mining purposes. The data warehouse can be made up of both detailed, summary and aggregate point-in-time data and is founded upon the historical archive of detailed data that is needed to support today's business requirements, as well as the unanticipated business questions of the future. The objective of this project is to provide an integrated Mainframe IMS data from various sources like PE (Project Enterprise), Midas (Mortgage Information Direct Access System) to Corporate Data warehouse.

Responsibilities:

  • Designed and Created Parallel Extender and sequencer jobs which distribute the incoming data concurrently across all the nodes, to achieve the best performance
  • Developed COBOL/SAS/JCL programs to retrieve IMS Data and load the Data to CDW Tables
  • Used Mainframe and Unix SAS to generate various DQ reports
  • Used SAS to compare Source Data and Target Database
  • Used the DataStage Designer to develop various jobs for extracting, cleansing, transforming, integrating, debugging, and loading data into data warehouse
  • Used the DataStage Director and Manager for monitoring, validating and debugging DataStage components
  • Integrated data from several different legacy sources and loaded it to the base tables
  • Created Autosys JIL (BOX, CMD, FWT) Scripts to schedule Unix Shell scripts
  • Performed Unit Testing, Integration Testing and User Acceptance Testing (UAT)
  • Created process flows diagrams using Microsoft VISIO
  • Performed Tuning of the ETL programs to increase the processing and loading efficiency
  • Extracted data from Mainframe Flat files, Unix Flat Files, Sybase, DB2 legacy system and loaded into the Corporate Data Warehouse
  • Used Mainframe Change Management (MCM) to manage projects, production hotlines on Mainframe
  • Used Clear Case/Clear Quest for the versioning and migration of code to production
  • Monitor, debug, resolve and support troubleshooting of production issues

Environment:
Ascential DataStage 7.5.1, DB2UDB 8.0, SAS, Sybase, Autosys,Cobol, JCL, MVS, SQL, PL/SQL, Shell Scripts, UNIX, Windows XP, OS/390, AIX 5.2, Jobtrac, Clearcase and Clear quest, Rapid SQL.

Confidential, Berlin, VT Jan 2009 - Apr 2009
ETL Developer

Confidential, is a national federation of 39 independent, community-based and locally operated Blue Cross and Blue Shield companies. As the nation\'s oldest and largest family of health benefits companies, the Blue Cross and Blue Shield Association prides itself on being the most recognized brand in the health insurance industry along with many other celebrated milestones.As a developer, was involved in loading the data marts for the provider, members & claims. Involved in extracting data from their legacy system sources and loading into a Data Mart. The major job involved in cleansing the data and transforming the data to the staging area then loading the data in the Data Mart for analyzing business performance at various stages.

Responsibilities:

  • Involved in gathering Business Requirements for reports and came up with standard Requirement gathering documents
  • Designed the mappings between sources (external files and databases) to staging and target warehouse database
  • Used theDataStageclient tools to develop processes for extracting, cleansing, transformation, integrating and loading data into data warehouse database (DB2 UDB)
  • Implemented the Slowly Changing Dimension, SCD Type-II strategy inDataStage, for the OLAP Dimension tables storing history data
  • Designed and developed several extract programs to send files to third party vendors
  • Designed and developed Code table extract process
  • Created DataStage jobs usingDataStageDesigner and extracted data from various sources, transformed data according to the requirement and loaded into data warehouse schema
  • Involved in tuningDataStagejobs for better performance
  • Used theDataStageDirector and the runtime engine to schedule running the solution, testing and debugging its components and monitoring the resulting executable versions
  • Create master controlling sequencer jobs using theDataStageJob Sequence
  • Used Clear Case/Clear Quest for the versioning and migration of code to production
  • Used IBM Tivoli Workload Scheduler for scheduling and tracking the DataStage Jobs.

Environment:
IBM Information Server 8.0, SAS, DB2 UDB 9.0, Linux, DB2 Control Center, SQL, PL/SQL, Shell Scripts, UNIX, Windows XP, OS/390, IBM Tivoli Workload Scheduler, Clearcase and Clear quest, Rapid SQL.

Confidential, McLean, VA Feb 2008 - Dec 2008
ETL Developer

Confidential, is a shareholder owned company whose people are dedicated to lowering the costs and increasing access to quality housing for more of America's families.The objective of this project is to provide an integrated data from various DB2 database sources in LP (Loan prospector), SAP (Security Accounting Project) to FAS 140 in Corporate Data warehouse.

Responsibilities:

  • Designed and Created Parallel Extender and sequencer jobs which distribute the incoming data concurrently across all the nodes, to achieve the best performance
  • Used Mainframe SAS to extract data from DB2 and Sybase Tables to create Load Ready Files
  • Used SAS to compare Source Data and Target Database
  • Used the DataStage Designer to develop various jobs for extracting, cleansing, transforming, integrating, debugging, and loading data into data warehouse
  • Used the DataStage Director and Manager for monitoring, validating and debugging DataStage components
  • Integrated data from several different legacy sources and loaded it to the base tables
  • Created Autosys JIL (BOX, CMD, FWT) Scripts to schedule Unix Shell scripts
  • Performed Unit Testing, Integration Testing and User Acceptance Testing (UAT)
  • Performed Tuning of the ETL programs to increase the processing and loading efficiency
  • Extracted data from Unix Flat Files, Sybase, DB2 legacy system and loaded into the Corporate Data Warehouse
  • Imported/Exported data between Development and Production using DB2 export/import utility
  • Designed and developed Code table extract process
  • Used Clear Case/Clear Quest for the versioning and migration of code to production
  • Monitor, debug, resolve and support troubleshooting of production issues

Environment:
Ascential DataStage 7.5.1, DB2, DB2UDB 8.0, SAS, Sybase, Autosys,SQL, PL/SQL, Shell Scripts, UNIX, Windows XP, OS/390, AIX 5.2, Jobtrac, Clearcase and Clear quest, Rapid SQL.

Confidential,Silver Spring, MD Jun 2007 - Jan 2008
ETL Developer

Confidential, is to provide billing information in a standard format for the various Verizon billing systems across all Verizon jurisdictions. The main purpose of this project is to integrate the Data from MCI systems to the existing Verizon system.

Responsibilities:

  • Gathered Business Requirements by working closely with Business users
  • Prepared design documents and mapping documents source system (Mainframe) and the warehouse tables (TeraData) and Designed the jobs based on the understanding the data model table relationship
  • Extensively worked on Data Extraction, Transforms, Loadingand Analysis
  • Developed Job sequencers for executing DataStage jobs
  • Developed jobs in Parallel Extender using different stages like Join, Transformer, External filter, Row generator, Column generator, Merge, Lookup, Remove Duplicates, Copy, Filter, Funnel, Dataset, Lookup File Set, Change Data Capture, Modify, and Aggregator
  • Used Parallel Extender for distributing load among different nodes by implementing pipeline and partitioning of data in parallel extender
  • Used different Partition methods and collecting methods for implementing parallel Processing
  • Created Parameters and Environment variables to run the jobs
  • Importing/Exporting the DataStage projects and taking the backup
  • Used IBM Tivoli Storage Manager to backup the data, creating archives and for restoring or retrieving the data.
  • Integrated data from various sources into the staging area in data warehouse for integrating and cleansing data
  • Extensively used all components of DataStage (Manager, Designer and Director) for various development and support activities
  • Used Clear Case/Clear Quest for the versioning and migration of code to production
  • Loading and Validating Data from mainframe flat files to TeraData Tables
  • Involved in coding of scheduling module using java, servlets and jsp

Environment:
Ascential DataStage 7.5, Shell Scripts, TeraData, UNIX, Windows XP, TeraData SQL Assistant, Java and Sun Solaris 5.1, SQL, PL/SQL, OS/390, AIX 5.2, Jobtrac, Clearcase and Clear quest, Rapid SQL.

Confidential, McLean, VA Feb 2006 - May 2007
ETL Developer

Confidential, is a shareholder owned company whose people are dedicated to lowering the costs and increasing access to quality housing for more of America's families. The objective of this project is to provide an integrated Mainframe IMS data from various sources like PE (Project Enterprise), Midas (Mortgage Information Direct Access System) to Corporate Data warehouse.

Responsibilities:

  • Designed and Created Parallel Extender and sequencer jobs which distribute the incoming data concurrently across all the nodes, to achieve the best performance
  • Developed COBOL/SAS/JCL programs to retrieve IMS Data and load the Data to CDW Tables
  • Used Mainframe and Unix SAS to generate various DQ reports
  • Used SAS to compare Source Data and Target Database
  • Used the DataStage Designer to develop various jobs for extracting, cleansing, transforming, integrating, debugging, and loading data into data warehouse
  • Used the DataStage Director and Manager for monitoring, validating and debugging DataStage components
  • Integrated data from several different legacy sources and loaded it to the base tables
  • Created Autosys JIL (BOX, CMD, FWT) Scripts to schedule Unix Shell scripts
  • Performed Unit Testing, Integration Testing and User Acceptance Testing (UAT)
  • Performed Tuning of the ETL programs to increase the processing and loading efficiency
  • Extracted data from Mainframe Flat files, Unix Flat Files, Sybase, DB2 legacy system and loaded into the Corporate Data Warehouse
  • Used Mainframe Change Management (MCM) to manage projects, production hotlines on Mainframe
  • Designed and developed Code table extract process
  • Used Clear Case/Clear Quest for the versioning and migration of code to production
  • Monitor, debug, resolve and support troubleshooting of production issues

Environment:
Ascential DataStage 7.5.1, DB2UDB 8.0, SAS, Sybase, Autosys,Cobol, JCL, MVS, SQL, PL/SQL, Shell Scripts, UNIX, Windows XP, OS/390, AIX 5.2, Jobtrac, Clearcase and Clear quest, Rapid SQL.

Confidential, NY Sep 2005 - Jan 2006
ETL Developer

Confidential, is a leading insurance corporation with Various Investment programs. The project was to maintain the Customer information Database for supporting the centralized SAP system. The main aim of the project is to develop web services in DataStage to maintain Customer information through different systems.

Responsibilities:

  • Re-architecture the Entire design due to performance issues
  • Developed DataStage code with an approach to make them as web services
  • Developed special strategies of DataStage development in order to integrate it as a batch process and also as a RTI service
  • Extensively worked on the Data Model to decrease the complexity of queries
  • Created a shared container approach in order to make the code more visible
  • Involved in developing a system plan to atomize the batch runs depending on profit center requirements
  • Extensively used Quality stage for data Cleansing and to standardize address information
  • Designed documents for maintaining best practices of DataStage
  • Responsible for developing test data and stress Test analysis
  • Designed complex logics in DataStage with out hash files or Flat files in order to improve the performance
  • Wrote Unix Shell Scripts to automate the process

Environment:
AscentialDataStage 7.5 (Administrator, Manager, Designer, Director,), COBOL, JCL, SAS, Windows NT, Oracle 10g, SQL, PL/SQL, SQL Server 7.0, DB2, Quality Stage, Web services Pack, Real Time Integration services

Confidential, Wilmington, DE May 2004 - Aug 2005
DataStage Analyst

The project involved extracting data from different sources and loading into a Data Marts. The major job involved in cleansing the data and transforming the data to the staging area then loading the data in the Data Marts. The Data Marts is an integrated DataMine that provides feed for extensive reporting. It enables an insight into the current and future financial situation/needs based on the information received from the Data Warehouse.

Responsibilities:

  • Developing Architecture for building a Data warehouse by using data modeling tool Erwin.
  • Designing the Target Schema definition and ETL using DataStage.
  • Mapping Data Items from Source Systems to the Target System.
  • Used the DataStage Administrator to assign privileges to users or user groups (to control which DataStage client applications or jobs they see or run), move, rename, or delete projects, and manage or publish jobs from development to production status.
  • Designing and Developing PL/SQL Procedures, functions, and packages to create summary tables.
  • Used the DataStage Director and its run-time engine to schedule running the solution, testing and debugging its components, and monitoring the resulting executable versions (on an ad hoc or scheduled basis).
  • Worked with DataStage Manager for importing metadata from repository, new job categories and creating new data elements.
  • Written shell scripts for Data Acquisition as a part of ETL process.
  • Analysis of star schema in dimensional modeling.
  • Involved in jobs and analyzing scope of application, defining relationship within & between groups of data, star schema, etc.
  • Identifying suitable dimensions and facts for schema.
  • Creating and loading data warehouse tables like dimensional, fact and aggregate tables using Ascential DataStage.
  • Attended several business sessions with the business users and created daily, monthly and quarter Analysis reports using Business Objects and altered some of the dimensional and Fact tables to satisfy their reporting needs.
  • Extensively used Meta Broker for importing metadata from Erwin and export warehouse data to Business Objects for reporting purpose.
  • Used Job Control routines and Transform functions in the process of designing the job.

Environment:
Ascential Data Stage 7.5, Business Objects 5.x, ERWIN, Oracle 8i, Sybase, SQL, PL/SQL, DB2/UDB,Unix, Windows NT 4.0, Actuate Reports 6.0.

Confidential, Detroit, MI Sep 2003 - Apr 2004
ETL Developer

Confidential, with headquarters in Detroit, Michigan, USA is among the three automobile giants of USA. This project aims at providing maintenance, production support and enhancement to Chrysler Corporation. My applications are related with MOTOR PARTS (MOPAR) of Chrysler Corporation.

Responsibilities:

  • Prepared mapping documentation based on requirement specification.
  • Redesigned the existing jobs with a different logical approach to improve the performance.
  • Involved in the creation of jobs using DataStage Designer to validate, schedule, run and monitor the DataStage jobs.
  • Involved in designing the process for loading the data from all the source systems to Operational Data Store.
  • Designed Jobs, which extracts data from multiple source systems and Flat Files and transform the data and create Dat files.
  • Create Sequential Files for the Lookup Code Tables and Loaded Directly to the DB2 Data Base.
  • Involved in developing Shell scripts for loading Dat files into DB2.
  • Designed jobs with Stages like LinkCollecter and LinkPartitioner for Initial data movement run to handle the very huge volumes of Data
  • Used the DataStage Designer to develop processes for extracting, cleansing, transforming, integrating, and loading data into data warehouse database.
  • Wrote Unix Shell Scripts to automate the process.
  • Prepared documentation for addressing the referential integrity relations in between the tables at ETL level and monitoring of jobs

Environment:
Ascential DataStage 6.0 (Parallel Extender), SQL Loader 9.2, TOAD-7.5, SQL* Plus 9.2, Windows NT, Sun Solaris 5.8, Erwin 3.5, DB2 and QMF

Confidential, INDIA Oct 2002 - Aug 2003
System Analyst
Domain Party Associates - Development of Reusability Components

This project involves in Developing of new components, which can be used as reusable components by other domain areas. This will allow the promotion of these components to a broader scale and a reduction of redundant functionality. This, in turn, will reduce the amount of code for service and maintenance, lowering the expenses and improving efficiency.

Responsibilities:

  • Designing of new components (3 Modules) which includes design of Technical Spec document, design of Physical and Logical Interface document
  • Coding, Preparing unit Test Case Design Document and Unit Testing
  • Issuing the Components to Production
  • Supporting during System and Integration Testing
  • Production support
  • Worked on programs for scheduling data loading and transformations using DataStage from legacy systems to Oracle 8I using SQL*Loader and PL/SQL
  • Involved in creation of jobs using DataStage Designer to validate, schedule, run and monitor the DataStage jobs
  • Involved in designing the procedures for getting the data from all systems to Data Warehousing system
  • Implementation of Surrogate key by using Key Management functionality for newly inserted rows in Data warehouse
  • Exporting the universe to the Repository to make resources available to the users

Environment:
Ascential DataStage 6.0/6.1, SQL, Business objects 5.1.3, Microsoft Visio, Shell Scripts, UNIX, Windows 2000, IBM 3090, COBOL, PL/1, MVS JCL, DB2 on MVS/ESA, TSO/ISPF, TSO DBX, SAS, TSO DB2MENU and Xpeditor

We'd love your feedback!