We provide IT Staff Augmentation Services!

Etl Tech Lead Resume

2.00/5 (Submit Your Rating)

CT

SUMMARY:

  • Around 9 years of IT experience in ETL Architecture, Development, Enhancement, Maintenance, Production support, Data Modeling, Reporting including Business requirement, system requirement gathering of Enterprise Data warehouses for Insurance, Sales, Entertainment and Finance domains.
  • Hands on experience using Informatica Power Center 8.x/7.x/6.x, DataStage 7.5.2(Parallel Extender), SSIS and Business Objects.
  • Expertise in designing and developing Informatica mappings from varied transformation logic using filter, expression, rank, router, aggregator, joiner, lookup, and update strategy.
  • Expertise in Developing Mappings, Define Workflows &Tasks, Monitoring Sessions, Export & Import Mappings and Workflows and Backup, Recovery.
  • Experience in Extraction, Transformation and Loading of data from heterogeneous source systems like Flat files, Excel, XML, VSAM, Mainframe Files, SAP R3 , UDB DB2, Oracle, SQL Server, Teradata.
  • Experience in Data Modeling using Dimensional Data modeling, Star/Snowflake Schema modeling, FACT and Dimensional tables, Physical and logical data modeling, using Erwin
  • Extensively Implemented Error Handling Concepts, Testing, Debugging skills and Performance tuning of targets, sources, transformation logics.
  • Expertise in doing the RCA (Root Cause Analysis) and resolving performance bottlenecks and enhance for optimization of the mappings, sessions and workflows.
  • Experience in working with UNIX Shell scripts, Linux scripts, Batch scripts and necessary Test Plans to ensure the successful execution of the data loading process
  • Strong skills in analyzing the database design, Coding using SQL and PL/SQL, DB2 UDB, Oracle 10g / 9i / 8i / 8.0, SQL Server, Teradata V2R5, SQL*Plus and Sybase database applications. Coded Stored Procedures, Packages and Triggers using PL/SQL.
  • Ability to coordinate the flow of development initiatives and Experience in Requirement Analysis, Design, Development, all the phases of Testing like Unit testing, Integration Testing, Regression testing, Performance, Acceptance testing.
  • Experience in PVCS, Virtual Source Safe (VSS), Clear Case and TFS for version controlling and promoting the code to higher region.
  • Automation, Scheduling and monitoring of jobs with scheduler AutoSys.
  • Management skills include Team Management, Project Planning, Monitoring, and self-starter with good communication skills and ability to work independently and as part of a team.
  • Ability to adjust to the people, circumstances & responsibility, with a Drive for results through self-motivation & self-candor.

SOFTWARE/ HARDWARE:

Data Warehousing ETL Tools

Informatica Power Center 8.x/ 7.x/6.x, Work Flow Manager, Server Manager/Informatica Server, Repository Manager, Designer tool set, Data Stage 7.5.2 Parallel Extender, SSIS

Reporting Tools (OLAP)

Business Objects 6.1

Data Modeling & Design Tool

Erwin , Microsoft Visio

Version Control Tools

PVCS, Clear case, Virtual Source Safe, TFS

Scheduling Tools

AutoSys

RDBMS

Oracle10g/9i/8.x, MS SQL Server 2008/2000, MS Access 2000, Teradata V2 R5/R4, DB2 UDB V9/8.2/7.2, Sybase

Languages

C, C++, COBOL, UNIX shell scripts, Linux scripts, Java, Batch Script, XML, SQL, PL/SQL

Operating Systems

MS-DOS, Windows Vista/ XP / NT / 2000, UNIX (Sun Solaris, HP-UX, IBM-AIX), Red Hat Linux

PROFESSIONAL EXPERIENCE:

Project Title: Insurance Data Warehouse (IDW)
Confidential,CT
Role : ETL Tech Lead
Period : May 2010 to till date

XL Group plc. through its subsidiaries, is a global insurance and reinsurance company providing property, casualty and specialty products to industrial, commercial and professional firms, insurance companies and other enterprises throughout the world. XL Group companies have approximately 60 offices in more than 20 countries with approximately 4,000 employees.

The Insurance Data warehouse (IDW) in XL was built to integrate the transactional data from multiple source systems like Mayfare, WINS, Genius, Subscribe, XL Programs, Wbrown for Policy and Claims facts data for decision making. We also use third party tools like Procede application to load re insurance data to IDW and used in various reporting tools like Cognos, MSRS and OBIEE. There were two datamarts built on top of data warehouse like FDM (Financial Datamart) and RDR (Reinsurance Data Repository). The refresh of data in the datamart is done on a daily/weekly/monthly basis depending on the application. The data warehouse also serves as a centralized information repository to help business to arrive at better decision and also to improve the operational efficiency.

Responsibilities:

  • As an ETL Tech lead, lead the ETL development for enhancements in Insurance Data warehouse.
  • Working closely with the business users to understand the requirements and converting them into project level technical capabilities.
  • Worked with business analysts to identify the appropriate data elements for required capabilities.
  • Update the status and plan the releases through the scrum meetings.
  • Coordinating with offshore team and providing the inputs.
  • Worked with source teams to find out the source team changes.
  • The project involved developing mappings for moving data from AS/400 and Flat files to Staging Area (STG) and then to Data Warehouse (DWH) and then to Data Mart.
  • Developing the ETL detail design documents for each target tables (Fact and dimension tables).
  • Creating primary objects (tables, views, indexes) required for the application
  • Designing ETL jobs. Used Informatica as ETL tool.
  • Designed and developed complex mapping for varied transformation logic like Expression, Filter, Aggregator, Router, Joiner Update Strategy, Unconnected and Connected lookups
  • Used Informatica Debugger to troubleshoot logical errors and runtime errors.
  • Designed and developed common modules for error checking (e.g. to check if the reject records output file is empty and to check if there are duplicate natural keys in a given table.)
  • Performed the tuning at source, Target and informatica mappings using Indexes, Hints and Partitioning in DB2, SQL Server and Informatica.
  • Prepared Test Plans for Unit test and System Integrated testing of Informatica Session and Workflows.
  • Co-ordinate with the QA team in various testing phases by resolving the defects and ensuring smooth execution of the test plans.
  • Creating the deployment documents and migrating the code to the production environment.
  • Investigating and fixing the bugs occurred in the production environment and providing the on-call support

Environment: Informatica 8.6.1, SQL Server 2008, Windows NT/2000, Cognos 8.4, Erwin, SSIS, Autosys

Project Title: P&C IM PLDW Maintenance Backlogs Project
Confidential,CT
Period: May 2008 to Jun 2009
Role: Business Analyst & Senior Lead ETL Developer

The Hartford Financial Services Group, Inc. is one of the largest investment and insurance companies in the United States. Tata Consultancy Services Limited (TCSL) had been chosen as the vendor for the production support, maintenance and enhancement of the Personal Lines Data warehousing maintenance and backlogs project.

The project aims at support and enhancement of the Enterprise Data warehousing applications in the Property and Casualty division of Hartford. The Hartford product lines include Group Benefits, Retirement Programs, Mutual Funds (Institutional), Investment Management, Reinsurance & International and also supports Annuities, Mutual Funds, Automobile, Insurance, Homeowners Insurance, Flood Insurance, Life Insurance, College Savings, International AARP endorsed Products and Online Commerce Protection for individuals.

The objective during this service phase is to maintain the application availability to the business users, provide fixes to production issues within the service levels defined and agreed upon. TCS also will look at the possibility of providing enhancements for the existing system.

Role and Responsibilities

  • Interacting with the business analysts to get the business Requirements, reporting needs and creating Business Requirement Document.
  • Performed Source System Data analysis as per the Business Requirement. Distributed data residing in heterogeneous data sources is consolidated onto target Enterprise Data Warehouse database.
  • Developed Mappings, Sessions, Workflows and Shell Scripts to extract, validate, and transform data according to the business rules.
  • Identify the Fact tables and slowly changing dimensional tables for PLDW data warehouse.
  • Building the necessary staging tables and worktables on oracle development environment.
  • Sourced the data from XML files, flat files, SQL server tables and Oracle tables.
  • Made substantial contributions in simplifying the development and maintenance of ETL by creating re-usable Mapplets and Transformation objects. Partitioned the sessions to reduce the load time.
  • Lead an offshore team of six members.
  • Performed data cleansing and cache optimization
  • Developed complex mappings in Informatica to load the data from source files using different transformations like Source Qualifier, Look up (connected and unconnected), Expression, Aggregate, Update Strategy, Joiner, Filter, Stored Procedure, normalizer, Router and Mapplets.
  • Change Data Capture for Incremental Aggregation.
  • Created Slowly Changing Dimension (SCD) Type 2 mappings for developing the dimensions to maintain the complete historical data.
  • Developed Mappings that extract data to Data Warehouse and monitored the Daily, Weekly, Monthly and Quarterly Loads.
  • Created High level/detailed level design documents and also involved in creating ETL functional and technical specification.
  • Documented ETL development standards as per client requirement.
  • Involved in review of the mappings and enhancements for better optimization of the Informatica mappings, sessions and workflows.
  • Extensively worked in performance tuning of programs, ETL procedures and processes.
  • Worked on Key-Range, Round-Robin, Pass-through and Hash Auto-Keys partitions at PowerCenter session level.
  • Performed Unit, Systems and Regression Testing of the mappings. Involved in writing the Test Cases and also assisted the users in performing UAT.
  • Extensively used UNIX shell scripts to create the parameter files dynamically and scheduling jobs using AutoSys.
  • Created integration services, repository services and migrated the repository objects.
  • Used PMCMD command to automate the PowerCenter sessions and workflows through UNIX.
  • Query based optimization, Cost based optimization.
  • Used heterogeneous data sources Oracle, DB2, and XML Files, Flat Files as source also imported stored procedures from Oracle for transformations.
  • Provided production support and maintenance for all the applications with the ETL process.
  • Used web based bugtracking tool Mercury Quality center for tracking and supporting the maintenance and development tasks (CR).
  • Wrote test plans and executed it at UNIT testing and also supported for system testing, volume testing and USER Acceptance Testing.

ENVIRONMENT: Informatica Power Center 8.3, Oracle 10g, Toad, shell scripts, PL/SQL, UNIX (AIX) and Erwin.

Project Title: CRM Migration
Confidential,NJ
Period: May 2007 to Apr 2008
Role: Senior Lead ETL Developer

American Home Assurance (AHA), a division of American International Group (AIG) the leading insurance company, wanted to migrate the data from different applications (EMMA and GOALD) to a single source to build a customer friendly CRM system which can be used by call center employees to provide faster information to customers and to generate various reports. EMMA and GOALD databases are in Sybase and DWH is in Oracle. Before loading in to DWH, intermediate files are to be loaded into staging area and then to Onyx database, from which some reports are generated. This involves migrating the customer, policy, product, campaign data to CRM system, and take the data like incidents, call details, prospects, and applicants to Data Ware House.

Roles and Responsibilities

  • Analyzing the existing informational sources and methods, understanding the customer expectations and identifying the problem areas
  • Extensively used DataStage Designer, DataStage Director, DataStage Administrator and Quality Stage.
  • Involved during the analysis, planning, design, development, and implementation stages of projects using IBM Websphere software (Qualitystage, IBM Information Analyzer).
  • Designed, coded, and tested the Information Server components
  • Designed and developed DataStage jobs using Parallel Processing techniques by implementing Pipeline and Partition Parallelism on a MPP system.
  • Used Datastage Enterprise Edition/Parallel Extender stages namely Datasets, Sort, Lookup, Change Capture, Funnel, Row Generator stages in accomplishing the ETL Coding.
  • Developed user defined Routines and Transformations using Ascential DataStage Manager
  • Tuned the jobs for higher performance by dumping the look-up data into hash-file.
  • Coordinated with Database Admin to create appropriate indexes for faster execution of jobs.
  • Used Administrator to administer the locks on the jobs and other Administration activities for DataStage Server.
  • Developed UNIX shell scripts to automate the Data Load processes to the target
  • Developed jobs to load data in slowly changing dimensions.
  • Involved in Performance Tuning of Jobs.
  • Assisted operation support team for transactional data loads in developing SQL & Unix scripts
  • Used import/export utilities to transfer data from production instance to the development environment.
  • Participated in discussions with Project Manager, Business Analysts and Team Members on any technical and/or Business Requirement issues
  • Obtained detailed understanding of data sources, flat files and complex data schemas
  • Extensively used DataStage Director for Job Scheduling, Emailing production support for Troubleshooting from LOG files
  • Extensively worked on Error handling, cleansing of data, Creating Hash files and performing lookups for faster access of data.

Environment: DataStage 7.5.2 Parallel Extender, IBM Information Server, Quality Stage, Information Analyzer, UNIX Shell Scripts, Sybase, Oracle 9i, Windows NT/2000.

Project Title: AR Data Migration
Confidential,NJ
Period: Oct 2006 to Apr 2007
Role: Senior ETL Developer

Domestic Brokerage Group (DBG), a division of American International Group (AIG) is seeking to replace its existing Accounts Receivable (A/R) and billing systems with SAP based solution and hence it is imperative that legacy data needs to be migrated. The source data exists in multiple environments such as IMS database, VSAM files, Sybase and DB2. This requires not only transfer of data but also conversion of data in-line with SAP format.

Roles and Responsibilities

  • Involved in all the phases of the SDLC requirement gathering, design, development, Unit testing, UAT, Production roll-out, enhancements and Production support.
  • Interacting with the business representatives to understand the requirements and determine the best approach for timely delivery of information. Writing the Software Requirement Specification for the Business requirement.
  • Ensuring timely deliveries of work items to the Client.
  • Involved in Implementing ETL standards and Best practices.
  • Used the DataStage Designer to develop processes for extracting, cleansing, transforming, integrating, and loading data into data warehouse database.
  • Obtained detailed understanding of data sources, flat files and complex data schemas
  • Developed jobs using different types of stages -- Sequential File, Transformer, Aggregator, Merge, IPC, Link Partitioner and Link Collector and Hashed File.
  • Extensively worked on Error handling, cleansing of data, Creating Hash files and performing lookups for faster access of data.
  • Used DataStage Manager for importing metadata from repository, new job categories and creating new data elements. Created Re-usable repository and routine using DataStage Manager.
  • Performance Tuning of Jobs, Stages, Sources and Targets.
  • Used DataStage Administrator to setup DataStage projects and defined DataStage user profiles and assigned privileges.
  • Extensively used DataStage Director for Job Scheduling, emailing production support for troubleshooting from LOG files.
  • Developed Routines using DataStage BASIC programming language.
  • Migrated jobs from development to QA to Production environments.
  • Defined production support methodologies and strategies.
  • Logging in the different issues related to the development phase.


Environment: DataStage 7.5.2, SAP, Quality Stage, WISD, COBOL, IMS, Sybase, DB2 UDB, Oracle 9i, UNIX Shell Scripts, Windows NT/2000.

Project Title: Home Entertainment Data Warehouse
Confidential,CA
Period : Jan 2006 to Sep 2006
Role : ETL developer

NBC Universal is one of the world\'s leading media and entertainment companies in the development, production and marketing of entertainment, news and information to a global audience. HEDWay is the data warehouse application used by Finance and Sales Planning team in NBC Universal. The data warehouse is categorized into two different applications, namely, FEG(Family Entertainment Group) and CATMAN(Category Management) based on the data that each of the application handle. The data warehouse helps the user community to generate various reports to plan their business strategy based on their analysis of current and historical trends.
HEDWAY is developed with an objective to have a consolidated and consistent data mart of sales and inventory divisions of the company with historical business trends using historical data. Data from DB2 source is selectively extracted, transformed and loaded into target database using ETL tool Informatica. Several complex reports were developed using Micro Strategy.

Role and Responsibilities

  • Involved in all the phases of the SDLC requirement gathering, design, development, Unit testing, UAT, Production roll-out, enhancements and Production support.
  • Interacting with the business representatives to understand the requirements and determine the best approach for timely delivery of information. Writing the Software Requirement Specification for the Business requirement.
  • Ensuring timely deliveries of work items to the Client.
  • Involved in Implementing ETL standards and Best practices.
  • Analyzing the sources, targets, transforming the data, mapping the data and loading the data into targets using Informatica PowerCenter.
  • Designing and developing Informatica mappings and worked on Data modeling with Erwin.
  • Follow the quality standards and guidelines for configuration management, testing, project planning and execution.
  • Carrying out impact analysis for change requests.
  • Extensively used Router, Lookup, Aggregator, Expression and Update Strategy Transformations.
  • Designed and developed the sessions for the relevant mappings or Informatica jobs and organized as sequential / concurrent batches.
  • Worked on upgrading the older version mappings of Informatica to latest version of Informatica.
  • Worked with pre and post sessions, and extracted data from Transaction System into Staging Area.
  • Tuning the Mappings for Optimum Performance, Dependencies and Batch Design.
  • Schedule and run Extraction and Load process and monitor sessions using Informatica PowerCenter Workflow Manager and Workflow Monitor.
  • Created UNIX scripts to generate Email Notifications.
  • Created the reports using Business Objects functionalities like Slice and Dice, Drill down, Cross Tab, Master/Detail and Formulas etc.
  • Unit and System testing of developed ETL mappings.

Environment: Informatica PowerCenter 8.1, Business Objects 6.1, SQL SERVER, SAP R/3 4.6c, SQL/PLSQL, DB2, UNIX Shell programming, Erwin.

Project Title: Integrated DataWarehouse 2.0
Confidential,CA
Period: Feb 2005 to Dec 2005
Role: Software Engineer / Analyst

Charles Schwab has embarked upon a strategic initiative to build an enterprise wide Data Warehouse (iSchwab) that will enable fact-based decision-making. The data warehouse being built will serve as a firm foundation for Business Intelligence and Data Analytics subsequently. The subject area covered in this phase is Banking & Sales Analytics. The process loads the Bank customers and account details, Household wise/Category wise Exclusive, Qualified and Practice revenue amounts, which will be used by the Decision Support System for the analysis
The project was divided into two phases.

Move data from source data files to Staging area
In this development phase the data from different sources were captured in a flat file with the pipe operator used as a delimiter and the header being the record count .The data from the flat file was then moved to the staging area.
Move data from staging area to the Data Warehouse
In this development phase the data from staging is subject to refinement through aggregation, summarization and reorganization to make it usable for Decision Support System analysis.

Role and Responsibilities

  • Extensively involved in understanding the client’s database and identifying the significant fields to extract.
  • Analyzed the sources, targets, transforming the data, mapping the data and loading the data into targets using Informatica PowerCenter.
  • Involved in extraction of data from the different flat files and data from Oracle OLTP Systems.
  • Developed ETL transformations, mappings, sessions and workflows.
  • Worked extensively on different types of transformations like normalizer, expression, union, filter, aggregator, update strategy, lookup, sequence generator and joiner.
  • Used the Teradata external loading utilities like FastLoad, MultiLoad, and TPump to load effectively into Teradata database and BTEQ for adhoc reporting.
  • Scheduling the jobs to ensure the freshness of the data.
  • Involved in different Team review meetings.
  • Wrote Shell Scripts in UNIX to automate the ETL Process, file manipulation and data loading procedures
  • Supervising, testing / validating of all mappings for different applications.
  • Documentation for all the ETL processes carried in Informatica.
  • Tuning the Mappings for Optimum Performance, Dependencies and Batch Design.

Environment: Informatica Power Center 6.1, Teradata, Shell Scripts, PL/SQL procedures, UNIX, Windows.

Project Title: Sales Opportunity Tracking System
Confidential,IL
Period: Mar 2004 to Jan 2005
Role: ETL developer

Sales Opportunity tracking system provides better management of opportunities created by sales managers across the US. The aim of this project is to support Business Executives to take quick decisions based on different dimensional and fact data from the data warehouse for Sales Tracking. Involved in extracting and transforming of the data into Sales Data Mart.

Role and Responsibilities

  • Involved in development phase meetings for Business Analysis and Requirements Gathering.
  • Involved in dimensional modeling to design and develop STAR schemas using ERWIN to identify fact and dimension tables.
  • Designed and Documented the ETL mappings based on the Schemas designed.
  • Worked with heterogeneous sources from various channels like Oracle, SQL Server, flat files, and web logs.
  • Created complex and robust mappings and mapplets.
  • Used different transformations such as Joiner, Look-up, Rank, Expressions, Aggregator and Sequence Generator.
  • Involved in Fine-tuning SQL overrides for performance Enhancements.
  • Responsible for choosing proper partitioning methods for performance enhancements.
  • Involved in unit testing of mappings, mapplets also involved in integration testing and user acceptance testing.
  • Scheduling the sessions to extract, transform, and load data in to warehouse database based on business requirements.
  • Created sessions, database connections and batches using informatica server manager.
  • Unit test of the ETL mappings to ensure functionality

Environment: Informatica Power Center 6.1, Oracle 8i, SQL, PL/SQL, UNIX, Windows NT

We'd love your feedback!