We provide IT Staff Augmentation Services!

Etl Developer Resume

4.00/5 (Submit Your Rating)

Richfield, MN

Summary

  • 7+ years experience in IT with 5 years of strong experience in Data Analysis, Data Modeling, ETL Designing, Implementing Data Warehouses and Data Marts in various verticals like Retail, Banking, Health Care, Telecommunication and Insurance sectors.
  • Experience in data extraction, transformation and loading using Informatica Power Center 9.0.1/8.6.1/8.5/7.1 (Designer , Repository Manager, Repository Server Administration Console, Workflow Manager and Workflow Monitor)
  • Worked on integrating data from heterogeneous sources like Oracle, flat files, DB2, COBOL and XML files.
  • Designed and developed complex mapping logic using various transformations such as Unconnected and Connected lookups, Router, Filter, Expression, Aggregator, Joiner, Update Strategy and Union.
  • Experience in using XML Source and targets.
  • Extensively involved in data modeling, design of star schema and snow flake schema. Used Erwin for physical and logical data modeling.
  • Experienced in RDBMS, writing MS SQL Server and Oracle SQL queries and PL/SQL programs including Stored Procedures, Functions and Triggers.
  • Experience in integration of various data Sources like Sybase, Oracle, MS SQL Server, MS Access, Teradata, DB2, XML and flat files into the Staging area.
  • Experience with various Online Analytical Processing tools (OLAP) like Cognos 7.2/6.0 and knowledge about Business Objects 5.1/4.0.
  • Experience in writing UNIX shell scripts to support and automate the ETL process.
  • Experience working in both Windows and UNIX environment.
  • Expertise in Data Cleansing, worked extensively using Data Profiler tool to cleanse data.
  • Sound knowledge of Software Development Life Cycle and Agile Project management methodologies such as SCRUM.
  • Excellent problem solving skills with a strong technical background and good interpersonal skills. Quick learner and excellent team player, ability to meet tight deadlines and work under pressure. Keen on keeping abreast of upcoming technologies.

Technical Skills

ETL Tools

Informatica Power Center 9.1 / 8.6 and 7.2, Power Exchange, Power Connect

RDBMS

Oracle 11g/10g/9i/8i/8.0, (SQL, PL/SQL, Stored Procedures, Functions), SQLSERVER 2005/2000 , Teradata, DB2

Data Modeling

Erwin

Reporting Tools

Business Objects XI R3/R2/6, Cognos 7X, Cognos Report net.

Database Tools

Toad, SQL Loader and Oracle SQL Developer

Operating Systems

Windows 2000/NT/XP, Unix, Mac OS.

Scripting

Unix Shell Scripting.

Methodologies

Star Schema, Snowflake Schema

Languages

JAVA,C, C++, XML, Autosys, Live Office, Control-M


Education
Bachelors in Electronics and Instrumentation Engineering

Professional Experience

Confidential,Richfield (MN) March ’11 – Present
ETL Developer

Description:
Best Buy is a leading multi-channel global retailer and developer of technology products and services. The purpose of this project was to migrate TMS (Transport Management Service) from I2 Technologies to Red Praire, to increase efficiencies around how products are delivered to customers.

Job responsibilities:

  • Gather business requirements from the client and translate the business details into technical design.
  • Responsible for designing the framework for various requirements.
  • Utilize agile methodology for delivery of project using practices from scrum.
  • Develop medium to complex mappings using several different transformations.
  • Worked on combining the data from heterogeneous sources like Flat files, DB2, Oracle, XML etc.
  • Worked on executing PL/SQL scripts from Informatica mappings using stored procedure transformation.
  • Developed PL/SQL procedures and PL/SQL packages to load and retrieve data from Oracle.
  • Created reusable transformations and mapplets to use them in different mappings.
  • Created mappings which involved Slowly Changing Dimensions (SCD).
  • Created various tasks like sessions, decisions and email notifications.
  • Implemented performance tuning logic on targets, sources, mapping, sessions to provide maximum efficiency and performance.
  • Performance tuning of informatica mappings for large data files by managing the block sizes, data cache sizes, sequence buffer lengths and commit levels.
  • Precise Documentation was done for all mappings and workflows.
  • Used SQL Loader to import data from large flat files and Excel sheets to Oracle tables.
  • Used Erwin to create logical/physical models and data structures.
  • Used Toad to create functions and stored procedures in PL/SQL.
  • Generated various reports using Cognos as per client’s requirements.

Environment: , Informatica Power Center 9.1/8.6.1/8.5/8, Framework Manager, Erwin 4.5, Oracle 11g/10g, MS SQL Server, Toad, XML, Unix, Windows XP.

Confidential,Charlotte (NC) Apr ‘09 to Feb ‘11
ETL Developer

Description:
Time Warner cable is a leading name in the telecommunication industry, spread over 33 divisions serving various parts of the country with Cable, Hi-Speed Internet, Digital Phone and other services.
TWC has one of the nation’s largest database’s having more than half million transactions a day with its billing system running on two different systems, ICOMS (AS400) and VANTAGE (Oracle).

Job responsibilities:

  • Involved in complete SDLC, from requirement gathering client meetings, designing, development, testing, and deployment of the product.
  • Used Erwin to designed tables, indexes and define relation.
  • Performed data cleansing on source data and registered errors in error log tables.
  • Involved in designing the mapping and workflow.
  • Extensively used Informatica to develop mappings to perform extraction, transformation & loading data into multiple environments.
  • Used Cursors and Stored Procedures for ETL Control tables and aggregate tables.
  • Responsible for production deployment.
  • Used Tidal and Unix Shell scripting for automation.
  • Developed mappings and session’s by parameterize for reusability.
  • Extensively involved in fine tuning the database by creating indexes, partitioning the stage and DM tables, analyzing the queries.
  • Used session level partitioning to tune the mappings and sessions.
  • Administered user privileges, password management & folders.
  • Responsible for migration from Informatica 7.2 to 8.6
  • Used Win CVS for version control.

Brief Projects Description:
EVM Data Mart:
The purpose of this project was to develop standardized reports to be used by all divisions and corporate users from a Data Mart, with the capability to drill down, slice / dice and view the data multi-dimensionally, thereby enabling the end user to view business volumes generated by each division, by service across time periods.

  • A Snowflake schema was designed to support the application.
  • Data was staged from 2 systems (ICOMS and Vantage) on daily basis as incremental load, and data mart was built every day successfully.
  • Various standard data cleansing operations were performed to extract data and stage it in an in-house environment.
  • Designed and developed transformation rules (business rules) to generate consolidated (fact/summary) data using Informatica.
  • Mappings were designed to cater the slowly changing dimensions.
  • Maintained shared folders, shortcuts, repositories and user security via repository manger.
  • Used database objects like views, partitioning and stored procedures for accomplishing the complex logical situations.
  • Migrated Repositories across environments.
  • Devised error handling strategies during loads, to handle volatile data.
  • Designed mapplets to minimize re-creation of similar processes.
  • Interpreted logical and physical data models for business users to determine common data definitions and establish referential integrity of the system.
  • Worked with mappings using expressions, aggregators, filters, lookup, update strategy and stored procedures transformations.

AD SALES (CBOL):

Objective of the project was to report timely sales metrics, accurate sales data, enable management staff to view the entire company performance, provide special requests by staff in a timely fashion from CBOL.

  • Designed the data flow from source systems (SQL Server)
  • Involved in designing the data models & dimension models.
  • Devised incremental strategies for stage extracts & DM loads.
  • Designed mappings to capture dimensional changes and to maintain history of the data.

SPRINT JV:

TWC’s joint venture (JV) with Sprint to offer wireless voice service along with its other featured products.

  • Involved in devising a data flow method for importing the data, decrypting the data, loading the data into relational tables, encrypting it and pushing it to another system.
  • Used PGP keys for encryption & decryption.
  • Designed relational tables to accommodate incoming data from Sprint.

UNRATED:

The unrated allows the digital phone team the ability to notify the divisions which calls are not rated. All CDR’s that do not have a TN match in the subscriber file or have a call outside of their date range fall into the unrated bucket. The Unrated report is generated which lets the divisions know which calls are not rated.

  • Involved in processing the text files coming from primal and loading them into relational tables on a daily basis.
  • Designed the summary and detailed tables loaded them on an incremental basis for all the divisions.

Environment : Informatica Power Center 8.6/7.2, Oracle 9i,10g, SQL, PL/SQL, MS Access, Erwin 4.0, Win CVS, Tidal Scheduler 5.2.2, AIX Version 5.3, Windows XP Professional 2002, FTP, Toad 9.0, Cognos 7 series, Report Net Report, MS Excel 2003, Sybase ASE 12.5.

Confidential,NJ July ’07 to April ‘09
DW Consultant

Description:
AIG’s Enterprise Data Warehouse was developed to support the organization’s growing needs to track all HR information concerning client’s and its subsidiaries’ employees.

Job Responsibilities:

  • Responsible for the Data modeling and populating the business rules using mapping.
  • Wrote Informatica ETL design documents and establishment of ETL coding standards.
  • Created mappings, mapplets, transformations to imported source and target metadata.
  • Improved the performance of the mappings by:
  • Rearranging the Filter transformations early into the transformation pipeline.
  • Filtering at the source qualifier level for Relational Databases.
  • Implemented various integrity constraints for data Integrity like Referential Integrity using Primary Key and Foreign Key relationships.
  • Wrote SQL overrides in Source Qualifier according to business requirements.
  • Created reusable mapplets and transformations.
  • Performed Informatica mapping reviews.
  • Created dimensions, hierarchies, cubes, mappings and mapplets using Informatica Power Center.
  • Monitored all the sessions that were scheduled and running.
  • Involved in debugging of failed mappings using the Debugger.
  • Wrote optimized stored procedures on Sybase 12.0 using transact SQL shell scripts to process nightly data feeds.
  • Involved in testing the integrity of the loaded data.
  • Created PL/SQL procedures for processing data within the database.
  • Created catalogs in Impromptu Administrator to create Reports.
  • Updated Impromptu catalog by adding tables and creating joins.
  • Generated various reports using Cognos as per client’s requirements.

Environment: Informatica Power Center 7.1/6.2, Sybase 12.0, Sybase SQL 12.0 SQL/PL-SQL, Cognos7, Microsoft Windows NT, MS SQL Server 2000, Oracle 8i.

Confidential,Chattanooga (TN) Feb ‘06 to July ‘07
DW Consultant

Description:
This project objective was to create an FDM (Financial Data Mart) for Blue Cross Blue Shield of Tennessee also referred as BCBST. It involved developing a data mart of consolidated source of cash receipts that were used for accounting and reporting purposes. An interface for cash receipts was developed, processed in Facets to the People Soft general ledger, which was closely controlled by the business area. This was accomplished by way of process control tables that Finance Department could update quickly and easily. The ETL operations were carried out using Informatica 7.1

Job responsibilities:

  • Developed and implemented ETL processes using Informatica to move data from different sources into data mart.
  • Extensively worked on source analyzer, warehouse designer, mapping designer & mapplet, dimension validations and reusable transformations.
  • Used Designer to extract data from heterogeneous data sources like flat files and databases.
  • Utilized Lookup transformation to update slowly changing dimensions Type- 2 using Time stamping.
  • Created various transformations such as Filter, Expression, Joiner, Lookup, Update Strategy, Sequence Generator and Stored Procedure.
  • Created mappings & reusable transformations.
  • Used XML source qualifier transformation to transform data from a web application over to the Data mart.
  • Created schedules, running batches and sessions.
  • Used Workflow Manager to initiate sessions, set source and target options and set server parameters and batch scheduling.
  • Developed many stored procedures, triggers, functions and packages.
  • Wrote shell scripts to perform pre-session and post-session operations.
  • Extensively worked in the performance tuning of the database, ETL procedures and processes.
  • Created list reports, cross tab reports, chart reports and reports with lot of conditional formatting using Report Studio.
  • Created query prompts, calculations, conditions & filters, prompt pages using Report Studio.
  • Involved in scheduling Report Studio and Query Studio reports and sending reports via email to several business users.

Environment: Informatica PowerCenter 7.1 (Source Analyzer, Data warehouse designer, Mapping Designer, Mapplet, Transformations, Workflow Manager, Workflow Monitor), Oracle 9i, SQL, PL/SQL, Stored procedures, SQL*Loader, PL/SQL, Erwin 4.0, Unix Shell Scripting, CognosSeries 7.x Framework Manager, CognosConnection, Query Studio, Report Studio, Cognos Configuration, Cognos Scheduler.

We'd love your feedback!