We provide IT Staff Augmentation Services!

Sr. Etl Developer Resume

3.00/5 (Submit Your Rating)

Summary:

  • Eight (8) years of work experience in IT Industry. Extensively worked on Data Extraction, Transformation and Loading (ETL) using Informatica, Oracle, Microsoft Access, UNIX, XML Files, Teradata V2R5, DB2 and Flat Files.
  • Two Years of experience in Teradata V2R5, and worked on Teradata utilities such as B.TEQ, MLOAD, FLOAD and T Pump.

Ø Two Years of experience in Informatica Admin related activities.
Ø Expertise working with Telecom, Insurance, Health Care Domains.

  • Comprehensive knowledge of physical and logical data modeling, performance tuning.
  • Experienced in extraction, transformation and loading with informatica power Center 8.x/7.x/6.x/5.x using Oracle 10G/9i/8i, Teradata, Flat Files, COBOL, XML files, Microsoft wsdl.
  • Strong work experience in Data Analysis, Design, Development, Implementation and Testing of Data Warehouses.
  • Experience in Debugging of mappings using data and error conditions, Scheduling jobs using Control-M, Maestro, and Informatica Scheduler.
  • Involved in migrating the data warehouse from Flat File to Teradata.
  • Used Teradata External loaders like MLOAD, FAST LOAD to load data from Flat files into the target tables in Teradata Database.
  • Experience in handling CDC (Change Data Capture) using Informatica Power Exchange.

Ø Experienced with SQL and PL/SQL Created Functions and Stored procedures, Indexes, Synonyms, tables, Views etc.

  • Experienced in Creating Folders, Users, and Groups in Informatica and assigning privileges to Users/Groups.
  • Created slowly changing dimension mappings such as SCD Type II to load data into Data Marts.

Ø Strong experience in database partitioning and tuning Informatica Mappings while dealing with large volumes of data.

  • Experienced in deployment of code from Development to UAT server and UAT server to Production server.
  • Expertise in Unit testing of mappings, sessions and created documents such as Test Case, Test script document, Test result document.
  • Mentored team members to understand the project over view and ETL Architecture.
  • Experienced in Onsite-Off shore model.

Ø Experienced in the design of Dimension Models, Star Schemas and Snowflake Schemas.
Ø Expertise in Data Modeling, Business Process Analysis, debugging and performance tuning of sources, targets, mappings and sessions. Experience in using data modeling tools like Erwin.

  • Strong understanding of the principles of DW using Fact Tables, Dimension Tables, star schema modeling, Ralph-Kimball approach, Bill-Inman approach.
  • Articulate with excellent communication and interpersonal skills with the ability to work in a team as well as individually.

Technical Skills:

Data Warehousing Tools (ETL):

Informatica Power Center 8.x/7.x/6.x, Informatica Data Analyzer, Crystal Reports XI R2, Business Objects, Cognos 8x.

Operating System:

Windows 98/2000/NT, UNIX, MS-DOS

Design/Application Tools:

Star-Schema Modeling, Snowflakes Modeling, Fact and dimension tables, Erwin, Toad, SQL Navigator, Clear case, Control –M, Maestro, Microsoft Visio.

RDBMS:

Oracle8i/9i/10G, DB2 UDB 8.1,Teradata V2R5,and MS-Access

Languages/Scripting:

C, SQL and PL/SQL, UNIX, Shell Scripting

Methodologies:

Data Modeling – Logical / Physical / Dimensional, Star / Snowflake

Educational Qualification:

Ø Bachelor of Engineering in Electrical and Electronics

Project Profile:

Sr. ETL Developer
Confidential,San Diego, California

Confidential,Data mart Phase2 Oct08 – Dec09

Confidential, is a separate application, which is acquired by Qualcomm in 2006.This application is used in mobile phones, which help application subscribers to login or do any transactions between their bank accounts.
Data mart and cognos reports are being created on the firethorn data at activity and subscribers levels. There are 3 layers in the data mart, Stage, Data mart and summary. The cognos reports are triggered mostly on the summary layer.
All the reports should reach to the business managers by 7:30 Am PST.

Foundation Data warehouse (FDW) Jan08–Sep08

This project deals with Finance data (capital, fixed assets) of QUALCOMM. This data warehouse is divided into three different levels (L1, L2, and L3). Each level is built with different granularity. L1 level is the most granular level which have the transactions at month level.
Reports can be fired at any level without much complexity in reports as we have three different levels, which indirectly increase the performance of the reports.

Responsibilities:

    • Participated in client meetings, discussions for bringing out the better solutions to maintain the quality of data.
  • Involved all phases of the project life cycle such as Analysis, Design, Coding, Testing, and Production.
  • Performed Unit Testing for ETL Mappings and documented the results.
  • Involved in providing the better solutions using informatica in order to increase the performance where needed.
  • Experience in handling CDC (Change Data Capture) using Informatica Power Exchange.
  • Extensively worked on Informatica Designer, Workflow Manager, Workflow Monitor Working on Source Analyzer, Warehouse designer, Mapping Designer & Mapplet Designer.
  • Experienced in Creating Folders, Users, and Groups in Informatica and assigning privileges to Users/Groups.
  • Participated in effort estimation in order to give the delivers in time.
  • Followed some of the best practices of data warehousing while designing.
  • Resolved the production tickets based on level of priority (or) emergency.
  • Developed transformation logic and designed various Complex Mappings and Mapplets using the Designer.
  • Worked on various transformations like Lookup, Aggregator, HTTP, Expression, Router, Filter, and Update Strategy, XML Parser.
  • Involved in Data Modeling and mentored team members to understand the project over view and Architecture.
  • Worked on Partitioning of Data at Database Level.
  • Configured and ran the Debugger within the Mapping Designer to troubleshoot the mapping before the normal run of the workflow.
  • Resolved the production tickets based on priority.
  • Extensively worked with the Informatica server that runs on UNIX platform and used Workflow monitor to monitor the Informatica server and reviewed error logs that it generates for each session runs.
  • Created control-M jobs and scheduled them.
  • Experienced in deployment of code from Development to UAT server and UAT server to Production server.

Environment: Informatica Power Exchange, Informatica Power Center 8.6, SQL Navigator, Oracle10G, UNIX, Windows, SQL, Control-M.

Confidential,Oct06 – Nov07
ETL Developer
Confidential,San Antonio, Texas

Confidential,provides insurance, banking, and finance products to the customers. USAA issues the insurance policies specifically for the US Federal employee’s. This project mainly deals with property insurance policies data for different line of business. Cleansing of data is done in Mainframes to some extent and then the data flows through Staging Data store, Packaging and finally loaded into Data Mart. policies are processed weekly and monthly.

Responsibilities:

  • Actively participated in designing the Logical Data Model and Physical Data Model for Staging Area.
  • Created Mappings, Mapplets, Sessions and Workflows to load Staging tables.
  • Extensively used Mapping Parameters and Mapping Variables to use globally.
  • Extensively worked on Informatica Designer, Workflow Manager, Workflow Monitor Working on Source Analyzer, Warehouse designer, Mapping Designer & Mapplet Designer.
  • Developed transformation logic and designed various Complex Mappings and Mapplets using the Designer.
  • Worked on various transformations like Lookup, Aggregator, Expression, Router, Filter, and Update Strategy.
  • Configured and ran the Debugger within the Mapping Designer to troubleshoot the mapping before the normal run of the workflow.
  • Performed Unit Testing for ETL Mappings and documented the results.
  • Experienced in Onsite-Off shore model and giving the deliveries on time.
  • Extensively worked with the Informatica server that runs on UNIX platform and used Workflow monitor to monitor the Informatica server and reviewed error logs that it generates for each session runs.
  • Involved in the preparation of documentation for ETL standards, procedures and naming conventions.
  • Resolved the production tickets based on priority.
  • Involved in Data Modeling and mentored team members to understand the project over view and Architecture.
  • For better performance created partitions, SQL Over rides in source qualifier.
  • Actively participated in creating views to support Cube Build process.
  • Experienced in deployment of code from Development to UAT server and UAT server to Production server.
  • Created Source to Target document for all the mappings for Maintenance.

Environment: DB2 UDB V8.1, UNIX, Windows, SQL, Control-M, Informatica Power Center 7.1.3, Crystal Reports XI R2, Main Frame (COBOL Sources), Flat Files

Confidential,Jan05 – sep06
ETL Developer
Confidential,Fairfield, CT.

Confidential,is a $15 billion unit of General Electric Company and GE Healthcare is headquartered in the United Kingdom, Worldwide, GE Healthcare serving healthcare professionals and their patients in more than 100 countries.

Confidential,provides transformational medical technologies that are shaping a new age of patient care. GE Healthcare’s expertise in medical imaging and information technologies, medical diagnostics, patient monitoring and life support systems, disease research, drug discovery, and biopharmaceutical manufacturing technologies is helping physicians detect disease earlier and to tailor personalized treatments for patients. GE Healthcare offers a broad range of products and services that are improving productivity in healthcare and enhancing patient care by enabling healthcare providers to better diagnose and treat cancer, heart disease, neurological diseases, and other conditions.

Responsibilities:

Ø Created various documents such as ETL design Document, Source to Target mapping document, Test Cases, Test Scripts, Code migration document and other documents.
Ø Involved in migrating data from source to target using application tools (Fast load, Multi load)
Ø Involved in SQL scripts Macros in Teradata to implement business rules.
Ø Involved in building a universe in star schema model for reporting purposes in Business objects.
Ø Involved in the optimization team for query tuning and optimization of the teradata SQL for performance in loading the data to the target
Ø Involved in migrating the data warehouse from Flat File to TERADATA.

  • Used Teradata External loaders like MLOAD, to load data from Flat files into the target tables in Teradata Database.
  • Extensively worked on Informatica Designer, Workflow Manager, Workflow Monitor Working on Source Analyzer, Warehouse designer, Mapping Designer & Mapplet Designer.

Ø Experienced in deployment of code from Development to UAT server and UAT server to Production server.
Ø Involved in code review and performance tuning of Teradata programs and ETL loads.
Ø Scheduled the Sessions/Workflows by using Informatica Tool.
Ø Did performance tuning of mappings and SQL statements.

  • Created various Documents Such as Test plan Document, Test cases Document, Test Script document, ETL Design document etc.
  • Created proper Primary Index (PI) taking into consideration of both planned access of data and even distribution of data across all the available AMPS.
  • Elaborate usage of generic Shell scripts to clean up any Informatica external Loader failures (MLOAD, Fast Load etc) and also to update various target tables.

Ø Performed Unit Testing for ETL Mappings and documented the results.
Ø Worked closely with QA team to close the bugs if any.
Ø Worked closely with RTP team to deploy code from Dev to QA, QA to Production
Environment: Oracle8i, Informatica 6.1, Windows 2000, SQL, TOAD, UNIX, Maestro, Business Objects 5.1, Teradata v2R5

Confidential,May03 – Oct04
ETL developer
Confidential,Schaumburg

A PCS cost accounting DSS project has requested the creation of a customized, in-house computer application that can provide an automated and consistent method of capturing the costs incurred during all the life cycle phases of developing Motorola cellular product. Cost Management for engineering projects (CMEP) is the project developed for this purpose.

The whole project has three main modules to it. Extraction, Transformation and Loading (ETL) is one module and Reports is another module and CMEP Application interface is another module.

Responsibilities:

Ø Studied about the current system present.
Ø Analyzed the specification document.
Ø Created the Detailed design and Standard’s Document’s.
Ø Extensively worked on Informatica Designer, Workflow Manager, Workflow Monitor Working on Source Analyzer, Warehouse designer, Mapping Designer & Mapplet Designer.
Ø Created and Generated SQL scripts for Warehouse Tables.
Ø Performed Unit Testing for ETL Mappings and documented the results.
Ø Involved in data extraction from Access and Oracle using informatica.
Ø Created and executed sessions, database connections and worklets using Workflow Manager.
Ø Used Pmcmd Commands for starting and stopping the workflows.
Ø Scheduled and monitored transformation processes using informatica workflow manager.
Ø Created the test instance and scheduled for the workflows for test.
Ø Experienced in deployment of code from Development to UAT server and UAT server to Production server.
Ø Created Unit Test Cases, Defect logs and System Test Plan
Ø Participate in the requirement review meetings as required
Environment:Oracle8i, Informatica 6.1,Windows XP, Business Objects 5.1,Clear case, VB.Net, Unix

Confidential,Jun02 – Feb03
ETL Developer
Confidential,
Capacity Collaboration is the component of forecast collaboration that facilitates the exchange of planning information between manufacturers and suppliers at the resource and item-resource levels. Capacity Collaboration can be either non-captive or captive.

In general, modeling capacity collaborations includes the following steps:
1. Define the entities included in the collaboration.
2. Define the data measures that are monitored in the collaboration.
3. Define the views that are used to see the collaboration data.
4. Define the actions associated with the collaboration, such as create, modify, Delete, or update an object.

Responsibilities:

Ø Studied the user’s analysis requirements.
Ø Involved in Data Extraction from Oracle, Flat files using Informatica.
Ø Extensively worked on Informatica Designer, Workflow Manager, Workflow Monitor Working on Source Analyzer, Warehouse designer, Mapping Designer & Mapplet Designer.
Ø Used Router, Aggregator, Lookup, Expression and Update Strategy transformations whenever required in the mapping.
Ø Designed and developed aggregator, joiner, look up, Filter, Router transformation rules to generate consolidated data using Informatica ETL tool.
Ø Created mappings and mapplet’s and also tuning them for better performance.
Ø Created and executed sessions, database connections and worklets using Workflow Manager.
Ø Scheduled and monitored transformation processes using informatica workflow manager.
Ø Scheduled the workflow on daily basis using Informatica Scheduler.
Environment: Oracle8i, Informatica 6.0, Windows 2000,TOAD, Clearcase.

Confidential,Sep01 – May02
ETL Developer
Confidential,
The Product Report Wizard is the calculation of last year gross sales bill and quantity, by using the FISCAL CALENDER table and taking out the fiscal month start date, end date and fiscal month work days, comparing these dates with last year we are calculating the gross sales bill and quantity and also initializing the two fields with zero at start date of fiscal month and loading the data into oracle table. The Product Report Wizard is developed for this purpose.

Responsibilities:

Ø Created the Detailed Design Document and Standard Document.
Ø Created and generate SQL scripts for tables.
Ø Involved in data extraction from oracle and flat files using informatica.
Ø Extensively worked on Informatica Designer, Workflow Manager, Workflow Monitor Working on Source Analyzer, Warehouse designer, Mapping Designer & Mapplet Designer.
Ø Writing SQL overrides, documenting the informatica mappings.
Ø Designed and developed aggregator, lookup, filter transformation rules to generate consolidated data
Ø Created mappings, mapplets for better performance
Ø Created and executed sessions using work flow manager
Ø Scheduling the work flows 5 days a week using Informatica Scheduler.
Ø Participate in the requirement review meeting as required.
Ø Followed the informatica scanner developed by G.E for their projects to maintain the standards of informatica.
Ø Worked on Version Control
Environment: Oracle8i, Informatica Power Center 5x, Windows NT, UNIX

We'd love your feedback!