We provide IT Staff Augmentation Services!

Etl Developer Resume

4.00/5 (Submit Your Rating)

Washington, DC

Professional Summary:

  • 8 years of IT Experience in Data Warehousing, Database Design and ETL Processes in the Test and Production environments of various business domains like finance, manufacturing and health care industries.
  • Highly proficient in Development, Implementation, Administration and Support of ETL processes for Large-scale Data warehouses using Informatica Power Center.
  • Worked extensively on ETL process using Informatica Power Center 9.x/8.x/7.x.
  • SQL database experience in a high transaction and multi server production environment.
  • Expert Knowledge of Integration Services (SSIS), Analysis Services (SSAS) and Reporting Services (SSRS).
  • Experience in Database Development, Data Warehousing, Design and Technical Management.
  • Good understanding indatabaseanddata warehousingconcepts (OLTP OLAP).
  • Hands on Experience in Installing, Configuring, Managing, Monitoring and Troubleshooting SQL Server 2005/2008.
  • Hands on experience in tuning mappings, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings, and sessions and at database.
  • Extensive experience with data modeling techniques, logical and physical database design.
  • Experience in Extracting, Transforming and Loading (ETL) data from Excel, Flat file, Oracle to MS SQL Server by using DTS and SSIS services.
  • SQL Server 2008/2005 RDBMS database development including T-SQL programming.
  • Extracted data from File Sources, Relational Sources, XML and COBOL sources using Informatica PowerCenter.
  • Experience in administrating the created reports and assigning permission to the valid users for executing the reports.
  • Worked extensively on ETL process using IBM Cognos Data Manager and Pentaho Kettle.
  • Worked on OLAP Data warehouse , Model ,Design and Implementation
  • Experience in upgrading from Informatica Powercenter 8.1 to Informatica Powercenter 9.1.
  • Experience in Creating and Updating Clustered and Non-Clustered Indexes to keep up the SQL Server performance.
  • Used Power Exchange to integrate the sources like Mainframe MVS, VSAM, GDG, DB2 and XML files.
  • Experience in UNIX shell scripting, job scheduling and communicating with server using pmcmd.
  • Experience in Analyzing Execution Plan and managing indexes and troubleshooting deadlocks.
  • Good knowledge in using system tables such as sysindexes, sysprocesses, sysobjects and syscomment in various queries and using Database Console Commands (DBCC).
  • Good knowledge in Normalizing and De-normalizing the tables and maintaining Referential Integrity by using Triggers and Primary and Foreign Keys.
  • In-depth understanding of Data Warehousing and Business Intelligence concept
  • Designed and developed efficient Error handling methods and implemented throughout the mappings in various projects.
  • Designed, tested, and deployed plans using IDQ 8.5
  • Good knowledge of Master Data Management concepts.
  • Responsible for interacting with business partners to identify information needs and business requirements for Reports.
  • Good Knowledge of Kimball and Inmon data warehouse design approaches and considerations
  • Good understanding of dimensional models, slowly changing dimensions, star snowflake schemas.

Technical Skills:

Databases:

Oracle 10g/9i/11i/R12, DB2, SQL server 7.0/2000/2005/2008, MS Access 2000/2005, Teradata

Languages:

Transact- SQL, PL/SQL,HTML, C, C#, PERL

Operating Systems:

Windows NT/98/2000/XP/Vista/2003, Linux, Unix, MS-DOS, Sun Solaris.

OLAP/Reporting Tools:

SQLServer Analysis Service(SSAS), SQL Server Reporting Service(SSRS), Share Point MOSS 2007, Business Objects 6.x, Cognos Framework Manager

ETL Tools:

SQL Server Integration Services (SSIS), SQL Server DTS, Informatica Power Center 9.x/8.x/7.x, Informatica Data Quality Suite 8.5, Cognos Data Manager, Pentaho Kettle

Data Modeling Tools:

Microsoft Visio 2000/2003

SQL Server Tools:

SQL server Management Studio, SQL server Query Analyzer, SQL server mail service , DBCC, BCP , SQL server profiler

Web Technologies:

MS FrontPage, MS Outlook Express, FTP, TCP/IP, LAN,PHP

Other Tools:

Microsoft Office, MS Visio, Visual Basic 6

Professional Experience:

Confidential Aug 2012 – Till Date
Washington, DC.
Role: ETL Developer

The Federal National Mortgage Association, commonly known as Fannie Mae, is a government-sponsored enterprise (GSE). The corporation\'s purpose is to expand the secondary mortgage market by securitizing mortgages in the form of mortgage-backed securities (MBS), allowing lenders to reinvest their assets into more lending and in effect increasing the number of lenders in the mortgage market by reducing the reliance on thrifts. The objective of the project is to validate the Annual Loan Level Disclosure of MBS with the Fixed and Adjustable rate loan pools.

Responsibilities:

  • Designed and developed various Informatica mappings using transformations like Expression, Aggregator, External Procedure, Stored Procedure, Normalizer, Lookup, Filter, Joiner, Rank, Router, Update Strategy and XML.
  • Developed various Ad-hoc mappings for various business needs.
  • Perform Designs and Analysis on business systems applications, systems interfaces, databases, reporting, or business intelligence systems
  • Deliver new systems functionality supporting corporate business objectives.
  • Translate requirements and high-level design into detailed functional design specifications.
  • Responsible to tune ETL procedures and schemas to optimize load and query Performance.
  • Interpreted logical and physical data models for Business users to determine common data definitions.
  • Supported the BI team by extracting operational data from multiple sources, merging and transforming the data to facilitate enterprise-wide reporting and analysis and delivering the transformed data to coordinated data marts.
  • Involved in Data Validating, Data integrity, Performance related to DB, Field Size Validations, Check Constraints and Data Manipulation.
  • Coordinated with Business Users to understand business needs and implement the same into a functional Data warehouse design.
  • Used ERStudio to analyze and optimize database and data warehouse structure.
  • Implemented the concept of slowly changing dimensions to maintain current and historical data in the dimension.
  • Used Autosys scheduler to schedule and run Informatica workflows.

Environment: Informatica 9.5, Oracle 11g, UNIX, PL/SQL, Oracle 11g, Embarcadero ERStudio Data Architect, TOAD, Autosys, Putty.

Confidential,USA Nov 2011 – Aug 2012
Bellevue, WA.
Role: ETL Developer

T-Mobile USA, Inc. is an American mobile-network operator that provides wireless voice, messaging and data services in the United States, Puerto Rico and the U.S. Virgin Islands. The company is the fourth-largest wireless carrier in the U.S. market. T-Mobile plans to implement a hosted Sales Performance Management (SPM) solution to support sales performance management and the sales compensation administration process. The purpose of this project is to deliver the technology needed to replace the current systems’ functionality with an easy-to-maintain, end-to-end SPM solution.

Responsibilities:

  • Designed and developed various Informatica mappings using transformations like Expression, Aggregator, External Procedure, Stored Procedure, Lookup, Filter, Joiner, Rank, Router, Update Strategy and XML.
  • Developed various Ad-hoc mappings for various business needs.
  • Perform Designs and Analysis on business systems applications, systems interfaces, databases, reporting, or business intelligence systems
  • Deliver new systems functionality supporting corporate business objectives.
  • Translate requirements and high-level design into detailed functional design specifications.
  • Wrote stored procedures, functions, and database triggers in SQL Server 2008.
  • Responsible to tune ETL procedures and schemas to optimize load and query Performance.
  • Interpreted logical and physical data models for Business users to determine common data definitions.
  • Supported the BI team by extracting operational data from multiple sources, merging and transforming the data to facilitate enterprise-wide reporting and analysis and delivering the transformed data to coordinated data marts.
  • Involved in Data Validating, Data integrity, Performance related to DB, Field Size Validations, Check Constraints and Data Manipulation and Updates by using SQL Single Row Functions.
  • Coordinated with Business Users to understand business needs and implement the same into a functional Data warehouse design.
  • Used ERStudio to analyze and optimize database and data warehouse structure.
  • Implemented the concept of slowly changing dimensions to maintain current and historical data in the dimension.
  • Facilitate/lead design reviews of functional design with other members of the technical team, communicating design, requirements, feature set, functionality and limitations of systems/applications.
  • Provide SME level guidance to Development and Application support teams during development and deployment phases.

Environment: Informatica 9.1, SSIS, SQL Server 2008, PL/SQL, Transact SQL, Oracle 11g, Embarcadero ERStudio Data Architect, TOAD.

Confidential Jan 2011 – Nov 2011
Austin, TX.
Role: ETL Engineer

Golfsmith International Inc. is the world\'s largest superstore for golf equipments and golf accessories. The primary objective of this project is to get data from the new and the old environments together, on the backdrop of Oracle EBS R12 implementation, and perform the necessary operations on the data as per the user requirement and load the subsequent data into the data warehouse for analysis and reporting.

Responsibilities:

  • Designed and developed various Informatica mappings using transformations like Expression, Aggregator, External Procedure, Stored Procedure, Lookup, Filter, Joiner, Rank, Router, Update Strategy and XML.
  • Developed various Ad-hoc mappings for various business needs.
  • Installing Informatica Powercenter 9.1 and migrating the old processes and codes from previously existent Informatica Powercenter 8.1.
  • Responsible for developing, Testing, support and maintenance for the ETL processes using Informatica Power Center 9.1.
  • Responsible for development of robust and reliable solutions requiring low maintenance, thorough testing of solutions
  • Supported the BI team by extracting operational data from multiple sources, merging and transforming the data to facilitate enterprise-wide reporting and analysis and delivering the transformed data to coordinated data marts using Cognos 8 BI Data Manager.
  • Delivering metadata to Cognos Framework Manager, to model the target data warehouse and data repositories used in Cognos Business Intelligence.
  • Creating and scheduling jobstreams in Cognos Data Manager to automate the Data Load processes to target Data warehouse.
  • Designed the process to write failed and rejected rows into flat files and tables using Cognos Data Manager.
  • Developed UNIX shell scripts for scheduling sessions in Informatica.
  • Wrote stored procedures, functions, and database triggers in SQL Server 2008.
  • Extensively used SSIS FTP tasks for data transfers from FTP servers to SQL Server.
  • Created SSIS packages to clean and load data to data warehouse.
  • Created SSIS package to transfer data between OLTP and OLAP databases.
  • Experience in creating Jobs, Alerts, SQL Mail Agent, and schedule SSIS Packages.
  • Coordinated with Business Users to understand business needs and implement the same into a functional Data warehouse design.
  • Assisted in documenting business requirements for data warehousing and integration needs.
  • Worked extensively with Oracle Applications R12 and Oracle Applications 11.0.3
  • Retrofitted PL/SQL scripts, functions and procedures from Oracle 11.0.3 into the R12 world.
  • Experienced in using the UTL_File package in PL/SQL, to develop oracle concurrent request scripts for writing tables into MS Excel Worksheet.
  • Responsible to tune ETL procedures and schemas to optimize load and query Performance.
  • Developed stored procedures using PL/SQL.
  • Developed data transfer and transformation tasks using Pentaho Kettle.

Environment: Informatica Power Center 9.1/8.1, IBM Cognos 8 BI Data Manager, SSIS, Pentaho Kettle, SQL Server 2008, PL/SQL, Transact SQL, Oracle EBS R12, Oracle Applications 11.0.3, Windows XP.

Confidential Sep 2009 – Dec 2010
Washington, DC
Role: ETL Engineer

The DCI Group, LLC is one of the top strategic public affairs consulting firm associated with telemarketing company Feather Larson Synhorst DCI and the direct-mail firm FYI Messaging. The idea of the project is to initially build data marts and then integrate all marts into one enterprise wide data warehouse.

Responsibilities:

  • Designed and developed various Informatica mappings using transformations like Expression, Aggregator, External Procedure, Stored Procedure, Lookup, Filter, Joiner, Rank, Router, Update Strategy and XML. Developed various Ad-hoc mappings for various business needs.
  • Designed, tested, and deployed plans using Informatica Data Quality Suite 8.5
  • Developed and tested all the backend programs, Error Handling Strategies and update processes.
  • Experience in using Normalizer transformation for normalizing the XML source data.
  • Extensively used XML transformation to generate target XML files.
  • Developed Scripts to automate the Data Load processes to target Data warehouse.
  • Worked extensively with Oracle Data Integrator[ODI] and Oracle Warehouse Builder[OWB]
  • Responsible to tune ETL procedures and schemas to optimize load and query Performance.
  • Involved with the Architecture group to develop ETL metadata strategies and Informatica objects reuse policies. Developed reusable Informatica Mapplet’s and Transformations.
  • Developed ETL technical specs, Visio for ETL process flow and ETL load plan, ETL execution plan, Test cases, Test scripts etc.
  • Involved in production support activities with Installation and Configuration of Informatica Power Center 8.6.
  • Analyzed, Designed and Implemented the ETL architecture and generated OLAP reports.
  • Used Informatica Workflow Monitor to monitor and control jobs.
  • Accomplished data movement process that load data from DB2 into Teradata by the development of Korn Shell scripts, using Teradata SQL and utilities such as BTEQ, FASTLOAD, FASTEXPORT, and MULTILOAD.
  • Developed stored procedures using PL/SQL and driving scripts using Unix Shell Scripts.
  • Created BTEQ and multiload scripts to load Teradata tables.
  • Worked extensively with Teradata utilities (MLOAD, TPUMP and FAST LOAD) to load data.
  • Wrote stored procedures, functions, and database triggers. Created database triggers on tables to generate surrogate keys.
  • Coordinated with Business Users to understand business needs and implement the same into a functional Data warehouse design.
  • Interpreted logical and physical data models for Business users to determine common data definitions.
  • Installation, Configuration and Deployment of Informatica 8.6 with Informatica GRID option.
  • Assisted in documenting business requirements for data warehousing and integration needs.

Environment:Informatica Power Center 8.6, Informatica Data Quality Suite 8.5, Teradata V2R6, Oracle 10g, DB2, TOAD, Erwin 3.5.2, PL/SQL, WINCVS, Windows XP, UNIX, Sun Solaris

ConfidentialSept 2008 - Aug 2009
Stamford, CT
Role: ETL Engineer

  • UBSis a premier global financial services firm offering wealth management, investment banking, asset management and business banking services to its clients. The Data Warehouse of the firm was developed to provide account, customers, branch, policies and employees information. This information was extracted from different OLTP applications using Informatica and was transformed into a Teradata Data mart.

Responsibilities:

  • Worked on Informatica Power Center 6.2 - Source Analyzer, Data warehouse designer, Mapping Designer Mapplet, Transformations, Work Flow Manager (Task Developer, Worklets, and Work Flow Designer) and Work Flow Monitor.
  • Responsible for design, development and maintenance of Data Marts including Sales, Policy, Customer Reporting and Claims leveraging Informatica Power Center ETL tool, Oracle, DB2 andPL/SQL.
  • Responsible for developing, Testing, support and maintenance for the ETL processes using Informatica Power Center.
  • Responsible for development of robust and reliable solutions requiring low maintenance, thorough testing of solutions.
  • Created mappings using the transformations like Source Qualifier, Aggregator, Expression, Lookup, Router, Normalizer, Filter, Update Strategy and Joiner transformations.
  • Created reusable transformations and Mapplets and used them in complex mappings.
  • Used debugger in identifying bugs in existing mappings by analyzing data flow, evaluating transformations.
  • Used Informatica Power center workflow manager to create sessions, workflows and Worklets to run with the logic embedded in the mappings.
  • Used the Informatica Server Manager to embed sessions with pre- and post-session shell scripts. Also, wrapped session PMCMD commands in shell scripts for batch processing using Autosys Scheduler.
  • Developed BTEQ, MLOAD scripts to load data to Teradata Data mart.
  • Designed and developed UNIX shell scripts to schedule jobs. Also wrote pre-session and post-session shell scripts.
  • Used crontab to schedule jobs in the production environment.
  • Unit tested and tuned SQLs and ETL Code for better performance.
  • Designed high level ETL architecture for overall data transfer from the OLTP to OLAP with the help of SSIS.
  • Created various Documents such as Source-To-Target Data mapping Document, Unit Test Cases and Data Migration Document.
  • Developed Complex Teradata SQL in BTEQ scripts to transform the data in Teradata staging tables before loading into EDW.
  • Developed BTEQ and Fast Export scripts to extract data to files from warehouse for downstream applications.
  • Created Stored Procedures to transform the data. Worked extensively on SQL, PL/SQL for various needs of the transformations.
  • Actively involved in building the system test environment and migrated mappings from Development to System Test environment and executed code in QA environment.
  • Created packages in Harvest to migrate code across multiple environments through a standard transmittal process.
  • Involved in conducting a POC (proof of concept) on SAS integration with Teradata.
  • Participated in Review of Test Plan, Test Cases and Test Scripts prepared by system integration testing team.
  • Monitored the performance and identified performance bottlenecks in ETL code.

Environment:Informatica Power Center 6.2, Oracle 9i, DB2, Erwin 4.0, Unix Shell Scripting, Teradata V2R5, Crontab, MS PowerPoint, Business Objects 6.0/6.5, SAS, TOAD 7.5, SQL, PL/SQL, Win NT 4.0

ConfidentialJune 2006 - Aug 2008
Bangalore, India
Role: ETL Developer

Dell Inc. is an American multinational information technology corporation that develops, sells and supports computers and related products and services. The primary objective of this project is to get data from different sources (SQL Server, Oracle, Flat file) and perform the necessary operations on the data as per the user requirement and load into the data warehouse for analysis.

Responsibilities:

  • Involved in Dimensional modeling to Design and develop STAR Schemas, used ER-win 4.0, identifying Fact and Dimension Tables.
  • Worked with different Sources such as Oracle, MS SQL Server and Flat file.
  • Used Informatica to extract data into Data Warehouse.
  • Extensively used several transformations such as Source Qualifier, Router, Lookup (connected unconnected), Update Strategy, Joiner, Expression, Aggregator and Sequence generator transformations.
  • Identified and tracked the slowly changing dimensions, heterogeneous Sources and determined the hierarchies in dimensions.
  • Accomplished data movement process that load data from databases, files into Teradata by the development of Korn Shell scripts, using Teradata SQL and utilities such as BTEQ, FASTLOAD, FASTEXPORT, MULTILOAD, Winddi and Queryman.
  • Tuned Teradata SQL Queries.
  • Created reusable transformations and Mapplets and used them in mappings.
  • Used Informatica Repository Manager to maintain all the repositories of various applications.
  • Developed number of Complex Informatica Mappings, Mapplets and Reusable Transformations for the Claim Profitability Systems to facilitate Daily, Monthly and yearly Loading of Data.
  • Involved in fixing invalid Mappings, testing of Stored Procedures and Functions, Testing of Informatica Sessions, and the Target Data.
  • Wrote UNIX scripts and PL/SQL scripts for implementing business rules.
  • Worked on SQL tools like TOAD to run SQL queries to validate the data.
  • Design and Development of pre-session, post-session and batch execution routines to run Informatica sessions using Informatica Server.
  • Created and Scheduled sessions Worklets using workflow Manager to load the data into the Target Database
  • Wrote PL/SQL stored Procedures and Functions for Stored Procedure Transformations.
  • Used Control –M scheduler to automate the process.

Environment: Informatica PowerCenter 7.1.3, PowerExchange, Oracle9i, MS SQL Server 2000, Teradata V2R5, MS Enterprise Manager, MS Query Analyzer, Flat file, XML, Control-M (scheduler), Linux, UNIX Korn scripting, Toad, Erwin 4.0, Java, JMS, Windows NT, Clear case.

ConfidentialDec 2004 - Apr 2006
Thane, India
Role: SQL Developer

Siemens is the world’s single-source leader of automation technology products engineered and manufactured for all industrial sectors. This project was to create and maintain a central data warehouse for all the data and to create reports to give better analysis to the automation and drive department.

Responsibilities:

  • Worked on complex data loading (implemented the batch data cleansing and data loading).
  • Used BCP utility to publish table output to text files.
  • Worked on DTS Package, DTS Import/Export for transferring data from Heterogeneous Database to SQL Server.
  • Creation/ Maintenance of Indexes for fast and efficient reporting process.
  • Configured Server for sending automatic mails to the respective people when a DTS process failure or success.
  • Created new tables, written stored procedures for Application Developers and some user defined functions.
  • Analyzed the Database Growth and Space Requirement. Handling Users/Logins/User Rights.
  • Maintained a good client relationship by communicating daily status and weekly status of the project.
  • Created linked servers between different SQL Servers and also created linked server with different access files used across various Used Data departments
  • Performed DBA tasks through Enterprise Manager, Query Analyzer, English Query, Import Export wizard
  • Modification Language (DML) to insert and update data, satisfying the referential integrity constraints and ACID properties.
  • Performance tuning of SQL queries and stored procedures using SQL Profiler and Index Tuning Wizard.
  • Developed SQL scripts to Insert/Update and Delete data in MS SQL database tables.
  • Developed code that matches the prototype and specification, is maintain and, as necessary, portable to other environments.
  • Created Business-Crucial stored procedures and functions to support efficient data storage and manipulation.

Environment: UNIX, Shell Scripting SQL Server 7.0/2000, Enterprise Manager, SQL Profiler, DTS, T-SQL, Replication.

We'd love your feedback!