Sr. Etl/informatica Developer Resume
PROFESSIONAL SUMMARY:
- Eight plus (8+) years of IT experience in the Analysis, Design, Development, Testing and Implementation of business application systems for Health care, Pharmaceutical, Financial, Telecom and Manufacturing Sectors.
- Strong experience in the Analysis, design, development, testing and Implementation of Business Intelligence solutions using Data Warehouse/Data Mart Design, ETL, OLAP, BI, Client/Server applications.
- Strong Data Warehousing ETL experience of using Informatica 9.1/8.6.1/8.5/8.1/7.1 PowerCenter Client tools - Mapping Designer, Repository manager, Workflow Manager/Monitor and Server tools – Informatica Server, Repository Server manager.
- Expertise in Data Warehouse/Data mart, ODS, OLTP and OLAP implementations teamed with project scope, Analysis, requirements gathering, data modeling, Effort Estimation, ETL Design, development, System testing, Implementation and production support.
- Extensive testing ETL experience using Informatica 9.1/8.6.1/8.58.1/7.1/6.2/5.1 (Power Center/ Power Mart) (Designer, Workflow Manager, Workflow Monitor and Server Manager) Teradata and Business Objects.
- Strong experience in Dimensional Modeling using Star and Snow Flake Schema, Identifying Facts and Dimensions, Physical and logical data modeling using ERwin and ER-Studio.
- Expertise in working with relational databases such as Oracle 11g/10g/9i/8x, SQL Server 2008/2005, DB2 8.0/7.0, UDB, MS Access and Teradata.
- Strong experience in Extraction, Transformation and Loading (ETL) data from various sources into Data Warehouses and Data Marts using Informatica Power Center (Repository Manager, Designer, Workflow Manager, Workflow Monitor, Metadata Manger), Power Exchange, Power Connect as ETL tool on Oracle, DB2 and SQL Server Databases.
- Experience using SAS ETL tool, Talend ETL tool and SAS Enterprise Data Integration Server highly preferred.
- Extensive experience in developing Stored Procedures, Functions, Views and Triggers, Complex SQL queries using SQL Server, TSQL and Oracle PL/SQL.
- Experience in resolving on-going maintenance issues and bug fixes; monitoring Informatica sessions as well as performance tuning of mappings and sessions.
- Experience in all phases of Data warehouse development from requirements gathering for the data warehouse to develop the code, Unit Testing and Documenting.
- Extensive experience in writing UNIX shell scripts and automation of the ETL processes using UNIX shell scripting.
- Proficient in the Integration of various data sources with multiple relational databases like Oracle11g /Oracle10g/9i, MS SQL Server, DB2, Teradata, VSAM files and Flat Files into the staging area, ODS, Data Warehouse and Data Mart.
- Experience in using Automation Scheduling tools like Autosys and Control-M.
- Worked extensively with slowly changing dimensions.
- Hands-on experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data mapping, build, unit testing, systems integration and user acceptance testing.
- Excellent interpersonal and communication skills, and is experienced in working with senior level managers, business people and developers across multiple disciplines.
Education:
Friends University, Wichita, KS
- B.S. Computer Information System with Mathematics as minor.
- MBA Student
Technical Skills:
Operating Systems: Windows 2008/2007/2005/NT/XP, UNIX, MS-DOS
ETL Tools: Informatica Power Center 9.1/8.6/8.5/8.1/7.1 (Designer, Workflow Manager, Workflow Monitor, Repository manager and Informatica Server)
Databases: Oracle 11g/10g/9i/8i, MS SQL Server 2008/2005, DB2 v8.1, Teradata.
Data Modeling tools: Erwin, MS Visio
OLAP Tools: Cognos 8.0/8.1/8.2/8.4/7.0/, Business Objects XI r2/6.x/5.x, OBIEE 10.1.3.4/ 10.1.3.3
Languages: SQL, PL/SQL, UNIX, Shell scripts, C++
Scheduling Tools: Autosys ,Control-M
Testing Tools QTP, WinRunner, LoadRunner, Quality Center, Test Director
Professional Experience:
Confidential, Denver, CO Sept 2011 to May 2012
Sr. ETL/Informatica Developer
Description: Qwest is a large telecommunications carrier. Qwest Communications provides long-distance services and broadband data, as well as voice and video communications globally. This project includes developing Data warehouse from different data feeds and other operational data sources.
Built a central Database where data comes from different sources like oracle, SQL server and flat files. Actively involved as an Analyst for preparing design documents and interacted with the data modelers to understand the data model and design the ETL logic.
Responsibilities:
- Responsible for Business Analysis and Requirements Collection.
- Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
- Parsed high-level design specification to simple ETL coding and mapping standards.
- Designed and customized data models for Data warehouse supporting data from multiple sources on real time
- Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse.
- Created mapping documents to outline data flow from sources to targets.
- Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions and measured facts.
- Extracted the data from the flat files and other RDBMS databases into staging area and populated onto Data warehouse.
- Maintained stored definitions, transformation rules and targets definitions using Informatica repository Manager.
- Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.
- Developed mapping parameters and variables to support SQL override.
- Created mapplets to use them in different mappings.
- Developed mappings to load into staging tables and then to Dimensions and Facts.
- Used existing ETL standards to develop these mappings.
- Worked on different tasks in Workflows like sessions, events raise, event wait, decision, e-mail, command, worklets, Assignment, Timer and scheduling of the workflow.
- Created sessions, configured workflows to extract data from various sources, transformed data, and loading into data warehouse.
- Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.
- Extensively used SQL* loader to load data from flat files to the database tables in Oracle.
- Modified existing mappings for enhancements of new business requirements.
- Used Debugger to test the mappings and fixed the bugs.
- Wrote UNIX shell Scripts & PMCMD commands for FTP of files from remote server and backup of repository and folder.
- Involved in Performance tuning at source, target, mappings, sessions, and system levels.
- Prepared migration document to move the mappings from development to testing and then to production repositories.
Environment: Informatica Power Center 8.6.1, Workflow Manager, Workflow Monitor, Informatica Power Connect / Power Exchange, Data Analyzer 8.1, PL/SQL, Oracle 10g/9i, Erwin, Autosys, SQL Server 2005, Sybase, UNIX AIX, Toad 9.0, Cognos 8.
Confidential, Stamford, CT Nov 2010 to July 2011
Role: ETL Consultant
XL Global Services Inc. provides the backbone Information Technology support to the XL Capital group of companies, a leading provider of insurance and reinsurance coverage, innovative risk management and financial solutions. As part of providing financial solutions, XL Global Services Inc generates various reports for presenting a comprehensive Credit and Risk analysis for its customers. The project was designed to develop and maintain Data Marts. We have to upload the data from various centers with the data in different systems using ETL Tools.
Responsibilities:
- Logical and Physical data modeling was done using Erwin for data warehouse database in STAR SCHEMA
- Using Informatica PowerCenter Designer analyzed the source data to Extract & Transform from various source systems(oracle 10g,DB2,SQL server and flat files) by incorporating business rules using different objects and functions that the tool supports.
- Using Informatica PowerCenter created mappings and mapplets to transform the data according to the business rules.
- Used various transformations like Source Qualifier, Joiner, Lookup, sql ,router, Filter, Expression and Update Strategy.
- Implemented slowly changing dimensions (SCD) for some of the Tables as per user requirement.
- Developed Stored Procedures and used them in Stored Procedure transformation for data processing and have used data migration tools
- Documented Informatica mappings in Excel spread sheet.
- Tuned the Informatica mappings for optimal load performance.
- Have used BTEQ, FEXP, FLOAD, MLOAD Teradata utilities to export and load data to/from Flat files.
- Created and Configured Workflows and Sessions to transport the data to target warehouse Oracle tables using Informatica Workflow Manager.
- Have generated reports using OBIEE 10.1.3 for the future business utilities.
- This role carries primary responsibility for problem determination and resolution for each SAP application system database server and application server.
- Worked along with UNIX team for writing UNIX shell scripts to customize the server scheduling jobs.
- Constantly interacted with business users to discuss requirements.
Environment: Informatica PowerCenter Designer 8.6/8.1, Informatica Repository Manager, Oracle10g/9i,DB2 6.1, Erwin, TOAD, SAP Version: 3.1.H,Unix- SunOS, PL/SQL,SQL Developer
Confidential, Pittsburgh, PA Aug 2008 – Sept 2009
Sr. ETL/Informatica Developer
Project(s): EPC BI / EQUITRANS BI/ EGC BI / ETRM BI
Description: EQT Corporation is an integrated energy company, supplying natural gas, crude oil, and gas related services to the customers. The main objective of the project was to help the decision making team of the organization, to monitor/improve the sales and to explore avenues for new business opportunities. DW team is responsible for building Global Data Warehouse and providing reports for Production and Midstream groups. Worked on four capital projects EPC BI, EQUITRANS BI, EGC BI and ETRM BI. The data is extracted from Flat files, Oracle, SQL and DB2 into the Operational Data Source (ODS) and the data from Operational Data Source was extracted, transformed and applied business logic to load them in the Global Data Warehouse Using Informatica PowerCenter 9.1.0 tools.
Responsibilities:
- Interacted with Data Modelers and Business Analysts to understand the requirements and the impact of the ETL on the business.
- Designed ETL specification documents for all the projects.
- Created Tables, Keys (Unique and Primary) and Indexes in the SQL server.
- Extracted data from Flat files, DB2, SQL and Oracle to build an Operation Data Source. Applied business logic to load the data into Global Data Warehouse.
- Extensively worked on Facts and Slowly Changing Dimension (SCD) tables.
- Maintained source and target mappings, transformation logic and processes to reflect the changing business environment over time.
- Used various transformations like Filter, Router, Expression, Lookup (connected and unconnected), Aggregator, Sequence Generator, Update Strategy, Joiner, Normalizer, Sorter and Union to develop robust mappings in the Informatica Designer.
- Extensively used the Add Currently Processed Flat File Name port to load the flat file name and to load contract number coming from flat file name into Target.
- Worked on complex Source Qualifier queries, Pre and Post SQL queries in the Target.
- Worked on different tasks in Workflow Manager like Sessions, Events raise, Event wait, Decision, E-mail, Command, Worklets, Assignment, Timer and Scheduling of the workflow.
- Extensively used workflow variables, mapping parameters and mapping variables.
- Created sessions, batches for incremental load into staging tables and scheduled them to run daily.
- Used shortcuts to reuse objects without creating multiple objects in the repository and inherit changes made to the source automatically.
- Implemented Informatica recommendations, methodologies and best practices.
- Implemented performance tuning logic on Targets, Sources, Mappings and Sessions to provide maximum efficiency and performance.
- Involved in Unit, Integration, System, and Performance testing levels.
- Written documentation to describe program development, logic, coding, testing, changes and corrections.
- Migrated the code into QA (Testing) and supported QA team and UAT (User).
- Created detailed Unit Test Document with all possible Test cases/Scripts.
- Conducted code reviews developed by my team mates before moving the code into QA.
- Provided support to develop the entire warehouse architecture and plan the ETL process.
- Modified existing mappings for enhancements of new business requirements.
- Prepared migration document to move the mappings from development to testing and then to production repositories.
- Involved in production support.
- Works as a fully contributing team member, under broad guidance with independent planning & execution responsibilities.
Environment: Informatica PowerCenter 9.1.0, Oracle 11g, SQLServer2008, IBM ISeries (DB2), MS Access, Windows XP, Toad, Tidal, Cognos 8.4.1., SQL developer.
Confidential, NJ October 2007 - July 2008
Role: Sr ETL Developer
NYK Lines is one of the world's premier full service intermodal carriers. The company utilizes a vast network of ocean vessels, barges, and railroad and motor carriers to link the international shipper with the consignee and also services offered include Intermodal services, terminals and warehousing, insurance, as well as repair and maintenances.
Modules: Shipment Data Mart, Job order Cost Mart, Net Contribution Mart, DnD Mart (Detention & Demurrage)
Responsibilities:
Environment: Informatica Power Center 8.6/8.1, SQL*Loader, IDOC, RFC, HP Quality Center, Oracle9i/10g, AUTOSYS, Rational Clear case, Rational Clear Quest, Windows XP, TOAD, UNIX.
Confidential, Chicago June 2006 to Sept 2007
Role: Sr. ETL Developer
This project responsibility is to develop an Enterprise DataWarehouse (EDW). This responsibility of this project to completely integrate the business to single environment.This data warehouse is used to access easily the detailed data on a single platform and also to facilitate the enterprise-wide data analysis to Reporting within the business environment. This Data Warehouse is build using Informatica Power Center 8.6.1 for extracting data from various sources including flat-files, SAP-ABAP, Teradata and Oracle.
Responsibilities:
- Analyzed the requirements and framed the business logic for the ETL process.
- Extracted data from Oracle as one of the source databases.
- Involved in JAD sessions for the requirements gathering and understanding.
- Involved in the ETL design and its documentation.
- Interpreted logical and physical data models for business users to determine common data definitions and establish referential integrity of the system using ER-STUDIO.
- Followed Star Schema to design dimension and fact tables.
- Experienced in handling slowly changing dimensions.
- Collect and link metadata from diverse sources, including relational databases Oracle, XML and flat files.
- Responsible for the development, implementation and support of the databases.
- Extensive experience with PL/SQL in designing, developing functions, procedures, triggers and packages.
- Developed mappings in Informatica to load the data including facts and dimensions from various sources into the Data Warehouse, using different transformations like Source Qualifier, JAVA, Expression, Lookup, Aggregate, Update Strategy and Joiner.
- Developed reusable Mapplets and Transformations.
- Used data integrator tool to support batch and for real time integration and worked on staging and integration layer.
- Optimized the performance of the mappings by various tests on sources, targets and transformations
- Design, develop and Informaica mappings and workflows; Identify and Remove Bottlenecks in order to improve the performance of mappings and workflows
- Review existing code, lead efforts to tweak and tune the performance of existing Informatica processes
- Scheduling the sessions to extract, transform and load data in to warehouse database on Business requirements.
- Scheduled the tasks using Autosys.
- Loaded the flat files data using Informatica to the staging area.
- Created SHELL SCRIPTS for generic use.
- Created high level design documents, technical specifications, coding, unit testing and resolved the defects using Quality Center 10.
- Developed unit/assembly test cases and UNIX shell scripts to run along with daily/weekly/monthly batches to reduce or eliminate manual testing effort.
Environment: Windows XP/NT, Informatica Powercenter 9.1/8.6, UNIX, Teradata V-14, Oracle 11g, Oracle Data Integrator, SQL, PL/SQL,SQL Developer, ER-win, Oracle Designer, MS VISIO, Autosys, Korn Shell, Quality Center 10.
Confidential, NJ Feb 2005 to Mar 2006
Role: Informatica Developer
Merrill Lynch is the wealth management division of Bank of America providing corporate finance &investment banking services. The objective of the project was to build data Warehouse for Customers Investment Deposit, funding accounts and Corporate Services. The data for Customers, Accounts and Transactional related information were extracted from multiple sources, transformed and loaded into the target database using ETL tool.
Responsibilities:
- Understanding the Business requirements based on Functional specification to design the ETL methodology in technical specifications.
- Developed data conversion/quality/cleansing rules and executed data cleansing activities such as data
- Consolidation, standardization, matching Trillium for the unstructured flat file data.
- Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica Power Center 8.5.
- Experience in integration of heterogeneous data sources like Oracle, DB2, SQL Server and Flat Files (Fixed & delimited) into Staging Area.
- Wrote SQL-Overrides and used filter conditions in source qualifier thereby improving the performance of the mapping.
- Designed and developed mappings using Source Qualifier, Expression, Lookup, Router, Aggregator, Filter, Sequence Generator, Stored Procedure, Update Strategy, joiner and Rank transformations.
- Managed the Metadata associated with the ETL processes used to populate the Data Warehouse.
- Implemented complex business rules in Informatica Power Center by creating re-usable transformations, and robust Mapplets.
- Implemented performance tuning of Sources, Targets, Mappings and Sessions by identifying bottlenecks and used Debugger to debug the complex mappings and fix them.
- Improved session Performance by enabling property incremental aggregation to load incremental data into target table.
- Worked with Functional team to make sure required data has been extracted and loaded and performed the Unit Testing and fixed the errors to meet the requirements.
- Copied/Exported/Imported the mappings/sessions/ worklets /workflows from development to Test Repository and promoted to Production.
- Used Session parameters, Mapping variable/parameters and created Parameter files for imparting flexible runs of workflows based on changing variable values.
- Worked with Static, Dynamic and Persistent Cache in lookup transformation for better throughput of Sessions.
- Used PMCMD command to automate the Power Center sessions and workflows through UNIX.
- Gathered business requirements from Business Analyst.
- Designed and implemented appropriate ETL mappings to extract and transform data from various sources to meet requirements.
- Designed and developed Informatica ETL mappings to extract master and transactional data from heterogeneous data feeds and load
- Installed and Configured the Informatica Client tools.
- Worked on loading of data from several flat files to XML Targets.
- Designed the procedures for getting the data from all systems to Data Warehousing system.
- Created the environment for Staging area, loading the Staging area with data from multiple sources.
- Analyzed business process workflows and assisted in the development of ETL procedures for moving data from source to target systems.
- Used workflow manager for session management, database connection management and scheduling of jobs.
- Created UNIX shell scripts for Informatica ETL tool to automate sessions.
- Monitored sessions using the workflow monitor, which were scheduled, running, completed or failed. Debugged mappings for failed sessions.
Environment: Informatica Power Center 8.5 , Oracle 10g , SQL Server 2005 , DB2, SQL*Plus, SQL Loader ,SQL Developer, Autosys, Flat files, UNIX, Windows 2000
Confidential, Seattle, WA Aug 2003 to Jan 2005
Role: ETL DEVELOPER
T-mobile is the one of the largest Telecom Companies in the USA. Joined existing onshore BI team as ETL Developer and successfully designed, developed business solutions. The project which aims in fulfilling T-mobile’s need for reporting to better understand the market trends, behavior, future opportunities and to improve there decision making process. Coordinated with the business and P&A team to understand the system requirements and then analyzing and designing ETL solutions to accomplish the same. Involved in various successful releases to accomplish T-mobile’s reporting needs under order-activation (OA) functional area.
Responsibilities:
Environment: Informatica Power Center 5.1.2/7.1, Erwin 4.5, Oracle 9i, Windows NT, Flat files, SQL, Relational Tools, Clear Case, UNIX (HP-UX, Sun Solaris, AIX) and UNIX Shell Scripts.