Informatica Resume
Fort Worth, TX
SUMMARY
- 8years of experience in analysis, design, development,testing, implementation of client/server applications, programming languages, query and reporting tools and customer support with 6years experience in Data Warehouse Applications.
- Extensive Data warehousing experience using Informatica Power Center (Repository Manager, Designer, Server Manager) Informatica Power Mart.
- Extensive experience working with SQL Server Integration Services(SSIS) and Data Transformation Services(DTS).
- Experience in designing complex reports including sub reports and formulas with complex logic using SQL Server Reporting Services(SSRS).
- Experience in performance tuning of sources, worklets, transformations, mappings, targets, workflows, sessions and batches.
- Experience in building Universes,Stored Procedures, Free Hand SQL methods and creating Complex and AdHoc Reports using Business Objects.
- Over 5 years of experience using SQL and PL/SQL..
- Worked with different Data sources ranging from flat files, VSAM files, XLS files and Oracle Databases.
- Hands on experience in Star and Snow flake schema design.
- Expertise in implementing complex Business rules by creating robust mappings, mapplets, reusable transformations and Partitioning.
- Extensive experience in Shell Scripting (Ksh/Csh).
- Exceptional analytical and problem solving skills.
SKILLS
Languages: Java, C, C++, SQL, PL SQL, JavaScript, Visual Basic 5.0/6.0, Visual Studio
ETL Tools: Informatica Power Mart7.1/6.2.1/5.x, Power Center 8.5/8.1.1/7.1/6.2.1/5.1.1,DTS, SQL Server Integration Services
Reporting Tools: Business Objects 6.5/XIR2, Cognos Impromptu/Powerplay, ReportNet, SQL Server Reporting Services, SQL Server Analysis Services
Testing Tools: Win Runner 6.5/7.0/7.5, Quick Test Pro 6.5/8.2,
Load Runner5.0/7.0/8.0/8.1, Test Director, Quality Center 8.2, Clear Quest
Protocols/API: TCP/IP, UDP, UNIX network Programming.
OS: Unix, AIX 5.2, Linux, Sun Solaris 5.8 , WIN 95/98/NT/2000 server
DBMS: Oracle 10g/9i/8i/7.x, Teradata V2R4, Sybase 12, DB2/400(AS/400), MySQL, SQL Server 6.5/7.0/2000/2005, MS Access 2000/97.
Data Modeling: Erwin 3.5/4.0, Rational Rose, UML
Others: Toad 7.0, SQL*Loader, Dream weaver, Flash, Rational Rose, MS Visio, Autosys 4.0
PROFESSIONAL EXPERIENCE
Confidential, Fort Worth, TX Feb 08 - Present
Role: Sr. Informatica Developer
Description: Confidential, is one of the world\'s leading and progressive leaders in transportation industry. It transports almost every industrial and consumer products found in our world, touching food, shelter, clothing and energy. The purpose of this project is the enhancement of the Envision to provide external and internal users with reliable, timely and accurate data that can be easily analyzed. The project is involved in migrating the data from the Legacy data sources to SAP. The Legacy source data has to be loaded in to the SAP, with the necessary conversions, using the Interfaces such as RFC, BAPI, IDOC, LSMW and BDC.
Responsibilities:
- Worked with the Business analysts for requirements gathering, business analysis, testing and for project coordination.
- Participated in the design, development and deployment of the ETL tool Infromatica and data management processes.
- Project life cycle - from analysis to production implementation, with emphasis on identifying the source and source data validation, developing particular logic and transformation as per the requirement and creating mappings and loading the data into different targets.
- Created Informatica ETL mappings that transfer data from Mainframe source systems such as VSAM Datasets, DB2 to SAP and Flat Files, Vice Versa from SAP to VSAM Datasets.
- Involved in creating Mappings to load data using the standard and custom Interfaces like RFC, BAPI, and IDOC.
- Used ETL tool Informatica to extract and load data from Mainframes, Oracle and flat files to SAP.
- Created Parameter Files in many of the Complex Mappings to virtually create a User Interface.
- Implemented SCD methodology includingType 1, Type 2 changes to keep track of historical data
- Involved in Generating and Installing SAP R/3 code for the Mappings.
- Developed data-driven approaches to support event-based job workflow scheduling.
- Used FTP tools such as Core FTP to retrieve the log files and the target files created on the UNIX server.
- Created and scheduled Worklets. Setup workflow and Tasks to schedule the loads at required frequency using Workflow Manager.
- Designed and developed Informatica mappings for data loads and data cleansing. Extensively worked on Informatica Designer and Workflow Manager.
- Extensively used many of the transformations of Informatica including lookups, Stored Procedures, Update Strategy and others.
- Developed sessions and Batches using Informatica Workflow Manager.
- Extensively worked in performance tuning of the programs, ETL Procedures and processes.
- Involved in performance improvements to the database by building Partitioned tables, Index Organized Tables and Bitmap Indexes.
- Used SQL tools like TOAD to run SQL queries and validate the data in warehouse and mart.
- Converted all Desktop Intelligence reports to Web Intelligence Reports using Report conversion tool.
- Involved in fixing invalid mappings, testing of Stored Procedures and Functions, Unit and Integrating testing of Informatica Sessions, Batches and the Target Data.
- Designed and developed table structures, stored procedures, and functions to implement business rules.
- Participated in design reviews of the report system developed in Business Objects.
- Tested the data and data integrity among various sources and targets. Associated with Production support team in various performance related issues.
- Created report prototypes using Web Intelligence XIR2 for user presentation.
- Experienced in documenting ETL design and test results.
Environment: Informatica Power Center 8.5, SAP R/3 7.1, TOAD, Oracle 9i/10g , UNIX, Core
FTP, Windows NT, Teradata, BTEQ,Tibco, Business objectsXIR2, DB2.
Confidential, Collegeville, PA Feb 06 - Jan 08
Role: Data Warehouse Developer
Description: Confidential, is one of the world's largest research based pharmaceutical companies. It is a global leader in pharmaceuticals, consumer health care products and animal health care products. The project deals with building a real time view of enterprise wide data. A decision support system is built to compare and analyze product prices, their quantities and patient profiles. IMS Health data is combined with data from other sources and is made available for ad hoc reporting. This Data Warehouse is used to deliver reports and information to sales and marketing management.
Responsibilities:
- Migrated SQL SERVER 2000 DATABASES objects to SQL Server 2005
- Developed stored procedures for schedules, reports and other client interfaces.
- Migrated DTS objects to SQL Server Integrated Services(SSIS) environment
- Migrated Microsoft Analysis Services to SQL Server Analysis Services (SSAS)
- Deploy/Design cubes in SSAS environment Snowflake/ and Star Schema Designs
- Redesign/ Deploy reports using Report Services 2005
- The objective was to build a cube (business requirements including KPI's were already declared before development) and mentor internals on procedures.
- Migrated data from SAS environment to SQL Server 2005 via SQL Server Integrated Services (SSIS).
- Applied conversion and transformation procedures according to business requirements. Designed and built cube (SSAS) with 10 dimensions in addition built partitions spanning 8 years by month.
- Added three perspective views and three drills through action objectives.
- Make modifications/and add new MDX queries to cubes
- Manage /Rebuild Partitions for cubes
- Wrote .Net application to process and rebuild multiple cube(s) partitions.
Environment:
SQL Server 2005, SQL Server 2000, VB.Net, Oracle 10g, Erwin, SQL Server Integration Services (SSIS), SQL Server Analysis Services (SSAS) ,SQL Server Report Services 2005, .Net Visual Studio 2005
Confidential, Atlanta, GA May 04 - Jan 06
Role: Sr.Informatica Developer
Description: Confidential, is an insurance firm serving individuals with a wide range of insurance products and insurance related services. It has a wide range of Insurance Products that include Auto Insurance, Homeowners Insurance and Life Insurance. Claims Management System provides the technology that assists claim professionals in administering claim practices in a timely and effective manner. This application involved in the design and development of the Data Warehouse. The company's data came from different operational sources like DB2, Oracle and SQL Server and was then loaded into the Claims Data Warehouse built on Oracle using Informatica. Various Business Rules and Business Processes were applied to Extract, Transform and Load the data into the Data Warehouse.
Responsibilities:
- Worked on Dimensional modeling to design and develop STAR Schemas, identified Fact and Dimension Tables.
- Identified and tracked the slowly changing dimensions, heterogeneous Sources and determined the hierarchies in dimensions.
- Created users, Groups and assigned users to groups and granted privileges and permissions to groups and Folders.
- Extensively used Transformations like Router Transformation, Aggregator Transformation, Normalizer Transformation, Source Qualifier Transformation, Joiner Transformation, Expression Transformation, Aggregator Transformations and Sequence generator Transformations.
- Used workflow manager for session management, database connection management and scheduling of jobs to be run in the batch process.
- Designed and developed transformations, mappings and sessions in Informatica to load target database and tuned mappings for improving performance.
- Tuned performance of Informatica sessions for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.
- Developed Shell Scripts, PL/SQL procedures, table and Index creation scripts.
- Created Pipeline partitioning to improve Session performance.
- Created and Scheduled sessions & Batches using workflow Manager to load the data into the Target Database.
- Involved in massive data cleansing of the production data load.
- Scheduled Sessions and Batch Processes based on demand, run on time, run only once using Informatica Server Manager.
- Analyzed the systems, met with end users and business units in order to define the requirements.
- Responsible to tune ETL procedures and STAR schemas to optimize load and query Performance.
- Worked with different Sources such as Oracle, SQL Server, Excel, Flat and COBOL files. Used Informatica to extract data into Data Warehouse.
- Designed and developed universes and reports according to the user requirements.
- Developed complex reports that involved multiple data providers, master/detail charts, complex variables and calculation contexts.
- Wrote documentation to describe program development, logic, coding, testing, changes and corrections.
- Performed Unit testing and System integration testing.
Environment: Informatica Power center 7.0, Business Objects 6.5, DB2 7.2 UDB, SQL Server, SQL* Plus, PL/SQL, Oracle 9i, Teradata,WindowsNT/2000, UNIX
Confidential, San Francisco, CA Aug 03 - Apr 04
Role: Data Warehouse Developer
Description: Confidential, is one of the most recognized apparel brands in the world. In order to quickly identify customer needs and improve services to its huge subscribers, a business intelligence system was implemented using Informatica\'s PowerCenter software. With PowerCenter a large amount of customer-related data was collected from diverse sources which included customer billing, ordering, support and service usage. Business users then analyzed this data to understand which customers stayed the longest with the service and why and to gauge the effectiveness of promotional activities. The results have been improved customer-retention efforts and more focused effective customer promotions.
Responsibilities:
- Involved in developing Data Warehouse using Star schema.
- Involved in conducting design sessions and reviews. Communicated with the users to observe various business rules in implementing the data warehouse.
- Analyzed the requirements, functional specifications and identified the source data to be moved to the Data Warehouse.
- Developed mappings using Informatica Power Center Designer.
- Configured Informatica with different source systems.
- Developed PL/SQL stored procedures for loading data into the data warehouse.
- Created, scheduled and monitored the sessions and batches using Workflow Manager.
- Configured Workflow manager to commit the expected rows.
- Worked with the pre session and post session settings and Flat files.
- Developed the work flow diagrams and worked with technical specifications.
- Defined the program specifications for the data migration programs, as well as the necessary test plans used to ensure the successful execution of the data loading processes.
- Involved in developing the mapping, which contains all the Lookups used by all the facts to improve performance.
- Involved in Performance tuning of mappings and sessions.
Environment: Oracle9i, PL/SQL, Informatica Power Center 6.2, Windows NT, ERwin, Unix
Confidential, Cleveland, OH Jun 02 - Jul 03
Role: Informatica Developer
Description: Confidential, is a financial service organization. The company offers a wide range of financial products and services to both commercial and retail customers throughout its state. In order to perform advanced analysis of the Customer Data to help offer better services, a data warehouse was developed in line with the business strategy.
Responsibilities:
- Involved in writing SQL Stored procedures and Shell Scripts to access data from Oracle, MS SQL Server 7.0.
- Involved in the warehouse design using a Star Schema.
- Worked extensively on Source Analyzer, Mapping designer, Warehouse Designer.
- Joined relational tables existing in separate databases using Joiner Transformation.
- Used Designer to import the sources, targets, create various transformations, mappings for extraction, transformation and loading operational data into the Staging environment.
- Created unique primary key values to replace missing primary keys using Sequence Generator Transformation making them reusable.
- Loaded the Target tables using Look-up transformations to receive input values directly from another transformation in the pipeline.
- Used the Router Transformation to load user-defined groups for the business units based on group filter conditions.
Environment: Informatica Power Center 6.1, Oracle 8i, SQL Server, ERWin, Windows 2000.
Confidential, New York, NY Dec 01—May 02
Role: ETL Developer
Description: Confidential, was started by BT in the US in 1993. It had global responsibility for BTýs implementations of voice, data and transmission systems. Concert later entered into a joint venture relationship with MCI (ended in 1998), and later with AT & Týwhich also ended. Concert is now freestanding within BT and is known as BT Ignite. It continues to provide global voice and data networks.
The main objective of Customer Data mart was to help with migrate the organization from product-centric orientation to one that is driven by customer needs. It provides a standardized means for accessing customer profiles scattered across varied heterogeneous systems in different states. Transactional Interaction, demographic, behavioral and self-provided profile data were the sources. Data was coming in from Flat-files, Mainframe data, XML, Relational and was transformed by using Informatica and was loaded into the target that is Oracle.
Responsibilities:
- Interacted with end-users and functional analysts to identify and develop BRD and transform into technical requirements.
- Extensively used ETL to load data from Oracle database, XML files, and Flat files to Oracle.
- Extensively used Transformation Language functions in the mappings to produce the desired results.
- Worked on all the transformations like Lookup, Aggregator, Expression, Filter, Router, Update Strategy, Stored Procedure and Sequence Generator.
- Created and ran pre-existing and debug sessions in the Debugger to monitor and test the sessions prior to their normal run in the Server Manager.
- Extensively made use of several features in TOAD to keep track of the various source, staging, target tables and used UltraEdit-32 to view the Session log files.
- Involved in promoting the folders from Development to Production Repositories.
Environment: Power Center 5.1, Power Connect 4.2, Oracle8i, DB2, TOAD, UltraEdit-32, ERWIN, HP-UX.
Confidential, Hyderabad, India Jan 01 - Nov 01
Role: Oracle Developer
Description: A system was designed to monitor the inventory of the organization. The system recorded all stores transactions like material receipt against purchase order, material issue, and stock adjustment voucher, material returned to stores, material receipt/issue from/to production unit. The system kept information of stores in terms of quantity and value, updated purchase order/manufacturing orders depending on the various stock transactions. Maintained records of rejected material, kept track of material consumption, work order wise/account head wise. It gave reports like high stock value list, item below re-order level, stores ledger and sub ledger, items above maximum level, cost of center wise/work order wise material consumption, list of non-moving and moving items.
Responsibilities:
Environment: Oracle 8.0, Windows NT