Informatica Developer Resume
San Antonio, TX
Summary
Having 9+ years of experience in the IT industry with a strong background in Analysis, Design, Implementation, Development and Informatica ETL Developer, 7+ years of experience using Informatica Power center/Power mart and 2 years of experience in .NET software development
7+ years of experience using Informatica Power center/Power mart 9.x/ 8.x/7.x/6.x/5.x for ETL, extensively used ETL methodologies for supporting data extraction, transformations and load processing, in the corporate-wide-ETL Solution using Informatica Power Center.
2 years of experience in .NET software development using C#, VB.NET, ASP.NET, SQL Server and Oracle.
Experience using Oracle 11g/10g/9i/8i, SQL, PL/SQL, SQL Plus,DB2, Teradata V2R6/V2R5, MS SQL Server 2008/2005/2000/7.0/6.5,Associated data,Sybase 12.0/11.0/10, DB2,IMS, MS Access 7.0/2000
Extensive experience in Informatica Power Center (versions 8.1/7.1/6.2), client tools like Designer, Workflow Manager, Workflow Monitor.
- Experience design and development of Oracle,SQL,Teradata and DB2.
Extensively worked on Data Cleansing, Data Profiling, Data Quality using Informatica Products.
Expertise in designing and developing mappings from varied transformation logic using filter, expression, rank, router, aggregator, joiner, lookup, and update strategy.
Experience in creating UNIX shell scripts.
- Strong knowledge and Experience in data warehouse
Experience in performance tuning of Informatica Sources, Targets, Mappings, Transformations, Sessions and determining the performance bottlenecks
- Extensive experience in writing SQL and PL/SQL programs Stored procedures, Triggers, Functions for back end development.
- Extensive experience in Pharmaceutical Sales and Marketing applications and understanding Pharma data, etc .
- Strong knowledge Informatica production support.
Strong knowledge of Software Development Life Cycle (SDLC) including Requirement analysis, Design, Development, Testing, Support and Implementation. Provided End User Training and Support.
- Excellent Experience in scheduler workflow manager and control-M, Autosys etc
Experienced in Programming, Performance Tuning using PL/SQL.
Generated Business reports using Business Objects, Cognos as a reporting tool as and when required.
Excellent inter-personal and communication skills and ability to work individually as well as in a team.
Involved in Enterprise Data Warehouse testing. Involved in integrating source data, controlling data movement and reviewing up the performance.
Highly motivated self-starter with ability to handle multiple projects and meet tight deadlines.
TECHNICAL SKILLS:
ETL Tools:Informatica Power Center 9.1.0/8.x/7.x/6.x
1. Designer
2. Workflow Manager
3. Workflow Monitor
4. Repository Manager
MSSQL Server 2008
1. Integration Services
2. Analysis Services
3. Reporting Services
Oracle Warehouse Builder
Database: Oracle 11g/10g/9 – SQLand PL/SQL
MSSQL 2008/2005 – SQLand PL/SQL
MS-Access 2007
IMS
DB2
Associated Data
B2B
Teradata – SQLandPL/SQL
Teradata utilities (BTEQ, FLOAD, MLOAD, TPT, Fast Export)
Scripting UNIXOLAP Tools CognosMicrostrategyBusiness Objects
Windows: Windows95/98/2000/2003/NT/XP, UNIX
Languages: C, C++, SQL, PL/SQL, HTML, VB.
Education: M.Sc Computer Science – Osmania University -India. May 2001
Professional Experience
Client: Confidential San Antonio, TX May 2010 -- Present
Role: Informatica Developer
USAA is a Texas-based, and Texas Department of Insurance regulated, unincorporated, reciprocal inter-insurance exchange and Fortune 500 financial services company offering banking, investing, and insurance to people and families that serve, or served, in the United States military.
Responsibilities:
Worked on Power Center client tools like Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer and Transformations Developer.
Used most of the transformations such as the Source Qualifier, Expression, Aggregator, Connected and unconnected lookups, Filter, Router, Sequence Generator, Sorter, Joiner, and Update Strategy.
Used Workflow manager for Workflow and Session Management, database connection management and Scheduling of jobs to be run in the batch process.
Developed shell scripts, PL/SQL procedures for creating/dropping of table and indexes of performance for pre and post session management.
Imported data from various Sources transformed and loaded into Data Warehouse Targets using Informatica.
Used shortcuts to reuse objects without creating multiple objects in the repository and inherit changes made to the source automatically.
Developing Informatica mappings and tuning them when necessary.
Optimizing Query Performance, Session Performance.
- Production support Maintain a log files, screenshot, run statistics
Unit and System Testing of developed mapping.
Documentation to describe program development, logic, coding, testing,changes and corrections.
Environment: Informatica Power Center 9.1.0 (ETL), Oracle 10g,DB2, ERWIN 4.0, SQL, PL/SQL, SQL Server 2000, Windows NT/2000, HP-UNIX.
Client: Confidential Rochester, NY Mar 2009 – April 2010
Role: ETL/ Informatica Developer
Carestream Health Inc is a health imaging and information technology solutions company. Carestream provides medical and dental imaging systems and healthcare IT solutions; molecular imaging systems for the life science research and drug discovery/development market segments; and x-ray film and digital x-ray products for the non-destructive testing market worldwide.
Responsibilities:
Analyzed the source system and involved in designing the ETL data load.
Developed the ETL Mappings with Complex Transformations.
Created the Workflows in the Work Flow Manager for Running the Sessions and Tasks.
Created the Job Chains for Running the Sessions using Redwood Cronacle with Various Job Steps for Multiple Data Refresh.
Created the Data Validation Scripts for Validating Multiple Data Loads
Used Autosys for Scheduling, Created various UNIX Shell Scripts for scheduling various data cleansing scripts and loading process. Maintained the batch processes using Unix Shell Scripts
Extensively coded PL/SQL and SQL and tuned them for better performance.
Extensive SQL querying for Data Analysis.
Performed Data Analysis and Source-To-Target Mapping.
Worked on Data Extraction, Data Transformations, Data Profiling, Data Loading, Data Conversions and Data Analysis.
Wrote, executed, performance tuned SQL Queries for Data Analysis and Profiling.
Identify and document, business functions, activities and processes, data attribute and table meta-data, and general and detail design specifications.
Wrote PL/SQL Scripts For Pre and Post Session Tasks.
Tuned the Mappings by Removing the Source/Target bottlenecks and Expressions to improve the Throughput of the Data Loads.
Extensively worked with Teradata utilities Mload, Fload
Done collect statistics against the production data
Developed Models and Cubes in Cognos Transformer using Cognos Transformer.
Provided the Production Support for other Teams in Migrating Mappings from Earlier Versions and Trouble Shooting the Issues.
Interacted with the User to Fix the Data Issues.
Environment: Informatica 8.6, Teradata V2R5.0, Quest TOAD 9.5, Oracle 9.2,DB2,PL/SQL, SUN Solaris 5.8, Cognos EP series 7(Impromptu, Impromptu web reports, Transformer, Power Play), Upfront, Access Manager, Redwood Cronacle 5.7.
Client: Confidential Torrance, CA Mar 2008 - Feb 2009 Role: Informatica Developer
This is data warehouse for AHM auto sales in US and Canada regions. The operational data store is intended to store centralized information from distributed normalized databases at source end.
From ODS, Data will be loaded to Staging Area Database through nightly data loads Staging Area has all the dimension information with incremental facts. Dimensional data along with Incremental Fact data will be published from Staging Area to Data Warehouse .On Successful load completion to EDW, The incremental facts in the Staging area will be purged. An OLAP Cube is built on this Warehouse for decision support on Sales Analysis.
Responsibilities:
Worked on Informatica - Source Analyzer, Warehouse Designer, Mapping Designer and Mapplet, and Transformation Developer.
There were two kinds of data loading processes (Daily and Weekly) depending on the frequency of source data.
Imported data from various sources transformed and loaded into Data Warehouse Targets using Informatica.
Fixing invalid Mappings, testing of Stored Procedures and Functions, Unit and Integration Testing of Informatica Sessions, Batches and Target Data.
Extensively used Transformations like Router, Aggregator, Source Qualifier, Joiner, Expression, Aggregator and Sequence generator.
Scheduled Sessions and Batches on the Informatica Server using Informatica Server Manager/Workflow Manager.
Worked with pmcmd to interact with Informatica Server from command mode and execute the shells scripts.
Worked with different sources such as Oracle, MS SQL Server and flat files.
Knowledge of slowly changing dimension tables and fact tables.
Writing documentation to describe program development, logic, coding, testing, changes and corrections.
- Used control-M for Scheduling, Created various UNIX Shell Scripts for scheduling various data cleansing scripts and loading process. Maintained the batch processes using Unix Shell Scripts
Optimizing the mappings by changing the logic to reduce run time.
Environment: Informatica Power Center 8.1 (ETL),Teradata, Oracle 9i,B2B, ,SQL, PL/SQL, SQL*Loader, SQL Navigator, SQL Server 2000, Windows NT/2000, HP-Unix.
Client: Confidential, CA Aug 2007 - Feb 2008
Role: ETL Developer
Amgen is one of the largest pharmaceutical transactions services companies in the United States and one of the nation’s leading prescription benefits managers. Developed and maintained central data warehouse and different applications/ interfaces associated with it to maintain the entire system such as Direct Sales and Charge backs, Incentives and Payments, Membership and etc., which will help in analyzing the status quo of the company and in taking the business decisions for further development of the company.
Responsibilities:
- Involved in requirements gathering, business analysis with the users for the development of new applications.
- Preparation of technical specification for the development of Informatica Extraction, Transformation and Loading (ETL) mappings to load data into various tables in Data Marts and defining ETL standards.
- Optimized the existing applications at the mapping level, session level and database level for a better performance.
- Scheduled jobs using Informatica Workflow Manager, scheduling tool Control-M enterpriser and Unix Shell Scripting.
- Worked on partitioning, mapping variables, debugger, variable ports and etc.
- Implemented slowly changing dimensions methodology and developed mappings to keep track of historical data.
- Developed Dynamic Parameter files for the workflows.
- Identified performance issues in existing sources, targets and mappings by analyzing the data flow, evaluating transformations and tuned accordingly for better performance.
- Involved in creating stored procedures and PL/SQL Scripts.
- Converted all the stored procedures into Informatica mappings.
- Extensively used Router, Lookup, Aggregator, Expression and Update Strategy Transformations.
- Created recovery workflows in order to provide backup for the critical jobs.
- Provided 24x7 on-call support for over 100 applications on a daily basis in order to meet SLA and to provide consistent data to the users.
- Used Toad and SQL Plus for database analysis performance tuning and trouble shooting.
- Documented System Architecture Design (SAD), Mapping Document and Unit Test plans.
- Worked on Clear Quest to track all the requests, assigned work and etc.
Environment: Informatica PowerCenter7.1.2, Oracle 9i, Teradata, Associated Data, SQLPlus, PL/SQL, Windows Server 2000, TOAD 7.0, Autosys, ClearQuest.
Client: Confidential, India Apr 2006 – Mar 2007
Role: Informatica Developer
This Data warehouse was designed to generate reports for the MeghaCorporation and analyze the sales of various products. The product data is categorized depending on the product group and product family. It is also used to analyze the usage of product at different times of the year. It reports the historical data stored in various databases like Oracle, XML files and Flat Files. Data from different sources was brought using Informatica ETL and sent for reporting using Business Objects.
Responsibilities:
Developed ETL mappings, transformations using Informatica powercenter
Implemented source and target definitions in workflow designer.
Developed mappings using various transformations like update strategy, lookup, stored procedure, Router, Filter, sequence generator, joiner, aggregate transformation and _expression.
Mappings were designed to cater the slowly changing dimensions.
Used various stored procedures in the mappings.
Developed several reusable transformations and mapplets that were used in other mappings.
Worked on the server manager to run several sessions and batches concurrently and sequentially.
Mappings were designed to the utmost performance by optimizing expressions in expression editor of transformation, filtering source data at the source qualifier, and applying transformation
Logic.
Environment: Informatica powercenter, Oracle, Windows NT, PL/SQL, STAR Schema, Flat files.
Client: Confidential India Jan 2005 – Feb 2006 Role: ETL Developer – Informatica
This project is for the Sales Analysis that helps the decision makers to analyze the sales patterns using the various business angles or dimensions. The above proposed data warehouse is strictly based on Star Schema to accommodate future addition to warehouse without affecting the existing structure and the applications developed.
Responsibilities:
- Involved in design & development of operational data source and data marts in Oracle
- Reviewed source data and recommend data acquisition and transformation strategy
- Involved in conceptual, logical and physical data modelling and used star schema in designing the data warehouse
- Designed ETL process using Informatica Designer to load the data from various source databases and flat files to target data warehouse in Oracle
- Used Power mart Workflow Manager to design sessions,
- Created parameter based mappings, Router and lookup transformations
- Created mapplets to reuse the transformation in several mappings
- Used Power mart Workflow Monitor to monitor the workflows
- Optimized mappings using transformation features like Aggregator, filter, Joiner, Expression, Lookups
- Created daily and weekly workflows and scheduled to run based on business needs
- Created Daily/weekly ETL process which maintained 100GB of data in target database
- Involved in Unit Testing, Integration Testing, and System Testing
Environment: Informatica Power Center 7.1.1, Teradata,Oracle 9i, Cognos 6.5 (Impromptu), ERwin 3.5, Sun Solaris 7, Shell scripting, Windows 2000.
Client: Confidential India Jan 2003 – Dec 2004
Role : .NET Software Developer
Responsibilities:
Designed and developed dynamic ASPX web pages using C#, ASP.Net, XML, HTML, Java Script.
Involved in automating Order Imports from multiple Amazon US/UK Accounts.
Used Cascading Style Sheets (CSS) to attain uniformity of all web pages and to control the layout and look of the page easily.
Auto Synchronize Inventory Across all your multiple Amazon Accounts. As soon as orders are imported users Inventory automatically gets updated synchronizing with imported orders.
Extensively used Object Oriented Programming Concepts in developing the application.
Creating and printing picking slips and invoices and making sure that local time zone is set to manage orders in your local time.
E-mail alert Service using where user can customize the format to generate E-mail which will be sent automatically to Project associated members.
Used COM+ objects to invoke third party application for Address verification through dazzle. Address is automatically verified through dazzle software and gives the result in xml format.
Responsible for the deployment of the COM DLLs onto the COM+ Application servers.
Extensively used XML class libraries to parse the data into the database.
Designed multi tier architecture to make use of the application customizable across.
Content based search using XML and regular expression functionalities is the main and user friendly functionality which will ease the users to query the reports in very quick span if time.
Environment: Visual Studio 2002/ CSS, C#. Net, Asp.Net ADO.Net, JavaScript, SQL Server 2000, HTML, XML, Web Services