We provide IT Staff Augmentation Services!

Informatica Developer Resume

4.00/5 (Submit Your Rating)

SUMMARY:

  • Over 10 years of experience in all phases of Software Development Life Cycle for Client/Server applications and Data Warehouse / Data Mart solutions. Involved in requirement analysis, design, development, testing and implementation using Oracle 10g/ 9i / 8i / 8 / 7.3, Informatica Power Center / Power Mart 4.7 / 5.1 / 6.0 / 6.2/ 7.1/ 8.1.1/ 8.5, Cognos 8.x/ 7.x / 6 , Business Objects 6.1, Erwin 3.5, Oracle Warehouse Builder 3.0, Designer2000, Developer2000 on Windows and Unix platforms.
  • Experience in Data Modeling and database designing with Business Process Re-engineering (BPR) and Information Engineering (IE)methodologies.
  • Around 7 years of experience working with the and Informatica.
  • Extensive experience in Business Intelligence using Cognos 7.x /6 (Cognos Impromptu 6.0, Impromptu Web Reports 6.0 (IWR), PowerPlay 6.6, PowerPlay Enterprise Server 6.6, Transformer 6.6) , Business Objects6.1 (Supervisor, Designer, Report Writer, Broadcast Agent, Infoview and Webintelligence2.6.
  • Extensive experience in developing strategies for Extraction, Transformation and Loading (ETL) data from various sources into Data Warehouse and Data Marts using Informatica PowerMart/PowerCenter (Repository Manager, Server Manager, Mapping Designer, Mapplet Designer, Transformation, Designer, Warehouse Designer) and Datastage (Manager, Designer and Director)
  • Experienced in Installation, Configuration, and Administration of Informatica Power Center 5.x/6.x/7.1/8.1.1 and Power Mart 5.x/6.xClient, Server
  • Experience in integration of various data sources from Databases like MS Access, DB2, Oracle, SQL Server, TeraData and other file formats like flat-files, CSV files, COBOL files and XML files
  • Strong working knowledge of Data Modeling methodologies using Star schema and Snowflake schema.
  • Involved in creation of Oracle Materialized Views and automation of their refresh process
  • Excellent knowledge in shell scripting and scheduling of jobs under the UNIX Environment.
  • Good Application development experience with using Oracle Designer2000 / Developer 2000 (Forms 4.5 / 5.0 And Reports 2.5/3.0).
  • Development of Personal Computer based systems using C and Visual Basic to Automate operations and for data acquisition from Remote Sensing Satellites.
SKILLS:

ETL Tools Informatica Power Center / Power Mart 4.7 / 5.1 / 6.2/7.1/8.1.1/8.5, Oracle Warehouse Builder.
Data Modeling Tools: Erwin, Designer2000/6i
OLAP Tools: Cognos Impromptu 6.0, Impromptu Web Reports 6.0 (IWR), PowerPlay
6.6, Business Objects 6.1/5.1,Web Intelligence 2.5/2.6, Broadcast Agent
2.6/5.1 and Crystal Enterprise.
Database Tools: SQL*PLUS, SQL* Loader, TOAD
Languages: C, SQL, and PL/SQL
Scripting languages: UNIX Shell scripting, PERL, VBScript

Databases: Oracle 7.x / 8 / 8i / 9i, SQL Server 6.5/7.0, Sybase, Tera data V2R5

Hardware: IBM PC and Compatibles, Analog and Digital Control Systems.
Systems: UNIX (IBM AIX, SUN Solaris), Windows 95/98/NT/2000, MSDOS 6.22

Confidential, Charlotte, NC
Informatica Developer Sep 2009 - Till Date
The primary goal of the Wealth Management Performance Measurement Project is to provide access to transactional inflow and outflow data (monies coming into TIAA-CREF and monies leaving TIAA-CREF) within the OLAP Marketing Database.

  • Analyzed business requirements, performed source system analysis, prepared technical design document and source to target data mapping document
  • Designed the Data Mart defining Entities, Attributes and relationships between them
    Performed impact analysis for systems and database modifications
  • Implemented Slowly Changing Dimensions - Type I & II in different mappings as per the requirements.
  • Designed the Data Mart defining Entities, Attributes and relationships between them
  • Performed impact analysis for systems and database modifications
  • Implemented Slowly Changing Dimensions - Type I & II in different mappings as per the requirements.
  • Involved in the Development of Informatica mappings and mapplets and also tuned them for Optimum performance, Dependencies and Batch Design.
  • Performed incremental aggregation, built slowly changing dimensions using lookup and update strategy transformations, used re-usable transformations.
  • Developed ETL processes to populate the Enterprise Product Orders data mart Dimensions and Fact using Informatica, Oracle SQL, UNIX shell scripts
  • Developed complex mappings using transformations such as the Source qualifier, Aggregator, Expression, Lookups, Filter, Router, Sequence Generator, Update Strategy and Joiner
  • Optimized performance by tuning the Informatica ETL code as well as SQL
  • Performance tuning of the Informatica mappings using various components like Parameter files, Variables and Dynamic Cache
  • Involved in Developing Unix script to load data to Teradata using FLOAD, TPUMP, MLOAD, BTEQ, etc.
  • Created, optimized, reviewed, and executed Teradata SQL test queries to validate

Environment: Informatica 7.1/8.6, DataStage, XML, Flat files, Siebel, OBIEE, Borland StarTeam 2005, Oracle 9i,/10g , COBAL, Tera data V2R6, Unix Shell Scripts, PL/SQL, ERWIN 4.0, Windows XP & UNIX

Confidential, Louisville, KY
Informatica Developer/Admin Jun 2006 - July 2009
Global Forecasting & Tracking application is a Legacy Evolution initiative to replace mainframe based Load Handling and TFCS applications that are used in Airline and Ground domains. This project is executed in 3 phases to gradually migrate off these applications from mainframe to J2EE based distributed application. The TFCS system is used to forecast load information to destination and intermediate locations. It provides a constant inventory and tracking of the UPS trailer fleet. Load handling system create link for packages to containers, loads, aircrafts and maintain Load Handling (LH) transactions for gateway and hub/center users.

  • Involved in interacting with Business Managers and users to gather requirements, as and when required, and analyzed the data provided by them and put these data together to do the documentations.
  • Used Informatica Designer to import, edit the metadata of sources and targets.
  • Designed and developed Informatica workflows using various transformations for extracting data from source system and loading aggregated data into staging and target data mart.
  • Involved in development of database objects stored procedures and functions.
  • Developed various Transformations, validated and fine-tuned the ETL logic coded into mappings.
  • Extensively used Informatica Transformations like Union, Look Up, Aggregator, Filter, Router, Joiner, Sequence Generator, Expression,Update Strategy, Stored procedure, Normalizer and Rank.
  • Made substantial contributions in simplifying the development and maintenance of ETL by creating re-usable Mapplets and Transformation objects.
  • Used Lookup Transformation to access data from tables, which are not the source for mapping and also used Unconnected Lookup to improve performance.
  • Used Stored Procedure Transformation for truncating the target tables before loading the data to SQL Server staging.
  • Involved in partitioning of very large tables. Used Informatica Power Connect MQ Series to extract data from message queues, and load data to target data mart.
  • Created equal partitions to ensure the smooth flow of data concurrently for better performance.
  • Used Key Range,Round Robin, Hash Key and Pass Through partitioning.
  • Used Mapplets, Parameters and Variables to implement Object Orientation techniques and facilitate the reusability of code.
  • Monitored and scheduling in-house financial applications and ensure daily, weekly, and monthly processes successfully complete.
  • Provided support to Informatica developers including creating users, granting user/group permissions, creating folders, data sources, connections, Exchange distribution lists and monitoring workflows.
  • Involved in debugging the mappings for fixing the problems. Involved in re designing the mappings for improving the performance.
  • Used list and range partitioning.- Associated with DBA in monitoring the performance related issues
  • Administering development, production and test environments. Migrated code from development to Test and Production.
  • Created error log tables and Data recoverable option in session level.
  • Implemented Error handling strategy for trapping errors in a mapping and sending to an error table.
  • Develop, configure , test & maintain batch interfaces using Informatica PowerCenter
  • All the jobs are integrated using complex Mappings including Mapplets and Workflows using Informatica power center designer and workflow manager. Ver 8.6 Used Web-Based Administration Console
  • Used Type 2 changes in Slowly Changing Dimensions tables.
  • Used dynamic group deployments for Repository level migrations
  • Created error log tables and Data recoverable option in session level.
  • Involved in Developing Unix script to load data to Teradata using FLOAD, MLOAD, BTEQ, etc.
  • Created, optimized, reviewed, and executed Teradata SQL test queries to validate Involved in SQL scripts and stored procedures in Teradata to implement business rules.
    Used Teradata Sql Assistant to query the data in the target Teradata data warehouse.

Environment: Informatica 8.1.2/8.5/8.6, DB2 , Power exchange, Change Data Capture Informatica Power Connect MQ Series, DataStage, XML, Autosys, Flat files, Cognos 8, Mainframe, Oracle 9i,/10g , COBAL, Tera data V2R6, Unix Shell Scripts, PL/SQL, SQL Server 2005, Unix Shell Scripts, ERWIN 4.0, Windows XP & UNIX

Confidential, Columbus, OH Jul 2005 - May 2006
Informatica Developer

Confidential, a publicly traded company based in Columbus, Ohio, provides a variety of financial services that help consumers invest and protect their long-term assets, and offers retirement plans and services through both public- and private-sector employers. It is a leading provider of annuities, life insurance, retirement plans, and other financial services for individuals and institutional clients. These products are offered through multiple distribution channels.

  • The data migration included identifying various databases where the information/data lay scattered, understanding the complex business rules that need to be implemented and planning the data transformation methodology.
  • Used Maestro Conman (v7.0 1.13) and Composer (v7.0 1.21) for Schedule and execute workflows
  • Performed quantitative analyses on the data loaded into the data warehouse on a daily, weekly, monthly and quarterly basis. Upon completion of the data warehouse refresh cycle, reconciled data with the source system. Maintained the master schedule via Maestro as to when the fact & dimension tables were populated in the data warehouse.
  • Performed regression testing of Informatica mappings & workflows after an upgrade from PowerCenter v6.1 to v7.1.3
  • Involved in developing SQL and PL/SQL codes through various procedures, functions, and packages to implement the business logics of database in Oracle.
  • Extensive experience in implementation of Data Cleanup procedures, transformations, Scripts, Stored Procedures and execution of test plans for loading the data successfully into the targets
  • Used Lookup Transformation to access data from tables, which are not the source for mapping and also used Unconnected Lookup to improve performance.
  • Created complex Informatica mappings to load the data mart and monitored them. The mappings involved extensive use of transformations like Aggregator, Filter, Router, Expression, Joiner, Sequence generator.
  • Involved in data normalization and access of legacy systems using Normalizer transformation.
  • Configured the mappings to handle the updates to preserve the existing records using Update Strategy Transformation.
  • Created and configured Workflows, Worklets and Sessions to transport the data to target using Informatica Workflow Manager.
  • Configured the mappings to handle the updates to preserve the existing records using Update Strategy Transformation.
  • Defined Target Load Order Plan and Constraint based loading for loading data correctly into different Target Tables.
  • Data mapping of EIM tables to Base tables.
  • Used Teradata Sql Assistant to query the data in the target Teradata data warehouse.

Environment: Informatica 7.1.3, DB2, Sybase, Seibel 7.x, XML, Flat files, Business Objects 6.1, Microstrategy, web intelligence 2.6, Oracle 9i, Lotus Notes, SQL*Loader, Java, Maestro Conman (v7.0 1.13) ,Unix Shell Scripts, PL/SQL, SQL Server, Tera data V2R5, Oracle HRMS, People Soft, Windows NT

Confidential, Miami, FL Oct 2004 - Jun 2005
Informatica Developer

Bay view and its subsidiaries deliver a wide array of products, including the purchase of residential and commercial real estate loans, the origination of small-balance commercial real estate loans, the investment in and development of commercial real estate, We source our business through a network of thousands of financial institutions, mortgage companies, third-party brokers and advisor\'s.

  • Involved in Data modeling and design of data warehouse in star schema methodology with confirmed and granular dimensions and FACT tables.
  • Creating and running Sessions & Batches using Work flow Manager to load the data into the Target Database.
  • Using Informatica Repository Manager, maintained all the repositories of various applications, created users, user groups, security access control.
  • Created Informatica Mappings to load data using transformations like Source Qualifier, Aggregations, Expression, Joiner, Lookup, Filters, Sequence, Router and Update Strategy.
  • Designed and developed Informatica Mappings, applets, Sessions, Work flows and Work lets for data loads and data cleansing.
  • Extensively worked on Database Triggers, Stored Procedures, Functions and Database Constraints optimized for maximum performance.
  • Involved in the development of Informatica mappings and also performed tuning for better performance.
  • Tested all the applications and transported the data to the target Warehouse Oracle tables, schedule and run extraction and load process and monitor sessions and batches.
  • Developed STAR Schema\'s including identifying FACTS and Dimensions and Designed tables.
  • Written Unix Shell Scripts for getting the data from all systems to the data warehousing system. The data was standardized to store various business units in tables.
  • Analyzed the sources according to Business rules and developed Validation Database.
  • Worked on Work flow Manager Tools - Task Developer, Work flow Designer & Work let Designer.
  • Managed the Meta data associated with the ETL processes to populate the Data Warehouse.
  • Tested all the applications and transported the data to the target Warehouse Oracle tables, schedule and run extraction and load process and monitor sessions and batches by using Informatica work flow manager.
  • Identified and tracked the slowly changing dimensions on heterogeneous Sources.
  • Tested the target data against source system tables by writing some QA Procedures.

Environment: Informatica 7.1.3, Business Objects 6.1,web intelligence 2.6, web intelligent SDK, ZABO, JSP, Oracle 9i,Teradata Bteq, SQL*Loader, Unix Shell Scripts, SUN Solaris, PL/SQL, SQL Server, Tera data V2R5, Toad.

Confidential, Thousand Oaks (CA) Jul'03 - Sep' 04

Informatica Developer
This is a Data Warehousing Solution for Food Company that mainly deals with Consumer Products specially related to FMCG products using point-of-sale (POS). The package provides methods to analyze sales revenue, products and brands flow, time based analysis of products movement, customers choices based on product categories, fast moving products, slow moving products, capital blocked on slow moving products, etc. Different types of promotional schemes and their effect on the overall sales, grouping of products (Market-Basket Analysis) are the special features covered by the package.

Responsibilities

Used Informatica -Designer for developing mappings, using transformations, which includes aggregation, Updating, lookup, and summation. Developed sessions using Work flow Manager and improved the performance details. Created Stored Procedures to transform the Data and worked extensively in PL/SQL for various needs of the transformations while loading the data. Used transformations like Aggregate, Expression, Filter, Sequence Generator, Router, Joiner, Lookup and Stored procedure transformations. Used the repository manager to give permissions to users, create new users, repositories.
Created reusable transformations and applets and used them in mappings. Created mappings with PL/SQL procedures/functions to build business rules to load data
Used Informatica Power Center Work flow Manager to create sessions, batches to run with the logic embedded in the mappings. Monitoring custom business logic to address the challenges of managing the Application Server's Infrastructure as well as monitoring the performance of the actual business logic.
Fine tuned Transformations and mappings for better performance. Involved in the process design documentation of the Data Warehouse Dimensional Upgrades. Extensively used Informatica for loading the historical data from various tables for different departments. Written Data loading stored procedures, functions using PL/SQL from Source systems into operational data storage. Created views and altered some of the dimensional tables to satisfy their reporting needs. Created target tables in Oracle database. Responsible for presenting the Data Warehousing concepts and tools to their prospective clients
Created database connections and created Repository and domains Analyzed the existing database and designed the Universes using Business Objects Designer Module. Created Universe, Classes, Objects, Measure Objects, Conditions, Joins Involved in resolving the loops, cardinalities, contexts and checked the Integrity of the Universes. Created the reports using Business Objects functionalities like Queries, Slice and Dice, Drill down, functions, Cross Tab, Master/Detail and Formulas etc Developed complex queries using different data providers in the same report Attended Trouble shooting and knowledge transfer.

Environment: Informatica power center 7.1/6.2, Business Objects 6.0/5i,DB2 7.0 Oracle 9i, Sybase, SQL Server, SQL*Loader, Windows NT/2000, Erwin 3.5.2.

Confidential, WA Nov '02- Jun '03

ETL Developer Description

Confidential, is a fortune 500 company and a leading Financial Organization. This project was a Sales Data Warehousing project for analyzing customers and their transactions, accounts, balances, payments, funds transfer between accounts of its customers and also between the bank and other financial institutions. This project was developed for Sales & Marketing and Finance departments to assist them in their business analysis.
Responsibilities:

  • Created Informatica Mappings and applets using different transformations.
  • Extensively used different transformations like Source Qualifier, filter, Aggregations, Expression, Connected and Unconnected Lookup, Sequence Generator, Router and Update Strategy.
  • Resolved the dimension keys and performed currency conversions before loading into the target.
  • Calculated values in a single row by specifying input/output ports before writing to the target using Expression Transformation.
  • Created and developed a series of mappings for handling different cases of input data in the same source table.
  • Analysis of certain existing mappings, which were producing errors and modifying them to produce correct results.
  • Used parameters and variables in mappings.
  • Used work flow Manager to Create session task and other tasks.
  • Involved in Production Monitoring using Work flow Monitor.
  • Involved in UNIT Testing.
  • Development of a template to be used as a Mapping Specification Document.
  • Involved in the customization of the mappings by analyzing the data.
  • Generated the reports using Cognos Impromptu by querying the data from different database tables as per the requirement
  • Documentation of mappings as per standards.
  • Created Catalogs and assigned joins as per the requirements.
  • Customized the reports by adding calculations, conditions and functions
  • Developed Impromptu Web Reports (IWR) and published them into Upfront.
  • Created List Reports, Cross Tab Reports and Drill Through reports.
  • Created Cubes for multi dimensional analysis using power play Transformer.
  • Data retrieval, Multidimensional analysis, power play structure
  • Created Multi-Dimensional reports using power play.
  • Designed and developed multi dimensional cubes, Visualizations.
  • Applied Governor settings to the appropriate users, table weighting to Improve the performance.

Environment: Informatica Power Center 6.0, Oracle 8.0/8i, DB2 , SQL Server 7.0 Windows'2000, HP Unix 4.0, Cog nos Impromptu 6.0, Impromptu Web Reports 6.0, power play Transformer ,
Upfront, Access Manager, power play Enterprise Server, proscriptions Editor, Windows NT/2000, UNIX

Confidential, CA Jun '01 to Oct '02
ETL Developer
Description:
Customer Data Management System is the data warehousing system which forecasts the client Information about A/C, Debit, Credit, Trial Balance, Loan Information and interest Details.

Responsibilities:

  • Involved in the design of ETL structures for ETL processes, particularly to load bridge and helper tables to maintain the hierarchical nature of Slowly Changing Dimensions (SCD).
  • Setup processes to build SCD for both Type 1 and Type 2 (using the date range and active/inactive flag).
  • Involved in defining and maintenance of shared components (such as sources, targets, applets and other reusable objects) in Power mart.
  • Involved in data cleansing by creating mappings using various transformations available in Informatica such as Expression, Router, Filter, Lookup, Stored Procedure, Update Strategy etc.
  • Involved in data normalization and access of legacy systems using Normalizer transformation.
  • Involved in Performance Tuning of the ETL Application and Processes.
  • Experienced in setting up the connectivity to various sources and targets in distinctive environments.
  • Setup sessions, batches to populate Data mart with the data from several sources.
    Coordinated with DB A\'s and Change Control Team for Cut overs and Migrations.
  • Involved in setup of automated schedule for the Data mart build along with the responsibility of communicating with Ops Group for all ad-hoc ETL processes execution requests.
  • Developed Unix Shell and DOS Command Scripts to automate daily load processes.
  • Involved in the documentation for the purpose of auditory control and regulatory needs.
  • Created & tested reports using Business Objects functionality like Queries, Slice and Dice.
  • Created graphical representation of reports such as Bar charts, Pie charts as per End user requirements.
  • Testing of Universe and Reports, which includes Database connectivity

Environment: Sun Solaris, Oracle 8x, Erwin, Informatica power center 5.1, Business Objects 5.1, Web Intelligence 2.1.

Confidential, CA Mar' 00 - May'01
Programmer/System Analyst

HOTS (Hub Office Trading System) is a transaction placement application for the Taiwan Hub Office. HOTS is used daily throughout the day, on as as-needed basis on business days. The HOTS business cycle begins in Taiwan, continues in San Mateo on the same trade date
Responsibilities:

  • Developed various ASP pages under Visual Interdata environment.
  • Development involved writing complex validations using client-side & server -side VB/Java scripts.
  • Effectively used ADO to perform database operations.
  • Implementing API in Visual Basic applications.
  • Involved writing and Developing COM for server side applications.
  • Extensively Used Server Side Components.
  • Extensively used MTS and IIS 4.0.
  • Developed various HTML Pages under Visual Interdata environment.
  • Visual Source Safe has been used for all the source code maintenance.
  • Wrote Stored Procedures in SQL SERVER 7.0.

Environment: ASP 2.0, Visual Interdata 6.0, Visual Source Safe, Visual Basic 6.0,ADO, SQL SERVER 7.0, VB Script, Scriptures, COM, DHTML , HTML,XML , Windows NT 4.0 and IIS 4.0 .

Confidential, India Jan'99- Feb'00
Programmer
This is an on line Financial Accounting System for Integrated Thermo Plastic Ltd., which handles various types of transactions like account receivable, payable and journal invoices. The various account heads are maintained in the form of general ledger are generated.
Responsibilities:

  • Implementing API in Visual Basic applications.
  • Maintains the account information on main account level as well as sub accounts.
  • Keep track of the accounting transactions.
  • Keep track of payments dues, late payments, and fines, over charges and credit history.
  • Reports account information on the monthly, quarterly, annual basis, quarterly tax information is also maintained by the system.
  • Reports all the account balances for the accounting period.
  • Keeps track of bank deposits like direct deposits, check deposits, cash deposits and credits.

Environment: Visual basic 5.0, MS access, Windows 95 and crystal Reports.

Education:

  • B.E (Architecture)
  • COGNOS certification in Impromptu & Power play
  • Trained in Data warehousing techniques, OLAP and COGNOS analysis and reporting tools
  • Post Graduate Diploma in Computer Applications

We'd love your feedback!