We provide IT Staff Augmentation Services!

Senior Informatica Developer Resume

5.00/5 (Submit Your Rating)

Portland, ME

Summary:

  • About 7 years of IT experience in Software Analysis, Design and Development for various software applications in client-server environment in providing Business Intelligence Solutions in Data Warehousing for Decision Support Systems, OLAP and OLTP Application Development.
  • Six Years plus of Data Warehouse, Data mart, Data Integration and Data Conversion Projects ETL experience using Informatica Power Center 9.0.1/8.6/7.1/6.2.
  • Extensively used Enterprise Data warehousing ETL methodologies for supporting data extraction, transformation and loading processing, in a corporate-wide-ETL Solution using Informatica Power Center.
  • Expertise in different Schemas (Star and Snow Flake) to fit reporting, query and business analysis requirements.
  • Expertise on Informatica Mappings, Mapplets, Sessions, Workflows and Worklets for data loads.
  • Expertise on Exception Handling Mappings for Data Quality, Data Cleansing and Data Validation.
  • Extensively worked with Informatica Mapping Variables, Mapping Parameters and Parameter Files.
  • Strong experience in developing complex mappings using varied transformations like Unconnected and Connected lookups, Router, Aggregator, Sorter, Ranker, Joiner, Stored Procedure and Update Strategy.
  • Developed slowly changing dimension mappings of Type1, Type2 and Type3 (version, flag and time stamp).
  • Experience in developing Reusable components and Partition sessions across the projects.
  • Experience in developing Incremental Aggregation mappings to update the values in flat table.
  • Experience in Trouble shooting and implementing Performance tuning at various levels such as Source, Target, Mapping, Session and System in ETL Process.
  • Extensively worked on Code Migration from Development, Systest, UAT, and Production.
  • Worked on Informatica Power Center upgrade from 6.2 to 7.1.4 and 8.1.1.
  • Extensive experience in full life cycle development with emphasis on Project Management, User Acceptance Programming and Documentation.
  • Extensive experience in writing PL/SQL Procedures, Triggers, Functions and Views for back end development.
  • Experience in UNIX Shell (Korn Shell, Bourn Shell, C Shell) Scripting.
  • Involved in Manual and Automated Testing of applications on UNIX and Windows Environment.
  • Extensive knowledge on Business Intelligence / Reporting tools Crystal Reports,(OBI) EE, Business Objects.
  • Good knowledge on ER Data Modeling tools like Erwin, MS Office Suite and Visio.
  • Good exposure to the Software Development Life Cycle (SDLC) and OOAD Techniques.
  • Extensively worked with Scheduling tools like Control M and AutoSys.
  • Knowledge in full life cycle implementation of Data Warehouses pertaining to Insurance, Financial, Networking, Pharmaceutical verticals.
  • Team Player with ability to work independently and manage/lead ETL Teams.
  • Have Strong technical, communication, time management and interpersonal skills.

Technical skills:


Languages

C, C++, Java, C#, ColdFusion, PHP, SQL, PL/SQL, UML, XHTML, Java Script, Shell Scripting, Perl Script

RDBMS

Oracle 10g/9i/8i, DB2, MS SQL Server 2008/2005/2000, Teradata, MS Access

ETL Data Integration Tools

Informatica Power Center / Power Mart 6.x/7.x/8.x (Designer, Workflow Manager, Work flow Monitor, Repository Manager, Repository Server Administration Console), Power Connect for ERP and Mainframes, Power Exchange, Datastage.

Operating Systems

UNIX, AIX, Windows 2003/2000/NT/XP, Sun Solaris, Linux 7.3/8.0.

IDE

Toad, Autosys, Contol-M, SQL*Loader, SQL Server Management Studio, Teradata SQL Assistant, Quest Central for DB2, ChangeMan Version Manager, Eclipse, Visual Studio, Dreamweaver, Apache Directory Studio, Erwin 4.5/4.0, MS Visio 2007

Reporting Tools

Crystal Reports, Oracle Reports Builder, BO, MS Access Reports

Professional Experience:

Confidential,Portland, ME Jan’11 – Current
Role: Senior Informatica Developer

Wright Express Corporation, together with its subsidiaries, provides payment processing and information Management products and services to the commercial and government vehicle fleet industry in the United States, Canada, New Zealand, Australia, and Europe. It operates in two segments, Fleet and Mastercard. The Consolidated Fleet Reporting project will be combining data from different Sources into the Data Hub to allow integrated data to be loaded into the RDW

Responsibilities:

  • Documenting user requirements, Translate requirements into system solutions and developed implementation plan and schedule.
  • Extensively participated in functional and technical meetings for designing the architecture of ETL load process (mappings, sessions, workflows from source systems to staging to DW).
  • Developing Mapping Documents indicating the source tables, columns, data types, transformation required, business rules, target tables, columns and data types.
  • Creating Several Informatica Mappings for the CDC change data capture to populate the data into Dimensions and Fact tables.
  • Extensively used ILM tool for data masking.
  • Extensively used ETL to load data from flat file to Oracle database.
  • Responsible in developing the ETL logics and the data maps for loading the tables.
  • Developing Complex Transformations, Mapplets using Informatica to Extract, Transform and Load data into Data marts, Enterprise Data warehouse (EDW) and Operational data store (ODS).
  • Creating Informatica Exception Handling Mapping for Data Quality, Data Cleansing and Data Validations.
  • Writing complex Oracle SQL queries for Source Qualifier SQL Overrides.
  • Working on Performance Tuning of the complex transformations, mappings, sessions and SQL queries to achieve faster data loads to the data warehouse.
  • Creating session tasks, event waits & raise, command task, worklets etc in the workflow manager and deployed them across the DEV, QA, UAT and PROD repositories.
  • Generating Email Notifications through scripts that run in the Post session implementations.
  • Have written lots of SQL Scripts in Pre-SQL and Post-SQL session tasks.
  • Designed mappings for populating Tables for one time load and incremental loads.
  • Catching up for the applications and need to make sure catch up process is completed
  • Analyze functional requirements provided by Business Analysts for the code changes.
  • Create workflows based on dependencies in Informatica for the code changes.
  • Writing UNIX scripts for scheduling the Control-M jobs to automate the processes.
  • Writing UNIX scripts for unit testing the ETL Code (Sessions, Workflows, Log Files).
  • Unit test the Data Mappings and Workflows and validate the data loaded into database.
  • Involved in ETL Code migration from Environment to Environment.
  • Involved in creation of Environment Document which provides instructions to Implement/Test the Project in QA and Production environments.
  • Troubleshooting issues reported in production environment, Co-ordinate with quality control in final resolution of issues and production release of programs.

Environment: Informatica Power Center 8.6.1 (Repository Manager, Designer, Workflow Manager, Workflow Monitor and Repository Server Admin console), Mainframes, Sun Solaris Informatica servers, SQL, PL/SQL, Oracle 10g, TOAD, Control-M, Windows 2003/2000, Shell Scripting, XML, AIX, UNIX, Autosys, ChangeMan Version Manager and Crystal reports.


Confidential,Portland, ME May’09 - Dec’10
Role: Senior Informatica Developer

Unum is the currently the largest disability Insurance company worldwide. Unum provides employee benefits including disability insurance, critical illness insurance, long-term care insurance and life insurance. Worked with DaIS team in crating LC data mart. LTC data mart helps Unum in effective in-house reporting in LTC division.

Responsibilities:

  • Improving workflow performance by shifting filters as close as possible to the source and selecting tables with fewer rows as the master during joins.
  • Involved in Analyzing the System requirements, implementation of star schema & Involved in designing and development of data warehouse environment.
  • Effective used pushdown optimization in Informatica.
  • Analyzing on usage of pushdown, mload, odbc.
  • Effectively used teradata utilities mload, fload, tpump.
  • Involved in the performance improvement project.
  • Coordinating with off-shore team on day-to-day assignments.
  • Involved in designing of testing plan (Unit Testing and system testing).
  • Tested scripts by running workflows and assisted in debugging the failed sessions.
  • Improving workflow performance by shifting filters as close as possible to the source and selecting tables with fewer rows as the master during joins.
  • Used persistent caches whenever data from workflows were to be retained.
  • Used connected and unconnected lookups whenever appropriate, along with the use of appropriate caches.
  • Created tasks and workflows in the Workflow Manager and monitored the sessions in the Workflow Monitor.
  • Perform Maintenance, including managing Space, Remove Bad Files, Remove Cache Files and monitoring services.

Environment: Informatica Power Center 9.0.1, Power Exchange, Teradata V12.0, Oracle 11g/10g SOA, SQL assistant, SSH, WLM, UNIX AIX, Windows XP, Autosys, Data Explorer v9x, SFTP, Share point, Mastero.

Confidential,Detroit, MI Mar’08 - Apr’09
Role: Informatica Developer

This Project was to Build Enterprise Data warehouse (EDW) for the BCBS applications which had SQL Server for the ETL, the Primary Objective of the Project was to replace the SQL server DTS with Informatica as the ETL tool, the secondary objective was calculating the productivity of the sales personnel and awarding the most productive sales person with Special incentives.

Responsibilities:

  • Involved in Analyzing the System requirements, implementation of star schema & Involved in designing and development of data warehouse environment.
  • Preparing TSD and ETL Specs based on BSD and Mapping document.
  • Conducting walk through on TSD with client tech leads, team leads and Data Archs and DBAs
  • Developed code and executed UTCs.
  • Prepared SDLC Work book and conducted walk through before moving to SIT, UAT and Production.
  • Conducting weekly status meetings to discuss about the progress in that week.
  • Creating job setup document for job scheduling tool.
  • Given Technical clarifications and assigned work to off-shore team and monitored their work status daily.
  • Worked on Informatica tools like Source Analyzer, Target Designer, Mapping Designer, Workflow Manager, and Workflow Monitor.
  • Used various Transformations like Joiner, Aggregate, Expression, Lookup, Filter, Union, Update Strategy, Stored Procedures, and Router etc. to implement complex logics while coding a Mapping.
  • Extracted data from Flat files, SQL Server and Oracle and loaded them into Teradata.
  • Implemented different Tasks in workflows which included Session, Command, E-mail, Event-Wait etc.
  • Worked with complex queries for data validation and reporting using SQL and PL/SQL.
  • Transformed bulk amount of data from various sources to Teradata database by using BTEQ scripts.
  • Written the scripts needed for the business specifications (BTEQ Scripts).
  • Transferred data using Teradata utilities like SQL Assistant, Fast Export, Fast Load, MLoad.
  • Involved in Performance Tuning of SQL Queries, Sources, Targets and sessions by identifying and rectifying performance bottlenecks.
  • Worked on WLM to automate and Schedule the Workflows and jobs.
  • Tested all the mappings and sessions in Development, UAT environments and also migrated into Production environment after everything went successful.

Environment: Informatica Power Center 8.6.1/8.1.1, Power Exchange, Teradata V12.0, SQL Server 2008, Oracle 11g/10g SOA, TOAD, SSH, WLM, UNIX AIX, Windows XP, Autosys, Data Explorer v9x, SFTP.

Confidential,Cincinnati, OH May’07 – Feb’08
Role: ETL Developer

P&G (Procter & Gamble) is a uniquely diversified consumer products company with a strong global presence. Established in 1837, the Procter & Gamble Company began as a small, family operated soap and Candle Company in Cincinnati, Ohio, USA. PLIM is a Post Launch Initiative Management Data mart is a wide initiative to track the initiatives across various sites in the globe. The data sources will be from different systems, which include GBP_NG, Symphony, SIDI databases, and data files to derive the complex metrics used to track the initiatives. As a typical Data mart, PLIM is been classified into two streams i.e.; ETL (Extraction Transformation and Loading) using Informatica and Analytical reporting using Siebel Analytics. PLIM would help the business users to track the performance of each and every site after the launch of initiative.

Responsibilities:

  • Used update strategy to effectively migrate data from source to target.
  • Moved the mappings from development environment to test environment.
  • Interacted with the business community and database administrators to identify the Business requirements and data realties.
  • Created design document Informatica mappings based on business requirement.
  • Created Informatica mappings using various Transformations like Joiner, Aggregate, Expression, Filter and Update Strategy.
  • Involved in the performance improvement project.
  • Involved in designing of testing plan (Unit Testing and system testing).
  • Tested scripts by running workflows and assisted in debugging the failed sessions.
  • Improving workflow performance by shifting filters as close as possible to the source and selecting tables with fewer rows as the master during joins.
  • Used persistent caches whenever data from workflows were to be retained.
  • Used connected and unconnected lookups whenever appropriate, along with the use of appropriate caches.
  • Created tasks and workflows in the Workflow Manager and monitored the sessions in the Workflow Monitor.
  • Perform Maintenance, including managing Space, Remove Bad Files, Remove Cache Files and monitoring services.
  • Set up Permissions for Groups and Users in all Development Environments.
  • Migration of developed objects across different environments.
  • Also involved in conducting and leading the team meetings and providing status report to project manager.

Environment: Informatica Power Center 7.1, Oracle 9i, PL/SQL, Windows, ERWIN, PL/SQL Developer,


DB2 and Unix.

Confidential,Columbus, OH Mar’06 - Apr’07
Role: ETL Developer
Project Title: Application Development and Maintenance

UBS Bank is one of the largest banks in the US and intends to transform its enterprise wide data warehousing, data movement and analytics technology environment to improve data quality and access controls, instantiate common data routines through the Data Integration (DI) architecture with a view to corresponding cost reduction based on the simplification. The Information and Analytics Foundation Program (IAF) is comprised of two (2) major components. One is Application Development and other one is Maintenance (ADM) component, which will transfer the support of these environments to IBM who will run the current environment while they are transformed from their current state and maintain that support responsibility in the transformed state. A Foundation component which will transform the current “W” data warehouse environment and the current Bacardi environment onto a single consolidated platform with a standard data model, data integration tools and metadata tools

Responsibilities:

  • Monitoring and tracking the applications as per the schedule run.\\
  • Handling the user queries and tickets for Informatica applications.
  • Mainly involved in ETL design & development.
  • Mainly involved in Preparation of Low level Designs (Mapping Documents) by understanding the system CPMS.
  • Responsible in developing the ETL logics and the data maps for loading the tables.
  • Responsible formigrating from Development to staging and to Production (deployments).
  • Responsible of business requirement and analysis from business users.
  • Designed for populating Tables for one time load and Incremental loads.
  • Reloading of applications run and need to make sure reloading process is completed.
  • Restarting of applications run and need to make sure restart process is completed.
  • Catching up for the applications and need to make sure catch up process is completed.
  • Analyze functional requirements provided by Business Analysts for the code changes.
  • Create workflows based on Dependencies in Informatica for the code changes.
  • Unit test the data mappings and workflows.
  • Validate the data loaded into database.
  • Provide the status report for all the application monitoring, tracking status.
  • Execute the Test cases for the code changes.

Environment: Tera Data, Informatica 7.1, Windows 2000 and UNIX (server)

Confidential,India Jan 2005 – Feb 2006
Role: Oracle Developer

MIC Electronics limited is a global leader in the design, development and manufacturing of LED video Displays, high end electronic and telecommunications equipment and development of telecom software.

Responsibilities:

  • Created database objects for the system like tables, views, sequences, functions, synonyms, indexes, triggers, packages and stored procedures.
  • Developed Front-end forms using the logic such as Master-detail and complex forms for the application.
  • Loaded data into various tables from Access database as well as from Excel Spreadsheets using SQL*Loader.
  • Developed stored procedures, packages and functions to get data into Forms & Reports.
  • Developed complex tabular reports for the systems using running totals, sub totals, grand totals and daily balance.
  • Developed new Data Input forms and Reports using Oracle Forms and Reports builder.
  • Written scripts to create Temporary tables various schemas and Audit Transactions.
  • .

Environment: Oracle8i, Forms 4.5 and Reports 2.5, SQL, PL/SQL, SQL*Loader, Toad, Windows 2000

Education:

Bachelor’s in Computer Engineering.

We'd love your feedback!