Informatica Developers Resume Profile
Professional Summary
- 9 years of Experience on Data warehouse Data Migration projects using ETL tool Informatica PowerCenter 9.x/8.x Power Exchange 8.6.1 Oracle DB2 Teradata UNIX.
- Demonstrated expertise utilizing ETL tools Informatica 9.6.1 9.1 8.x Informatica Data Quality RDBM systems Oracle and DB2 and Teradata.
- Worked on all phases of data warehouse development lifecycle from ETL Architecture design implementation and support of new and existing applications.
- Excellent technical and analytical skills with clear understanding of ETL design and project architecture based on reporting requirements.
- Experience in solution design of ETL components to meet performance requirements and enhancing overall ETL Informatica architecture.
- Experience in complex ETL solutions design and implementation like Dynamic Aggregation Dynamic sorting and Dynamic Expression Evaluation.
- working experience in reading data from various sources like SAP using SAP connector File Mode and stream mode Mainframe using IBM MQ Series real-time CDC data using Power exchange and XML sources using XML source qualifier.
- Good experience in Teradata RDBMS using FastLoad MultiLoad FastExport Teradata SQL Assistant and BTEQ
- Experience in Informatica Data Quality transformations like Match Consolidation Exception Parser standardizer Address Validator.
- Experience in Informatica partitioning with various partition techniques including dynamic partitioning.
- Experience in Advanced workflow concepts like Push Down Optimization Concurrent workflow execution in Informatica.
- Extensively involved in Optimization and Tuning of mappings and sessions in Informatica by identifying and eliminating bottlenecks.
- Experience in working in an onsite-offshore structure and effectively coordinated tasks between onsite and offshore teams.
- Experience in developing the best practices and quality assurance standards for ETL and BI solutions and having knowledge of reporting Tool Cognos.
- Excellent problem solving skills with strong technical background and good interpersonal skills. Quick learner and excellent team player.
Experience Profile
- Working as Programmer Analyst in R2 Technologies LLC from July 2014 to present
- Worked as Associate Consultant in Capgemini from March 2011 to June 2014
- Worked as Software Delivery Engineer in Mphasis an HP Company from December 2009 to February 2011.
- Worked as a Programmer Analyst in Solutions Delivery Inc. from January 2006 to November 2009
Technical Skills
- ETL Tool Informatica PowerCenter 9.6.1 9.1 8.x 7.1
- Databases Oracle DB2 Teradata 12 SQL Server
- Operating Systems UNIX Windows
- Languages SQL PL/SQL Shell Scripting
- Other Tools Cognos 8.1 Informatica Power Exchange 8.6.1
- Informatica Data Quality IDQ FastLoad MultiLoad FastExport Autosys SVN Control-M HPQC
- Methodologies Star Schema Snowflake Schema
Work Experience
Client confidential
Duration confidential
Environment Power center 9.6.1 Informatica Data Quality 9.6.1 DB2 UNIX MSTR
Project Overview
The Zurich Data Warehouse ZDW is an integral part of Zurich s BI application. The ZDW is a centralized Data warehouse which stores all incoming data from PSA and other external sources. This data is used to periodically feed various Datamarts which are then used by business to generate reports.
Responsibilities
- Working on Application Enhancements.
- Creating Design documents Implementation Unit and application test for new work orders.
- Creating Design Instruction document containing the ETL solution to meet performance requirements taking inputs from the SRS document.
- Creating the Deployment Order Application Plan documents as part of design deliverables.
- Integrated various data sources of DB2 SQL Server VSAM Cobol file in EBCIDIC format and sequential files into staging area.
- Designed and developed ETL and Data Quality mappings to load and transform data from source to DWH using PowerCenter and IDQ
- Experience in Informatica Data Quality transformations like Match Consolidation Exception Parser standardizer Address Validator.
- Developer Match and Merge strategy and Match and consolidation based on customer requirement and the data.
- Built several reusable components on IDQ using Parsers Standardizers and Reference tables.
- Performed analysis design and programming of ETL processes.
- Creating Test plan document containing the positive and negative test cases.
- Creating Understanding documents for existing applications.
- Upgrading of Informatica PC client from 9.1 to 9.6.1 and prepared a step wise guide for the onsite/offshore team.
- Supporting existing applications Monitoring responding and assigning existing and new incidents using Remedy Tool.
Client confidential
Duration confidential
Environment Power center 9.1 DB2 SAP UNIX HPQC
Project Overview
The finance hub part of HUDSON program is a technical tool supports business processing by providing a structured interface layer connecting various source systems to the SAP ECC system. These business processes are homogenized activities resulting from the standardization of the general ledger and reporting the common chart of accounts centralized governance and the consolidation of various IT systems into one Finance platform.
Responsibilities
- Participated in the initial project planning and knowledge transfer sessions.
- Lead a team of five developers and assigning tasks to them.
- Created detailed level technical design specification document based upon the high level specifications provided for offshore ETL developers.
- Prepared the technical solution document ETL mapping specification documents.
- Worked on Design and implementation of various stages.
- Worked for fast closure of the UAT defects and performance tuning of the code and presenting the code to the client.
- Created ETL workflows sessions and mappings.
- Created ETL design documents and support documents.
- Prepared test case document.
- Mentored team members and provided technical guidance to project team for query/issue resolution in timely manner.
- Performed tuning of SQL and Informatica objects which involved tuning source target and adding parallel processing and modifying the Informatica code for optimal performance.
- Involved in ETL process from development to testing and production environments.
- Provide post implementation support to the production support team
Client First Data confidential
Duration confidential
Environment Informatica 9.0 Teradata 12 FastLoad MultiLoad FastExport SQL Server Tandem BRIO
Project Overview
First data international Germany is a major payments processor for banks finance companies and merchants across Europe. The client needs to build a Data warehouse for both its customer centric KUZE and card centric data KAZE . This warehouse has two data marts KAZE and KUZE .The existing DWH in TANDEM system is migrated to Teradata platform. The staging database which was earlier SQL server is migrated to Teradata. Existing DWH uses C language Tandem scripts Perl scripts for KAZE data mart and Informatica for KUZE data mart. New DWH implementation uses Informatica for both KAZE and KUZE.
Responsibilities
- Developed complex Informatica mappings using various transformations.
- Developed Reusable codes which can be used thru out the application.
- Providing estimates in man hours for ETL and database objects involved.
- Involved in Unit testing and System Integration testing.
- Creation of Complex mappings such as Incremental aggregation and Slowly Changing Dimensions.
- Coding using Teradata Analytical functions BTEQ SQL of TERADATA
- Developed processes on Teradata using RDBMS utilities such as Multi Load Fast Load Fast Export BTEQ Teradata .
- Involved heavily in writing complex SQL queries based on the given requirements and used volatile tables temporary tables derived tables for breaking up complex queries into simpler queries
- Fine tuned transformations and mappings for better performance.
- Involved in debugging of mappings using Informatica Debugger.
- Prepared test case scenarios and executed the scenarios.
- Tested the reports generated by BRIO tool by comparing reports from both Old and New systems.
- Mentored team mates on Informatica coding and testing.
Client confidential
Duration confidential
Environment Informatica 8.6.1 PowerCenter PowerExchange Oracle 10g
DB2 IBM MQ Series ABC Framework Unix
Project Overview
JP Morgan Chase is one of the world s major investment banking firms. Global Collateral Engine GCE is the central repository that stores integrated data coming from BDAS-GCH CCMS source systems. Data from Source Systems is Extracted to staging area and Transformation of the data is done by applying the business logic and data load into GCE system. Power Exchange and MQ Series are used to Capture Data Changes in Source Systems and loaded into GCE in Real-time.
Responsibilities
- Developed both one-time and real-time mappings using Power Center 8.6.1 Power Exchange and IBM MQ Series.
- Registered the Data maps for Real-time CDC Changed Data Capture data in Power Exchange. Worked on Extraction maps Row Test in Power Exchange Navigator.
- Worked on importing MQ Series Source and developing mapping using MQ Source
- Worked on Transformations like Normalizer Dynamic Lookup SQL Transformations Stored Procedure Transformations for various Bussiness Logic.
- Developed mapplets which were used in Real-time and reprocess mappings.
- Developed mappings with target update override where Merge Statement is used.
- Involved in writing the SQL Statements for SQL Override in Source Qualifier as well as Target Update Override.
- Worked on Unit Testing the Data Acquisition Module.
- Involved in ABC Framework Component development.
Client confidential
Duration confidential
Environnent Informatica 8.1 Oracle9i/10g Mainframe PLSQL Autosys UNIX
Project Overview
Abercrombie and Fitch Enterprise Warehouse AFEW was used to maintain and analyze various store needs and trends of Abercrombie Fitch and provide information related to Various Assets and their value / status space and clothing lines and trends Information.
Responsibilities
- Develop or enhance the ETL mappings to extract data from the various Sources and load it into the Oracle databases or Flat files as required.
- Do the error validation of the data moving from various sources to targets.
- Analyzing Informatica mappings and session level properties and logs for performance bottlenecks.
- Creating the Mapping and Reviewing Technical specifications.
- Creation of Complex Jobs using Aggregator Transformer Filter Joiner Lookup Stages.
- Fine tuned Stages and Jobs for better performance.
- Validating Run Monitor Test the developed Jobs.
- Validating integrity of the data after successful load.
- Involved in Developing Mappings Transformations Workflows using Source Qualifier Aggregator Connected Lookups Expressions Filters And Sequence generator Transformations
- Tuned and tested the mappings to perform better using different logics to provide maximum efficiency. Implementing the Performance Optimization solution to various components wherever applicable.
- Unit Test Case Document Preparation
Client confidential
Duration confidential
Environment Informatica 7.1.4 8.0 Oracle Unix
Project Overview
Data warehouse implementation for a large Polymers company for its Manufacturing Distribution and Sales Analysis. The Data warehouse is being designed to generate reports to understand products movement over multiple channels channel-wise performance etc.
The solution envisaged the creation of a central data repository with data from all the systems. The extraction translation and loading of data were done after a thorough study of the systems. Further Analysis reports were created using Business Objects.
Responsibilities
- Majorly involved in ETL developing
- Creating the mappings and involved in Extraction Transformations and Loading
- Used Informatica to load data from Oracle and flat files to Oracle
- Extracting the data from the source system and creation of the target database
- Worked on Informatica Source Analyzer Mapping Designer and used various transformations such as the Source Qualifier Aggregators Connected Unconnected lookups Filters Sequence Joiner
- Used SQL overrides in source Qualifier to meet business requirements.
- Creating Targets Mapplets
- Configuring and Scheduling Sessions in Workflow Manager
- Creating and testing sessions