Informatica Developer Resume
Charleston, WV
SUMMARY
- Over 6 Plus years of IT experience in Design, Development, Implementation and Production support of Data warehouse Tools like Informatica Power Center, Power Exchange, IDQ Tool, MDM and databases like DB2,Tera data,Oracle, SQL server and Unix.
- Extensive experience as ETL design, development and support using ETL tool Informatica Power Center 9.X/8.6/7.1.
- Extensive experience in UNIX Shell Scripts to perform data warehousing tasks.
- Experience in Informatica Data Quality (IDQ) and Power Exchange.
- Interacted with business users for requirement analysis and to define business and functional specifications using various analysis techniques gap, risk and data techniques.
- Well acquainted with Performance Tuning of sources, targets, mapping and sessions to overcome the bottlenecks in mappings.
- Extensive experience in extraction, transformation and loading of data from heterogeneous source systems like flat files, Oracle,Tera Data, Mainframe,DB2, MQ and MS SQL .
- Deliver end - to-end Data Quality and ETL solutions.
- Lead the technical delivery and co-ordinate with other teams in American Express in ensuring application and business goals are successfully met.
- Good understanding of Star and Snowflake Schema, Dimensional Modeling, Relational Data Modeling, Slowly Changing Dimensions and data ware housing concepts.
- Proactively identify opportunities to automate the manual monitoring to prevent human errors and ensure application availability targets are met.
- Experience of domain knowledge in health care, Retail, Finance, Pharmacy and insurance.
- Involved in all aspects of ETL- requirement gathering, coming up with standard interfaces to be used by operational sources, data cleaning, coming up with data load strategies, designing various mappings, developing mappings and unit testing.
- Provide leadership in production support services including problem analysis, solution design, testing and implementation, by ensuring system/business requirements and interface dependencies are met in a timely manner with resilient high quality solutions.
- Good communication and organizational skills; Self-motivated, hardworking; Ability to work independently and in a cooperative team environment.
TECHNICAL SKILLS
Data Warehousing: Informatica PowerCenter 9.5/9.1.0/9.0.1/8.6.1 (Designer, Workflow Manger, Repository Server, Repository Manager, Server manager, Mappings/Mapplets/Transformations. Ardent DataStage 6.0 /5.1(DataStage Administrator, DataStage Designer, DataStage Director, DataStage Manager), Power Exchange, Power Connect, Data Junction 7.5(Map Designer, Process Designer, MetaData Query),Cognos.
Tools: HP Quality Center,Test Director, TOAD, SQL Assistant, Excel, Word.
Environment: UNIX, Sun Solaris, Win 2003/Xp/Vista.
Databases: Oracle 11g/10g/9i/8i, MS SQL Server 2000/7.0/6.5 , Sybase SQL Server 12.0/11.x, DB2 UDB, MS Access 7.0.
Programmig: SQL, Transact SQL, PL/SQL, Unix Shell Scripting, C, C++ Java, VBScript,Visual Basic 6.0/5.0/4.0, HTML 4.0, DHTML.
PROFESSIONAL EXPERIENCE
Confidential, Charleston,WV
Informatica Developer
Responsibilities:
- Collected Business Analysis and Requirements by working with the business end users.
- Designed ETL Detailed Design document which is a guideline to ETL Coding.
- Passed higher level model/design into Informatica ETL code.
- Created new and altered existing mappings for the Managed Care project
- Involved in designing an error handling strategy for data validation, error reporting.
- Worked on Informatica Power Center 9 tool - Source Analyzer, warehouse designer, Mapping Designer, Workflow Manager, Mapplets, and Reusable Transformations.
- Populated data into Staging tables, DM Schema and Sales force Schema from Oracle source systems called Proton and AMI systems, and applied business logic on transformation mapping for inserting and updating records.
- Used ETL extensively to load data from DB2 and Teradata databases to Oracle databases.
- Applied the Business logic in Populating the necessary clinical Facts and Dimensions and make sure the bi-directional data was valid comparing it with the Sales force data using the relational junction for replication.
- Created shell scripts/Perl scripts for better handling of incoming source files like moving file from one directory to another directory, extracted information from file names such as date for continuously incoming source and cleaning the blank XML Tags in the post session to validate against the XSD..
- Using Informatica Designer designed Mappings that populated the Data into the Target Star Schema on Oracle 10G Instance.
- Created mappings and workflows to transform the extracted Teradata source .
- Performed data manipulations using various Informatica Transformations like Joiner, Expression, Lookup, Aggregate, Filter, Update Strategy, Router, Normalizer, etc.
- Conducted SQL testing for DB sources for Insert and Update timestamps, Counts, Data definitions, Control, and Error logic for Facts and Reference tables.
- Involved in performance tuning of the Mappings, SQL statements, Query optimization, and Explain Plan utilities for Optimum Performance and used Informatica Debugger.
- Used data base objects like Materialized views, Sequences, Triggers, Cursors, Parallel Partitioning and Stored Procedures for accomplishing the complex logical situations and Memory Management.
- Applied appropriate field level validations like date validations, Applying Default values for cleansing the data.
- Maintained warehouse metadata, naming standards and warehouse standards for future application development.
- Prepared Unit Test Plans and performed Negative testing, Positive testing
- Implemented the Slowly Changing Dimension strategy for the Warehouse.
- Converted heavy SQL overrides in existing mappings of the QSR and CMS projects to transformation objects
Environment: Informatica Power Center 9.5, Oracle 10g/9i, HP Unix, Autosys Scheduler, HP Quality Center, TOAD
Confidential, Detroit, MI
Informatica Developer
Responsibilities:
- Collection of data source information from all the legacy systems and existing data stores
- Designed, configured, and tested MDM solutions.
- Created data maps in Informatica Power Exchange to extract the Mainframe sources across all subject areas.
- Worked extensively on different types of transformations like Expression, Filter, Aggregator, Lookup, Joiner, Sequence generator and Router.
- Developed complex mappings using multiple sources and targets in different databases, Flat files and Mainframe files.
- Involved in creating Tivoli workflow Job streams for each subject area and scheduled as per the business requirements.
- Implemented CDC process across all subject areas using MD5 function in Informatica.
- Implemented ABC (Audit, Balance, and Control) architecture for all Informatica workflows across all subject areas.
- Implemented four steps algorithm for Member matching process to load the accurate data into EDW.
- Created stored procedures to generate the sequence numbers for Master data tables in landing area.
- Involved in performance tuning at session level tuned the mapping by setting the buffer size, cache size, increasing commit intervals etc
- Worked within a team to populate Type II slowly changing dimension customer tables from several mainframe flat files.
- Troubleshooting and support 24/7 during operational cycles for historical load process.
- Completed the Independent Health Industrial Certifications.
- Worked in different subject areas - Medical, Pharmacy, Dental, Vision, Medicare, Medicare Advantage, PPO, HMO, POS, Master Medical and Care Management.
- Studied the existing environment and accumulating the requirements by querying theClients on various aspects.
- Identified various Data Sources and Development Environment.
- Data modeling and design of data warehouse and data marts in star schema methodology with confirmed and granular dimensions and FACT tables.
- Prepared user requirement documentation for mapping and additional functionality.
- Analyzed current system and programs and prepared gap analysis documents.
- Prepared Conceptual Solutions and Approach documents and gave Ballpark estimates.
- Prepared Business Requirement Documents, Software Requirement Analysis and Design Documents (SRD) and Requirement Traceability Matrix for each project workflow based on the information gathered from Solution Business Proposal document.
- Worked on Inbound, Outbound and Carve-out Data Feeds and developed mappings, sessions, workflows, command line tasks etc. for the same.
- Coordinated and worked closely with legal, clients, third-party vendors, architects, DBA’s, operations and business units to build and deploy.
- Performed extensive Data Profiling using Informatica Data Explorer.
- Extensively used ETL to load data using Power Center / Power Exchange from source systems like Flat Files and Excel Files into staging tables and load the data into the target database Oracle. Analyzed the existing systems and made a Feasibility Study.
- Extracted Mainframe files using Informatica Power Exchange and transformed them to be loaded in ODS tables.
- Worked with various Informatica Transformations like Joiner, Expression, Lookup, Aggregate, Filter, Update Strategy, Stored procedure, Router and Normalizer etc.
- Worked with Connected and Unconnected Stored Procedure for pre & post load sessions
- Prepared technical specification to load data into various tables in Data Marts.
- Designed and Developed pre-session, post-session routines and batch execution routines using Informatica Server to run Informatica sessions.
- Documented the complete Mappings.
- Maintained the documents with various versions using PVCS.
- Worked extensively on Mappings, Mapplets, Sessions and Workflows.
Environment: Informatica Power Center 9.5/9.1.0 (Repository Manager, Designer, Workflow Manager, Workflow Monitor), Power Exchange Navigator 8.6.1/8.1.1 , Oracle 10g, TOAD, SQL Server 2005, SQL Plus, SQL Query Analyzer, SQL Developer, MS Access, Flat Files, AIX, Windows NT, Shell Scripting, Tivoli, Clear Quest, COBOL, JCL, PVCS.
Confidential, Auburn Hills, MI
ETL/Informatica Developer
Responsibilities:
- Analysis of business requirements for end-to-end ETL process.
- Worked closely with business analysts to understand the business needs for decision support data.
- Analyzed and Created Facts and Dimension Tables for star schema.
- Extraction, Transformation and Load was performed using Informatica Power Center to build the data Mart.
- Developed various Sessions, Batches for all Mappings to load from Source flat files tables to Target tables.
- Defined data management strategies and standards, addressed issues of data ownership and governance and MDM process design for data import to MDM
- Creation of Transformations like Sequence generator, Lookup, joiner and Update Strategy transformations in Informatica Designer.
- Wrote shell scripts for data validation and reconciliation.
- Extensively worked on the performance tuning of mappings and sessions.
- Wrote stored procedures for dropping and recreating indexes for efficient Data Load.
- Designed and Developed pre-session, post-session routines and batch execution routines using Informatica Server to run Informatica sessions.
- Data modeling and design of data warehouse and data marts in star schema methodology with confirmed and granular dimensions and FACT tables.
- Prepared Unit Test Plans and performed Negative testing, Positive testing.
Environment: Informatica Power Center 9.1.0 (Repository Manager, Designer, Workflow Manager, Workflow Monitor), Oracle 10g, SQL Server 2005, SQL Developer, HP-Unix.
Confidential
ETL Developer
Responsibilities:
- Involved in the Analysis of Physical Data Model for ETL mapping and the process flow diagrams.
- Created Mappings to extract data from data sources like Oracle 9i, SQL Server.
- Used Informatica (Repository Manager) to give permissions to users, create new users, repositories
- Extensively used the Mapping parameters, variables & parameter file in the ETL complex mappings
- Worked extensively with complex mappings using expressions, aggregators, filters, lookup and procedures to develop and feed in Data Warehouse
- Accessed Mainframe Data through Power Exchange, Generated Files to upload to reporting data mart
- Used SQL, PL/SQL for the stored procedures, Packages, function over ride required for the NRD ETL Process
- Created the Worklets, Tasks, Timers, Reusable objects for scheduling the Workflows
- Involved in performance tuning of the complex Informatica mapping for extracting & loading the data from GBT and resolving the production issues for Phase 1A
- Created the SQL Loader scripts & used the SQL Loader to GBT clean up files and for data validation
- Performed data accuracy & data integration test and created the test cases and test data scripts for the NRD phase 1& 1A
- Involved in the SQL Queries performances tuning by altering session parameters, using explain plans & using hints
- Interacted with the GBT business super user to identify the business requirements & to validate the NRD Report specifications.
- Written UNIX shell scripts & used Crontab to schedule the workflows
- Analyzed and explored Cognos for reporting purposes
- Built an Analysis Services cube from a fact table with millions of rows and lots of dimensions from Oracle database to SQL SERVER 2000 using Analysis manager for SQL Server 2000
- Extensively used MDX language queries for structuring and performance optimization
Environment: Informatica Power Center 8.6.1,Java, XML, XSL, C++, Sybase, UML, TOAD, Erwin 3.5.2, Oracle 9i/8i, SQL Server 2000, Win NT, Sun Solaris 5.8, Cognos 6.6, Unix AIX 5.1/4.3.