Sr. Informatica Developer Resume
PROFESSIONAL SUMMARY
- 8+ years of IT experience in Data warehousing and Business intelligence with emphasis on Business Requirements Analysis, Application Design, Development, testing, implementation and maintenance of client/server Data Warehouse and Data Mart systems in the Financial, Banking, Retail, Insurance, and Pharmaceutical industries.
- 7+ years of solid ETL (Extract Transform Load) data integration and Data Warehouse experience using Informatica PowerCenter 8.6/8.1/8.0/7.1.4/7.1.3/7.1.2/7.1.1/7.0/6.2/6.1/5x, Repository Admin Console, Repository Manager, Designer, Work Flow Manager, Work Flow Monitor, Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet Designer, Mapping Designer, Workflow Designer, Task Developer, Worklet Designer, Task View, Informatica Power Connect for SAP/Peoplesoft/Siebel.
- Experience in performance tuning of Informatica Sources, Targets, Mappings, and Transformations & Sessions.
- Expertise in Data Warehousing, Data Migration, Data Modeling, and Data Cleansing
- 5+ years of experience with databases such as Oracle 8i/9i/10g,SQL SERVER 2000/2005 , DB2 and Oracle Siebel 8.1 and in writing SQL, and PL/SQL Packages, procedures, Functions, Triggers, Cursors as per the business needs.
- Solid experience in Data Warehousing Concepts like Ralph Kimball Methodology, Bill Inmon Methodology, OLAP, OLTP, Star Schema, Snow Flake Schema, Fact Table, Dimension Table, Logical Data Modeling, Physical Modeling, Dimension Data Modeling using Erwin 4.2/4.0/3.5.5/3.5.2, Microsoft Visio
- Experience to identify research and resolve ETL production root cause issues.
- Good knowledge about UNIX and shell scripting.
- Experience in maintenance, enhancements, performance tuning of ETL code.
- Extensively worked on developing Informatica Mappings, Mapplets, Sessions, Workflows and Worklets for data loads.
- Experienced with POWEREXCHANGE AND POWERCONNECT to connect and import sources from external systems like Mainframe (DB2, VSAM), SAP R/3, etc.
- Experienced in using Query studio, Report Studio in creating ad-hoc and complex reports.
- Experience to identify research and resolve ETL production root cause issues. Experience in maintenance, enhancements, performance tuning of ETL code.
- Developed test case’s for business and user requirements to perform System/Integration/Performance testing.
- Continuously monitor the accuracy of the data and the content of the delivered reports.
- Carried out training sessions for end-users (Line of Business).
- Excellent communication and interpersonal skills. Ability to work effectively while working as a team member as well as individually.
- Operating Systems: Windows (NT, 2000, 2003, XP), Linux (Red Hat), Unix (Solaris, AIX)
- Languages: SQL, Oracle PL/SQL , T-SQL , Unix Shell Scripting, XML, C, Verilog
- ETL Tools: Informatica 8.x/7.1/6.2/5.1, Power Connect / Exchange
- OLAP Tools: Cognos BI 8.x/7.x, MS OLAP, Business Objects
- Data Modeling: ER Win 3.x,4.x (ERD)
- Databases: Oracle 8i/9i/10g, MS SQL Server 2000/2005, IBM DB2, MS Access,
- Job Schedulers AutoSys, Control-M.
- Version Control MS Visual Source Safe (VSS).
Confidential,Pleasanton,CA Jan’09 – Dec’09
Sr. Informatica Developer
Confidential,is an integrated managed care organization, based in Oakland, California, USA, founded in 1945. Kaiser Permanente is a consortium of three distinct groups of entities: the Kaiser Foundation Health Plan and its regional operating subsidiaries, Kaiser Foundation, and the autonomous regional Permanente Medical Groups.
The ETL layer of the Incentive System imports data from various source systems into the Comp Central Database. Data is loaded on a scheduled basis each month and transformed based on a set of business rules. In addition, there are several files that are exported by the ETL layer which include the Payroll File, HR Executive Comp File, & Cross-Sell Extract File.
Responsibilities:
- Interacted with Business Analyst / Line of Business to understand requirements and translated them into the appropriate technical solution
- Extensively worked on developing Informatica Mappings, Mapplets, Sessions, Workflows and Worklets for data loads.
- Responsible for mentoring Developers and Code Review of Mappings developed by other developers
- Extracted data from various heterogeneous sources like Oracle, SQL Server, and Flat Files.
- Responsible for Production Support and Issue Resolutions using Session Logs, and Workflow Logs
- Extensively used various active and passive transformations like Filter Transformation, Router Transformation, Expression Transformation, Source Qualifier Transformation, Joiner Transformation, and Look up Transformation, Update Strategy Transformation, Sequence Generator Transformation, Rank Transformation, and Aggregator Transformation.
- Developing PL/SQL Packages, procedures, Functions, Triggers, Cursors as per the business requirements
- Responsible for best practices like naming conventions, Performance tuning, and Error Handling
- Responsible for Performance Tuning at the Source level, Target level, Mapping Level and Session Level Solid
- Expertise in using both Connected and Un connected Lookup transformations
- Extensively worked with various lookup caches like Static Cache, Dynamic Cache, and Persistent Cache
- Developed Re usable Transformations, and Re Usable Mapplets
- Worked with Shortcuts across Shared and Non Shared Folders
- Developed Slowly Changing Dimension Mappings for Type 1 SCD and Type 2 SCD
- Responsible for implementing Incremental Loading mappings using Mapping Variables and Parameter Files
- Responsible for determining the bottlenecks and fixing the bottlenecks with performance tuning.
- Used Update Strategy DD_INSERT, DD_DELETE, DD_UPDATE, and DD_REJECT to insert, delete, update and reject the items based on the requirement.
- Extensively used ETL to load data from wide range of sources such as flat files (CSV, fixed-width or delimited)
- Worked with Session Logs and Workflow Logs for Error handling and Troubleshooting in all environment Responsible for Unit Testing and Integration testing of mappings and workflows.
Environment: Informatica 8.1/8.5, Cognos 8 BI/7.x, Cognos Report Net 1.1, ERwin 4.x, SQL Server 2000,DTS, Teradata V2R5,Oracle9i,PL/SQL, Windows 2003, RHE Linux4.0, UNIX Shell Scripting.
Confidential,NY Oct \'07 -- Dec \'08
Informatica Developer
Confidential,is the largest distributor of healthcare products and services to office-based practitioners. Customers include dental practices and laboratories, physician practices, and animal health clinics, as well as government and other institutions Henry Schein operates its four business groups - Dental, Medical, International, and Technology - through a centralized and automated distribution network, which provides customers in more than 200 countries with a comprehensive selection of more than 90,000 national and Henry Schein private-brand products in stock, as well as over 100,000 additional products available to our customers as special order items.
.
The main objective of the project was to implant a sales data mart for the Insurance products. Informatica PowerCenter 7.0/6 was used as ETL tool to extract data from source system to target systems. The source systems were Oracle, Flat Files, SQL Server and the target was Oracle and Flat Files. Implemented various loading techniques like Slowly Changing Dimensions and Incremental Loading
Responsibilities:
- Design of ETL mappings for the CDC change data capture Responsible to analyze functional specification and prepared technical specification accordingly.
- Regular client communication for issue resolution over conference call, email, telephone
- Extensively used various transformations like Filter, Router, Sequence Generator, Look Ups, Update Strategy, Joiner, Source Qualifier, Expression, Sorter, and Aggregator.
- Extensively used Mapping Variables, Mapping Parameters, and Parameter Files for the capturing delta loads Worked with various tasks like Session, E-Mail, Workflows, Worklets, and Command.
- Worked with the Informatica Scheduler for scheduling the delta loads and master loads
- Extensively worked with aggregate functions like Avg, Min, Max, First, Last, and Count in the Aggregator Transformation.
- Extensively used SQL Override, Sorter, and Filter in the Source Qualifier Transformation..
- Extensively used Normal Join, Full Outer Join, Detail Outer Join, and master Outer Join in the Joiner Transformation
- Extensively worked with various re-usable tasks, workflows, Worklets, mapplets, and re-usable transformations
- Implemented various new components like increasing the DTM Buffer Size, Database Estimation, Incremental Loading, Incremental aggregation, Validation Techniques, and load efficiency
- Worked with slowly changing dimension Type1, Type2, and Type3
- Performance tuning of the process at the mapping level, session level, source level, and the target level
- Extensively used Mapplets for use in mappings thereby saving valuable design time and effort
- Responsible for Production Support.
- Extensively worked with various Look up Caches like Static, Dynamic, Persistent, and Shared Caches.
- Worked with session logs, Informatica Debugger, and Performance logs for error handling when we had workflows and session fails
- Performance tune ETL programs, Maintenance and enhancements of ETL code.
- Worked extensively with the business intelligence team to incorporate any changes that they need in the delivered files
Environment: Informatica Power Center 7.1/6.2(Repository Manager, Designer, Workflow Manager, and Workflow Monitor), Oracle 9i, SQL Server, CA Autosys, Teradata (T-Pump, Fast Load, Multi Load), Business Objects, UNIX, Windows XP.
Confidential,NJ Jan‘05 – Aug’06
ETL Informatica Consultant
Confidential,its roots back 90 years when an American entrepreneur named C.V. Starr founded AIG’s earliest predecessor company in Shanghai. What began as a small insurance business grew to become one of the world’s largest companies. By the end of 2007 AIG had assets of approximately $1 trillion, $110 billion in annual revenues, 74 million customers and 116,000 employees in 130 countries and jurisdictions. Yet, less than a year later, AIG found itself on the brink of failure and in need of emergency government assistance.
There are many subject areas in AIG like sales order, and we worked in Financial Subject area, which comprises the module of the project FDR. SAP, XML and Flat files are the source systems from which the data has been pulled into Staging and then from staging the required cleansing of data is done and some calculations are performed and stored in respective dimensions and fact tables in the warehouse.
Responsibilities:
- Data Quality Analysis to determine cleansing requirements.
- Worked on Informatica Power Center tool - Source Analyzer, Data warehousing designer, Mapping & Mapplet Designer and Transformation Designer.
- Extracted data from SAP system to Staging Area and loaded the data to the target database by ETL process using Informatica Power Center.
- Designed and developed complex Informatica mappings using transformations like Normaliser, Router, Lookup (connected and Unconnected) and Update strategy transformation for various data loads.
- Designed and developed various Pl/SQL stored procedures to perform various calculations related to fact measures. Converted the PL/SQL Procedures to Informatica mappings and at the same time created procedures in the database level for optimum performance of the mappings.
- Performance tune ETL programs, Maintenance and enhancements of ETL code
- Tuning Informatica Mappings and Sessions for better performance. Created Sessions, Workflows, Post Session email task and also performed various workflow monitoring and scheduling tasks
- Performed Unit testing and maintained test logs and test cases for all the mappings.
- Maintained warehouse metadata, naming standards and warehouse standards for future application development.
- Parsing high-level design specification to simple ETL coding along with mapping standards.
Environment: Windows XP, Informatica 6.2, Oracle8i, 9i, MS SQL Server 2000.
Confidential,India Sept’02 – Dec’04
ETL Developer
Confidential,is one of the most prominent banks in India, having its total assets as Rs. 1, 43,146 Crores as on 31st of March 2007.
The product and sales data is kept at their respective branches and the data is collected from different OLTP systems in the form of Daily/weekly, monthly, quarterly and yearly basis for each sales region, factory or branch and manually collected at the corporate office for aggregating the data as per the report requirement.
.
Responsibilities:
- Data Quality Analysis to determine cleansing requirements.
- Developed mappings to populate Reference data tables which provide codes and descriptions for dimension tables in the database.
- Developed complex mappings in Informatica to load the data from various sources using different transformations like Source Qualifier, Look up (connected and unconnected), Expression, Aggregate, Update Strategy, Sequence Generator, Joiner, Filter, Update Strategy Rank and Router transformations
- Implemented efficient and effective performance tuning procedures.
- Fixed invalid Mapping\'s, tested Stored Procedures and Functions, Unit Testing of Informatica Sessions, Workflows.
- Used Debugger wizard to remove bottlenecks at source level, transformation level, and target level for the optimum usage of sources, transformations and target loads. Accessing source data and addressing data quality in legacy sources applications
- Created Sessions, reusable Worklets in Workflow Manager and scheduled the sessions to run at a specified frequency.
- Used session logs to debug sessions. Executed sessions, both sequential and concurrent for efficient execution of mappings and used other tasks like event wait, event raise, email, command and pre/post SQL.
- Developed UNIX shell scripts to run the pmcmd functionality to start and stop sessions.
Environment: Informatica Power Center 5.1, Oracle 9i, SQL, Windows 2000, UNIX.
Confidential,India Mar’01 – Aug’02ETL/SQL Developer
Confidential,is India\'s largest private sector enterprise, with businesses in the energy and materials value chain. Group\'s annual revenues are in excess of US$ 28 billion. The flagship company, is a Fortune Global 500 company and is the largest private sector company in India
The Product Profitability Reporting system project, commonly known as PPR, was conceived in 1988. The concept of the PPR system included two major components: PPR Accounting and Data Warehouse efforts. The data warehouse brings together many varied formats of transaction data into one consistent format, is reconciled to financial data, and is balanced from the transaction system to the warehouse. The PPR Accounting effort identified revenue and claims cost using a product matrix. This matrix includes SBU, Rating Line of Business, Risk, Fee Arrangement, Product, Network, Reinsurer, and Renewability. Business decisions were made to move all reporting (internal and external) from current sources to the data warehouse.
The data warehouse is the single source of data for this purpose.
Responsibilities:
- Responsibilities include designing the documents which is based on the requirement.
- Done extensive testing and wrote queries in SQL to ensure the loading of the data.
- Perform unit testing at various levels of the ETL.
- Developed and implemented the coding of Informatica Mapping for the different stages of ETL.
- Fixed the invalid mappings and troubleshoot the technical problems of the database.
- Debug the sessions by utilizing the logs of the sessions. Developed Design Document for better understanding of existing ETL processes.
- Developed and implemented the UNIX shell script for the start and stop procedures of the sessions
- Developed complex mappings in Informatica to load the data from various sources using different transformations like Source Qualifier, Look up (connected and unconnected), Expression, Aggregate, Update Strategy, Sequence Generator, Joiner, Filter, Update Strategy Rank and Router transformations Worked
- Extensively with the business intelligence team to incorporate any changes that they need in the delivered files
- Involved in implementation of user logins, roles and profiles as a part of Security Policies for various categories of users.
- Created database objects like tables, views, indexes, stored-procedures, triggers, and user defined functions
- Written T-SQL queries for the retrieval of the data.
- Developed administrative tasks of SQL Server of performing daily Backups and Recovery procedures.
- Worked with the developing team in the Writing functions in Visual Basic 6.0 for Upload download functionality, Data transfer and migration.
- Involved in developing functions in VB for Encryption, Decryption of Password and User name.
- Develop code and analyze it in Visual basic 6.0 for the front-end interface.
- Played an important role in ensuring timely delivery of the whole set of data & their queries to the Procurement Department.
- Study and understand the requirements of the information required by company’s Procurement Department about the vendors.
- Normalize data sets and re-design the data from scratch including logical and physical design.
- Develop code and analyze it in Visual basic 6.0 for the front-end interface.
- Played an important role in ensuring timely delivery of the whole set of data & their queries to the Procurement Department created the logical and physical data modeling using Erwin tool.
- Loaded data using BCP utility.
- Configured the Log shipping and created the Standby server. Prepared Estimates for Database, table size growth.
- Tuned Complex SQL Queries
Environment: Windows XP, Informatica 5.1, SQL, Oracle8i, 9i, MS SQL Server 2000.