Informatica Tech Lead/lead Developer Resume
Summary
- Over Eight years of strong IT experience in complete cycle of software development including analysis, design, development implementation, testing of Data Warehouses on Windows and Unix environment
- Eight years of extensive industry experience in managing complex, high volume data using Informatica PowerCenter 8.x/7.1/6.2/5.1.2/5.1.1/4.7, PowerMart 7.1/6.2/5.1.1/5.0/4.7.
- Proficient in implementing shell scripts, Data Cleansing, stored procedures/Tiggers using PL/SQL
- Extensive experience with different source systems like DB2, Oracle 10g/9i/8i, MS SQL Server 2000/7.0/6.5, Teradata, Informix
- Experience in generating complex reports using Cognos, Business Objects, Microstrategy as Business Intelligence reporting tools
- Thorough understanding of Data-Warehousing concepts with expertise knowledge in Dimensional Data Modeling, Star schema, Snow-Flake schema, creation of Fact and Dimension Tables, OLAP, OLTP
- Experience in developing logical and physical data models using Erwin 4.1/3.7 tool
- Excellent analytical, interpersonal and communication skills, ability to work in a team environment and under least supervision, strong research and learning skills
- ETL Tools: Informatica PowerCenter 8.x/7.1/6.2/5.1.2/5.1.1/5.0, PowerMart 7.1/6.2/5.1.1/5.0/4.7.2
- Reporting Tools: Business Objects XI, 6.0/5.1/5.0, Cognos 6.0, Microstrategy 7.1
- Data Modeling: Erwin 4.1/3.7, Microsoft Visio 2000
- Database: Oracle 10g/9i/8i/8.0, MS SQL Server 2000/7.0/6.5, DB2, MS Access 2000/7.0/6.5, Teradata V2R3, Informix
- Servers: Oracle 9iAS, MTS, Microsoft IIS 6.0, Apache, Tomcat, MS Site Server.
- Operating Systems: Microsoft Windows XP/2003/2000/9i/NT, Windows 2000/2003 Server, Unix, Sun Solaris 5.8
- Programming Languages: SQL, PL/SQL, T-SQL, Unix Shell Scripting (Korn, Bourne), C, Java, Visual Basic 6.0/5.0, HTML, JavaScript, VB Script, XML, JAVA
- Other Tools: SQL*Plus, SQL Loader, TOAD, MS Visual Studio
Bachelor of Science in Electronics & Telecommunications Engineering
Professional ExperienceINFORMATICA Tech Lead/Lead Developer Jan 2006 - Till Date
Confidential, Madison, WIAmerican Family Insurance is a leading insurance company that offers auto, home, life & annuities1, health, business and farm & ranch insurance. Currently, it operates in 18 states spanning from Washington to Ohio. It is nation's third-largest mutual property and casualty insurer and 16th-largest property and casualty insurance company group.
Essential Responsibilities-- Worked with initial phases of the software development life cycle by analyzing the business requirements with the Business Analysts and reviewing the Detailed Design schema with the Modelers so as to have proper understanding of the requirements
- Worked with data modeler to design the fact and dimension tables.
- Designed the mapping specification, test plans, business documents, fact qualifier and walkthrough documents
- Worked on Requirement gathering, Analysis, Development and Production Support.
- Analyzed data, helped design the data model for the Enterprise Data Warehouse, worked with the end user to help understand data and mentored them use of the reporting tool, document the data definitions and Informatica mappings as well as code the mappings. Involved extensively in troubleshooting problems with the source data.
- Mentored AMFAM employees and consultants in Informatica PowerCenter, SQL and Business Objects usage and best practices.
Property Credit Into Rating (PCIR)
Project Involved using credit and other information to rate a Property Policy. This involved sending notification to the customer about rate increase, sending notification to the agents about the change in the premiums of the customers, tracking customer calls about inquiries, tracking number of credit reports ordered by agents etc. Due to high data volume and client reporting needs, a Monthly Snapshot Data Mart was also created.
Auto Credit into Rating (ACIR)
American Family chose to implement new rating variables to more accurately price the existing as well as new auto policies. The introduction of these new variables is to help the company change from a risk selection company to a risk pricing company. The reporting requirements of this project were similar to that of PCIR project but more challenging because of the number and grouping of customer on the Auto policy as well as huge volume of data.
Managing The Change
The project consisted of creating a new data mart, new reports, implementing a new tool (PDF Explode) and generating a weekly report for the agents. This would help the agents analyze how their business is impacted every week. This was all done within a very short timeframe to meet a business need of RPM.
SIS and PIE
Sales Initiation Suite(SIS) project is direction for the migration of quoting and applications to a web environment. The reporting needs required reading Enterprise XML, parsing the XML, storing it in Relational tables for Business Objects.
PIE involves building a property insurance estimate tool on amfam.com for home, renters, condo and mobile home owners that give site visitors the ability to quickly and easily get a premium estimate. This project will allow the customer the ability to receive an anonymous premium estimate and drive new leads to an agent to complete the application and sale.
Responsibilities-
- Involved in the project as Lead Developer and Technical Lead.
- Lead a team of Informatica and BO developers to gather and analyze requirements create Fact Qualifier and develop Informatica and BO code.
- Worked on creating the Mapping Specifications, Source-to-Target document, Lineage Document, Test Plan and Test Results.
- Worked on Data Profiling starting from individual fields to database tables to help model the Data Mart and using conformity across business.
- Worked with Data Modelers to create the data model for the Enterprise Data Warehouse and the Data Mart
- Designed and Developed data validation, load processes, test cases using PL/SQL, SQL. Used DB2 Optimizer to optimize SQL. Worked on complex SQL to load one fact table. Involved in designing the process to load the facts to optimize the loads.
- Created XML parser to parse data using Enterprise XSD.
- Used XSLT to extract the required elements from the Enterprise XML for performance tuning
- Created Java Transformation to clean up the XML and call the XSLT functions.
- Implemented Java Code in the JAVA transformation for loop logic and improve performance
- Used Type1, Type2, and Type3 changing dimensions methodology to keep track of historical data.
- Designed and involved in loading exception (error) tables to test the source data
- Developed process and worktables to generate audit reports to check the data quality of Mart data.
- Created Monthly Snapshot of the Property and ACIR Mart so that the clients can have easier access to the data
- Helped the clients and BO developers understand the data needs and create BO universe and reports.
- Involved in Performance Tuning of the SQL and Informatica Mappings using Session Partitioning, Database Partitioning, tuning the transformations.
- Involved in designing, coding and testing bridge tables.
- Worked with Business Objects team to generate reports for the clients. Also generated adhoc reports for the client.
- Created Mapplets to calculate Written and Earned Premiums, Exposures per policy and coverage.
- Involved with the scheduling team so that the correct dependencies are met, run as many jobs as possible concurrently for faster delivery of the data
- Helped Project Manager to manage the changes requested by the clients and meet the deadline.
- Mentored Developers on Kimball mythologies and Informatica best practices for the Data Mart
- Created Universes using Derived tables, implementing different Loop Resolving techniques like Creating aliases and contexts.
- Used @Functions like @Prompt (for user defined queries), @Where (for creating conditional filters), @Select to create duplicate objects in the process of building the Universes.
- Created custom List of Values (LOVs) for better understanding of prompts by the end users.
Environment: [Informatica PowerCenter 8.x/7.1.4, DB2, PL/SQL, ORACLE 10G, WINQSL, Quest Central, Business Objects 6.0/XI, Java]
Sr. Informatica Consultant Dec 2004 - Dec 2005 Confidential, New YorkThe project involved collecting, organizing and maintain data for decision-making regarding the advertisement spots for Olympics event. The warehouse was used by the business users to make decisions on how to sell the advertisement spots during Olympics. The data generated was stored in a data warehouse on oracle.
Responsibilities-- Created stored procedures to load and maintain the summary table, which was a major part of decision support for Olympic event.
- Created Informatica mapping to insert, delete and update data from the dimension tables.
- Created shell scripts to run the informatica mapping and call the stored procedures to load the fact tables
- Involved in trouble shooting the stored procedures and mapping to get the desired results
- Involved in checking the control reports for data loading and reloading the data in case of failures
- Involved in performance tuning of informatica mapping, session, sources and targets.
- Worked very closely with the DBA's to improve the performance of the stored procedures
Environment: Informatica PowerCenter 7.1, Oracle 8i, PL/SQL, TOAD, Business Objects 5.1, unix shell scripts
Sr. Informatica Consultant Dec 2003 - Nov 2004 Confidential, Indianapolis, INThe project involved collecting and summarizing data from various branches to provide proper decision-making.
Responsibilities -
- Worked with initial phases of the software development life cycle by analyzing the business requirements with the Business Analysts and reviewing the Detailed Design schema with the Designers so as to have proper understanding of the requirements
- Implemented slowly changing dimensions methodology to keep track of historical data
- Designed and Developed pre-session and post-session routines to run Informatica sessions
- Involved in Performance Tuning of sources, targets and sessions to optimize load performance
- Implemented Joiner, Expression, Aggregator, Sorter, Lookup, Update Strategy, Filter, stored procedures and Router Transformations
- Used Informatica workflow Manager to create, schedule, monitor and send the messages in case of process failures.
- Created worklets which included sessions, decisions and control tasks for test loading the data
- Designed and Developed data validation, load processes, test cases using PL/SQL, SQL
- Involved in Logical and Physical Database Design using Erwin 4.1 Tool
Environment: Informatica PowerCenter 7.1/6.2, Oracle 9i, TeradataV2R5, PL/SQL, COBOL, Sun Solaris, Excel 2000, ERWIN 4.1, Business Objects 6.0
Informatica Consultant March 2003 - Nov 2003 Confidential, Atlanta, GAThe project dealt with extracting and transforming data from various sources and business types using business rules to Transient, Staging, Warehouse and Mart. The data was then loaded into regional data marts according to geographical, time dimension for various strategic businesses rules.
Responsibilities -- Analyzed source and target models and the business requirements associated with them and made appropriate changes by translating the requirements into Data Warehouse architectural designs
- Designed and maintained logical and physical enterprise Data Warehouse schemas
- Improved the performance of the mappings to decrease load time and give optimum system performance
- Created number is workflows that used a group of worklets for loading the data at staging and Data Warehouse stages.
- Used Workflow Manager to create and run batches for different applications
- Loaded Operational data from Oracle, DB2, SQL Server, Flat Files into Transient Tables and cleaned the data using appropriate business rules
- Involved in designing Star schema by de-normalizing database tables
- Generated Reusable Transformations, Mapplets and used them extensively in many mappings
Environment: Informatica PowerCenter 6.2, Sun Solaris 5.8, Microstrategy 7.1, DB2, PL/SQL
Informatica Developer Mar 2001-Feb 2003 Confidential, Hyderabad, IndiaConfidential, is a largest Retail Software Solutions, Smart Card Systems and Terminal Based Integration Service Provider. Statistical data about the customers and inventory was organized and recorded according to region and time. Also designed and performance tuned Data Marts so that it provides data for efficient decision-making.
Responsibilities -- Developed PL/SQL procedures, Korn Scripts to atomize the process for daily and Nightly Loads
- Created backup of the Repository on regular intervals depending on the amount of work done.
- Implemented performance tuning of Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval
- Implemented optimization techniques in Mappings and Sessions for improving overall performance of worklets & workflows
- Extensively used sequence generator transformation for replacing missing keys, creating unique keys and cycle through range of values
- Developed Unix shell scripts in Korn shell environment, and PL/SQL code in Oracle 8i for daily and monthly data loads
- Imported XML Source files to Designer, modified groups, used in the Mappings and loaded into an XML target file.
- Created and Scheduled Sessions and Batch Process based on demand, run on time, run only once and monitored sessions using Informatica Server Manager and Shell Scripts.
- Involved in the migration/ up gradation of existing mappings from informatica v5.1 to v6.2
- Created reports using Business Objects5.0 and formatted the reports using the features in Business Objects like Slice and Dice, Drill Down
Environment: Informatica Power Center 6.2/5.1, Business Objects 5.0, Unix shell scripting, PL/SQL, Oracle 8i, Erwin 4.1, Windows NT/2000, Auto Sys 4.0