Informatica Developer Resume
SUMMARY:
- Over 5+ years of professional experience in IT Industry including expertise in Analysis, Design, Development, Implementation, Modeling, Testing, and support for Data warehousing applications.
- Involved in Full Life Cycle Development of building a Data Warehouse.
- Extensive experience in using Informatica tool for implementation of ETL methodology in Data Extraction, Transformation and Loading.
- Clear understanding of Data warehousing and Business Intelligence concepts with emphasis on ETL and life cycle development Using Power Center, Repository Manager, Designer, Workflow Manager and Workflow Monitor.
- Extensively worked on Dimensional modeling, Data migration, Data cleansing and Data Staging of operational sources using ETL processes for data warehouses.
- Actively involved in Performance Tuning, Error handling Product support on various Platforms.
- Experience in implementing update strategies, incremental loads and change Data capture (CDC).
- Knowledge of complete SDLC including Requirement Analysis, Requirement Gathering, Project Management, Design, Development, Implementation and Testing.
- Expertise in OLTP/OLAP System Study, Analysis and E-R modeling, developing database Schemas like Star schema and Snowflake schema used in relational, dimensional and multidimensional data modeling.
- Performed system Analysis and QA testing and involved in Production Support.
- Experience in coding using SQL, PL/SQL procedures/functions, triggers and exceptions. Good experience in Relational Database concepts, Entity relation diagrams.
- Have work experience with Relational data model Erwin.
- Experience in UNIX working environment, writing UNIX shell scripts for Informatica pre & post session operations.
- Excellent communication and interpersonal skills. Ability to work effectively working as a team member as well as an individual.
TECHNICAL SKILLS:
ETL Tools : Informatica Power Center 8.5/8.1.0/7.1.2/6.1/5.1.2, Power Mart 7.5/5.2,
Data Transformation Studio.
Databases : Oracle 10g/9i/8i/7.x, SQL Server 2005/ 2008, MS Access.
BI Tools : Cognos 8
Data Modeling Tool : Erwin
Programming : SQL, PL/SQL, C, C++
Operating System : RED HAT LINUX 5.0, UNIX (SOLARIS8, AIX) & WINDOWS NT/2000 Technologies : XML
EDUCATION:
- B. Tech in Electronics and Communication Engineering
PROFESSIONAL EXPERIENCE:
Confidential, Cincinnati, Ohio
Sr Informatica Developer. May 2012 – Till Date
CLIENT DESCRIPTION:
Confidential, a veteran insurance company, Baldwin & Lyons, Inc., through its subsidiaries, engages in marketing and underwriting property and casualty insurance products primarily in the United States. The company provides various fleet transportation insurance products, including casualty insurance, such as motor vehicle liability, physical damage, and other liability insurance; workers compensation insurance; specialized accident (medical and indemnity) insurance products for independent contractors; fidelity and surety bonds; and inland marine products consisting of cargo insurance. It offers its fleet transportation insurance products for motor carrier industry. The company also provides various additional services comprising risk surveys and analyses, government compliance assistance, loss control, and cost studies; and research, development, and consultation in connection with new insurance programs that include development of computerized systems to assist customers in monitoring their accident data. In addition, it offers claims handling services to clients with self-insurance programs. Further, the company’s reinsurance assumptions business accepts cessions and retrocessions from selected insurance and reinsurance companies, principally reinsuring against catastrophes.
DESCRIPTION OF THE PROJECT
Confidential, has a new policy system (AQS) that they are currently modifying to add the Small Business Workers Compensation (SBWC) line of business. It is B&L’s desire to incorporate the data for this line of business into their Enterprise Data Warehouse (EDW). Integrate Small Business Workers Compensation data from AQS into the existing Enterprise Data Warehouse staging environment.
Responsibility as ETL Developer:
- Participated in Business Requirement Analysis.
- Involved in Preparing design documents.
- Estimates and planning of development work using Agile Software Development.
- Participating daily standup calls with client to give daily progress status.
- Involved in design of ETL processes using Informatica ETL tool to load data from Source System XML and load into target system (SQL server).
- Have used data transformation studio to map the unstructured data.
- Developed complex mappings, sessions and workflows (ETL) to meet the above requirements.
- Developed complex transformation like unstructured data transformation.
- Debug the Informatica mappings and validate the data in the target tables.
Confidential, Somerset NJ Dec 2010 –April 2012
Sr. Informatica Developer
Description:
MetLife was using support tracking tool called remedy to register any kind of support request directly through users else from central on call support team. Request raised by user was getting assigned to concern application support team to work on that to fix the issue. Remedy had its own transactional relational data base to store its day to day activity data. Remedy had multiple options to choose one out of that to register any issue based on application Priority and issue type and those are Incident, Service Request, and Problem. Each category was described with its own Severity like Seviority-1, Seviority-2 and Seviority-3. Each severity type had its own time like to accomplish the task.
United States Business Production Management and reporting warehouse requirement basically came up to track support activity of all MetLife applications supported by various Production support team under different managers.
Responsibilities:
- Implemented Support Performance Business Intelligence Solution for MetLife.
- Existing data from remedy data source were consolidated into a uniform ORACLE database.
- Informatica was used to define imports of the data, build scheduling workflows, to define integration flow.
- Participated in Business Requirement Analysis for business intelligences of USB PM Matrics system.
- Participated in analysis of Source system data which was present in ORACLE database.
- Involved in design of ETL processes using Informatica ETL tool to load data from Source System like ORACLE.
- Participated in Extraction, Transformation and loading using Informatica Workflows.
- Creating the Staging Area and loading raw data to cleanse it over there.
- Developed mappings, sessions and workflows (ETL) for SCD types I and II to meet the above requirements.
- Involved in Analysis, Documentation and Testing of workflows..
- Debug the Informatica mappings and validate the data in the target tables once it was loaded with mappings.
- Designed and developed mappings using various Transformations like Aggregator, Joiner, Lookup, Filter, Router and Update Strategy.
- Preparing Unit Test Cases for testing the Mappings.
- Maintaining the OLAP system to provide quality data to support DSS.
- Interact with client and provide best solution.
Responsibility as Cognos Developer:
- Creating baseline design document for Reporting.
- Design of report process, including prototype creation.
- Development of Medium to Complex report using Cognos.
- Create drill-thru reports and cross-tab reports in Report Studio
- Create different reports that included functionality such as Prompt pages, cascading prompts, conditional filters etc.
- Creating test cases and test scripts, performing unit testing and integration testing.
- Developing reports using report studio with multiple Prompts, different types of charts and graphs to fulfill reporting needs.
- Creating adhoc reports using Report Studio.
Environment: Informatica PowerCenter 8.1, SQL, PL/SQL, UNIX, ORACLE
Confidential, Somerset NJ Feb 2009 – Dec 2010
Informatica Developer
Description:
PeopleSoft Payroll data mart, Reports and Interfaces, is the project which will provide payroll data from PeopleSoft to the MetLife users and downstream applications through interfaces and reports. So the process of extracting data on weekly basis from PeopleSoft to our data mart takes a major role for our application and also for some downstream applications.
This project will provide the detail information about the Metlife United States employees about their compensation ,absence detail more over we can say by using this User of the application can see the entire payroll system of Metlife corporate employees those who are working under united states business through report with additional flexibility like slicing and dicing.
Responsibilities:
- Analyzed the system, met with end users and business units in order to define the requirements.
- Wrote SQL Queries, PL/SQL Procedures and Shell Scripts to apply and maintain the Business Rules.
- Extracted the data from Oracle, Flat files and load into Data warehouse.
- Translated business requirements into Informatica mappings/workflows
- Used Source Analyzer and Warehouse designer to import the source and target database schemas, and the Mapping Designer to map the sources to the target.
- Used Informatica Designer to create complex mappings using different transformations like filter, Router, lookups, stored procedure, joiner, update strategy, expressions and aggregator transformations to pipeline data to Data Warehouse/Data Marts.
- Extensively worked in the performance tuning of the programs, ETL Procedures and processes.
- Developed Complex, Logical and Physical Data Models, using Erwin Tool
- Designed and developed number of complex mappings, mapplets and reusable objects. .
- Involved in fixing invalid Mappings, testing of Stored Procedures and Functions, Unit and Integration Testing of Informatica Sessions, Batches and the Target Data
- Involved in SQL Tuning and trouble shooting.
- Created UNIX shell scripts for Informatica pre/post session operations.
- Worked with mapping variables, Mapping parameters and variable functions Involved in creating target database for the Data marts
Environment: Informatica PowerCenter8.1, Oracle 9i, PL/SQL, Windows NT, UNIX, Cognos 8, Erwin.
Confidential, Chicago Apr 2007 – Feb 2009
Informatica Developer
Description:
The Work Package Execution for Distribution aims at improving the visibility of the Distribution team into the performance of its partners. Orbitz partners with several affiliate websites. In order to measure and understand the performance of its affiliates in terms of revenue share (and several other parameters), a complete data mart solution needs to be built which includes data modeling, ETL design and development and report design and development. This project will also need the development of a dashboard solution in order to provide the senior leadership with a single and summarized view of the underlying data. The objective is to extract data stored in such as ODS and to load finally into a single data warehouse repository, which is in Oracle.
Responsibilities:
- Implemented Business Intelligence Solution for Orbitz.
- Requirement Analysis
- Preparing design documents.
- Estimates and planning of development work using Agile Software Development.
- Participating calls with Onsite Coordinator to understand the requirement and documenting it.
- Involved in design of ETL processes using Informatica ETL tool to load data from Source System like ORACLE.
- Design of ETL processes using Informatica tool to load data from Source System like Oracle and Data Transformations and then load to the target database.
- Creating the Staging Area to maintain valid/consistent data with implementing the data cleansing logic.
- Designed and developed mappings using various Transformations like aggregator, joiner, Lookup, Filter, Router and Update Strategy.
- Developed mappings, sessions and workflows (ETL) for SCD types I to meet the above requirements.
- Design ETL jobs to load data in the data warehouse tables.
Environment: Informatica Power Center 7.1, Oracle 9i, PL/SQL,