Informatica Analyst & Developer Resume
Broomfield, CO
Summary
- 7+ years of focused IT industry experience in Data Modeling, Database Development and Data Warehousing, including development, implementation and testing database and data warehouse applications for Health Care, Telecommunication Services, Retail and Financial Services.
- Worked on Informatica, Oracle, SQL, PL/SQL, UNIX and Windows environments in various industries.
- Good understanding of ETL, Dimensional Data Modeling, Slowly Changing Dimensions (SCD) and Data Warehouse Concepts.
- Extensively used Data warehouse concepts and principles like Star Schema, Snowflake Schema, Surrogate keys and Normalization -De normalization.
- Experience in integration of various Operational Data Source (ODS) s with Multiple Relational Databases like Oracle, SQL Server. Worked on integrating data from flat files like fixed width and delimited.
- Acquainted with SFDC, Netezza and Teradata.
- Hands on with Informatica Designer Components - Source Analyzer, Target Designer, Transformation Developer, Mapplet and Mapping Designer.
- Worked extensively with complex mappings using different transformations like Source Qualifiers, Expressions, Filters, Joiners, Routers, Union, Unconnected / Connected Lookups and Aggregators.
- Strong Experience in developing Sessions, Tasks, Worklets, Workflows using Workflow Manager Tools.
- Experience in using the Informatica command line utilities like pmcmd to execute workflows in non-windows environments.
- Extensively used Informatica Repository Manager and Workflow Monitor.
- Experience in debugging mappings. Identified bugs in existing mappings by analyzing the data flow and evaluating transformations.
- Worked with Stored Procedures, Triggers, Cursors, Indexes and Functions.
- Excellent communication, analytical, business and interpersonal skills. Highly motivated to take independent responsibility as well as has the ability to contribute and be a productive team member.
Technical Skills
- Operating Systems: Windows all flavors, UNIX, AIX 6.1
- ETL Tools: Informatica Power Center 9.x, 8.x, 7.x, Power Mart 6.2/5.1,Informatica Power Connect, Informatica Power Exchange for SFDC
- Data Bases: Oracle 11g/10g/9i/8i, SQL Server 2005, MS Access 2005, Teradata(Fast-Load, Multi-Load, Fast Export), Netezza.
- Languages: SQL, PL/SQL, Shell Scripting
- Utilities: SQL*PLUS, PL/SQL Developer, Toad
- Data Modeling Tools: Visio, Erwin
- Reporting Tool: Business Object 6.5/6.1, Crystal Report 9.2
Professional Experience
Company : Confidential- Broomfield, CO
Sept 2011 – Till date
Role: Informatica Analyst & Developer
Description: Confidential is an international communications company, headquartered in Broomfield, CO. They are one of only six Tier 1 Internet providers in the world. This project was on Order Mart, SMART and SMMART applications where some new changes or modification for existing ETL mappings is required depending upon the client requirements. SAMART and SMMART application is used for Customer order entry using SIEBEL, PIPELINE, CLARIFY, EON AND IFO order entry system.
Environment:Informatica Power Center 9.1, Oracle 9i, SQL Server 2005, UNIX, Flat files, XML files, Rally (Agile-Scrum process) , SQL*Plus, PL/SQL.
Responsibilities:
- Test the changes/code fixes that are implemented as part of change requests/code fixes
- Monitor the deployment activity when the specific deliverables are being productionized.
- Ensure the deployed code is working fine in production environment.
- Give approaches/solutions in case of any production failures/errors.
- Generate test cases by using test case generator for the Level3, Looking glass and Wiltel releases.
- Use the code review tool to validate the mappings developed in Level3 and Wiltel releases.
- Working as a developer in data extraction and analysis for Looking Glass company source data into the Target applications using the Environment, Analyzer Tool and validated the mappings using Code review tool.
- Working within Continuous Integration frameworks where major focus is on developing automated unit tests and documented code coverage.
- Actively participated in integration part from Global Crossing Data (Newly acquired company) to Level 3 data from EON to BPMS.
Company: Confidential-Burlington, NC
Feb 2010 – Aug 2011
Role: Informatica Developer
Description: Confidential bought a pricing analytic tool called Vendavo and they need to setup the Informatica environment.At a high level this pricing tool was connected with the Salesforce to compare profit margins for certain tests ordered based on geography etc and allow the Salesforce to better negotiate to increase profit margin. The data was flowing like Legacy system data – extract – ETL staging – transform – flat files – Vendavo – users can run reports.
Environment:Informatica Power Center 8.1.1, SQL*Plus, PL/SQL, Oracle 9i, SQL Server 2005, UNIX, Flat files, XML files.
Responsibilities:
- Responsible forPowerCenter 8.1 installations, configuring Power Center domain, nodes and integration services.
- Extensive experience working with repository manager. Was responsible for designing the new processes with Application Architect.
- Created mappings for various sources like Oracle, SQL server, flat files and XML files to load and integrate data to warehouse.
- Developed various mappings, Reusable transformations and validated the ETL logic coded into mappings.
- Use OBIEE to help my team.
- Created target definition in oracle which was the target database.
- Implemented lookups and different transformations in the mappings.
- Developed the mapping to pull the information from different tables and used SQL Override to join the tables instead of Joiner transformation to improve the performance.
- Responsible to write complex SQL queries and PL/SQL procedures to perform database operations according to business requirements.
- Created ODBC connection for importing the source and target definitions.
- Scheduled sessions and work flows using Informatica scheduler and PMCMD command.
- Developed unit test cases and did unit testing for all the developed mappings.
- Schedules all the jobs using scheduler built with PERL to run from Web restart tool.
- Created various documentation for process, mapping, test cases etc .
Company: Confidential, New York City-NY
May 2008 – Nov 2009
Role: ETL Developer
Description: Confidential efficiently monitors the market and performing thorough fundamental and quantitative analysis, in-depth portfolio risk, performance analysis, economic forecasting and more. Informatica power center was used as an ETL tool to extract the stock data and load it into target systems in data warehouse. These data than were used to generate different kinds of reports. Thomson Reuters using rapidly changing dimension in this project.
Environment: Informatica Power Center 8.6, Erwin 7.2, Oracle 10g, SQL, PL/SQL, Toad, Windows 2000 Server.
Responsibilities:
- Analyzed the source data coming from different sources and worked with business users and developers to develop the model.
- Imported various heterogeneous source files using Source Analyzer in the designer.
- Involved in analyzing logical model of source and target.
- Developed logical and physical data models that captured data flows using Erwin 7.2.
- Worked on B2B requirements (Business to Business) and extensively DX (data exchange) and DT(data transform).
- Involved in identifying the sources for various dimensions and facts for different data marts according to star schema and snowflake schema design patterns.
- Designed and developed complex Informatica mappings, mapplets, reusable transformations, workflows and worklets using various tasks to facilitate daily, weekly and monthly loading of data.
- Based on the logic, used various transformation like Source Qualifier, Normalizer, Expression, Filter, Router, Update strategy, Sorter, Lookup, Aggregator, Joiner transformation in the mapping.
- Involved in migrating data from Stagging to Data warehouse and Scheduling jobs.
- Created Connected, Unconnected and Dynamic lookup transformation for better performance and increased the Cache file size based on the size of the lookup data.
- Used Workflow Manager for Creating, Validating, Testing and Running the sequential and concurrent Batches and Sessions.
- Used Debugger to test the mappings and fix the bugs. Created Unit Testing Document for Informatica ETL routines.
Company: Confidential- Richmond, VA
Feb 2007 – Apr 2008
Role: Informatica Developer
Description: Confidential offers a variety of consumer lending and deposit products, including credit cards, auto loans, small business loans, home equity loans, installment loans and savings products. The aim of the project was to help the customer service representatives to deal and transect with different customers data. The operational data of different financial departments was loaded into a central data warehouse and farmed out to different regional data marts.
Environment:Informatica Power center 8.1,Oracle 9i, Erwin, UNIX scripts.
Responsibilities:
- Extensively worked for Data Analyzing, Cleansing and Data Integration and workedto resolve inconsistencies in data.
- Created the Mappings, Mapplets in Informatica using designer and created sessions using workflow manager.
- Responsible to create Informatica maps by utilizing various transformations like Expression, Source Qualifier, Aggregator, Connected/Unconnected lookup, Filter, Joiner etc.
- Involved in creation, edition and deletion of sessions and batches of session using workflow manager in Informatica.
- Implemented Materialized Views using PL/SQL for loading extended tables.
- ImplementedSlowly Changing Dimensions(Type 2: versions)to update the dimensional schema.
- Delivered the new system in Agile methodology.
- FacilitatedAgiledevelopment process in the company including requirements and design processes. Developed build and release scripts and assisted with configuration management process.
- Monitored workflowsand collected performance data to maximize the session performance.
- Used Informatica efficiently and tuned the scripts for better performance results and for large data files by increasing data cache size and target based commit interval.
- Prepared user and technical documentation. Documented information about source system data locations, UNIX hosts, user login details.
Company: Confidential, Overland Park - KS
Mar 2006 – Dec 2006
Role: ETL / Data Warehouse Developer
Description: One of the largest telecommunication provider’s in the country. Confidential offers a comprehensive range of wireless and wire-line communications services bringing the freedom of mobility to consumers, businesses and government users. Assisted in the design of the data warehouse. Gathered data from multiple source systems. Also assisted in the development, execution, documentation of system and integration test plans updated and maintained metadata.
Environment: Informatica power center 8.1, Oracle 8i, Flat file, PL/SQL, Windows NT, MS Visio, and SQL Server 2000.
Responsibilities:
- Requirement gathering and Business Analysis.
- Involved in the development of mappings and performance tuning.
- Extensively use ETL to load data from different data bases and flat file to Oracle.
- Extensively work on complex store procedures, triggers and functions.
- Created sessions and batches work flow manager to load the data into target database.
- Tested the data for the data integrity and consistency.
- Documented the issues related to cleaning of the data.
Company: Confidential
Jan 2005 – Dec 2005
Role: Data Analyst
Description: Confidential is professionally managed global software development and consulting services company catering to ISVs, software technology companies. Winjit provides IT services, systems integration, outsourcing and training solutions to major corporations. Worked on a project for the credit card division of the bank. Customer-centric database was build to analyze transactions across finance, marketing, risk, collections and consumer relations was developed.
Environment: Windows 95/98, UNIX, Sql, Ms Access 2003, MS Excel 2003, Informatica
Responsibilities:
- Installed and configured Sql Server. Designed and developed the databases.
- Configured the DTS packages to run in periodic intervals.
- Extensively worked with DTS to load the data from source systems.
- Maintained security and data integrity of the database.