Informatica Developer & Production Support Resume
SUMMARY
- 8+ years of IT experience in Data Warehouse Development and Informatica Production Support.
- Proficient in all phases of Software Development Life Cycle (SDLC), including requirements gathering, design, development, system testing and acceptance testing and production support
- Expertise in Requirement Analysis, Design, Coding, Testing & Implementation of ETL/DWH projects using Informatica Power Centre 10.x/9.x/8.x, SQL, Oracle and Unix Shell Scripts
- Extensive experience with ETL tool Informatica in designing Workflows, Worklets, Mapplets, Mappings and scheduling the Workflows and sessions using scheduler tools
- Experience in documenting application use cases and providing source to target mapping requirements
- Good experience on Source Qualifier, Router, Filter, Expression, Aggregator, Joiner, Update Strategy, Unconnected and Connected lookups, Rank, Sorter transformations
- Good experience in raising incident management request by using Service Now and JIRA.
- Working on rally by creating user stories and task and updated test case to capture the work activities with team as scrum master.
- Certified on Change Request Management, Incident Request Management and Problem Management.
- Having good experience on creating Standard RFC, Normal RFC and Emergency RFC for solving critical production issues on priority.
- Deployed the Informatica and Unix components from Dev to QA to Prod by using GIT and Lara Deployment
- Experience in integration of various data sources like Oracle, Microsoft SQL Server and Teradata and flat files by using ETL tools.
- Experience in error handling and troubleshooting using various log files
- Worked on performance tuning, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings and sessions
- Willingness to learn new concepts and ability to articulate alternative solutions.
- Worked extensively with a QA Team in understanding test cases, test plans, User Acceptance Testing (UAT) and ensuring that the software meets the system requirements specifications
- Experience in preparing documentation like HLD, LLD and Test case documentation.
- Having 1+ year of experience in Teradata database & tools Utilities like Fast - Load, Multi-load and Bteq Scripting.
- Good understanding on End-To-End implementation of data warehouse and strong understanding of Business Process Analysis.
- DWH Concepts, ETL, Star schema, Data Modeling, experience using Normalization, Re-engineering, Dimensional Modeling, Facts & Dimensions tables.
- Good communication skills, interpersonal skills, self-motivated, quick learner
TECHNICAL SKILLS
ETL&BI Tools: Informatica Power center 8.6/ 9.1/9.6/10.1.0
Data Bases: SQL Server 2008, Oracle 11gR2/10g, Teradata
Development Tools: Toad, SQL Developer, Teradata SQL Assistant, MSSQL Studio
Operating Systems: Windows XP/2007/2003, UNIX (Putty, Winscp)
Programming/Languages: SQL, PL/SQL, C, UNIX Shell Scripting
Scheduling Tools: Control- M, Tidal and Automic (UC4)
Process Tool: Jira, Service Now, GIT, LARA Deployment, BMC Remedy.
Methodologies: Star/Snowflake, ETL, OLAP, Complete Software development lifecycle
Modeling: Dimension/ER Data Modeling Logical/Physical modeling
Reporting Tool: Brio, OBIEE.
PROFESSIONAL EXPERIENCE
Confidential
Informatica Developer & Production Support
Responsibilities:
- Worked with Business in regard to the requirements, understanding them thoroughly for the complete project outcome.
- Supported 3 different applications. Worked on adhoc requests and submitted the reports to the business users.
- Analyzed the business requirements, technical specification and physical data models for ETL mapping and process flow.
- Extensively worked on debt classification by using Confidential Applens platform.
- Responsible for debugging and performance tuning of targets, sources, mappings and sessions.
- Responsible for debugging and performance tuning of targets, sources, mappings and sessions.
- Worked with different Sources such as Oracle, SQL Server, Flat files and Teradata
- Designed Informatica Specs (Stage, Dimension & Fact tables) for the current system
- Using ETL Process to Extract, Transform and Load the data into stage area and data warehouse and Creating Mapplets and using in mapping.
- Implemented and documented all the best practices used for the data warehouse.
- Coordinated with SME’s, Managers, DBA’s, Informatica administrators (operation team) and other developers during the project life cycle.
- Reviewing the code and developed by team developers and providing the inputs on fixes.
- Working on rally by creating user stories and task and updated test case to capture the work activities with team
Environment: s: InformaticaPower Center 10.4, 9.1 and 9.6, Oracle, Teradata, Toad, WinSCP, Putty, Automic (UC4), UNIX, REMEDY, JIRA.
Confidential, Phoenix, US
Informatica Developer
Responsibilities:
- Worked with Business in regard to the requirements, understanding them thoroughly for the complete project outcome.
- Analyzed the business requirements, technical specification and physical data models for ETL mapping and process flow.
- Implemented the business rules and extracted the data from various sources such as SQL Server, Oracle and Flat Files loaded the required data into Oracle database.
- Extensively working Informatica Transformation like Source Qualifier, Rank, Router, Filter, Joiner, Lookup, Aggregator, Union, and Sorter etc.
- Involved RFC creation and Bit Bucket and Lara deployments from Dev to QA and QA to Prod environments
- Created Workflows, tasks, database connections, FTP connections using workflow manager.
- Expertise in using different tasks (Session, Command, Decision, Email, Event-Raise, Event- Wait, Control)
- Executed sessions, sequential and concurrent batches for proper execution of mappings and set up email delivery after execution.
- Implemented and documented all the best practices used for the data warehouse.
- Improving the performance of the ETL by indexing and caching.
- Responsible for debugging and performance tuning of targets, sources, mappings and sessions.
- Worked with Admin team for migrate the Informatica Power Centre mappings and Code/Folder migration from one environment to another as part of release management.
- Implemented various Performance Tuning techniques on Sources, Targets, Mappings, Workflows and data base tuning
- Working on rally by creating user stories and task and updated test case to capture the work activities with team
- Having good experience on creating Standard RFC, Normal RFC and Emergency RFC for solving critical production issues on priority.
- Deployed the Informatica and Unix components from Dev to QA to Prod by using GIT and Lara Deployment
- Extracted data from different sources of databases. Created staging area to cleanse the data and validated the data.
- Imported data from various sources transformed and loaded into Data Warehouse Targets using Informatica
- Created complex Aggregate, Expression, Join, Filter, Router, Lookup and Update transformation.
- Handle slowly changing dimensions of Type 2 to populate current and historical data to Dimensions and Fact tables in the data warehouse.
- Designed for populate target tables for one-time load and Incremental loads.
- Extracting, Transform and loading of data from flat file, sources to target using transformations in the mappings.
Environment: s: Informatica Power Center 9.6 & 10.2, SQL Server, Oracle, Control-M and Unix
Confidential, San Jose
Informatica Developer
Responsibilities:
- Worked with Business in regard to the requirements, understanding them thoroughly for the complete project outcome.
- Analyzed the business requirements, technical specification and physical data models for ETL mapping and process flow.
- Implemented the business rules and extracted the data from various sources such as SQL Server, Oracle and Flat Files loaded the required data into Oracle database.
- Extensively working Informatica Transformation like Source Qualifier, Rank, Router, Filter, Joiner, Lookup, Aggregator, Union, and Sorter etc.
- Created Workflows, tasks, database connections, FTP connections using workflow manager.
- Expertise in using different tasks (Session, Command, Decision, Email, Event-Raise, Event- Wait, Control)
- Executed sessions, sequential and concurrent batches for proper execution of mappings and set up email delivery after execution.
- Implemented and documented all the best practices used for the data warehouse.
- Improving the performance of the ETL by indexing and caching.
- Responsible for debugging and performance tuning of targets, sources, mappings and sessions.
- Worked with Admin team for migrate the Informatica Power Centre mappings and Code/Folder migration from one environment to another as part of release management.
- Implemented various Performance Tuning techniques on Sources, Targets, Mappings, Workflows and data base tuning
- Imported data from various sources transformed and loaded into Data Warehouse Targets using Informatica
- Created complex Aggregate, Expression, Join, Filter, Router, Lookup and Update transformation.
- Handle slowly changing dimensions of Type 2 to populate current and historical data to Dimensions and Fact tables in the data warehouse.
- Designed for populate target tables for one-time load and Incremental loads.
- Extracting, Transform and loading of data from flat file, sources to target using transformations in the mappings.
Environment: s: Informatica Power Center 10.1, SQL Server, Oracle, Tidal, Unix and Windows 2010
Confidential, New Jersey
Informatica Developer/Production Support
Responsibilities:
- Perform data analysis for any requirement and provide source to target mapping rule document
- Designed and developed complex aggregate, join, lookup transformation to generate and consolidate (fact and summary) data using Informatica Power Center tool.
- Used the Slowly Changing Dimensions (SCD type 2) to update the data in the target dimension tables.
- Walked through the Informatica and Oracle code to identify protected information references of columns like SSN, Last name and first name.
- Knowledge of best practices in Data Warehousing and Business Intelligence
- Designed the Dimensional Data Model of the Data Warehouse Confirmation of source data layouts and needs
- Involved in Creating Fact and Dimension tables using Star schema
- Created sessions, database connections and batches using Informatica Server Manager/Workflow Manager.
- Involved in monitoring the sessions, workflows and worklets using Workflow Monitor to ensure the data is properly loaded into the Enterprise Data Warehouse.
- Created configured and scheduled the sessions and Batches for different mappings using workflow manager and using UNIX scripts.
- Extensively used Informatica Power Center Workflow manager to create sessions, workflows and batches to run with the logic implanted in the mappings.
- Developed complex mappings in Informatica to load data from various sources.
- Checked sessions and error logs to troubleshoot problems and also used debugger for complex problem trouble shooting.
- Extensively used Transformations like Router, Aggregator, Normalizer, Joiner, Expression and Lookup, Update strategy and Sequence generator
- Implemented performance tuning logic on targets, sources, mappings, sessions to provide maximum efficiency and performance.
- Parameterized the mappings and increased the re-usability.
- Created procedures to truncate data in the target before the session run.
Environment: s: Informatica Power Center 9.6, Windows 2008, Unix, Oracle, SQL Server, Autosys
Confidential, New Jersey
ETL Developer
Responsibilities:
- Scheduling ETL process on daily, weekly and monthly basis
- Responsible for design and development of rating data mart for financial Data Warehouse.
- Analyze business requirements, technical specification, source repositories and physical data models for ETL mapping and process flow.
- Developed mapping to load Fact and Dimension tables, for type 1 and type 2 dimensions and incremental loading and unit tested the mappings.
- Responsible for debugging and performance tuning of targets, sources, mappings and sessions.
- Executed sessions, sequential and concurrent batches for proper execution of mappings and set up email delivery after execution.
- Implemented and documented all the best practices used for the data warehouse.
- Created Workflows, tasks, database connections, FTP connections using workflow manager.
- Responsible for identifying bugs in existing mappings by analyzing data flow, evaluating transformations and fixing bugs.
- Using ETL Process to Extract, Transform and Load the data into stage area and data Warehouse
- Created mappings, reusable transformations in Mapping Designer.
- Worked with different Sources such as Oracle, SQL Server and Flat files.
- Designing the interfaces and Bug Fixing and unit testing to check the data discrepancy.
Environment: Informatica PWC 9.6, Oracle10g, SQLServer 2005, Linux and Windows 2008, Autosys
Confidential, Dublin
Informatica Developer
Responsibilities:
- Mainly involved in ETL developing
- Involved in development of Stage, Dimension & fact tables mappings using Expression, Update strategy, Filter, Aggregator, Joiner and Lookup
- Involved in Fine tuning mappings as a Performance activity
- Using ETL Process to Extract, Transform and Load the data into stage area and data Warehouse
- Using Transformations to clean the data from staging area as per the data warehouse Requirements
- Created mappings, reusable transformations in Mapping Designer
- Accomplished data movement process that load data from databases, using Teradata SQL and utilities such as Bteq, Fast load, Multi load.
- Involved in Analyzing / building Teradata EDW using Teradata ETL utilities and Informatica.
- Experience with Teradata as the target for the data marts, worked with Bteq, Fast Load and Multi-Load.
- Worked with different Sources such as Oracle, SQL Server, Flat files and Teradata
- Preparation of Program Specifications and create the database connections.
- Designed Informatica Specs (Stage, Dimension & Fact tables) for the current system
- Using ETL Process to Extract, Transform and Load the data into stage area and data warehouse and Creating Mapplets and using in mapping.
- Used Teradata as a Source and a Target for few mappings. Worked with Teradata loaders within Workflow manager to configure Fast Load and Multi-Load sessions.
- Used Shortcuts to reuse objects across folders without creating multiple objects in the repository.
- To monitor the scheduled workflows like Daily, Weekly and Monthly jobs and execute the Manual Weekly and Monthly jobs.
Environment: Informatica Power Center 8.6, Oracle, SQL Server, Teradata, BTEQ, Fast load, Multi load, Unix, Windows XP