Sr Informatica Developer/technical Lead Resume
PROFESSIONAL SUMMARY:
- Over 11+ years of Information Technology Experience in Data Warehouse Projects using Tools like Informatica Power Center 9.1/8.6/8.1.1/7. x/6.x, Power Exchange 8.x, Mapping Architect Visio and Informatica Developer (IDQ) tool.
- Extensive experience in Banking and Financial Domains.
- Strong Knowledge of Data Warehouse Architecture and Designing Star Schema, Snow flake Schema, FACT and Dimensional Tables.
- Extensive experience in Extracting, Transforming and Loading of data using Informatica from different Source systems including flat files, RDBMS tables and Excel Sheet to Warehouse system.
- Extensive experience in designing and developing Complex mappings from varied transformation logics like Unconnected /Connected lookups, Router, Filter, Expression, Aggregator, Joiner, Union, Update Strategy and more.
- Hands on experience in Trouble shooting and implementing Performance tuning at various levels such as Source, Target, Mapping and Sessions.
- Experience creating and administering repositories, folders, mappings, sessions, workflows, mapplets, User defined Functions and transformations using Repository Manger, Designer and Server Manager/Workflow Manager.
- Experience in Administrative jobs like code migration, Import/Export utilities.
- Hands on experience in creating & configuring sessions, workflows and worklets using the Workflow Manager of Informatica.
- Worked on UNIX shell scripts in scheduling Informatica pre/post session operations.
- Proficiency in data warehousing techniques for Data cleansing, Slowly Changing Dimension phenomenon’s (SCD’s) and Surrogate key assignment.
- Hands on experience through complete Software Development Life Cycle (SDLC) including Analysis, Design, Development, Testing and Implementation.
- Expertise in Developing PL/SQL Packages, Stored Procedures/Functions, triggers and Indexes.
- Expertise in creating Database objects like Tables, Views.
- Enthusiastic to learn the new Technologies through self - earning.
- Completed/Learned the Big Data Technologies Hadoop, Hive, Pig & Sqoop .
- Interested to work in BIG Data Technologies.
- Excellent communication, presentation skills, a very good team player and self-starter with ability to work independently and as part of a team.
TECHNICAL SKILLS:
Data Warehousing: Informatica PowerCenter 9.1/8x/7x/6x, Hadoop, Hive, Pig & Sqoop Repository Manager, Designer, Workflow Manager, workflow MonitorSQL Loader
Databases: Oracle11g/10g/9i/8i/7.3, Db2
Data Modeling: ERWIN 4.x/3.x, Ralph-Kimball Methodology, Bill-Inmon MethodologyStar Schema, Snow Flake Schema, Extended Star SchemaPhysical And Logical Modeling, Dimension Data ModelingFact Tables, Dimension Tables, Normalization, Denormalization
Programming: Unix Shell Scripting, SQL, and PL/SQL, Java
Environment: Windows XP/2000/98, Win NT, UNIX(SunSolaris10, HP,AIX Linux)
Other Tools: Tivoli Scheduling, Control-M Scheduling Tool, SQL*Plus, Toad, Putty, Ms-Office.MS VISIO
PROFESSIONAL EXPERIENCE:
Confidential
Sr Informatica developer/Technical Lead
Environment: Informatica Power Center 9.6.1, Informatica Developer 9.6.1,Tivoli Scheduling, Netezza 7, LINUX, UNIX Shell Scripting
Responsibilities:
- Working with Data architect team to understand functional and technical requirements
- Creating source to target mapping document for ETL developers
- Documenting various ETL standards to be followed which include both Informatica and Oracle Design standards
- Creating various Informatica mappings to load data from Staging to Data warehouse (dimension and fact tables)
- Working on Informatica Power Center to create various Type 1 and Type 2 mappings using transformations like Joiner, Lookup, Router, Filter, Union, Sequence Generator, Aggregator and Expression
- Creating workflows and sessions to execute mappings using Workflow manager
- Extensively worked in using mappings variables, session variables and workflow variables
- Used the Developer tool to create the mappings, Profiles & Score cards to understand the source data inconsistencies and primary key dependencies.
- Using UNIX parameter files at session and workflow level
- Working on sessions to be executed in parallel or series based on requirement
- Working on pre and post session commands to send email notifications using Informatica Workflow Manager and renaming files in UNIX
- Creating Unit Test Plan and test case document for all ETL changes
- Migrating ETL changes from DEV to QA. Also involved in taking Repository backup to avoid any loss of code being developed
- Working with QA team providing Developer support and resolving any issues in test environment
- Executing data loads in DEV and QA and collected statistics for all ETL jobs
Confidential
Technical Lead
Environment: Informatica Power Center 9.5.1, Tivoli Scheduling, Oracle 11g, SQL Developer 7.1.0, LINUX, UNIX Shell Scripting
Responsibilities:
- Working with Data architect team to understand functional and technical requirements
- Creating source to target mapping document for ETL developers
- Documenting various ETL standards to be followed which include both Informatica and Oracle Design standards
- Performing code review and approving the code to higher environment
- Providing status updates to Project Manager by gathering updates from developers
- Assigning tasks to developers based on priority
- Involved in Datawarehouse design to load data in Staging and Datamart
- As a developer, I am involved in developing various POC mappings for developers
- Working on loading data into Staging from Oracle and flat files
- Creating various Informatica mappings to load data from Staging to Datamart(dimension and fact tables)
- Working on Informatica Power Center to create various Type 1 and Type 2 mappings using transformations like Joiner, Lookup, Router, Filter, Union, Sequence Generator, Aggregator and Expression
- Created reusable lookup transformation for mappings which use static tables like Geography and Time dimension tables
- Creating workflows and sessions to execute mappings using Workflow manager
- Extensively worked in using mappings variables, session variables and workflow variables
- Using UNIX parameter files at session and workflow level
- Working on sessions to be executed in parallel or series based on requirement
- Working on pre and post session commands to send email notifications using Informatica Workflow Manager and renaming files in UNIX
- Extensively worked with Oracle admin to improve performance which include creating indexes, partitioning
- Using oracle hints for better performance on SQL override queries
- Walking through each Informatica module and implementing performance tuning logic on sources, targets and sessions
- Implemented Error handling and Exception handling in Informatica Mapping Designer
- Understanding data quality scenarios from business and implement the same in Staging
- Creating Unit Test Plan and test case document for all ETL changes
- Migrating ETL changes from DEV to QA. Also involved in taking Repository backup to avoid any loss of code being developed
- Also worked as Informatica Admin in adding users to repository, creating folders for each project being developed and granting permissions to users
- Working with QA team providing Developer support and resolving any issues in test environment
- Executing data loads in DEV and QA and collected statistics for all ETL jobs
- Using Informatica scheduler to execute jobs accordingly
- Closely working with BO team and Risk Team(customers) to help understand the needs and make necessary ETL changes
Confidential
Technical Lead
Environment: Informatica Power Center 9.1, Db2, Control-M Scheduling Tool, Visio, LINUX, UNIX Shell Scripting
Responsibilities:
- Worked as ETL Architect and Onsite coordinator
- Coordinating work with the offshore team for the new enhancements and development.
- Working with Data architect team to understand functional and technical requirements
- Creating source to target mapping document for ETL developers
- Documenting various ETL standards to be followed which include both Informatica and Oracle Design standards
- Performing code review and approving the code to higher environment
- Design and development of complex ETL mappings making use of Connected/Unconnected Lookups, Normalizer, Stored Procedures, SQL & Custom Transformations.
- Implementation of successful strategies in loading feeds to the Frontier Data Warehouse.
- Optimization/performance tuning techniques implemented to identify bottlenecks- Query tuning and cache management.
- Development of PL/SQL Stored Procedures to implement complex business logic and integrate them with the ETL process as pre/post Stored Procedures.
- Designed Pmcmd & UNIX Shell scripts to in corporate dependencies and scheduled jobs to run ETL processes.
Confidential
Team Lead
Environment: Informatica Power Center 9.1/8.6.1, Db2, Control-M Scheduling Tool, Visio, ERWIN, LINUX, UNIX Shell ScriptingDescription:
Responsibilities:
- Understand the existing application architecture and their process.
- Go through the requirement Analysis with end to end and business.
- Understand the requirement how it impacts the existing application
- Document the impacted components in the requirement documents
- Discuss with the source systems and understand the process at their end.
- Discuss with the downstream systems and explain about the impacts because of implementing the new changes.
- Document the method of change for the impacted components
- Discuss with the Data Architects and walkthrough the data elements that are impacted
- Discuss with Data Architects and finalize the LDM.
- Discuss with Data Architects and finalize the PDM.
- Setups calls with offshore and transition the functional knowledge about the new requirements.
- Discuss with offshore about the LDM & PDM changes and hand over the Design document
- Discuss and finalize the design approach and high level architecture for the functional changes.
- Discuss and finalize the Test Approach & Test strategy with end to end teams and share the same with offshore.
- Co-ordinate with end to end, source systems and downstream systems during the SIT & UAT to get the files for testing
- Co-ordinate with end to end, source system and downstream systems and rectify the issues in the source files that are sent be source system.
- Track the status of the build with offshore and review the code and testing results.
- Communicate the development and testing status with end to end
- Implement the changes in production
- Verify the change in production and provide warranty support for the changes.
Confidential
ETL Developer/Tester
Responsibilities:
- Understanding the IDN environment.
- Worked on Informatica IDQ tool.
- Created the profiles and scorecards.
- Created the Power Exchange Data Maps for reading EBCIDIC files from mainframes.
- Involved in the discussions of the MDF architecture.
- Worked in the Testing team to test the Framework.
- Raised the issues and defects in each module
- Created the Mappings and Data Maps as a part of the testing.
- Created the Data Maps to read the fixed width EBCDIC files.
- Worked on Informatica Developer to perform the profiling.
- Created Profiles to perform the column level patterns and gather statistics to understand the source data.
- Create Profiles to understand the primary key dependencies.
- Created the Perl scripts, UNIX scripts for file processing.
- Tested the Stored Procedures which are part of the MDF framework.
- Worked in AGILE methodology.
- Participated in daily standup meetings and status meetings.
- Trained the resources within IDN to introduce the new MDF framework.