Sr. Informatica Mdm/etl Developer Resume
Baltimore, MD
SUMMARY:
- Software Professional with 9+ years of experience in the field of Information Technology with an extensive experience in Master Data Management and as an ETL Developer with prime focus on analysis, design, development, customization, and maintenance of various data warehouse applications in an effective way to deliver leading edge software solutions.
- Experience in integrating external application with Informatica MDM hub using message queries as a significant advantage.
- Experience working on Informatica MDM in design, develop, test and review & optimize Informatica MDM (Siperian) and IDD Applications.
- Excellent hands - on Data Exchange, MDM, IDQ, Data Transformation, SQL development.
- Expertise in Master Data Management concepts, Methodologies and ability to apply this knowledge in building MDM solutions.
- Excellent experience with ETL methodologies, business intelligence and data Warehousing principles and architectures including concepts, designs and usage of data warehouses and data mart.
- Configuring and maintaining various components of the MDM Hub including the schema, staging and landing tables, configuring base objects, Look ups, Hierarchies, display queries, put queries and query groups.
- Experience in installation and configuration of core Informatica MDM Hub components.
- Expertise in creating Mappings, Trust and Validation rules, Match Path, Match Column, Match rules, Merge properties and Batch Group creation.
- Designed, developed and translated business requirements and processes for data matching and merging rules, survivorship criteria, and data stewardship workflows.
- Configured IDD applications for enabling subject area groups, subject are Childs, dropdown lookups, dependent lookups, sibling references, cleanse functions, etc.
- Performed root cause analysis for data quality / code / MDM related issues, and worked with different teams to bring the defects to closure.
- Provided continuing enhancement and review of MDM matching rules, data quality and validation processes.
- Creating JAVA User Exits (PostLoad, PostMerge and PostUnMerge) to customize the Hub functionality.
- Analyzing the regulatory restrictions and the need for identification of golden record for Master Data.
- Strong understanding of Dimensional Modeling, OLAP, Star, Snowflake Schema, Fact, and Dimensional tables and DW concepts.
- Developed Informatica mappings using various transformations like using Source Qualifier, Application source Qualifier, XML Source Qualifier, Lookup, Filter, Expression, Router, Normalizer, Joiner, Update Strategy, Rank, Aggregator, Stored Procedure, Sorter, Sequence Generator, and web services transformation
- Involved in OLAP data modeling for star and snowflake schemas.
- Strong Data analysis to ensure accuracy and integrity of data in the context of Business functionality.
- Experience in the SIF framework and Application Program Interface (API) development.
- Experience in using the Informatica command line utilities like pmcmd to execute workflows, sessions, and tasks and pmrep to migrated code.
- Experience in Performance Tuning of Sources, Targets, mappings and using Push Down Optimization, different Session Partitioning techniques like Round robin, Hash-Key, Range & Pass Through.
- Experience in Integration of various data sources like Oracle, SQL Server, DB2, Flat Files, XML files, DB2 files.
- Scheduled ETL Jobs using Autosys scheduler, and Informatica scheduler to run daily, weekly, and monthly jobs.
- Created and concurrent and sequential sessions in workflow to run the jobs in various environments.
- Expertise in writing the data clean up scripts using SQL queries and UNIX scripts.
- Proficient in data warehousing techniques for Data cleansing, Slowly Changing Dimension phenomenon's (SCD's), Change Data Capture (CDC), Customer Data Integration (CDI).
- Strong potential in managing multiple projects without effecting quality and delivering on discussed timelines.
- Worked on making migration documents from development to testing and production environments.
- Expertise in performance tuning source, mapping, target and sessions.
TECHNICAL ENVIRONMENT:
ETL Tools: Informatica Power Center 5.x/6.x/7.x/8.x/9.x, Informatica MDM 10.1, IDD, IDQ, Informatica Analyst
BI Tools: Oracle Business Intelligence 11.1.1.7, MicroStrategy.
Design tools: ERWIN
Querying tools: SQL Developer, TOAD, SQL Plus
Languages: Shell Scripting, FTP Batch script, PL/SQL, SQL, C, COBOL, PASCAL, C++, Java, XML and HTML
Databases: Oracle 11g/10g/9i/8i/8.0/7.0, DB2, SQL Server 2000/2005/2008/2012/2014 , MySQL, MS Access
Scheduling Tool: Informatica Scheduler, Autosys
OS: DOS, UNIX, Linux, Windows 95/98/2000/NT/XP
PROFESSIONAL EXPERIENCE:
Confidential, Baltimore, MD
Sr. Informatica MDM/ETL Developer
Responsibilities:
- Interacted in business meetings to analyze the business requirements and developed a design plan.
- Coordinated with business analysts to analyze the business requirements and designed and reviewed the implementation plan.
- Worked in an Agile Software Development life cycle to successfully accommodate change in requirements.
- Created draft technical design documents based on the functional design documents.
- Involved in fact and dimension tables design sessions to create fact and dimensions for data warehouse and data marts based on project requirements.
- Developed and supported the Extraction, Transformation, and load process (ETL) for data migration using Informatica power center.
- Extensively involved in ETL code using Informatica ETL tool in order to meet requirements for Extract, transformation, cleansing, and loading of data from source to target data structures.
- Created various transformations like filter, router, lookups, stored procedure, joiner, update Strategy, expressions, and aggregator to pipeline data to Data Warehouse/Data Marts.
- Used Match and Consolidator transformations for fuzzy matching to remove duplicates.
- Defined, configured, and optimized various MDM processes including landing, staging, base objects, foreign-key relationships, lookups, query groups, queries/custom queries, cleanse functions, batch groups and packages using Informatica MDM Hub console.
- Defined Trust and validation rules before loading the data into the base tables
- Running the Stage Jobs, Load Jobs, Match and Merge Jobs using the Batch Viewer and Automation Processes.
- Performed Data standardization of addresses using Address Doctor, and other services as defined by the business.
- Designed, developed, and translated business requirements and processes for data matching and merging rules, survivorship criteria, and data stewardship workflows.
- Implemented Land Process for loading the Student Data Set into Informatica MDM from various source systems.
- Involved in designing MDM data model and created base objects, mappings, defined trust setting for sources, customized user exists, customized IDD applications.
- Maintained the java code that was used to build the custom web services. Some of the custom web services were create, update, Match, Search. Detail Search, Merge / Unmerge, Multi-Client Search, Program Participation, etc.
- Involved in writing some of the IDD User exits in Java for Merge / Unmerge.
- Design and configuration of landing tables, staging tables, base objects, hierarchies, foreign-key relationships, lookups, query groups, queries/custom queries and packages.
- Peformed rule based Profiling on the client business data in understanding the depth of accuracy and duplicates which are needed for MDM implementation.
- Wrote / maintained IDD user exits to perform data validation as part of the CRUD process.
- Configured Match & Merge Rules (Exact and Fuzzy) for consolidation process, Validation Rules and Trust scores respective of their BO tables.
- Configured Hierarchies (Entities/ Entity Objects/ Hierarchies/ Relationship Objects/ HM Profiles) to define the hierarchy data of the corresponding data-model in IDD.
- Created IDD application and Subject Areas, Subject Area along with Business entities
- Groups, Deploy and test IDD application, cleanse functions, utilizing timeline, export and import master data from flat file.
- Used SIF APIs (GET, SearchMatch, PUT, CleansePut, ExecuteBatchDelete etc.) to search, update, cleanse, insert and delete data.
- Created Batch deployment groups and ran audit trail for Incremental loads.
- Altered Trust scores and set up Validation rules as per the business requirements.
- Implemented Stage Jobs, Load Jobs, Match, and Merge Jobs using the Batch Viewer and Automation Processes.
- Initiated API calls using SOAP UI/tools to perform cleansing for Base Objects, HREF, HXREF.
- Actively involved in gathering requirements from end users, involved in modifying various technical & functional specifications in the development.
- Used the Debugger in debugging some critical mappings to check the data flow from instance to instance.
- Developed Informatica Mappings/Workflows to load data from Oracle, DB2 Database, Flat Files into Oracle.
- Involved in migration of the Informatica components for multiple releases.
- Created and maintained Migration documentation and Process Flow for mappings and sessions.
- Created various tasks like sessions, worklets, and workflows in the workflow manager to test the mapping during development.
- Involved in Performance tuning of various mappings and sessions to increase the performance.
- Extensively worked on tuning and thereby improving the load time.
- Involved in the error checking and testing of ETL Procedures using Informatica Session log and Workflow logs.
- Developed the mapping to pull the information from different tables and used SQLOverride to join the tables instead of Joiner transformation to improve the performance.
- Scheduled sessions to update the target data using Workflow Manager of Informatica.
- Reviewed data transformation rules and provided technical suggestions in data transformation logic and pseudo ode.
Environment: Informatica MDM 10.1,Informatica 9.1.0/8.6.1 , OBIEE 11.1.1.7.1 , IDD, IDQ, Oracle 11G, DB2, SQL, Java, PLSQL, SQL Developer, Flat files (Fixed Width, Delimited) CSV Files.
Confidential, Irvine, CA
Informatica Developer/DWH Consultant
Responsibilities:
- Worked with business analysts to identify appropriate sources for data warehouse and to document business needs for decision support for data.
- To build ETL Specification document based on Functional Requirement.
- Sophisticated transformations are implemented using Informatica features like Aggregator, Filter, Expression, Normalizer, Lookups, Update Strategy, and Source Qualifier etc.
- Created Mappings, Re-Usable Transformations and Mapplets using Designer of Informatica Power Center.
- Using Joiner transformation for extracting data from multiple sources.
- Wrote SQL-Overrides and used filter conditions in source qualifier thereby improving the performance of the mapping.
- Implemented and reviewed the transformation logic in the ETL mappings according to the pre-defined ETL standards.
- Implemented performance tuning logic on sources, mapping, sessions, and targets in order to provide maximum efficiency and performance.
- Defined Target Load Order Plan for loading data into Target Tables.
- Extracted data from various heterogeneous sources like Oracle, SQL Server, DB2 and Flat Files.
- Implemented Slowly Changing Dimensions - Type I & II in different mappings as per the requirements.
- Performance tuning of the Informatica mappings using various components like Parameter files, Variables, and Dynamic Cache.
- Good understanding of various lookup caches like Static Cache, Dynamic Cache, and Persistent Cache.
- Used Push Down Optimization and Partitioning to improve the performance on Informatica.
- Coding & Debugging, Sorting out their time-to-time technical problems.
- Created error log table to capture the error messages, session load time.
- Used Workflow monitor to see the work flows running and get the session properties and session log.
- Created debugging sessions before the session to validate the transformations and used existing mappings in debug mode extensively for error identification by creating break points and monitoring the debug monitor.
- Created various UNIX Shell Scripts for scheduling various data cleansing scripts and loading process. Maintained the batch processes using Unix Shell Scripts.
- Wrote PL/SQL stored procedures for processing business logic in the database. Tuning of SQL queries for better performance.
- Used Autosys JIL to schedule UNIX shell scripts and Informatica jobs.
- Worked on the migration of scripts from Sunsolaris older version to Sunsolaris10.
- Involved in Unit test plans and Performing Unit Testing on ETL Mappings and Workflows.
- Involved in troubleshooting the loading failure cases, including database problems.
Environment: Informatica 8.6.1, Oracle11G/10G, Autosys, Cobol files, Netezza, MsOffice, Sql Server 2008, MSRS, SSIS, DB2, Sharepoint, VisualSourcesafe, Sunsolaris10.
Confidential, Dallas, TX
Informatica Developer/DWH Consultant
Responsibilities:
- Involved in designing, developing, and delivering ETL framework.
- Worked with business users to define and document new or changing requirements.
- Translated business requirements into technical design for data transformation.
- Data was extracted from Oracle by using tool like Oracle Designer, flat files.
- Developed logical data models and assisted in creating physical data models.
- Designed and developed source to target data mappings. Effectively used Informatica recommended techniques for complex mapping design and for enhancing mapping performance.
- Created complex mappings using Unconnected Lookup, Sorter, Aggregator, dynamic Lookup, and Router transformations for populating target table in efficient manner.
- Configured connections to various sources and creating source to target mapping (SCD mappings), edit rules and validation, transformations, and business rules.
- Hands on experience with advanced optimization techniques such as concurrent caching, auto memory calculations and push down optimization.
- Created various UNIX Shell Scripts for scheduling various data cleansing scripts and loading process. Maintained the batch processes using Unix Shell Scripts.
- Used techniques like source query tuning, single pass reading and caching lookups to achieve optimized performance.
- Identifying data lineage using metadata manager.
- Developed PL/SQL stored procedures for source pre-load and target pre-load.
- Created Informatica Repository Queries for mappings and sessions documentation
- Created Shell Scripts to execute the sessions using pmcmd.
- Identifying Performance bottlenecks and fine-tuning ETL mappings and workflows to improve performance.
- Developed and executed System integration testing strategies. Conduct peer reviews.
- Informatica administration responsibilities including migration of code from Dev to QA and to preproduction staging area.
- Scheduled Batch scheduling operations using UC4 automation tool for calculating daily, weekly, monthly, and yearly financial operations.
- Used MS SQL reporting services for report generation and Integration services with Informatica
- Developed adhoc reports and dashboards using BI tools like OBIEE
Environment: Informatica 8.1.1, OBIEE, Oracle 10g and Designer, SSIS, Windows Server, Perl, Linux, DB2, Sun Solaris 8, UC4
Confidential, Houston, TX
Informatica/ETL Developer
Responsibilities:
- Involved in Extraction, Transformation, and Loading (ETL) of data by using Informatica Power Center.
- Created mapping and mapplets using various transformations like Joiner, Filter, Aggregator, Lookup, Stored Procedures, Router, Sorter, Rank, Normalizer, and Update Strategy etc.
- Extensively worked on Mapping Variables, Mapping Parameters, Workflow Variables, and Session Parameters.
- Used Informatica Workflow Manager and Workflow Monitor to schedule and monitor session status.
- Created complex mappings using Unconnected Lookup, Sorter, Aggregator, newly changed dynamic Lookup and Router transformations for populating target table in efficient manner.
- Involved in performance tuning of targets, sources, mappings, and sessions.
- Created post-session and pre-session shell scripts and mail-notifications.
- Extracted data from COBOL files using copybooks.
- Used midstream XML Parser transformation to extract data from xml files.
- Used midstream XML generator transformation to generate target xml files.
- Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads, and process flow of the mappings.
- Used Debugger to troubleshoot the mappings.
- Used ETL to extract and load data from Oracle, SQL Server, and flat files to Oracle.
- Involved in writing lot of Functions, Stored Procedures.
- Designed, developed, and implemented Error handling procedures in mappings.
- Used Debugger wizard to remove bottlenecks at source level, transformation level, and target level for the optimum usage of sources, transformations, and target loads.
- Used shell scripts for automating the execution of maps.
- Designed and developed Oracle PL/SQL scripts for Data Import/Export.
- Creating Relational Connections, Migrating the Mappings from Dev to Test and Test to Production.
Environment: Informatica Power Center 8.x, Oracle 9i, Mantas, SQL Server 2005, SSIS, VSAM, SQL Plus, PL/SQL, Toad 7.0, SCO UNIX 5.0.5, Windows XP, XML.
Confidential
Datawarehouse Consultant
Responsibilities:
- Worked with Business Analysts and Business Users in the design of technical specification documents for data warehouse during multiple phases of the project
- Participated and provided input in business review meetings for approving logical and physical data models
- Involved in gathering user requirements and analysis.
- Involved in working with heterogeneous data sources like flat files, DB2 and Oracle.
- Implemented Type2 Dimension Model in Customer/Supplier/Product - record book. I.e. new attributes/columns were introduced in the dimension tables by slowly changing dimensions.
- Used complex mappings involving Target Load Order and Event based loading.
- Created various transformations like Joiner, Lookup, Router, Filter, Update Strategy, Sequence Generator, Sorter and Stored Procedure.
- Created Mapplets for reusable business rules.
- Created Shortcuts to reuse objects across folders without creating multiple objects in the repository.
- Loaded bad data using reject loader utility and actively coordinated with testing team during Integration and Regression testing
- Dealt successfully with resource problems and reduced the turnaround time of DBA with data warehouse issues
- Implemented performance tuning techniques at system, session, mapping, transformation, source and target levels
Environment: Informatica Power Center 7.1/6.2, Oracle 9i, Windows 2003 R2, 10.1.3.x, HP Quality Center, Toad