We provide IT Staff Augmentation Services!

Ibm Infosphere Mdm Developer Resume

0/5 (Submit Your Rating)

Princeton, NJ

SUMMARY

  • 8 Years of IT Experience with 5+ years of ETL and data integration experience in developing ETL mappings using Informatica Power Center 9x/8 x/ 7x/ 6x
  • 5 + Years of experiencing in configuring maintenance, testing and troubleshooting on Master Data Management (MDM)
  • Thorough understanding of Software Development Life Cycle (SDLC) including requirements analysis, system analysis, design, development, documentation, and implementation
  • Sound knowledge of the Pharma, Telecom, Healthcare, Insurance and Banking Domains.
  • Expertise in Installing, Managing and configuring Informatica MDM Hub Server, Informatica MDM Hub Cleanse Server, Cleanse Adapters, Hub Resource Kit and IDD
  • Experience in creating Base objects, Staging tables, foreign key relationships, queries, packages, query groups and custom functions in Informatica MDM.
  • Created and defined Entity objects, Hierarchies, entity type and relationship objects and relationship types using Hierarchy tool
  • Expertise in Informatica MDM Hub Match and Merge Rules, Batch Jobs and Batch Groups.
  • Performed Data standardization of addresses using Address Doctor, Trillium and Informatica Data Quality and Data integration in real - time using SIF API’s.
  • Expertise in IBM InfoSphere MDM server implementation for customer domain.
  • Experience in MDM Areas such as Additions, Data and Behavior Extensions, Business Proxies, Suspect Processing, External Rules, Batch Framework, Notifications and DSUI
  • Experienced in customizing the Infosphere MDM Server using MDM Work Bench
  • Worked on Request/response Framework, Notification Framework of Infosphere MDM.
  • Expertise in IBM Infosphere MDM server installation, deployment and upgrade.
  • Knowledge in WebSphere Application server and JBOSS
  • Expertise in Informatica 9.6/9.1/8.6/8.1 using components like Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet Designer, Mapping Designer, Workflow Manager and Workflow Monitor
  • Experience in performance tuning, have identified and resolved performance bottlenecks at various levels like sources, targets, mappings, and sessions.
  • Expertise in developing mappings, sessions, running workflows, using parameter file and Variables.
  • Experience in UNIX working environment, have written UNIX shell scripts for Informatica pre & post session operations.
  • Vast experience with different databases like Oracle, Teradata, DB2, SQL Server and MS Access.
  • Expertise in End-User training, Designing documentation and Application/Product Demos.
  • Excellent analytical, programming, written and verbal communication skills with ability to interact with individuals at all levels.
  • Experience with Informatica Cloud, developed data synchronization and Replication tasks.

TECHNICAL SKILLS

MDM Tools: Informatica Multidomain MDM 9.1/9.5, IDD, SIF, IBM InfoSphere MDM server 9.1/11.3, IBM MDM WCC

Programming Languages: Java/J2EE, PL/SQL, C, C++, Visual Basic 6.0,ASP.net, XML, HTML, SQL

Application Servers: IBM WebSphere Application Server (WAS), JBOSS

ETL Tools: Informatica PowerCenter 9.x/8.x/7.x/6.x, Informatica Data Quality 8.6, Informatica Power Exchange, Informatica Cloud, Informatica Data Explorer 8.6

Database Tools: Oracle 11g/10g/9i/8i/7.x, SQL, MS SQL Server 2000/2005 Teradata V2R5/V2R4, Microsoft Access, DB2,Siebel,SOQL

Tools: /IDE Rational Tools RAD/RTC, Eclipse, Erwin 3.5, Unified Modeling Language (UML), MS Office 2003, TOAD 7.6/8.0, Putty, PSFTP, SIF, UNIX,LINUX, Windows NT/ 98/2000/2003 , Sun Solaris

Job Scheduling: CA Autosys, control-M, Informatica Schedule, Tidal, Maestro

PROFESSIONAL EXPERIENCE

Confidential, Princeton, NJ

Informatica MDM Hub Configuration Specialist

Responsibilities:

  • Responsible for resolving production incidents as they come and resolving them as they within the SLA time frame.
  • Trouble shooting and identifying high priority issues, coming up with short-term fixes for highly volume and recurring incidents until the long-term fix is in place
  • Provide data security to data stewards as per role
  • Monitoring the Informatica MDM batch jobs and making sure the jobs ran smoothly and troubleshooting the issue in case the jobs failed.
  • Accommodating user requests and user queries based on the priority
  • Performed data profiling, standardization and matching.
  • Defined and configured schema, staging tables, and landing tables, base objects foreign-key Relationships, look up systems and tables, packages, query groups and queries/custom queries.
  • Configured Trust and Validation rules and set up the match/merge rule sets
  • Developed IDQ Data Quality standardization & cleansing rules using the profiling results

Environment: Informatica MDM 9.5, Informatica Power Center 9.1, Oracle 11g/10g, Toad, Websphere Application server, Address Doctor 5, IDD, Windows server 2003, Unix. Tivoli

Confidential, Durham, NC

IBM InfoSphere MDM Developer

Responsibilities:

  • Requirements gathering and create functional and technical specs for master data requirements
  • Made changes to the MDM party model based on clients requirements
  • Develop, build and test custom code based on the client requirements
  • Deploying application to WAS Server.
  • Performing pre and post code migration tasks whenever the code was deployed
  • Involved in development of Data extensions and Behavior Extensions and Data Additions
  • Involved in development of Business Proxies
  • Customized match rules for suspect duplicate processing by defining custom critical data elements.
  • Involved in development and implementation of survivorship rules
  • Upgraded IBM InfoSphere MDM server from version 9 to Version 11
  • Managing all MDM defects and analyzing issues and fixing them.
  • Troubleshoot system or data driven errors, bringing a thorough understanding of MDM activity and error logs.
  • Support production releases of new functionality / new source system introduction to MDM.
  • Creating change records and interacting with various teams to manage the implementation.
  • Monitor all initial data load details for every new system
  • Implemented a response handler framework to identify, re-send and report the message failures in nightly loads
  • Acted as SME for IBM InfoSphere MDM application
  • Was heavily involved in Production Support resolving incidents coming in everyday within given SLA time
  • Triaging the incident and communicating within team and delegating the incident to proper team member of the right team.
  • Developed various custom Data cleanup jobs to fix various data defects caused due to gaps in implementation
  • Modified the DSUI functionality to support client requirements
  • Performed role of data-steward to correct the data to fix production incidents.

Environment: IBM InfoSphere MDM server 9.1, IBM InfoSphere MDM server 11.0, LINUX, Oracle 10g/9i, Toad 9.0.1, WAS, Abnitio, Control M, FACETS, Rational tools RAD/RTC, Serena Dimensions, HP Service Center, Message Broker, WAS

Confidential, Frazer, PA

MDM HUB Developer

Responsibilities:

  • Configuration and creation of landing and staging tables for newly added sources
  • Added new sources to the existing MDM implementation.
  • Create mappings to load data in to the staging tables
  • Build the cleanse functions based on the requirements for the mappings.
  • Defined the Foreign Key relationships and provided lookup information for the columns in the staging tables
  • Tested already existing match rules and performed match tuning by determining over and under matches.
  • Modified few match rules based on performance.
  • Created mapping documents, functions speciation’s document and technical specification documents.
  • Defined Trust and Validation rules and Configured Match set Rules for getting the matches in order to eliminate the duplicate records
  • Migration of siperian objects from Dev to QA and to Production environments.
  • Configured Informatica MDM hub on Weblogic application server.
  • Loaded the data into the landing tables using SQL scripts and used the batch viewer to run the Stage and Load process

Environment: Informatica Multidomain MDM 9.1, Informatica Power Center 9.1/8.6.1,UNIX HP-UX, LINUX, Oracle 10g/9i, Toad 9.0.1, Siebel,Veeva, Salesforce.com JBoss 5.1.0, Weblogic: 11g

Confidential, Titusville, NJ

Informatica MDM Hub developer/Power Center Developer

Responsibilities:

  • Designed, documented and configured Informatica MDM Hub to load, cleanse, match, merge, and publish the MDM data
  • Configured Informatica MDM hub on JBoss application server
  • Presented hub solutions for a Multi-Domain Party model MDM for customers and reps and products.
  • Designed MDM Development Methodology and defined match and merge Rules after a detailed analysis of source data
  • Configured and implemented address cleansing tool Address Doctor (AD5)
  • Configured Landing Tables, Staging Tables and Base objects
  • Defined the Foreign Key relationships and provided lookup information for the columns in the staging tables
  • Developed the mappings and used the necessary cleanse functions by analyzing the data using the data quality tool
  • Fixed the issues while getting the data loaded in stage and base tables
  • Developed complex hierarchies in the Hub by configuring the entity base objects, entity types, relationship base objects, relationship types, profiles, put and display packages
  • Configured IDD for Customer and Product centric Hub schemas.
  • Configured and Implemented application security services and Web Services Security, Message-Level, Security using SIF API
  • Generated and Deployed ORS-specific SIF APIs
  • Used Informatica client components like Designer, workflow Manger and Workflow Monitor
  • Developed complex mappings, mapplets using Informatica workflow designer to integrate data from varied sources like Oracle, flat files, Siebel and SQL and loaded into target.
  • Performed data migration and data cleansing using various transformations such as Expression, Filter, Joiner and Lookup and stored procedure transformation.
  • Developed UNIX scripts which were required to automate the data load process
  • Performed unit testing, functional testing, regression testing and User acceptance testing for the migrated Mappings. As a result Inconstancies in the ETL process were found and fixed.
  • Created UNIX Shell Scripts for pre/post session commands for automation of loads.
  • Have identified and resolved performance bottlenecks at various levels like sources, targets, mappings, and sessions in the converted maps and improved efficiency of the runs.
  • Used Informatica Cloud to integrate and load data into Salesforce.com.
  • Managed salesfore.com/Veeva objects.
  • Used session partitions, SQL over ride, dynamic cache memory and index caches for improving performance of informatica server.
  • Ensured all production changes comply with change management policies and procedures.
  • Developed AKD, FS and SOP documents for the operations team.
  • Collaborated closely with cross functional teams to implemented new technologies in compliance with industry standard.
  • Responsible for implementing Incremental loading of mappings using Mapping Variables and Parameter Files.
  • Worked with complex queries for data retrieval and modification in Oracle databases.

Environment: Informatica Multidomain MDM 9.1,IDD, SIF, Informatica Power Center 9.1/8.6.1/8.1.1 , InformaticaCloud, Win 2003, UNIX HP-UX, LINUX, Oracle 10g/9i, Toad 9.0.1,TIDAL, Mastero, Siebel, force.com,Veeva, Salesforce.com APEX dataloader, JBoss 5.1.0, IDD, Workbench.

Confidential, Wichita, KS

Informatica Admin

Responsibilities:

  • Responsible of setting up new Informatica 8.6.1 server. & Designed migration and testing strategy for the conversion of maps.
  • Involved in migration of Informatica power center from Version 6.1.1 to 8.6.1
  • Created Migration, Development, and Staging and Production repositories.
  • Used Repository Manager to administer Repository, created roles, groups, users, and assigned them with access rights.
  • Migrated Informatica mappings, sessions and workflows from migration repository to development repository.
  • Configured outlook on the new Informatica 8.6.1 server in order to enable sending of E-mails during workflow runs from sessions and mapping runs.
  • Developed and modified complex mappings, mapplets using Informatica workflow designer to integrate data from varied sources like SQL server, Oracle and flat files and loaded into SQL server targets.
  • Used session partitions, SQL over ride, dynamic cache memory and index caches for improving performance of informatica server.
  • Extensively used RedGate and BeyondCompare to compare the data runs from Informatica versions 6.1.1 and 8.6.1
  • Provided 24x7 support for production operations (incident break/fix, change, service request, project, and databases)
  • Ensured all production changes comply with change management policies and procedures.
  • Worked with complex queries for data retrieval and modification in SQL and Oracle databases.
  • Extensively worked with VBS scripts for various file conversion processes.

Environment: Informatica Power Center 8.6.1/6.1.1 , RedGate, BeyondCompare, Erwin, SSRS Win 2003, UNIX, Oracle 10g/9i, SQL Server 2000/2005, Toad8.5, Autosys, Visual Basics (VBS)

Confidential, Kansas

Lead ETL Developer

Responsibilities:

  • Involved in the design, development and implementation of Customer Analytics Datamart.
  • Involved in Dimensional modeling to design and develop STAR Schema, using Erwin to design Fact and Dimension Tables.
  • Involved in migration of Informatica power center from Version 7 to 8
  • Informatica administration responsibilities including migration of code from Dev to QA and to preproduction staging area
  • Successfully implemented Type 1 and Type 2 dimensions for inserting and updating slowly changing dimension tables in target for maintaining the history
  • Implemented Change Data Capture using Informatica Power Exchange.
  • Performed Data Architect activities such as data analysis, strategies for data migration/ conversion, Data Cleansing, Data Profiling, designing technical specification and table level specification along with physical and semantic data model.
  • Designed and developed audit process to maintain the data integrity and data quality. A two step process where source to target audits helped to make sure only desired data is loaded and post load audits for checking the integrity of the data within the data warehouse.
  • Created various Data quality rules and Data Quality Blocks to achieve various data quality objective
  • Used various transformations like Source qualifier, expression, joiner, router, filter, B2B, IDQ \ custom and update strategy.
  • Discover and analyze all data anomalies across all data sources and find hidden data problems using Data Explorer
  • Worked on pushdown optimization in order to improve session performance
  • Managed metadata associated with ETL processes used to populate the data warehouse
  • Developed and modified PL/SQL, procedures, functions, triggers, reports as per business requirements
  • Created and configured session as well as used various partitioning techniques such as key-range, hash key and round robin partitioning to provide faster execution of ETL process
  • Provided end user training and Production System Support.

Environment: Informatica Power Center 8.6/7.1, Power Exchange 8x, Erwin 4.2, Informatica Data Quality 8.6, Win 2003, UNIX, Oracle 10g/9i, SQL Server 2005, Teradata, BTEQ, Toad8.5, Autosys, Business Objects

Confidential, Columbus, GA

Informatica developer

Responsibilities:

  • Responsible for Requirement Gathering Analysis and End user Meetings
  • Analyzed data flow requirements and developed a scalable architecture for staging and loading data.
  • Created architectural design document based on the business process and functional requirements.
  • Developed complex mappings, mapplets using Informatica workflow designer to integrate data from varied sources like Oracle, Teradata, flat files, XML files, SQL databases and SAP sources following the design guidelines and loaded into target
  • Extensively created Tasks like Sessions, Command, Email, Timer, Assignment, Control, Decision, Event Wait and Event Raise
  • Implemented Complex Data Exchange for extracting and transforming data from unstructured and semi-structured documents like pdf, excel doc and XML files.
  • Extensively used both Connected and Unconnected Lookup transformations
  • Extensively worked with various lookup caches like Static Cache, Dynamic Cache, and Persistent Cache
  • Wrote Oracle SQL queries based on given specifications embedding them in UNIX scripts and generate the parameter file
  • Performed Performance tuning and optimization of Informatica mappings and session using features like Partitions and data/index cache.
  • Performed Pipeline partitioning to optimize the performance of mappings.
  • Responsible for Unit testing and Integration testing of mappings and workflows
  • Extensively used SQL and PL/SQL scripts.
  • Used SQL*Loader for bulk loading the data into the target tables

Environment: Informatica PowerCenter 8.1, Power Connect, SAP R3 Teradata, Oracle 9i, UNIX Shell Scripting, SQL, PL/SQL, SQL * Loader, TOAD, Erwin 3.5.5, Sun Solaris UNIX, Windows 2003

We'd love your feedback!