We provide IT Staff Augmentation Services!

Master Data Management Lead Resume

3.00/5 (Submit Your Rating)

South San Francisco, CA

SUMMARY

  • More than 9 years of extensive experience in complete Software Development Life Cycle (SDLC) including System Requirements Collection, Architecture, Design, Data Analysis, Coding, Development, Testing, Production Support, Maintenance and Enhancement in variety of technological platforms with special emphasis Client/Server, Data Warehouse MDM and Business Intelligence applications.
  • Heavy background in Master Data Management, Data Governance, Data warehousing using ETL,ELT experience of Informatica Power Center, IDQ (Informatica Data Quality) and Siperian Master Data Management.
  • Expertise in Master Data Management concepts, Methodologies and ability to apply knowledge in building Customer Master, Payer master, Vendor master MDM solutions.
  • Expertise in creating Mappings, Trust and Validation rules, Match Path, Match Column, Match rules, Merge properties and Batch Group creation.
  • Knowledge on implementing hierarchies, relationships types, packages and experience in creation and maintenance of entity objects, hierarchies, entity types, relationship objects, packages and profiles for hierarchy management in MDM Hub implementation. implementation of relationship types using Hierarchy tool to enable Hierarchy Manager (HM) in MDM HUB.
  • Hands on experience in design and configuration of landing tables, staging tables, base objects, hierarchies, foreign - key relationships, lookups, query groups, queries/custom queries and packages.
  • Designed, Installed, Configured core Informatica/Siperian MDM Hub components such as Informatica MDM Hub Console, Hub Store, Hub Server, Cleanse Match Server, Cleanse Adapter, Data Modeling.
  • Solid expertise in Data Extraction, Data Migration, Data Transformation and Data integration using ETL, ELT, data pipeline.
  • Worked on Data Profiling using IDE-Informatica Data Explorer and IDQ-Informatica Data Quality to examine different patterns of source data. Proficient in developing Informatica IDQ transformations like Parser, Classifier, Standardizer and Decision.
  • Strong knowledge in Kimball and Inmon methodology and models, Dimensional modeling using Star schema and Snowflake schema.
  • Solid skills in Informatica Power Center, SQL, PL/SQL, Stored Procedures and Triggers, Performing Debugging, Troubleshooting and Performance Tuning.
  • Extensive experience with Multiple Relational Databases like Oracle, SQL Server, Teradata,SAP
  • Experience with Data governance suite, Informatica PDM (Persistent Data Masking), DDM (Dynamic Data Masking).

TECHNICAL SKILLS

  • Informatica 10.0 8.6/8.1/7.1.2/7.1.1/6.2/5.1.2/4.7 (Informatica power center/power mart/power exchange Informatica Data Quality Siperian MDM)
  • SQL*plus
  • SQL*loader
  • Teradata
  • SQL loader utilities. Dimensional Data Modeling
  • Star Schema Modeling
  • Snowflake Modeling
  • FACT and Dimensions Tables
  • Physical and Logical Data Modeling
  • Hub & Spoke Architect
  • ERwin 4.1/3.5.2. TOAD
  • Sybase Power Designer
  • Oracle warehouse builderOracle 10g/9i/8i
  • Teradata
  • MS SQL Server 2008/2005
  • DB2
  • MS Access 97/2000
  • Sybase
  • AS400
  • I Series google big data
  • MongoDB
  • Amazon RedshiftBusiness Objects XI
  • Cognos 8.0
  • Microstrategy
  • Tableau Programming EnvironmentUNIX
  • Windows XP/Vista
  • Linux
  • MS DOS 6.22.Mainframe SQL
  • PL/SQL
  • SQL*Plus
  • XML
  • Core Java
  • UNIX Shell Scripting
  • Control-m iRobot Series.

PROFESSIONAL EXPERIENCE

Confidential - South San Francisco, CA

Master Data Management Lead

Responsibilities:

  • Worked with Stakeholders and Business Partners through the journey of delivering projects in a co-production model by developing a customer Center of Excellence focused on PRPC and skills.
  • Provided technology and architecture-related thought, strategic direction, and long-term vision. Areas of concentration include End to End integration solution, Real time changes, Security, Enterprise Architecture, and Transformative IT Technologies including Business Process Modeling, Real time Computing, and SOA.
  • Defined overall MDM solution architecture and sets technical direction, defines component architecture, and reviews detailed designs for accuracy and overall compliance to defined architecture.
  • Working with MDM team for implementation of TDM for dynamics data masking for securitization of customer information. Execute on set architectural guidelines and long term architecture strategy, and ensure compliance of the reference architecture and security requirements.
  • Perform proofs of concept (PoCs) and other technical evaluations of technologies, designs, and solutions as required.
  • Designed MDM end to end process the streamline the Customer master operation for GCOI.
  • Worked with vendors to identify challenges and utilized existing solution for Confidential customer master data for efficiency reporting.
  • Defined and design schema, staging tables, and landing tables, base objects foreign-key Relationships, look up systems and tables, packages, query groups and queries/custom queries.
  • Defined the Trust and Validation rules and setting up the match/merge rule sets to get the right master records
  • Involved with Data Steward Team for designing, documenting and configuring Informatica Data Director for supporting management of MDM data for governance.
  • Used Hierarchies tool, for configuring entity base objects, entity types, relationship base objects, relationship types, profiles subject areas in IDD.
  • Deployed new MDM Hub for portals in conjunction with user interface on IDD application.
  • Configured match rule set, probabilistic match engine property by enabling search by rules in MDM according to Business Rules.

Environment: Informatica MDM Hub 10.0, Informatica IDQ 9.5.1, Siperian, Jboss, Informatica Power Center 9.6,Informatica Rulepoint, Oracle 10g, PL/SQL,DVO, Tibco BW, Soap UI, Shell Scripting, Mainframes Z/OS, Autosys, Toad for Oracle, Teradata, SQL Navigator, XML SPY

Confidential - Torrance, CA

Tech Lead MDM

Responsibilities:

  • Collaborated with Business users and stakeholders, Project managers to define and layout plan for project score.
  • Involved in preparing the functional specs and prepared the Technical design for RIS and DFFU projects.
  • Architected the technical refresh project intended to migrate the Informatica platform, Oracle databases, Storage(NAS/SAN),Utility Tools (Autosys,NDM) in conjunction with Data Center Alignment.
  • Acting as an ETL liaison between Operating system Engineers (Unix) and NAS engineers with setting up of NAS filers including allocation,replication,retention, snapshots and backups on both Netapp and Isilon platforms and Network Engineers in setting up of F5 load balancers for applications like Oracle,NDM on top of VIP(local) and WIP (global).
  • Migrating/Provisioning new users, groups and adding servers to host groups and ACL using BOKS and setting up single sign on using Active Directory and LDAP Authentication.
  • Configuring NDM Node Names(Connect: Direct) using F5 (Load Balancers) including netmaps and userfiles for both inbound and outbound file/Data transmissions.
  • Setup Proactive Monitoring (PM) for Master data Governance and Operations for dealer daily.
  • DIH Server, Console and plugins installation and setup for Informatica Managed File Transfer (MFT) installation and setup.
  • Designed IDQ pipelines for scorecard to manage audit of Retail inventory for online and offline channel.
  • Configured and installed Informatica MDM Hub server, cleanse, resource kit, and Address Doctor.
  • Analyzed the source systems for erroneous, duplicative, and integrity issues related to the data.
  • Worked in enhancement of SQL, dynamic SQL and PL/SQL code to run in properly manner.
  • Configured and Install PWX logger and listener for Oracle Exadata and for DB Z/OS 2.1.
  • Worked with cross the teams to implement CDC for moving data from Source to Target to identify in real time to regulatory purpose.

Environment: Informatica MDM Hub 10.0, Informatica IDQ 9.5.1, Informatica Power Center 9.6/8.6/8.1, Power Exchange 9.6,Informatica Rule Point, Oracle 9i/10g, PL/SQL, V2R4, HP-UX, Korn Shell Scripting, Mainframes Z/OS, Control M,, SQL SERVER 2008,Toad for Oracle, Google cloud, AS400, DB2, Integrity Tool, ROBOT I series, Informatica Cloud, Teradata, SQL Navigator, Razorsql.

Confidential - Birmingham, AL

Lead MDM

Responsibilities:

  • Responsible for various phases of SDLC from requirement gathering, analysis, design, development and testing to production.
  • Analyzed and created business/solution functional requirement document to master customer data.
  • Analyzed and profiled data from different source systems using Informatica data quality tool.
  • Generated data profiling results using Informatica data quality tool.
  • As per the profiling results, helped MDM developers to configure trust for different source systems in Informatica HUB console.
  • Worked with ETL and MDM developers and created data quality rules to standardize the data as part of MDM project.
  • Analyzed functional requirement and created the logical model using ER-win.
  • Created the data flow and conceptual diagram for the business for better understanding and identifying the stages of data.
  • Created source to target mapping document for the ETL development and ETL testing team.
  • Set-up functionality walk-through workshops with business analysts and developers.
  • Reviewed the mapping documents from source to target landing tables in CMX ORS schema
  • Thoroughly conducted data analysis and gap analysis between source systems for MDM model.
  • Designed and developed the Test Suite in the HP ALM tool.
  • Mapped all test cases to the function requirement document.
  • Created complex SQL script by implementing business transformation logic to test the data in target tables against developer developed ETL.
  • Created complex SQL scripts to test the transformation logic from source database to target database as part of UAT Testing.
  • Develop and execute test strategies for ETL testing in complex, high volume MDM project.
  • Worked with business users and gathered requirement for MDM outbound views.
  • Developed scripts to create MDM outbound views so that downstream systems can source data directly from MDM views.
  • Used PL/SQL to write stored procedures to increase the performance.
  • Defined the Base objects, Staging tables, foreign key relationships, static lookups, dynamic lookups, queries, packages and query groups.
  • Worked on data cleansing and standardization using the cleanse functions in Informatica MDM.
  • Used Hierarchies tool for configuring entity base objects, entity types, relationship base objects, relationship types, profiles, put and display packages and used the entity types as subject areas in IDD.
  • Defined the Trust and Validation rules and setting up the match/merge rule sets to get the right master records.
  • Configured match rule set property by enabling search by rules in MDM according to Business Rules.
  • Worked on Data Standardization, Reference Table, Scorecards, Data Validation and Modification, Data Masking, Columns, Rules, Database Interaction through IDQ/MDM.
  • Configured and installed Informatica MDM Hub server, cleanse, resource kit, and Address Doctor.
  • Used Informatica Power center Data Masking option to protect the confidential customer account information.
  • Analyzed the source systems for erroneous, duplicative, and integrity issues related to the data.
  • Worked in enhancement of SQL, dynamic SQL and PL/SQL code to run in properly manner.
  • Used Informatica proactive monitoring to detect early stage integration process failure and complex event processing.
  • Managed MDM for CCAR and Credit Risk Database in Oracle Exadata.

Environment: Informatica Power Center 9.6/9.1, Power Exchange, Informatica Rule Point, Informatica Data Quality, Informatica MDM hub, Oracle 9i/10g, PL/SQL, V2R4, HP-UX, Windows 2000, Shell Scripting, Mainframes Z/OS, Control M, Microstrategy, SQL SERVER 2008, Toad for Oracle.

Confidential - Columbus, OH / Dallas, TX

Senior ETL DEVELOPER

Responsibilities:

  • Performed a major role in understanding the business requirements, designing and loading the data into data warehouse.
  • Informatica client tools - Used Source Analyzer, Target designer, Mapping Designer, Mapplets Designer and Transformation Developer for defining Source & Target definitions and coded the process of data flow from source system to data warehouse.
  • Developed mappings in Informatica to load the data from various sources into the Data Warehouse, using different transformations like Joiner, Aggregator, Update Strategy, Rank, Router, Lookup, Sequence Generator, Filter, Sorter and Source Qualifier.
  • Worked on Static and Dynamic Caches for the better throughput of sessions containing Rank, Lookup, Joiner, Sorter and Aggregator transformations.
  • Created complex Informatica mappings, reusable objects of Mapplets depending on client requirements.
  • Created, optimized, reviewed, and executed Teradata SQL test queries to validate transformation rules used in source to target mappings and to verify data in target tables and Loaded data into Teradata using Fast Load, BTEQ, and Fast Export, Multi Load, and Korn shell scripts.Used Fast Load, Multi-Load, and Tpump for data loading.
  • Implemented performance tuning logic on Targets, Sources, mappings, sessions to provide maximum efficiency and performance.
  • Used CDC for moving data from Source to Target to identify the data in the source system that has changed since the last extraction with the help of CDC.
  • Used Power Exchange for SAP FICO input for Moody’s rating agency.
  • Designed and Developed Oracle PL/SQL procedures, performed Data Import/Export, Data Conversions and Data Cleansing operations.
  • Used Autosys for automating Batches and Session.

Environment: Informatica Power Center 8.6/8.1,Power Exchange, Oracle 9i/10g, PL/SQL, Teradata V2R4, HP-UX, Windows 2000,Shell Scripting, Mainframes Z/OS, Autosys, OBIEE, Multiload, Fastload, FastExport, Toad for Oracle.

Confidential - Minneapolis, MN

Senior ETL / Informatica Developer

Responsibilities:

  • Collaborated with Business People for requirements gathering, done the Data Analysis on the legacy systems.
  • Involved in preparing the functional specs and prepared the Technical design.
  • Perform in depth Data Analysis and implemented Cleansing process and Data Quality.
  • Designed and Implemented the Pre Staging and Staging approach, for cleansing using Informatica ETL and UNIX.
  • Designed and developed star schema model for target database using ERWIN Data modeling.
  • Implemented Slowly Changing Dimension Type 1 and Type 2 for inserting and updating Target tables for maintaining the history.
  • Used various active and passive transformations such as Aggregator, Expression, Sorter, Router, Joiner, connected/unconnected Lookup, Stored Procedure, and Update Strategy transformations for data control, cleansing, and data movement.
  • Involved in massive data cleansing prior to data staging.
  • Extensively used Informatica client tools Source Analyzer, Warehouse designer, Mapping Designer, Mapplet Designer, Transformation Developer, Informatica Repository Manager and Informatica Workflow Manager.
  • Created sequential/concurrent Sessions/ Batches for data loading process and used Pre & Post Session SQL Script to meet business logic.
  • Designed and developed Mapplets for faster development, standardization and reusability purposes.
  • Involved in pre and post session migration planning for optimizing data load performance.
  • Extensively used ETL Informatica tools to extract data stored in MS SQL 2000, Excel, and Flat files and finally loaded into a single Data Warehouse.
  • Performed Unit testing during the mapping phase to ensure proper and efficient implementation of the transformations.
  • Extensively involved in performance tuning of the Informatica Mappings/Sessions by increasing the caching size, overriding the existing SQL.
  • Worked as team leader to implement Incremental load and CDC to data mart using Informatica for reporting purpose.
  • Maintained warehouse metadata, naming standards and warehouse standards for future application development.
  • Tuning the Mappings for Optimum Performance, Dependencies and Batch Design.

Environment: Informatica Power Center 8.6.1, Metadata Manager, Power Exchange 8.6, Data Analyzer, UNIX, Oracle 10g, Flat Files, XML Files, SQL and PL/SQL, ERWIN 4.0, Toad for Oracle10g, Remedy, Ultra Edit, Microsoft Project, ESP.

Confidential - Atlanta, GA

ETL Developer

Responsibilities:

  • Worked with Business analysis team on regular bases and assisted implementing agile methodology on project.
  • Designed ETL architecture and putting in place ETL standards for Informatica.
  • Administrator of several Informatica environments including production environments.
  • Created Reusable transformations and Mapplets using transformation developer and Mapplet Designer throughout project life cycle.
  • Scheduled Informatica workflows using Informatica scheduler and external scheduler Control Manager.
  • Did testing for Informatica Jobs on Pre Prod Server before moving them into Production servers
  • Created and maintained Connect Direct Connections for secure data transfer between production servers and other environments.
  • Responsible for determining the bottlenecks and fixing the bottlenecks with performance tuning.
  • Worked with Crontab utility in UNIX to manage and run workflow as per schedule.
  • Developed and maintained simple and complex end-user reports and reports from Informatica repository for internal use in Business Objects.
  • Installed and configured the transformation server for a Data Replication tool called Data Mirror.
  • Configured several oracle production servers for replication with the help of Data Mirror.
  • Supported Production environment for BI tools like Business Objects and Brio Hyperion 6.6.4.
  • Extensively used the SQL for Oracle and Teradata and developed PL/SQL scripts.

Environment: Informatica Power Center 7.1.4/7.1.1 , Cognos, Oracle, Teradata, Toad 8.6, UNIX, HP-UX 11i, Sun Solaris 5.4, SQL, FACETS 4.X, MS Office Visio 2003.

We'd love your feedback!