We provide IT Staff Augmentation Services!

Tech Lead Mdm Resume

0/5 (Submit Your Rating)

South San Francisco, CA

PROFESSIONAL SUMMARY:

  • More than 10 years of extensive experience in complete Software Development Life Cycle (SDLC) including System Requirements Collection, Architecture, Design, Data Analysis, Coding, Development, Testing, Production Support, Maintenance and Enhancement in variety of technological platforms with special emphasis Client/Server, Data Warehouse MDM and Business Intelligence applications.
  • Heavy background in Data Warehousing ETL experience of using Informatica 9.5.1/9.1/8.6.1/8.5 IDQ (Informatica Data Quality) and Siperian master data management.
  • Strong knowledge in Kimball and Inmon methodology and models, Dimensional modeling using Star schema and Snowflake schema.
  • Solid skills in Informatica Power Center, SQL, PL/SQL, Stored Procedures and Triggers, Performed Debugging, Troubleshooting and Performance Tuning.
  • Extensive experience with Data Extraction, Transformation, and Loading (ETL) from disparate Data sources like Multiple Relational Databases like Oracle, SQL Server, Teradata,SAP Worked on integrating data from flat files and XML files into a common reporting and analytical Data Model using Informatica.
  • Experience with Data governance suite, Informatica PDM (Persistent Data Masking), DDM (Dynamic Data Masking).
  • Proficiency in data warehousing techniques for data cleaning, Slowly Changing Dimension phenomenon, surrogate key assignment, Metadata, Master Data Management. Match & merge, System trust and publishing data.
  • Collaborate/Partner with other teams including Data Governance, Architecture, Release Management, System Test and Application Support on successful delivery of systems enhancements.

TECHNICAL SKILLS:

Data Warehousing: Informatica 10.0, 8.6/8.1/7.1.2/7.1.1/6.2/5.1.2/4.7 , (Informatica power center/power mart/power exchange, Informatica Data Quality, MDM), SQL*plus, SQL*loader, Teradata, SQL loader utilities.

Data Modeling: Dimensional Data Modeling, Star Schema Modeling, Snowflake Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling, ERwin 4.1/3.5.2. TOAD, Sybase Power Designer, Oracle warehouse builder

Databases: Oracle 10g/9i/8i, Teradata, MS SQL Server 2008/2005, DB2, MS Access 97/2000, Sybase, AS400, I Series

Reporting Tools: Business Objects XI, Cognos 8.0, Microstrategy, Tableau

Programming: SQL, PL/SQL, SQL*Plus, XML, Core Java, UNIX Shell Scripting, Control - m, iRobot Series.

PROFESSIONAL EXPERIENCE:

Confidential - South San Francisco, CA

Tech Lead MDM

Responsibilities:

  • Lead entire SDLC, CR, release management, configuration management, and quality methodologies and practices.
  • Acted as Liaison for clients on all aspects of Fiserv engagements within TFS.
  • Architected the technical refresh project intended to migrate the Informatica platform, Oracle databases, Storage (NAS/SAN), Utility Tools (Autosys, NDM) in conjunction with Data Center Alignment.
  • Acting as an ETL liaison between Operating system Engineers (Unix) and NAS engineers with setting up of NAS filers including allocation,replication,retention, snapshots and backups on both Netapp and Isilon platforms and Network Engineers in setting up of F5 load balancers for applications like Oracle,NDM on top of VIP(local) and WIP (global).
  • Migrating/Provisioning new users, groups and adding servers to host groups and ACL using BOKS and setting up single sign on using Active Directory and LDAP Authentication.
  • Configuring NDM Node Names (Connect: Direct) using F5 (Load Balancers) including net maps and user files for both inbound and outbound file/Data transmissions.
  • Design and Configuring the NAS sizing, Backups and replication aspects and export of filers to server hosts.
  • Remediating Vulnerability patches as part of safety and soundness.
  • Worked with Stakeholders and taking them through the journey of delivering projects in a co-production model to developing a customer Center of Excellence focused on PRPC and skills.
  • Provided technical leadership for the development team including technical and architectural support to customers. Guided all disciplines involved in a new product's introduction.
  • Provided innovation, technology and architecture-related thought leadership, strategic direction, and long-term vision. Areas of concentration include End to End integration solution, Real time changes, Security, Enterprise Architecture, and Transformative IT Technologies including Business Process Modeling, Real time Computing, and SOA.
  • Working with MDM team for implementation of TDM for dynamics data masking for securitization of customer information. Execute on set architectural guidelines and long term architecture strategy, and ensure compliance of the architecture and security requirements.
  • Perform proofs of concept (PoCs) and other technical evaluations of technologies, designs, and solutions, if required.

Environment: Informatica MDM Hub 10.0, Informatica IDQ 9.5.1, Siperian, Jboss, Informatica Power Center 9.6,Informatica Rule Point, Oracle 10g, PL/SQL,DVO, Tibco BW, Soap UI, Shell Scripting, Mainframes Z/OS, Autosys, Toad for Oracle, Teradata, SQL Navigator, XML SPY

Confidential - Torrance, CA

Senior ETL Developer

Responsibilities:

  • Collaborated with Business users for requirements gathering, done the Data Analysis on the legacy systems.
  • Involved in preparing the functional specs and prepared the Technical design for RIS and DFFU projects.
  • Configured high level pushdown properties for the Data Integration Service to run mappings or profiles in a sfdc environment.
  • Informatica Power Center and Developer UI with HDFS files and Hive tables.
  • Automated Shell scripts for all repository backups and monitoring the services.
  • Ambari/Ranger configurations for changes and access. Recycling all services as necessary.
  • Setup Proactive Monitoring (PM) for Power Center Governance and Operations.
  • DIH Server, Console and plugins installation and setup for Informatica Managed File Transfer (MFT) installation and setup.
  • Used IDQ to create scorecard to manage audit of Retail inventory for online and offline channel.
  • Created Audit table to better manage the Load counts in Warehouse Database.
  • Profiled the data using Informatica Analyst tool to analyze source data (Departments, party and address) coming from Legacy systems and performed Data Quality Audit.
  • Configured and installed Informatica MDM Hub server, cleanse, resource kit, and Address Doctor.
  • Analyzed the source systems for erroneous, duplicative, and integrity issues related to the data.
  • Worked in enhancement of SQL, dynamic SQL and PL/SQL code to run in properly manner.
  • Configured and Install PWX logger and listener for Oracle Exadata and for DB Z/OS 2.1.
  • Used CDC for moving data from Source to Target to identify the data in the source system that has changed since the last extraction with the help of CDC.
  • Set up hierarchy and relationship for party model and customer model in MDM Hub.
  • Worked as Data Steward for creating rule based consolidation of data and thrust and match/merge rules creation in MDM.

Environment: Informatica MDM Hub 10.0, Informatica IDQ 9.5.1, Informatica Power Center 9.6/8.6/8.1, Power Exchange 9.6,Informatica Rule Point, Oracle 9i/10g, PL/SQL, V2R4, HP-UX, Korn Shell Scripting, Mainframes Z/OS, Control M,, SQL SERVER 2008,Toad for Oracle, Google cloud, AS400, DB2, Integrity Tool, ROBOT I series, Informatica Cloud, Teradata, SQL Navigator, Razor SQL

Confidential - Birmingham, AL

MDM Developer

Responsibilities:

  • Analyzed and created business/solution functional requirement document to master customer data.
  • Analyzed and profiled data from different source systems using Informatica data quality tool.
  • Generated data profiling results using Informatica data quality tool.
  • As per the profiling results, helped MDM developers to configure trust for different source systems in Informatica HUB console.
  • Worked with ETL and MDM developers and created data quality rules to standardize the data as part of MDM project.
  • Analyzed functional requirement and created the logical model using ER-win.
  • Created the data flow and conceptual diagram for the business for better understanding and identifying the stages of data.
  • Created source to target mapping document for the ETL development and ETL testing team.
  • Set-up functionality walk-through workshops with business analysts and developers.
  • Reviewed the mapping documents from source to target landing tables in CMX ORS schema
  • Thoroughly conducted data analysis and gap analysis between source systems for MDM model.
  • Designed and developed the Test Suite in the HP ALM tool.
  • Mapped all test cases to the function requirement document.
  • Created complex SQL script by implementing business transformation logic to test the data in target tables against developer developed ETL.
  • Created complex SQL scripts to test the transformation logic from source database to target database as part of UAT Testing.
  • Develop and execute test strategies for ETL testing in complex, high volume MDM project.
  • Worked with business users and gathered requirement for MDM outbound views.
  • Developed scripts to create MDM outbound views so that downstream systems can source data directly from MDM views.
  • Used PL/SQL to write stored procedures to increase the performance.
  • Defined the Base objects, Staging tables, foreign key relationships, static lookups, dynamic lookups, queries, packages and query groups.
  • Worked on data cleansing and standardization using the cleanse functions in Informatica MDM.
  • Used Hierarchies tool for configuring entity base objects, entity types, relationship base objects, relationship types, profiles, put and display packages and used the entity types as subject areas in IDD.
  • Defined the Trust and Validation rules and setting up the match/merge rule sets to get the right master records.
  • Configured match rule set property by enabling search by rules in MDM according to Business Rules.
  • Worked on Data Standardization, Table, Scorecards, Data Validation and Modification, Data Masking, Columns, Rules, Database Interaction through IDQ/MDM.
  • Configured and installed Informatica MDM Hub server, cleanse, resource kit, and Address Doctor.
  • Used Informatica Power center Data Masking option to protect the confidential customer account information.
  • Analyzed the source systems for erroneous, duplicative, and integrity issues related to the data.
  • Worked in enhancement of SQL, dynamic SQL and PL/SQL code to run in properly manner.
  • Used Informatica proactive monitoring to detect early stage integration process failure and complex event processing.
  • Managed MDM for CCAR and Credit Risk Database in Oracle Exadata.

Environment: Informatica Power Center 9.6/9.1, Power Exchange, Informatica Rule Point, Informatica Data Quality, Informatica MDM hub, Oracle 9i/10g, PL/SQL, V2R4, HP-UX, Windows 2000, Shell Scripting, Mainframes Z/OS, Control M, Microstrategy, SQL SERVER 2008, Toad for Oracle.

Confidential - Columbus, OH / Dallas, TX

Senior ETL DEVELOPER

Responsibilities:

  • Performed a major role in understanding the business requirements, designing and loading the data into data warehouse.
  • Informatica client tools - Used Source Analyzer, Target designer, Mapping Designer, Mapplets Designer and Transformation Developer for defining Source & Target definitions and coded the process of data flow from source system to data warehouse.
  • Developed mappings in Informatica to load the data from various sources into the Data Warehouse, using different transformations like Joiner, Aggregator, Update Strategy, Rank, Router, Lookup, Sequence Generator, Filter, Sorter and Source Qualifier.
  • Worked on Static and Dynamic Caches for the better throughput of sessions containing Rank, Lookup, Joiner, Sorter and Aggregator transformations.
  • Created complex Informatica mappings, reusable objects of Mapplets depending on client requirements.
  • Created, optimized, reviewed, and executed Teradata SQL test queries to validate transformation rules used in source to target mappings and to verify data in target tables and Loaded data into Teradata using Fast Load, BTEQ, and Fast Export, Multi Load, and Korn shell scripts.
  • Used Fast Load, Multi-Load, and Tpump for data loading.
  • Implemented performance tuning logic on Targets, Sources, mappings, sessions to provide maximum efficiency and performance.
  • Used CDC for moving data from Source to Target to identify the data in the source system that has changed since the last extraction with the help of CDC.
  • Used Power Exchange for SAP FICO input for Moody’s rating agency.
  • Designed and Developed Oracle PL/SQL procedures, performed Data Import/Export, Data Conversions and Data Cleansing operations.
  • Used Autosys for automating Batches and Session.

Environment: Informatica Power Center 8.6/8.1,Power Exchange, Oracle 9i/10g, PL/SQL, Teradata V2R4, HP-UX, Windows 2000,Shell Scripting, Mainframes Z/OS, Autosys, OBIEE, Multiload, Fastload, FastExport, Toad for Oracle.

Confidential - Minneapolis, MN

Senior ETL / Informatica Developer

Responsibilities:

  • Collaborated with Business People for requirements gathering, done the Data Analysis on the legacy systems.
  • Involved in preparing the functional specs and prepared the Technical design.
  • Perform in depth Data Analysis and implemented Cleansing process and Data Quality.
  • Designed and Implemented the Pre Staging and Staging approach, for cleansing using Informatica ETL and UNIX.
  • Designed and developed star schema model for target database using ERWIN Data modeling.
  • Implemented Slowly Changing Dimension Type 1 and Type 2 for inserting and updating Target tables for maintaining the history.
  • Used various active and passive transformations such as Aggregator, Expression, Sorter, Router, Joiner, connected/unconnected Lookup, Stored Procedure, and Update Strategy transformations for data control, cleansing, and data movement.
  • Involved in massive data cleansing prior to data staging.
  • Extensively used Informatica client tools Source Analyzer, Warehouse designer, Mapping Designer, Mapplet Designer, Transformation Developer, Informatica Repository Manager and Informatica Workflow Manager.
  • Created sequential/concurrent Sessions/ Batches for data loading process and used Pre & Post Session SQL Script to meet business logic.
  • Designed and developed Mapplets for faster development, standardization and reusability purposes.
  • Involved in pre and post session migration planning for optimizing data load performance.
  • Extensively used ETL Informatica tools to extract data stored in MS SQL 2000, Excel, and Flat files and finally loaded into a single Data Warehouse.
  • Performed Unit testing during the mapping phase to ensure proper and efficient implementation of the transformations.
  • Extensively involved in performance tuning of the Informatica Mappings/Sessions by increasing the caching size, overriding the existing SQL.
  • Worked as team leader to implement Incremental load and CDC to data mart using Informatica for reporting purpose.
  • Maintained warehouse metadata, naming standards and warehouse standards for future application development.
  • Tuning the Mappings for Optimum Performance, Dependencies and Batch Design.

Environment: Informatica Power Center 8.6.1, Metadata Manager, Power Exchange 8.6, Data Analyzer, UNIX, Oracle 10g, Flat Files, XML Files, SQL and PL/SQL, ERWIN 4.0, Toad for Oracle10g, Remedy, Ultra Edit, Microsoft Project, ESP.

Confidential - Atlanta, GA

ETL Developer

Responsibilities:

  • Worked with Business analysis team on regular bases and assisted implementing agile methodology on project.
  • Designed ETL architecture and putting in place ETL standards for Informatica.
  • Administrator of several Informatica environments including production environments.
  • Created Reusable transformations and Mapplets using transformation developer and Mapplet Designer throughout project life cycle.
  • Scheduled Informatica workflows using Informatica scheduler and external scheduler Control Manager.
  • Did testing for Informatica Jobs on Pre Prod Server before moving them into Production servers
  • Created and maintained Connect Direct Connections for secure data transfer between production servers and other environments.
  • Responsible for determining the bottlenecks and fixing the bottlenecks with performance tuning.
  • Worked with Crontab utility in UNIX to manage and run workflow as per schedule.
  • Developed and maintained simple and complex end-user reports and reports from Informatica repository for internal use in Business Objects.
  • Installed and configured the transformation server for a Data Replication tool called Data Mirror.
  • Configured several oracle production servers for replication with the help of Data Mirror.
  • Supported Production environment for BI tools like Business Objects and Brio Hyperion 6.6.4.
  • Extensively used the SQL for Oracle and Teradata and developed PL/SQL scripts.

Environment: Informatica Power Center 7.1.4/7.1.1 , Cognos, Oracle, Teradata, Toad 8.6, UNIX, HP-UX 11i, Sun Solaris 5.4, SQL, FACETS 4.X, MS Office Visio 2003.

We'd love your feedback!