We provide IT Staff Augmentation Services!

Informatica Etl/mdm Technical Lead Resume

0/5 (Submit Your Rating)

Rochester, NY

SUMMARY

  • 8+ years of IT experience with a strong background in Master data Management(MDM), Data Warehousing, Business Intelligence and Product Information Management(PIM) with expertise in all aspects of SDLC project life cycle - Requirement gathering, Analysis, Design, develop, testing and Migration to production.
  • 3 years of proven experience in Informatica MDM hub (Siperian), Hierarchy Manager, Informatica Data Director (IDD), designing of Cleanse functions, Match & Merge rules setup, User Exits and etc.
  • Having strong experience in design and configuration of landing tables, staging tables, base objects, Trust and Validation rules, Match and Merge rules, Hierarchies, Foreign Key relationships, lookups, queries & packages, cleanse functions(predefined and custom in java) and etc. in Informatica MDM Hub Console.
  • Good experience in configuring the IDD applications, subject area groups, subject areas, subject area children and search queries for managing the MDM Hub data through IDD according to business needs by working with data stewards and business users. Also experience in creation of custom Java User Exits and custom workflow process.
  • Well versed with creation of Entities, Entity Types, relationship types, hierarchies, packages and profiles for hierarchy management in MDM Hub implementation.
  • Experience in integrating external applications to MDM hub through SIF (Services Integration Framework) API’s, configuring message triggers through message queues for JMS listener systems, Real time processing of the jobs through SIF API’s in java and batch processing of the jobs in MDM hub console.
  • Experience in development of mappings in IDQ to load the cleansed data into the target table using various IDQ transformations. Also experience in data profiling and analyzing the scorecards to design the data model.
  • Proficient knowledge of Data warehouse and data modeling concepts like Star Schema, Snowflake, Fact & Dimension modeling, SCD Types, Surrogate keys, Normalization/De-normalization, Semantic Layer and etc.
  • Having predominant experience in integrating the data from various source systems like Oracle, MS SQL server, DB2, Flat Files, XML files and etc. and build enterprise data warehouse/data mart.
  • Extensively worked in developing the complex Informatica ETL’s to extract the data, transform and load into the target tables. Well acquainted with the mappings, mapplets, sessions, worklets, workflows, tasks and different Informatica transformations like Source Qualifier, Look up (connected and unconnected), Expression, Aggregator, Update Strategy, Sequence Generator, Joiner, Filter, Update Strategy, Rank, Sorter, Router, SP and etc.
  • Predominantly worked on Teradata database and strong knowledge on Teradata architecture and concepts. Also having good experience in Oracle, SQL Server and DB2 databases.
  • Experience in creation of complex database objects like Stored Procedures, Functions, Cursors and Triggers using SQL and PL/SQL.
  • Well versed with performance tuning at Teradata database end by creating proper indexes(Primary Index, Secondary Index, Partitioned Primary Index, Keys, Join Index and Hash Index), collecting the required stats, modifying the joins in the query and etc. after analyzing the explain steps.
  • Certified Teradata professional in TD 12. Proven experience with the Teradata utilities BTEQ, Fast Load, Fast Export, TPUMP, Multiload and TPT Operators.
  • Expertise in identifying the bottlenecks at Informatica end like source, target, session, mapping, transformation and etc. by debugging/ reviewing the session logs and doing performance tuning accordingly.
  • Strong at UNIX shell programming. Experience in writing shell scripts to automate as per business requirement.
  • Experience in creating the standard/adhoc reports in Business Objects (BO) and Cognos.
  • Strong analytical and problem solving skills coupled with strong leadership and confident decision making skills for enabling effective solutions leading to high customer satisfaction.

TECHNICAL SKILLS

Data Modeling: Star Schema, Snow flake schema, 3NF Model, Facts & Dimensions, Conceptual schema, Logical Schema, Physical Schema

ETL Tools: Informatica Power Center v8.6/v9.1/v9.5/v9.6 and MS SSIS

MDM & Data Quality: Informatica MDM Hub Console v9.7/v9.6/v9.5, Informatica Data Director(IDD), Services Integration Framework(SIF), Informatica Developer for IDQ, Informatica Analyst and Data Flux

Databases: Teradata v12/v13/v13.10/v14, Oracle 10g/11g and MS SQL Server 2008/2014 and DB2

Teradata Utilities: Fast Load, TPump, Fast Export, BTEQ, MultiLoad and TPT

Scheduling Tools: Tidal, Dollar U and SQL Job Agent

Reporting Tools: SAP Business Objects (BO) and IBM Cognos v10.2.1

Operating Systems: UNIX and Windows XP/7/Server 2012

Replication Tools: Data Mover and Golden Gate

Languages: SQL, PL/SQL, Java and UNIX Shell Scripting

Other software Tools utilized: TOAD for Oracle, Eclipse, JBOSS, Soap UI, Citrix, HP Quality Center, BMC Remedy, PVCS, Putty, Live Link (shared services) and etc.

PROFESSIONAL EXPERIENCE

Confidential, Rochester, NY

Informatica ETL/MDM Technical Lead

Responsibilities:

  • Involved in Requirement Analysis, Design and configuration of MDM hub to design the enterprise wide MDM solution.
  • Configured the Base objects, staging tables, landing tables, foreign key relationships, lookups, queries, packages, query groups and custom functions.
  • Developed mappings with various cleanse functions to load data from landing tables to staging tables.
  • Configured Trust settings and validation rules. Created Match and merge rule for the base objects by defining the Match Path components, Match columns and rules to identify and merge the match candidates.
  • Integrated external web applications to MDM hub through SIF API’s. Also created message queues and configured message trigger setup for JMS listener systems.
  • Ran Stage Jobs, Load Jobs, Match and Merge Jobs in Real time through SIF API’s and in batch process through automation tools by calling batch groups.
  • Configured IDD to meet Data Governance objectives. Created subject area groups, subject areas, subject area child, search queries for searching the data in IDD. Also created custom User Exits and custom workflow process.
  • Created hierarchies, entity types, relationships, packages and profiles in hierarchies tool to maintain the hierarchies between different objects.
  • Closely worked with Data Steward Team to design and configure Informatica Data Director for supporting management of MDM data.
  • Built the Physical Data Objects and developed various mapping, mapplets/rules using the Informatica Data Quality (IDQ) based on requirements to profile, validate and cleanse the data.
  • Created mappings in power center to load the data from external sources to landing tables of MDM hub.
  • Analyzed the DTS & SSIS packages to understand the business requirement in detail and developed complex ETL's to pull the data from source and populate the data to target tables.
  • Worked extensively on transformations like Source Qualifier, Look up (connected and unconnected), Expression, Aggregator, Update Strategy, Sequence Generator, Joiner, Filter, Update Strategy, Rank, Sorter, Router, SP and etc. as per business requirement.
  • Involved in Logical and Physical data modeling for different phases.
  • Did performance tuning after identifying bottleneck at Informatica end in various scenarios like lookup, Joiner, Aggregator transformations, session level partitioning, source and target bottlenecks, changing different attributes for sessions and etc.
  • Co-ordinate with the offshore team and make sure that the deliverables will be done within the SLA time.
  • Written the complex queries and SP’s as per scenario which can be used in ETL’s.
  • Responsible for migration of the project between different environments (Dev, QA, Prod).
  • Maintained the knowledge documents for the ETL’s developed and for various scenarios.
  • Regular communication to the end users for the changes being performed.

Confidential, Piscataway, NJ

DW ETL/MDM Analyst

Responsibilities:

  • Involved in requirements gathering, analysis, design and building the MDM solution.
  • Developed mappings with different types of cleanse functions to load the data from landing tables to staging tables.
  • Configured the Base objects, Staging tables, foreign key relationships, lookups, queries, packages, query groups and custom functions in MDM Hub console.
  • Created Match and merge rules for the base objects by defining the Match Path components, Match columns and rules to identify and merge the match candidates.
  • Defined Trust and validation rules to calculate the trust scores for the records.
  • Configured the IDD applications, subject area groups, subject areas, subject area child and search queries for managing the MDM Hub data through IDD according to business requirement.
  • Did data profiling for the source systems data using Informatica Analyst tool to get the scorecard. And measured the quality of critical data element based on Invoice, PR, PO, Material, Material price and etc.
  • Designed and developed mappings/mapplet to extract, cleanse, transform and load into target tables using different IDQ transformations. Also validated the mapplets as rules to use in data profiling, data modeling and etc.
  • Extensively used Informatica Power Center as an ETL tool to extract, transform and load the data into the target tables.
  • Identified the bottlenecks for long running jobs and did performance tuning at Informatica end in various scenarios like lookup, Joiner, Aggregator transformations, session level partitioning, source and target bottleneck, changing different attributes for session and etc.
  • Developed complex queries along with multiple joins to fetch the data and load into the target tables.
  • Did performance tuning at database end after identifying the bottlenecks through Explain plan and increased the performance by creating proper indexes, partitions, collecting statistics and tuning of the query properly.
  • Implemented the reports in Cognos as per business requirement.
  • Analyzed the requests raised by business/end users and provided the RCA/STF. If the LTF was required to rectify the issue, gone through the complete SDLC life cycle and migrated to the production.
  • Make sure that the data would be extracted from the 17 ERP source systems and populated the core layer through all the EDW layers by running the scheduled jobs in TIDAL. For any issues, analyze and resolve either by STF or LTF.
  • Co-ordinate with client/onsite counterpart regularly and make the changes as per requirement. Also co-ordinate with the end users about the change being performed.
  • Prepared knowledge domain documentation on modules developed.

Confidential, Ann Arbor, MI

ETL/MDM Developer

Responsibilities:

  • Involved in all phases of SDLC from requirement, design, development, testing, training and rollout to the field user and support for production environment.
  • Involved in understanding of Business Processes, grain identification, identification of dimensions and measures for OLAP applications Designed & implemented complex Informatica mappings and did performance tuning in various scenarios.
  • Involved in implementing the Land Process of loading the customer/product Data Set into Informatica MDM Hub using ETL tool that was coming from various source systems.
  • Defined the Base objects, Staging tables, foreign key relationships, lookups, queries, packages and query groups.
  • Configured match & merge rule sets and ran stage jobs, load jobs, match jobs and etc. in batch viewer.
  • Implemented IDD as per the Business requirements.
  • Worked with data-stewards to set guidelines and benchmarks for resolving linkage and duplicate tasks in Initiate hub.
  • Worked for the address validation and name matching task using IDQ.
  • Extract and Load data using different tools in Teradata like Multiload, Fast Export, Fastload, TPump and BTEQ and TPT.
  • Identified the data quality issues, anomalies, patterns based on business rules. Then cleansed, labeled and fix data gaps with IDQ.
  • Wrote stored procedures in PL/SQL and UNIX Scripts for automated execution of jobs
  • Created standard/adhoc reports in COGNOS as per business requirement.
  • Prepared Technical Design Documents (TDD) of the complete implementation of the Business Module.

Confidential, San Jose, CA

Data warehouse/ETL Analyst

Responsibilities:

  • Involved in Requirement Analysis, Design and Development and migration to production for extracting the data from the source systems (ERP/ODS) and loading into the Enterprise Data Warehouse (TDPROD).
  • Did enhancement of modules as per business requirement in various measures like General Ledger (GL), Account Payable (AP), Headcount, Purchase Order (PO), Purchase Requisition (PR), Assets, Revenue and etc.
  • Developed complex ETL's to pull the data from source and populate the data to target tables by using different Informatica transformations like Source Qualifier, Look up (connected and unconnected), Expression, Aggregator, Update Strategy, Sequence Generator, Joiner, Filter, Update Strategy, Rank, Sorter, Router and etc. as per business requirement.
  • Developed the Teradata utility (Fastload, TPUMP and MLoad) scripts to load the data from flat files to Teradata tables.
  • Implemented SCD Type-1 and Type-2 in Informatica for various scenario’s as per requirement.
  • Written complex queries in Teradata to transform the data at DB end which could be used as SQL override in SQ transformation. Also written Stored Procedures as per requirement.
  • Did performance tunings at Informatica end in various scenarios like lookup, Joiner, Aggregator transformations, session level partitioning, changing different attributes and etc.
  • Did Job abort analysis/STF/LTF for production, extractions, BCP and dataflux systems.
  • Handled multiple weekly/monthly/quarterly/yearly activities like partitioning of tables, month end runs, Stats update and etc. for production system.
  • Did impact analysis for any code changes, table structure changes being implemented through different change requests and ensured for defect free delivery.
  • Performance Tuning/monitoring using different tools at TD database level. Provided the RCA/fix for any performance issues. Also closely worked with TD DBA Team.
  • Analyzed the cases raised by end users related to data, performance and tracking/status and resolved within SLA time.
  • Implemented UNIX shell scripts as per requirement to reduce the manual work.
  • Did analysis of Replication to TDPROD2 from TDPROD using Golden Gate and Data Mover whenever issues occurred. Also fixed the issues in the best possible way after analysis.
  • Daily Communication regarding the status /issues of work to client/onsite coordinator and propose solutions after analysis to implement.
  • Prepared knowledge domain documentation on modules developed. Also timely documentation of any issues/information and trained the team members/new joiners based on experience.
  • Implemented Data reconciliation reports/queries in production system to assess data quality and data tie between source systems and Enterprise warehouse via data flux.
  • Implemented various reports in Business Objects as per business requirement.

Confidential

ETL Developer

Responsibilities:

  • Worked with source databases like Oracle, SQL Server and Flat Files.
  • Extensively worked with various Active transformations like Filter, Sorter, Aggregator, Router and Joiner transformations.
  • Extensively worked with various Passive transformations like Expression, Lookup (connected and unconnected), Sequence Generator, Mapplet Input and Mapplet Output transformations.
  • Developed Mapplets to implement business rules using complex logic.
  • Used Informatica Repository Manager to create Repositories, User Groups and Users based on their roles.
  • Converted the PL/SQL Procedures and SQL*Loader scripts to Informatica mappings.
  • Tuned performance of Informatica session for large data files by increasing block size, data cache size and target based commit interval.
  • Developed UNIX shell scripts to automate the data transfer (FTP) process to and from the Source systems, to schedule weekly and monthly loads/jobs.
  • Used Informatica Designer to create complex mappings using different transformations to move data to multiple databases.
  • Design and Development of pre-session, post-session and batch execution routines to run Informatica sessions using Informatica Server manager.
  • Used Debugger to check the errors in mapping.
  • Generated UNIX shell scripts for automating daily load processes.
  • Managed Change control implementation and coordinating daily, monthly releases and reruns.

We'd love your feedback!