Bi/mdm Analyst Resume
Rochester, NY
SUMMARY
- 8+ years of IT experience with a strong background in Data Warehousing, Business Intelligence and Master data Management with expertise in all aspects of SDLC project life cycle - Requirement gathering, Analysis, Design, develop, testing and Migration to production.
- Proficient knowledge of Data warehouse and data modeling concepts like Star Schema, Snowflake, Fact & Dimension modeling, SCD Types, Surrogate keys, Normalization/De-normalization, Semantic Layer and etc.
- Having predominant experience in integrating the data from various source systems like Oracle, MS SQL server, DB2, Flat Files, XML files and etc. and build enterprise data warehouse/data mart.
- Extensively worked in developing the complex Informatica ETL’s to extract the data, transform and load into the target tables. Well acquainted with the mappings, mapplets, sessions, worklets, workflows, tasks and different Informatica transformations like Source Qualifier, Look up (connected and unconnected), Expression, Aggregator, Update Strategy, Sequence Generator, Joiner, Filter, Update Strategy, Rank, Sorter, Router, SP and etc.
- Having strong configure and development experience in Informatica Master Data Management (MDM) Hub Console like stage & base objects creation, landing to staging tables load with cleansing functions, Stage to Base object load, validation rules setup, defining match and merge rules, hierarchy manager, queries & packages creation and etc.
- Good experience in configuring the IDD applications, subject area groups, subject areas, subject area child and search queries for managing the MDM Hub data through IDD according to business needs.
- Expertise in Informatica MDM Hub Match & Merge process, Batch Jobs, Batch Groups, Validation Rules, Trust settings for source systems.
- Experience in development of mappings in IDQ to load the cleansed data into the target table using various IDQ transformations. Also experience in data profiling and analyze the scorecards to design the data model.
- Predominantly worked on Teradata database and strong knowledge on Teradata architecture and concepts.
- Developed Complex database objects like Stored Procedures, Functions, Cursors and Triggers using SQL and PL/SQL.
- Well versed with performance tuning at Teradata database end by creating proper indexes (Primary Index, Secondary Index, Partitioned Primary Index, Keys, Join Index and Hash Index), collecting the required stats, modifying the joins in the query and etc. after analyzing the explain steps.
- Proven experience with the Teradata utilities BTEQ, Fast Load, Fast Export, TPUMP, Multiload and TPT Operators.
- Certified Teradata professional in TD 12.
- Expertise in identifying the bottlenecks at Informatica end like source, target, session, mapping, transformation and etc. by debugging/ reviewing the session logs and doing performance tuning accordingly.
- Having experience in Oracle and SQL server databases. Also created stored procedures as per requirement.
- Strong at UNIX shell programming. Experience in writing shell scripts to automate as per business requirement.
- Experience in creating the standard/adhoc reports in Business Objects (BO) and Cognos.
- Awarded with ‘Performance Excellence Award’, ‘On the spot award’ and etc. from various clients and employer.
- Strong analytical and problem solving skills coupled with strong leadership and confident decision making skills for enabling effective solutions leading to high customer satisfaction.
TECHNICAL SKILLS
Data Modeling: Star Schema, Snow flake schema, 3NF Model, Facts & Dimensions, Conceptual schema, Logical Schema, Physical Schema
ETL Tools: Informatica Power Center v8.6/v9.1/v9.5/v9.6 and MS SSIS
MDM & Data Quality: Informatica MDM Hub Console v9.6/v9.7, Informatica Data Director (IDD), Informatica Developer for IDQ, Informatica Analyst and Data Flux
Databases: Teradata v12/v13/v13.10/v14, Oracle 10g/11g and MS SQL Server
Teradata Tools: TD SQL Assistant, TD Administrator, Viewpoint and TD Manager
Teradata Utilities: Fast Load, TPump, Fast Export, BTEQ, MultiLoad and TPT
Scheduling Tools: Tidal and Dollar U
Reporting Tools: SAP Business Objects (BO) and IBM Cognos v10.2.1
Operating Systems: Windows XP/7/Server 2012 and UNIX
Replication Tools: Data Mover and Golden Gate
Languages: SQL, PL/SQL and UNIX Shell Scripting
Other software Tools utilized: TOAD for Oracle, Citrix, HP Quality Center, BMC Remedy, PVCS, Putty, Live Link (shared services) and etc.
PROFESSIONAL EXPERIENCE
Confidential, Rochester, NY
BI/MDM Analyst
Responsibilities:
- Involved in Requirement Analysis, Design and Development and migration to production for extracting the data from the source systems and loading into the Pricebank.
- Analyzed the DTS & SSIS packages to understand the business requirement in detail and developed complex ETL's to pull the data from source and populate the data to target tables.
- Created mappings in power center to load the data from external sources to landing tables of MDM hub.
- Configured the Base objects, Staging tables, foreign key relationships, static lookups, dynamic lookups, queries, packages, query groups and custom functions.
- Developed mappings with various cleanse functions to load data from landing tables to staging tables. And defined the Trust and validation rules before loading the data into the base tables.
- Created Match and merge rule sets for the base objects by defining the Match Path components, Match columns and rules.
- Ran Stage Jobs, Load Jobs, Match and Merge Jobs using the Batch Viewer and Automation Processes.
- Successfully implemented IDD using hierarchy configuration and created subject area groups, subject areas, subject area child, IDD display packages in hub and search queries for searching the data in IDD data tab.
- Closely worked with Data Steward Team for designing, documenting and configuring Informatica Data Director for supporting management of MDM data.
- Built the Physical Data Objects and developed various mapping, mapplets/rules using the Informatica Data Quality (IDQ) based on requirements to profile, validate and cleanse the data.
- Worked extensively on transformations like Source Qualifier, Look up (connected and unconnected), Expression, Aggregator, Update Strategy, Sequence Generator, Joiner, Filter, Update Strategy, Rank, Sorter, Router, SP and etc. as per business requirement.
- Involved in Logical and Physical data modeling for different phases.
- Did performance tuning after identifying bottleneck at Informatica end in various scenarios like lookup, Joiner, Aggregator transformations, session level partitioning, source and target bottlenecks, changing different attributes for sessions and etc.
- Co-ordinate with the offshore team and make sure that the deliverables will be done within the SLA time.
- Written the complex queries and SP’s as per scenario, which can be used in ETL’s.
- Responsible for migration of the project between different environments (Dev, QA, Prod).
- Maintained the knowledge documents for the ETL’s developed and for various scenarios.
- Regular communication to the end users for the changes being performed.
Confidential, Piscataway, NJ
Data Warehouse/MDM Technical Lead
Responsibilities:
- Extensively used Informatica Power Center as an ETL tool to extract, transform and load the data into the target tables.
- Identified the bottlenecks for long running jobs and did performance tuning at Informatica end in various scenarios like lookup, Joiner, Aggregator transformations, session level partitioning, source and target bottleneck, changing different attributes for session and etc.
- Developed complex queries along with multiple joins to fetch the data and load into the target tables.
- Did performance tuning at database end after identifying the bottlenecks through Explain plan and increased the performance by creating proper indexes, partitions, collecting statistics and tuning of the query properly.
- Configured the Base objects, Staging tables, foreign key relationships, static lookups, dynamic lookups, queries, packages, query groups and custom functions in MDM Hub console.
- Created Match and merge rule sets for the base objects by defining the Match Path components, Match columns and rules.
- Defined Trust and validation rules before loading the data into the base tables
- Configured the IDD applications, subject area groups, subject areas, subject area child and search queries for managing the MDM Hub data through IDD according to business needs
- Did data profiling for the source systems data using Informatica Analyst tool to get the scorecard. And measured the quality of critical data element based on Invoice, PR, PO, Material, Material price and etc.
- Designed and developed mappings to extract, cleanse, transform and load into target tables using different IDQ transformations.
- Implemented the reports in Cognos as per business requirement.
- Analyzed the requests raised by business/end users and provided the RCA/STF. If the LTF was required to rectify the issue, gone through the complete SDLC life cycle and migrated to the production.
- Make sure that the data would be extracted from the 17 ERP source systems and populated the core layer through all the EDW layers by running the scheduled jobs in TIDAL. For any issues, analyze and resolve either by STF or LTF.
- Co-ordinate with client/onsite counterpart regularly and make the changes as per requirement. Also co-ordinate with the end users about the change being performed.
- Prepared knowledge domain documentation on modules developed.
Confidential, Ann Arbor, MI
ETL/Data Quality Analyst
Responsibilities:
- Involved in all phases of SDLC from requirement, design, development, testing, training and rollout to the field user and support for production environment.
- Involved in understanding of Business Processes, grain identification, identification of dimensions and measures for OLAP applications Designed & implemented complex Informatica mappings and did performance tuning in various scenarios.
- Involved in implementing the Land Process of loading the customer/product Data Set into Informatica MDM Hub using ETL tool that was coming from various source systems.
- Defined the Base objects, Staging tables, foreign key relationships, lookups, queries, packages and query groups.
- Configured match & merge rule sets and ran stage jobs, load jobs, match jobs and etc. in batch viewer.
- Implemented IDD as per the Business requirements.
- Worked with data-stewards to set guidelines and benchmarks for resolving linkage and duplicate tasks in Initiate hub.
- Worked for the address validation and name-matching task using IDQ.
- Extract and Load data using different tools in Teradata like Multiload, Fast Export, Fastload, TPump and BTEQ and TPT.
- Identified the data quality issues, anomalies, patterns based on business rules. Then cleansed, labeled and fix data gaps with IDQ.
- Wrote stored procedures in PL/SQL and UNIX Scripts for automated execution of jobs
- Created standard/adhoc reports in COGNOS as per business requirement.
- Prepared Technical Design Documents (TDD) of the complete implementation of the Business Module.
Confidential, San Jose, CA
Data warehouse Analyst
Responsibilities:
- Involved in Requirement Analysis, Design and Development and migration to production for extracting the data from the source systems (ERP/ODS) and loading into the Enterprise Data Warehouse (TDPROD).
- Did enhancement of modules as per business requirement in various measures like General Ledger (GL), Account Payable (AP), Headcount, Purchase Order (PO), Purchase Requisition (PR), Assets, Revenue and etc.
- Developed complex ETL's to pull the data from source and populate the data to target tables by using different Informatica transformations like Source Qualifier, Look up (connected and unconnected), Expression, Aggregator, Update Strategy, Sequence Generator, Joiner, Filter, Update Strategy, Rank, Sorter, Router and etc. as per business requirement.
- Developed the Teradata utility (Fastload, TPUMP and MLoad) scripts to load the data from flat files to Teradata tables.
- Implemented SCD Type-1 and Type-2 in Informatica for various scenario’s as per requirement.
- Written complex queries in Teradata to transform the data at DB end, which could be used as SQL override in SQ transformation. Also written Stored Procedures as per requirement.
- Did performance tunings at Informatica end in various scenarios like lookup, Joiner, Aggregator transformations, session level partitioning, changing different attributes and etc.
- Did Job abort analysis/STF/LTF for production, extractions, BCP and dataflux systems.
- Handled multiple weekly/monthly/quarterly/yearly activities like partitioning of tables, month end runs, Stats update and etc. for production system.
- Did impact analysis for any code changes, table structure changes being implemented through different change requests and ensured for defect free delivery.
- Performance Tuning/monitoring using different tools at TD database level. Provided the RCA/fix for any performance issues. Also closely worked with TD DBA Team.
- Analyzed the cases raised by end users related to data, performance and tracking/status and resolved within SLA time.
- Implemented UNIX shell scripts as per requirement to reduce the manual work.
- Did analysis of Replication to TDPROD2 from TDPROD using Golden Gate and Data Mover whenever issues occurred. Also fixed the issues in the best possible way after analysis.
- Daily Communication regarding the status /issues of work to client/onsite coordinator and propose solutions after analysis to implement.
- Prepared knowledge domain documentation on modules developed. Also timely documentation of any issues/information and trained the team members/new joiners based on experience.
- Implemented Data reconciliation reports/queries in production system to assess data quality and data tie between source systems and Enterprise warehouse via data flux.
- Implemented various reports in Business Objects as per business requirement.
Confidential
ETL Developer
Responsibilities:
- Interacted with end-users to identify and transform the business requirements into logical and technical requirements.
- Worked with source databases like Oracle, SQL Server and Flat Files.
- Extensively worked with various Active transformations like Filter, Sorter, Aggregator, Router and Joiner transformations.
- Extensively worked with various Passive transformations like Expression, Lookup (connected and unconnected), Sequence Generator, Mapplet Input and Mapplet Output transformations.
- Developed Mapplets to implement business rules using complex logic.
- Used Informatica Repository Manager to create Repositories, User Groups and Users based on their roles.
- Converted the PL/SQL Procedures and SQL*Loader scripts to Informatica mappings.
- Tuned performance of Informatica session for large data files by increasing block size, data cache size and target based commit interval.
- Developed UNIX shell scripts to automate the data transfer (FTP) process to and from the Source systems, to schedule weekly and monthly loads/jobs.
- Used Informatica Designer to create complex mappings using different transformations to move data to multiple databases.
- Design and Development of pre-session, post-session and batch execution routines to run Informatica sessions using Informatica Server manager.
- Used Debugger to check the errors in mapping.
- Generated UNIX shell scripts for automating daily load processes.
- Managed Change control implementation and coordinating daily, monthly releases and reruns.