We provide IT Staff Augmentation Services!

Sap Data Migration Consultant Resume

5.00/5 (Submit Your Rating)

Profile

  • 7 years of IT experience with more than 2 years of expertise on datamodelling technologyand 4+years of hands on approach on specializationin SAP Business objects Data services/Data Integrator and Data Quality Tools.
  • Developed and supported in Extraction, Transformation and Load process(ETL) using
  • SAP Business Objects Data Services to populate the tables in Data ware house and Data marts.
  • Worked on DATA MIGRATION FROM LEGACY SYSTEMS TO SAP SYSTEMS.
  • DATA MIGRATION using SAP BEST PRACTICES AND ALSO LSMW.
  • Tight hands on approach on SAP BODS 3.2 AS WELL AS 4.0.
  • Developed complex mappings inSAP BODI to load the data from various sources into the Data Mart using different transformations like Query, Table Comparison, Merge,Pivot, Reverse Pivot, Lookup, Key Generation, History Preserving, Date Generation, Map Operation and Validation etc.
  • Intense experience in using LSMW tool where a lot of interaction took place with end users in design and functional requirements.
  • Proficient in usingSAPBusiness Objects Data Services/Integrator to load data from Flat files which involved both fixed width as well as Delimited files and also from the relational database, which was Oracle (SQL, PL/SQL), SQL Server.
  • Had good hands on approach on SAP BO administration with configuration,installation and maintenance of SAP BO 3.1, 3.2, 4.0.
  • Data modeling expertise using Star Schema/Snowflake modeling, OLAP/ROLAP tools, Fact and Dimensions tables designPhysical and Logical Data Modeling by using ERWIN 4.0/3.x and Oracle Designer and knowledge in data mart and data flow modeling.
  • Experienced with Data Quality Management Metadata Management, Master Data Management, Process Modeling, Data Dictionary, informationStewardship, Data Profiling, Data Quality, and Data Model Standards.
  • Experienced in working with Data Resource Management team (DRM), Enterprise Metadata Repository team (EMR), Corporate Data Dictionary team (CDD), Integrated Governance Board (IGB) for data quality, data Integration of enterprise data assets.
  • Processing and Controlling using SAP BO Experienced with Data Cleansing Process including Data Auditing, Workflow Specification/Auditing, Post Data Services.
  • Extensively worked on SAPBO Data Integrator or SAP Data Services platforms.
  • Consolidated and audited datafrom disparate tools and sources, includingSAPbusiness intelligence (BI). Extract, transform, and load (ETL), relational databases, modeling tools, and third-party data into a single repository.
  • Experience in Installation and configuration, Creation of repositories and security domain with user groups and security implementation.
  • Involved in identification of Facts, Measures, Dimensions and hierarchies for OLAP models.
  • Substantial development experience in creatingStored Procedures, PL/SQL, SQL, Packages, Triggers, Functions and Query optimization.

SKILL SET:
Technical:
Data Warehousing: SAP Business Objects Data Services XI 3.2 (12.2.0, BOXIR3.1) ,BODI 11.7 /11.5/6.5/6.1 
OLAP tools : Business Objects Developer Suite 5.1/6.5/,BOXIR/R2, Business Objects Crystal Reports XI R3/R2, Business Objects Web Intelligence XI R2, SQL Developer and other reporting apps
Databases : Oracle 11/10g/9i, DB2, MS SQL Server 2010/ 2008/2005/2000,SAP 
Languages : SQL, PL/SQL, C, JAVA
Operating Systems: UNIX, Windows 95/98/00/NT/XP/Vista
GUIs : Developer 2000
Other Tools : Toad 6.3, SQL Developer, SQL * Load 
Data Modeling : Erwin 4.0/Model Manager, Oracle Designer

1.Date: June 2010 to September 2012.

Client: Confidential
Role: SAP DATA MIGRATION CONSULTANT

PROJECT SCOPE: Confidential is a clinical diagnostics and quality diagnostics manufacturing company which needs to have a clean and perfect data in their SAP client. The data coming from all the plants varying from North America, Europe and Singapore needs to be kept in a single client. All the data needs to be profiled and cleansed before the loading of data takes place. The implementation gave scope for zero tolerance for duplicate records. The processed cleansed data after repetitive profiling has been moved from legacy to ECC using BEST PRACTICES and LSMW.

Responsibilities:

  • Worked extensively on the entire tier two as well as tier one object. Worked extensively on material master and all tier two objects which are corresponding with the tier1 object.
  • Worked extensively in creating jobs, extract inscope and transform jobs.
  • All jobs include extensive work on extract the data from different sources. Built extract jobs inorder to extract the data into staging.
  • Built in scope jobs inorder to extract data only to get the data as per business requirements.
  • Worked extensively on the transform job inorder to complete the required three steps.
  • Remediation
  • Data quality management and data profiling
  • Deduplication.
  • Worked extensively to complete the remediation using built in functions and also custom functions.
  • Extensively completed data quality management using information steward and did extensive data profiling.
  • Verification of duplicate records with regard to the parent objects inorder to take off all the duplicate records.
  • Extensively worked in order to get a 100 percent clean data where there is zero tolerance for duplicate records in order to give at as an input to AIO jobs.
  • Worked extensively on best practices in order to push the data into SAP using IDOCS.
  • Had extensive knowledge on mapping, validation,look up and enrichment inorder to push the data using AIO.
  • Had extensive experience in trouble shooting in resolving the defects and also loading the data in DEV, QA and PROD environments.
  • Had immense expertise in moving the jobs between different mock cycles and resolving the defects for getting the load going and also for getting a 100 percent clean result.
  • Worked extensively on SAP INFORMATION STEWARD.
  • Worked extensively On ORACLE SQL DEVELOPER inorder to do the backend testing.
  • Had extensive meetings involvement as a part of defect resolution and also live trouble shooting with the data team lead, SME and also rest of the team.
  • Had extensive team meetings with business people in order to assimilate the business requirements and also in terms of deadlines and requirement estimation.

ENVIRONMENT: WINDOWS, SAP BODS 4.0, SAP INFORMATION STEWARD 4.0, HPQC, ORACLE 11G.

2. Date: December 2008 to April 2010.
Client: Confidential Mason MI.
Role: SAP DATA MIGRATION USING BEST PRACTICES, LSMW.

Description:

Confidential container is a leading container manufacturing company . Confidential ‘s vendors are located all over the world. This project involved loading of the master data from different sources into the SAP using best practices and also LSMW. The enterprise project mainly concentrated on the data quality, data migration and also data integration.

Responsibilities:

  • Involvedin complete life cycle of a data migration from DEV TO PROD.
  • Involved in data migrationfrom legacy systems to SAP using AIO methodologyand also LSMW.
  • Extensively used SAP Best practices to load the data.
  • Worked with most complex, medium and as well as low level objects while loading the data.
  • Involved in meetings with the process team in order to help the process team work effectively.And accurately completely the mapping of data from legacy system to the SAP system.
  • Involved in technical design of all the objects that are involved with SAP AIO methodology and LSMW.
  • Worked extensively with data profiling tools (sap information steward 4.0) for ongoing project to understand the quality of data before loading into the SAP.
  • Worked extensively with work flows, data flows, try catch blocks, lookups and also validation.
  • Worked extensively in loading of master data from the legacy system to the SAP system.
  • Involved in data profiling like source column data profiling, detail data profiling and used validation and address cleansing transformations by using data quality in SAP BODS 3.2, 4.0. 
  • Involved in meetings with the process team in order to implement business rules and in scoping for the objects.
  • Had been able to display proven experience in understanding the business requirements and also delivering the solution to the client.
  • Involved in writing the custom functions as per business requirements for loading the data into SAP.
  • Worked extensively with global variables and parameters on BODS 4.0.
  • Identified bugs in existing mappings by analyzing the data flow, evaluating transformations and fixing the bugs and redesign the existing mappings for improving the performance.

Environment:Windows 7, SQL SERVER 2008, db2, SAP BODS 4.0, SAP INFORMATION STEWARD 4.0.

3. Date: March 2007 to November 2008. 
Client: Confidential, Raleigh, North Carolina.
Role: Data migration,ETL development/Security.

Description:

The Common Access Platform – Data Foundation project lays the technical foundation for information delivery to senior management and all decision makers in the organization in order to allow for timely forward-looking decisions that will enable this organization to increase revenue, minimize loss of production, reduce operating and maintenance costs, and improve productivity.

The recommendation is to create a single repository; the “single source of truth”; and to assemble a technical platform that consists of Business Object components and SAP BW connectivity, information delivery (presentation) architecture. This technical platform will be called the Common Access Platform (CAP). Data governance practices will be implemented to ensure ongoing data quality.

This single repository will be populated with the following data subject areas that are prioritized by operations VP’s and COO:

  • Production (Daily, Weekly, Monthly)
  • Revenue (Daily Estimated / Monthly Actual / Weekly Estimated)
  • Cost of Sales (Daily Estimated / Monthly Actual / Weekly Estimated)
  • Gross Margin (Daily Estimated / Monthly Actual / Weekly Estimated)

The data from various corporate systems such as SAP, PDS, APBS billing, ZE Power database, etc. will be manipulated and transformed into complex database structure in this new data warehouse environment.

Responsibilities:

  • Involved in data migration from various corporate systems and interacted with end users and functional consultants.
  • Extensively used LSMW in order to obtain the data from various non-legacy systems.
  • Transforming the data as per the business requirements from the source systems to the target systems.
  • Involved in data profiling like source column data profiling, detail data profiling and used validation and address cleansing transformations by using data quality in SAP BODS 3.2, 4.0.
  • Involved in administrative tasks like Repository Configuration, Job Server configuration, Central Repository configuration, Job Scheduling, Monitoring.
  • Involved in Migration of Jobs and workflows from Development to Test and to Production Serverstoperform the integration and system testing.
  • Have good knowledge on all kinds of lookups (Lookup Seq, Lookup Ext, and Lookup).
  • Created Workflows using object library and tool palette to execute data flows.
  • Involved in SAP BOBJ to BW integration.
  • Involved in BW Backend Development, Design and Development of SAP BW front end queries.
  • Designed and developed simple and complex transformations in Data Flows.
  • Cleaned data inSAPDI using proper data types and mappings.
  • Experience in implementation of Recovery Mechanism for unsuccessful batch jobs execution.
  • Extensively worked with local Repositories, Central Repositories, Job Server and Web Admin client tolls.
  • Created Custom functions to make the code the reusable.
  • Created reusable components like Workflow, Data flows, batch and real time Jobs.
  • Worked with Database functions in Scripts.
Extensively worked with SQLServer applications like SSIS, Stored Procedures, SQL Querying, T-SQL and OLTP.
  • Extensively worked on Performance and tuning of the jobs.
  • Extensively used Try/Catch to handle exceptions and writing Scripting to automate the Job Process.
  • Have good knowledge about Web Admin/Management Console (i.e. Scheduling, monitoring jobs).
  • Extensively worked on Performance and tuning of the jobs.
  • Identified bugs in existing mappings by analyzing the data flow, evaluating transformations and fixing the bugs and redesign the existing mappings for improving the performance.
  • Involved in Analysis of data formats and availability.
  • Involved in Design of Extract, Transform and Load functionality.
  • Involved in Design of data models and data mart constructs.
  • Design of a dashboard capable of displaying reconciliation exceptions and sending alerts.
  • Involved in High Level design, Low level Design document,Unit Test Case and Integration Test case Preparation, Performance, and improvement of ETL.

Environment: BUSINESS OBJECTS DATA SERVICES 3.2(12.2.2),4.0SAP BW,ORACLE, SQL Server 2005, SQL, PL/SQL, WINDOWS XP, LSMW.

4. Date: Oct 2005 – Feb 2007 
Client: Confidential.
Role: Data Analyst/ Modeler.
Responsibilities:

  • Data Modeling, Data Analysis for OLTP and OLAP systems.
  • Logical & Physical Data Modeling, which involved;
  • ER Modeling - Developing Entity Relationship Diagrams (ERD).
  • Normalizing the tables/relationships to arrive at effective Relational Schemas.
  • Identifying the facts & dimensions; grain of fact, aggregate tables for Dimensional Models.
  • Developing Snowflake Schemas by normalizing the dimension tables as appropriate.
  • Implementation of Business Rules in the Database using Constraints & Triggers.
  • Dimensional Data Modeling to deliver Multi-Dimensional STAR schemas.
  • Requirements & Business Process Analysis; Rapid Audit of the requirements and existing systems.
  • Developed, Implemented & Maintained the Conceptual, Logical & Physical Data models.
  • Design & Implementation of Data Mart; DBA coordination; DDL & DML Generation & usage.
  • Metadata & Data Dictionary Management; Data Profiling; Data Mapping.
  • Normalization techniques to arrive at effective Relational Schemas.
  • Applied Data Governance rules (primary qualifier, Class words and valid abbreviation in Table name and Column names).
  • Involved incapturing data lineage, table and column data definitions, valid values and others necessary information in the data models.
  • Documented all the information of application and saving them for future reference.
  • Identifying the facts & dimensions; grain of the fact; aggregate tables, etc for Dimensional Models.
  • Developing Snowflake Schemas by normalizing the dimension tables as appropriate.
  • ETL Design & Implementation - Data Extraction, Transformation & Loading (using Oracle Warehousing Builder, SQL & PL/SQL).
  • Performance Tuning (Database Tuning, SQL Tuning, Application/ETL Tuning)
  • Create and alter SQL statements before sending database change request to DBA team.
  • Maintained and documented all create and alter SQL statements for all release.
  • Designing Data Flows & System Interfaces.
  • Architecting Work Flows, Activity Hierarchy & Process Flows; Documenting using Interface Diagrams, Flow Charts & Specification Documents.
  • Coordinating with DBA team to implement physical models & to setup development, test, staging & production environments for MGP; BPR Management; DDL & DML Generation & usage.

Environment: Oracle 11g/ 10g, Teradata V2R6, Oracle Warehouse Builder, SQL, PL/ SQL, Sybase Power Designer 12.5/ 15, SQL Server 2000/ 2005, UNIX, Toad, Business Objects, MS office, MS Access.

We'd love your feedback!