We provide IT Staff Augmentation Services!

Sap Data Architect Resume

4.00/5 (Submit Your Rating)

SUMMARY:

  • Over 25 years’ experience with deploying BI Solutions with heavy focus in data management and Confidential from legacy system, developing and delivering knowledge transfer and user adoption techniques ..
  • Expertise in Data migrations using AIO/BPDM Data Services into SAP ECC/Net weaver BI 7.0 (SAP BW)
  • Experienced in migrating Legacy data using LSMW Recordings, Batch, IDOCS, and BAPIs.
  • Experienced in developing the architecture for enterprise data warehousing using bus matrix architecture after consulting with business and IT to gather requirements.
  • Experienced in using the Kimball approach to dimension modeling and ETL to implement optimized slowly changing dimension practices.
  • Experienced in dimensional modeling including snowflakes and out rigging concepts, coupled with using dimension types as junk and role play. Modeling experience also included development of fact tables whose types consisted of transaction, snapshot, and accumulating snapshot.
  • Uses a project management style that is a combination of Agile and RUP, for quick deployments and quality documentation in order to deliver the customer a quality product.
  • Experienced in leading an offshore team in a matrix environment.
  • Experienced in custom reporting and analytics (Crystal Reports, Web Intelligence, Universes, Desktop Intelligence, and Xcelsius).
  • Experienced in managing metadata from legacy systems and using staging area for planning and performance optimization for Data Services (BODI) projects.

PROFESSIONAL EXPERIENCE:

Confidential

SAP Data Architect

Responsibilities:

  • Migrating financial data from people Soft Oracle ECC.
  • Responsible for migration banking information of Virtual accounts (F3K9)
  • Responsible for migrating collections information (FSCM) like Promise to pay (UDM Supervisor), Customer level and Invoice level notes, and 3 years of dunning History. Etc...
  • Utilized sap delivered functions modules, converted them into RFC enable function modules and translated them into business objects (SWO2), created IDOC form Business Objects ( BDBG)
  • As a Release management coordinator responsible for creating CHARM and Transport request, approve release notes, attend RCB meetings to promote objects between environments
  • Build development and QA BODS environments in SQL server 2008 and 2012,
  • IP 4.0 Sp 07 and patch 02, BODS 4.1 Sp 02. Part of environment build upgraded tomcat 6.x to 7.0 as per security requirements.
  • Configured Windows AD authentication of BODS as per EY InfoSec Security requirements.
  • Responsible for monitoring and tuning of Jobs, distributed Jobs depending on high CPU utilization and memory consumption.
  • Mentored off - shore resources in installing and configuring BODS.
  • Upgraded BODS 4.1 SP2 to BODS 4.2 SP6 Patch 3.
  • Responsible for Code migration and attending CRB meetings to get changes approvals.
  • Responsible for loading HR Org Mini-master Data using IDOCS, Custom ABAP programs.
  • Performance tuning of long running jobs ex. Jobs running 5+ hours tuned to run < 1 hour

Confidential

SAP Data Architect

Responsibilities:

  • Migrating financial data from ECC to BW (PSA) and Flat Files
  • Migrating ECC data using BW Extractors
  • Migrating BW data from DSO to flat files using BODS
  • Explored option for bench marking and designed strategic road map for ETL development
  • Options bench marked are RFC Read table, ABAP Data Flows, Staging Data and calling Function modules.
  • Converting reports data produced by ABAP reports and build data from the reports data.
  • Splitting huge volume of Budget Data files flat file into multiple files, copying header data in BODS for Budget uploading.
  • Extracting EKKO, EKPO, EKBE,EKET,T001, LFA1
  • Extracting BW Cost Center data using extractor 0CO OM CCA 9
  • Called SAP remote enabled function modules:
  • ZKPP FLEX UPL BATCH COST
  • ZKPP FLEX UPL BATCH PROFIT to upload budget data
  • ZKPP APPL LOG READ DB called this SAP remote enabled function module has the ability to read logs by passing the needed T code in the parameter….. KP06 (cost) or 7KEX (profit). And date range as well.

Confidential

Lead ETL/SAP Data Migration Architect

Responsibilities:

  • Understanding of Source systems and building data migration MAPS working with functional consultants.
  • Developed extracting tools and transformation processes to complete conversions from legacy systems with cleansed data per SAP R/3 structures.
  • Key SAP conversion in the SAP IM and WM modules including:
  • Migrating PO/GR data using movement type 101 in SAP ECC using Data Services PORDCR and MBGMCR IDOCS.
  • Migrated On hand Inventory using movement type 561.
  • Transferred Stock from temporary storage locations to final storage bins using LSMW BAPI - Business Object BUS2017, Method CREATEFROMDATA message Type MBGMCR, Basic Type MBGMCR03.
  • Created handling units using Data Services IDOC HU CREATE
  • Defined partner Settings in WE20 for HU CREATE, PORDCR1, MBGMCR etc...
  • Other key SAP conversions including:
  • Migrated Services Notifications (IW52) S2 type created using LSMW transaction recording, loaded multiple statuses and updated task settings deepening on the status.
  • Migrated notifications long text comments up to 99000 characters split into 132 byte length records using standard batch direct input program /SAPDMC/SAP LSMW IMPORT TEXTS in LSMW.
  • Production WIP with open pick requirements and goods issues
  • Service orders with open order status for customer returns
  • Used RFC enabled Functional modules CLAF CLASSIFICATION OF OBJECTS in Data Services to extract loaded S2 Notification characteristics for post load validation
  • Used RFC enabled Functional module Z AW READ TEXT to extract Long text notifications in Data Services to do post load validation.
  • Developed Data Services functions to update in the data flows to populate target columns dynamically.
  • Tuned data services Jobs to load in multithread mode in parallel, thereby improving performance dramatically by running 60 threads in parallel, jobs running 8 to 10 hours ran less than an hour, particularly PO as each PO creates 12 segments.
  • Monitored IDOCs and troubleshooting conversions with all available SAP tools like BD87, WE05, WE19, etc…
  • Extracted SAP R/3 tables in direct load and ABAP data flows to be use in pre load data validations before loading transaction data.
  • Verified legacy data loaded in SAP ECC, with various SAP t-codes like ME23N for PO data and corresponding GR load in PO history by document reference number for sample data or HUMO for converted Hus and LS33 for loaded Hus
  • Build Test Plans and Test Cases for TUT/FUT and defects tracking in HQ Quality Center
  • Created equipment records for known part numbers, created partners and equipment Serial numbers and UID to check the PO and Goods Receipts gets loaded with correct serial and UID for the associated parts.
  • Designed and developed custom WEBI reports for reconciliation of data loaded in SAP ECC.

Confidential

Lead ETL/SAP Data Migration Architect

Responsibilities:

  • Used Data Services (BODI) and Attunity against VSAM/DB2 to migrate data into an SQL Target system.
  • Configured optimal data stores and data flows for data movement between the involved systems
  • Developed jobs for data movement from Mainframe DB2 and VSAM Sources data using Attunity Connectors CDC.
  • Used Data Services (BODI) to generate data store configurations for facilitating migration from different environments (design, test, production)
  • Installed BODS 3.2, 4.0 and Upgraded 3.2 to 4.o using BOBJ Platform Servicers and BOBJ enterprise.
  • Implemented Data Quality Framework and configure dictionary, rule files and transformations
  • Build in house team to develop EDW on SQL Server using Kimball Methodology.
  • Developed functional requirements and data specifications for ETL Process.
  • Responsible for extracting data from various sources (SAP Netweaver BI7.0, SQL Server, flat files ...), map and transform it to a target data model and actually load the data into the target.
  • Design, document, develop, test, install and support complex data extract, transformation and load (ETL) projects.
  • Designing, coding, testing and documenting a central Enterprise Data Warehouse (EDW) for all kinds of operational reports (WEBI), Crystal Reports and dashboards (XCELCIUS).
  • Define database Data Stores and File Formats to allow Data Services XI 3.2 to connect
  • To the source or target database and files.
  • Design, create and implementation in SAP BO Data Services XI 3.2 (ETL) of Data Integrator scripts, work flows, and data flows.
  • Performed simple and complex transformations in Data Flows.
  • Cleansing data in DI using proper data types and mappings.
  • Created Workflows using object library and tool palette to execute data flows.
  • Created, Tested and Executed Data Integrator jobs using Data Services Designer to
  • Transfer data from source to target.
  • Debugging execution errors using Data Integrator logs (trace, statistics, and error) and by examining the target data.
  • Experience in handling initial (i.e. history) and incremental loads in target database.
  • Developed complex mappings utilizing built in Data Integrator functions, including reverse pivots, table comparisons, map operations, and SQL transforms, Key Generator, Case, Merge, Map Operator and lookup.
  • Configured the mappings to handle the updates to preserve the existing records using history preserve Transformation (Slowly Changing Dimensions Type-2).
  • Used SAP BO Data services Cleansing transformations for de-duplication, house-holding and address parsing.
  • Created of custom Functions and used functions like ifthenelse, set env, Job name etc.
  • Extensively created scripts, Global and local variables.
  • Performed data standardization/cleansing exercise to improve the consistency & conformance of the existing information.
  • Determined SAP Readiness (Format & Structure / Mandatory Check / Pattern Analysis)
  • Implemented data profiling to identify duplicates using BODS (Custom profiling).
  • Implemented of Recovery Mechanism for unsuccessful batch jobs execution.
  • Design Star Schemas, Design Fact table and Dimension tables tuned Data Integrator Mappings and Transformations for better performance
  • Implemented Performance techniques like table partition, bulk loading, degree of parallelism ...etc.
  • Implemented Data auditing and data validation techniques while designing flows for better data quality using Data Insight
  • Created Data Quality jobs to cleanse and match data based on internal dictionaries.
  • Monitored Jobs with Data Integrator Web Administrator wrote the Algorithm for ETL (Extract, Transform and Load) team for Data Validation using complex patterns.
  • Involved in scheduling strategy jobs and participated in weekly change control management meeting.
  • Migrated Objects from development to test to production environments.
  • Optimized codes according to the business rules. Performed application SQL

Environments: SAP Business Objects Data Services 3.2/4.0, Data Quality, Data Insight, ERWIN, SAPNetweaver BI 7.0, Flat files, Data Profiling, MS SQL Server 2005, MS DTS/SSIS, 2005.

Confidential

Lead ETL/ Data Migration Architect

Responsibilities:

  • Using Data Services (BODS) against SQL 2005 to migrate data into an SQL 2005 DW Target system.
  • Configured optimal data stores and data flows for data movement between the involved systems
  • Created data flows using i-docs as to and from objects and for managing data mappings
  • Delivered Data Services (BODI) jobs against SQL 2005 to implement SCD Type 1, Type 2 and Type 3.
  • Used Data Services (BODI) to generate data store configurations for facilitating migration from different environments (design, test, production).
  • Used parallelism in order to optimize various data flows within the project while also applying the appropriate cache type for optimal performance based on expected data movement.
  • Implemented the Extract, Transform and Load (ETL) data to the different fact tables and
  • The Slowly Changing Dimensions logic in SAP BO Data Services.
  • Extensively used Business Objects Data Services in creating jobs, workflows and data flow in extracting the data from Data Warehouse.
  • Installed and configured Business Objects Data Services in a multi-tier environment.
  • Worked with various data sources such as Flat Files, Oracle, and SQL Server.
  • Used Query, SQL, Merge, Case, Validation transforms to transform the data (as needed)
  • Before loading into the warehouse tables.
  • Worked with Flat Files (Pipe Delimited) sources and implemented error handling routines.
  • Building dw for medical billing reconciliation system for group of hospitals
  • Debugged errors using Data Integrator logs (trace, statistics, and error) and by examining the target data.
  • Used Date functions, String functions, Database functions to manipulate and format source data as a part of data cleansing implemented History Preserving using Query, Table comparison, History Preserving and key generation transformations in warehouse /dimension workflows.
  • As a part of optimization: used MAP operations to route UPDATE and INSERT records in warehouse workflows
  • Created the necessary indexes (for fields in WHERE clause)
  • Implemented incremental load process
  • Implementing the logic using Persistent Cache.
  • Right choice of transformations & right design in data flows
  • Batch jobs configuration and set repository schedules on UNIX boxes using Exceed 2008.
  • Handled the cross schema (Stage, Warehouse, Data Mart) challenges between the different environments (Development, QA and Production) by granting permissions on tables using database functions in scripts before running work flows.
  • Developed and designed Data Marts in Star and Snow-Flake Schema. Published reports onto the Business Objects Enterprise XI R2, schedule periodic batch generation of reports including utilizing multiple distribution paths such as email, FTP, Info View and others.
  • Performed both Unit and System integration testing of the Data Marts.
  • The documents generated during the process were Requirements Specifications, Design
  • Document, Project Planning, Change History, Unit testing documents, and System testing documents.
  • Involved in extracting structured or unstructured data from databases or Flat Files to process and cleanse and remove duplicate entries.
  • Performance improvement with parallelization and grid computing.
  • Perform data quality analysis, standardization and validation, and develop data quality metrics.
  • Responsible for performance tuning mappings
  • Expert in Business Objects Data Quality processing, including data cleansing, matching, data profiling, and data enhancement.
  • Developed and tested DQ data quality processes for address standardization and geo coding.
  • Designed consolidated data model for address processing for multiple sources of data that will contain over a billion records generated Export Data Quality Reports and exported all of the specified job reports at once upon execution.
  • Designed data model for normalized data matching results
  • Designed and built process flows for assigning name standardization keys to increase performance for matching on names.
  • Developed post-processing capabilities to enhance data after it is inserted into Data Warehouse.
  • Business Objects Data Insight XI is used in order to gain visibility into your data through powerful profiling and reporting capabilities, to ongoing data quality improvement through continuous monitoring.

Environment: Business Objects Data Services 3.2, Data Quality, Data Insight, ERWIN, Crystal Reports XI R2, Business Objects Reports, Web Intelligence 6.5, Business Objects Enterprise XI,Business Objects Designer, UNIX, Exceed 2008 and Citrix.

Confidential

Lead ETL/ Data Migration Architect

Responsibilities:

  • Using Data Services (BODI/Q) in a Confidential from VSAM using mainframe interface and Attunity Connector for file access of various sizes.
  • Delivered Data Services (BODI/Q) jobs that processed flat files from VSAM through use of COBOL.
  • Copybooks for smaller files and larger files to use the mainframe interface.
  • Used Data Services (BODI/Q) to implement and configure Universal Data Cleanse transforms to cleanse operational data before loading into SQL Server based datamarts.
  • Creating and managing data dictionaries which included cloning, merging and backing up to be used with data cleansing transforms
  • Enhanced the quality of operational data through deployment of data quality jobs
  • Migrated data from mainframe to datamarts in Teradata

Confidential

Lead Data/Business Analyst

Responsibilities:

  • Interviewed stakeholders and end users to gather requirements
  • Supported the project manager and led various teams on the project
  • Translated requirements from functional design to implement business rules and standardization into logical design for development team
  • Documented the specifications of the ETL Confidential processes including statistics of profiled data, and data mappings
  • Extensively used ETL to load data to Oracle database.
  • Drove the project as well as manage development team.
  • Provided quality documentation including both functional and technical designs, ETL processes, Impact and Lineage analysis.

Environment: Quest TOAD, SQL Server 2005

Confidential

Lead ETL/ Data Migration Architect

Responsibilities:

  • Used Data Services (BODI/Q) in a Data Migration from IBM AS400 to datamarts on SQL Server.
  • Delivered Data Services (BODI/Q) jobs that processed flat files from AS400 against validation transforms that contained business and standardization rules to begin profiling the data while it’s being staged.
  • Configured the data profiler in BODI by registering the repository with the administrator and designer to profile staged data, and benchmark the quality of data.
  • Used Data Services (BODI/Q) to implement and configure Universal Data Cleanse transforms to cleanse operational data before loading into SQL Server based datamarts
  • Creating and managing data dictionaries which included cloning, merging and backing up to be used with data cleansing transforms
  • Delivered data from staging area to datamarts for finance and hr.
  • Achievements:
  • Enhanced the quality of operational data through deployment of data quality jobs
  • Provided a solution for Confidential and improved quality of data immediately while creating a reusable process that the customer can continue the process of improving the quality of corporate data.

Confidential, Greenville, SC

Facilitator

Responsibilities:

  • Provided hands on instruction for deploying data services.
  • Delivered instruction on configuring repositories and job server for multiple developers.
  • Guided hands on development of batch jobs through compiling workflows and data flows, using Korn shell scripts for job execution on UNIX boxes, developing Data Models using star, snowflake and complex snowflake schemas design.
  • Demonstrated using the administrator of the management console for immediate and scheduled job execution in order for to avoid conflict with in other applications that may be resourcing data from BODI data stores.
  • Explained strategies for migration from the design through production phases while configuring multi-user environments and implementing version control with use of central repositories.
  • Demonstrated profiling which consisted of employing statistical analysis while submitting profiling tasks to benchmark data before the cleansing process using the Data Services profiler. Also taught configuration of transforms with business rules, directories, rule files and dictionaries for parsing and cleansing data before loading into target system
  • Achievements:
  • Established efficient practice of migration of data from mainframe into datamarts

Confidential, Columbia, SC

Facilitator

Responsibilities:

  • Provided hands on instruction for creating and deploying Xcelsius Dashboards
  • Delivered instruction on creating complex crystal report designs to emulate the look and feel of pivot table/crosstab.
  • Demonstrated adhoc reporting techniques to end-users using Web Intelligence
  • Explained design and development techniques of Universes
  • Translated requirements into designing and development of Business Objects Universes for end user reporting from Financial and HR DataMarts.
  • Experienced building and deploying complex reports involving formulas, dynamic parameters, crosstabs and sub-reports
  • Developed dashboards using Xcelsius
  • Added business value to existing platform by utilizing existing infrastructure while establishing efficient financial reporting solutions, optimizing and tuning the network by
  • Increasing the number of packets transferred and data sources by increasing the I/O levels, and creating indexes on tables

We'd love your feedback!