Sap Resume
CA
QUALIFICATIONS PROFILE
• Over 8 years of professional experience in the development and implementation of the business applications which includes SAP BW/BI 3.5/7.0/7.3, Business Objects XI 3.1/4.0 and DW/ETL processes using Data stage 7.5.2/8.1/8.5
• Technical and Functional expertise in SAP BI/BW versions 7.0/3.5. Implemented 4 full life cycle implementations for different modules in SAP BI/BW which involves including Requirements gathering, GAP analysis, Business Requirement documentation & Business Blueprint, POC, getting client approvals, Go-Live and Production support.
• Experience in successfully managing and leading teams for 4 years in SAP BI, BO, CRM and COPA by leading the business team to drive projects through full development cycle within project and budget parameters with excellent business analysis skills.
• Thorough knowledge of data warehousing principles including data modeling, extract/transform/load processes, job scheduling, indexing, OLAP processing, defining and using aggregates based on queries and execution plans.
• Extensively worked on data modeling - Entity Relationship (ER), Multi-Dimensional/Star Schema and Extended Star Schema models – Designing custom info objects, info providers, info cubes, DSO/ODS objects, multi providers, infosets, info sources, source systems.
• Experience with SAP BW administration using the BW Administrator Workbench, installing Standard Business Content (SBC) objects, customizing SBC objects, custom data warehouse development, managing ETL jobs, troubleshooting failed processes.
• Involved in the complete upgrade of BW 3.5 to 7.0.
• Worked extensively in business processes across the following modules FI/CO, MM, SD, PM, PP, SEM, BPS, and SCM.
• Proficient in 5 modes of extraction supported by SAP BI - mySAP Service API, DB Connect, XML, Flat files and extraction from 3rd party ETL tools.
• Expertise in ETL activities like Data Loading (Full/ Delta upload), and scheduling, monitoring and process chains for automated data loads from different source systems.
• Proficient with LO, FIGL, FIAP, FIAR, CO-PA, Master data extraction, Generic Extraction, Delta management, Business Content (Installation & Enhancement)
• Experience working with Design for Demand Planning using SAP APO, Integration of BI with APO involving Versioning, Re-Alignment, Disaggregating and Reporting.
• Worked on integration of SAP CRM with BW, created required cubes and load strategies for sales order processing.
• Experience in extraction and loading data from various data sources including Oracle, Teradata databases, legacy systems using Datastage, Informatica and Profitability and Cost management (PCM) result dataset.
• Experience in migration of objects from 3.5 to 7.0
• Expertise in performance tuning using Aggregates, Rollup, collapse, Partition, Compression and BI Accelerator.
• Used the BW Statistics cube to analyze query performance and aggregates performance.
• BW Reporting - Reporting using BEx Analyzer, BEx Web, BEx Query Designer. Expertise in creating BW components like Queries, workbooks, Calculated Key Figures, Restricted Key Figures, Structures, Variables, Exceptions, Conditions, R/RI, WAD, Web Reports, and CSS style Sheets.
• Experience in SAP BO Central Management Console (CMC), Central Configuration Manager (CCM), Universe Designer, Web Intelligence (WebI), BO Dashboards, Xcelsius dashboards, Web Intelligence Rich Client, Desktop Intelligence (DeskI).
• Experience in creating Business Objects (BO) Universes on infocubes and Bex queries.
• Expertise in creating Universes and reporting applications on top of various databases and coordinating the efforts of source data teams.
• Worked on analysis and development of ETL data flow design using Business Objects Data Services XI 3.x(SAP BODS). Good Knowledge on SAP Data Migration using Data Integrator as the tool and Data Cleansing, Address cleansing using Data Quality tool
• Extensive experience using DataStage 8.5/8.1/8.0/7.5.1/7.0/6.0(Datastage Manager, Datastage Designer, Datastage Administrator and Datastage Director) and developing Datastage Parallel Extender jobs using various Datastage stages such as Lookup, Join, Merge, Funnel, Remove duplicates, Aggregator, DB2 Enterprise, Oracle Enterprise, Sequential file, Dataset, File Set
• Expertise in data warehousing concepts and great hands on experience on the entire SDLC life cycle, ETL programming using IBM Websphere 8.x & Ascential Datastage Parallel Extender 7.x, Server jobs, Job Sequencers, Batch Jobs and other data warehouse lifecycle tasks such as data analysis, cleansing, transforming, debugging/testing, and data loading across source systems
• Worked with DB2, Teradata, Mainframe sources, Oracle feeds, Flat files, datasets operational sources. Extensive experience on Teradata and worked on projects that have dealt with humongous data.
• Experience in writing UNIX shell scripts and hands on experience with scheduling of shell scripts using AUTOSYS
• Excellent interpersonal & communication skills to work cross functionally with other Enterprise team members and ensure all applications & reporting solutions are clearly integrated and synchronized
TECHNICAL SKILLS:
SAP Applications SAP BI 7.3/7.0 /3.5, SAP ECC 6.0, SAP XI 3.0, SAP PI 7.0/7.3, SAP CRM7.0, SAP SRM 7.0,SCM-APO, Solman 4.0/7.0/7.1
ETL Tools Ascential DataStage 8.7/8.5/8.1/7.5.3/7.5.2/7.1/6.0/5.2/XE (Administrator, Manager, Designer, Director, Parallel Extender/Orchestrate, Quality Stage/Integrity), DataStage Plug-In, Data Mining, OLAP and OLTP, Informatica.
Reporting Tools BEx Analyzer, BEx Web Application Designer, BEx Broadcaster, Crystal Enterprise, Business Objects 4.0,XIR3, 6.X, Cognos 7.0/6.0/5.0
Data Modeling Tools Erwin R8/4.1/4.0/3.5.2, Designer2000/6i Star/Snowflake modeling Logical and Physical Data Modeling
RDBMS Oracle 11g/10g/9i/8i, DB2, DB2UDB, Teradata 13.0, MS-Access, MS SQL Server 7.0/2000/2005, Informix, and Redbrick
Operating Systems WINDOWS NT/95/98/2000, UNIX (HP, AIX), AS/400, OS/390, Z/OS
Languages Basic, PL/SQL, SQL, UNIX shell programming, C, C++, HTML, COBOL, JCL
Other tools SQL Plus, Microsoft office tools, HP Quality Center, Remedy, Citrix, Autosys
EDUCATION AND CERTIFICATIONS:
• ITIL – Completed ITIL V3 foundation level Certification.
• B.E. Electronics and Communications Engineering, Confidential,College of Engineering, Confidential,University, Hyderabad, India.
PROFESSIONAL EXPERIENCE:
Confidential,Irvine CA
January 2011 – Present
SAP BI/BO Consultant
Environment: SAP ECC 6.0, SAP BI 7.0, FI-AR, FI-AP, FI-GL, FI-SL, MM, IM, ABAP/4, BEx Analyzer, Query Designer, Web Application Designer, BO Designer (XIR 2.1, 3.1), WEBI (XI R 2.1, 3.1), Xcelcius 2008.
Responsibilities:
• Involved in the full life cycle implementation that includes Gap Analysis, design, development, testing, implementation and support.
• Understanding the business challenges and translating them into requirements.
• Activation of the Business Content for Accounts payable (AP), Accounts Receivable (AR), Profitability Analysis (PA), General Ledger (GL), Special Ledger (SL), Purchasing (MM) and customizing them as per the client requirement.
• Developed write optimized DSO\'s for EDW layers and standard DSO for consolidation layers and created direct update DSO\'s for creating history data and for snapshot scenario reports.
• Developed transformations with start routine, field routine, end routine, rule groups, semantic groups and involved in the creation and maintenance of DTP and InfoPackages for loading data in to data targets.
• Created generic data sources with ability of delta functionality.
• Created a BW Datasource for SQL Server, then created a datasource based on SQL table and finally uploaded PCM Data
• Involved in the enhancement of the master data and transaction datasources with Z-fields and are populated using User Exits.
• Extracted data from the finance modules like AR, AP and GL and enhanced the standard data sources 0FI_AP_04, 0FI_AR_04 to meet the client requirement.
• Worked on Accounts payable cubes ZUAPC01, ZAAPC001 and accounts receivable cubes ZUARC001, ZAARC002 and created InfoSets and MultiProviders according to the requirement.
• Created InfoObjects, InfoSources, InfoCubes and developed transfer structure, update rules and transfer rules as per business requirement and extracted data from external sources.
• Created process chains for periodic upload of master data, transactional data, and text data and scheduled and monitored the data loads with delta update and full update mechanisms.
• Created Aggregates on InfoCubes to improve the query performance.
• Created BEx queries for reporting on InfoCubes, InfoSets, ODS and MultiProvider
• Created OLAP Universes on SAP data (Info cubes / BEx queries) using the Information design Tool.
• Created WEBI reports and crystal reports and deployed them through info view with BW universe as source.
• Experience in creating formula variables, conditions, exceptions, exception aggregation, cell editors, customer exit variables and structures for developing complex custom reports.
• Created exception aggregations, cell editors, formula variables, RFK\'s and CKF\'s in the queries.
• Extensively used calculations, variables, prompts and sorted inputs, drill down, slice and dice, alerts for creating business objects reports.
• Created inventory accuracy Dash Boards (Xcelcius) and for management reports for inventory management and wages and costs report from cost center accounting.
• Involved in the migration and deployment of the universes and reports across multiple domains.
Confidential,March 2010 – December 2010
Senior SAP BI/ BO Consultant
Environment: SAP ECC 6.0, SAP BI 7.0, SAP CRM 7.0, SAP SCM, FI, CO-PA, BEx Analyzer, Query Designer, Web Application Designer, BO Designer (XIR 2.1), WEBI (XIR 2.1), Xcelcius 2008
Responsibilities:
• Involved in full life cycle implementation, right from requirement gathering, blue printing, development, testing, go-live and post go-live support.
• Held meetings with business users in order to gather business requirements and prepared business blue print based on those requirements. (
• Extracted and uploaded data into InfoCubes from Data Sources like Actual Cost through delta extraction (0C0_OM_CCA_9), Internal Order through delta extraction (0C0_OM_OPA_6), Account Receivable (0FI_AR_4) and Account Payable (0FI_AP_4) for the FI & CO modules from the Business content.
• Responsible for design, development and extraction from ECC Source system and flat files to BW for FI & CO.(
• Worked with functional consultant for mapping the source data (ECC) to target data source in BW. (
• Created CRM source system, installed BI content in BW system, enhanced crm datasources and extracted data from SAP CRM.
• Worked on 0CRM_C08, 0CSAL_C01, 0CSALMC04 to generate various cross scenario and sales reports.
• Created Queries by defining rows, columns and free characteristics providing drill down functionality in reporting using BEx. Worked with users to understand their reporting needs.
• Scheduled Daily/Weekly/Monthly data loads. (
• Managed and monitored data loads with full upload and delta mechanism. (
• Created Infoset for master data to enabling querying on data.
• Created summarized reports which includes display of Plan/Actual variance measures, analysis of full cost by cost center, overdue analysis, and analysis of payment history. (
• Developed and Tested Web Reports for drill down functionality including updating rules and transfer rules & verifying queries.
• Extensively involved in the scheduling and monitoring using process chain in Administrative Workbench. (
• Monitoring and performance implementing the solutions to the errors in the Production System. (
• Worked on of queries and performed various tuning techniques using Compression, Aggregates and Multiproviders.
• Developed single and multiple dash boards and score cards using Business Objects. (
• Created Master/detail reports, cross tabs, charts and applied slice and dice, implemented different levels of hierarchies for drill techniques(
• Defined BO Classes, Objects in the Universe and defined Cardinalities, Contexts, Joins and Aliases and resolved loops in Universes using Table aliases and contexts. (
• Designing, developing and deploying using Universe Designer, Web Intelligence, Desktop Intelligence, Info View and CMC.
Confidential,Dallas Texas
January 2009 – February 2010
SAP BW Technical Lead
Environment: SAP ECC 6.0, SAP BI 7.0 / BW 3.5, SAP SCM, FI/CO, GL, AP, AR, SD, MM, BEx, Web Application Designer
Responsibilities:
• Gathered requirements from functional users in Sales and Purchasing for cross-functional analysis. (
• Development of data model to satisfy the various functional requirements by implementing queries and cubes using business content. The main cubes developed are Customer (0SD_C01), Sales Overview (0SD_C03), Delivery Service (0SD_C04), Purchasing Data (0PUR_C01) and Vendor Evaluation (0PUR_C02). (
• Worked with functional consultant for mapping the source data (ECC) to target Data Sources in BI.
• Defined the initial sizing requirements based on source data volume estimates. (
• Configured Custom cubes, Transfer rules, Transfer Structures, Update Rules, ODS, Cubes, Aggregates, queries and workbooks.
• Customized and maintained Data Sources such as Billing (2LIS_13_VDHDR), Sales Order (2LIS_11_VAHDR), and Delivery Order (2LIS_12_VCHDR) for the SD module and Purchasing Order Item (2LIS_02_ITM) for MM.(
• Used process chains to schedule the infopackages and data transfer processes.
• Also customized primary LO Cockpit structures 2LIS_11_VAHDR, 2LIS_11_VAITM, 2LIS_11_VASCL and set the extraction process using LO cockpit and V-3 updates.
• Designed Infocubes for SCM APO BW for the Sales Quantities and Sales values. Created datamarts for Plan/Actual comparison of sales quantities and Order forecast reports
• Also configured BW objects including generic data sources to extract data from R/3 systems.
• Worked with the users for defining the reporting needs and helped design/develop flexible queries and pre-calculated reports using BEX reporting features like variables, exception and conditions in Sales and Purchasing areas.
• Created SAP BI reports which include Plan/Actual revenue comparison, Order and sales value by sales representatives and Vendor delivery efficiency.
• Analyzed report performance for statistics and implemented proposals for aggregate creations and indexing.
• Tuned queries using Partitioning, Aggregates and Multiproviders. (
• Developed prototype and presented the demo session for user acceptance.
• Trained users in reporting functions.
Confidential,OH
January 2008 – December 2008
SAP BW Techno functional consultant
Environment: SAP ECC 6.0, SAP BI 7.0, SD, MM, PP, FI, PCA, SCM, ABAP/4, BEx Analyzer, Query Designer, Report Designer, Web Application Designer
Responsibilities:
• Involved in the design, development and extraction of the data from R/3 system and flat files in to BW system from SD and MM modules using LO Cockpit extraction methods
• Analyzed the InfoCubes and ODS objects to meet the reporting needs of the client.
• Loading of transaction data to the business content InfoCube from InfoSources like 2LIS_11_VASCL, 2LIS_11VAHDR for sales, 2LIS_13_VDHDR, 2LIS_13_VDITM for billing, 2LIS_12_VCHDR for order
• Experience in the creation of the setup tables for the LO Cockpit extraction.
• Enhanced the business content datasources like 2LIS_03_SCL, 2LIS_03_ITM and 2LIS_03_BF by appending structures and populated the fields using user exits.
• Created process chains for periodic upload of the master data as well as the transactional data
• Involved in the monitoring data transfers from source systems into BW using PSA
• Developed reports on the MM cubes to check the price variance for materials from different vendors.
• Created reports on SD cubes for displaying sales figures and incoming orders
• Generated reports using BEx with conditions, exceptions, Restricted Key Figures and Calculated Key Figures
• Involved in the creation of the dashboards for sales, purchasing and delivery using Web Application Designer.
• Used aggregates, compression and partition techniques to improve the query performance
• As a part of the production support, involved in the debugging and monitoring load failures, update routines, transfer routines, process chains, scheduling data loads, solving the high priority tickets like delta loads, planning ECC outage for filling up of setup tables for Logistics (LO) extractors, minimizing the Init load and setting the V3 jobs.
Confidential,MN
February 2007 – December 2007
Data stage Developer
Environment: DataStage 7.5.1 (Parallel jobs, Server jobs), Oracle 9i, Sybase, TERADATA, DB2UDB, TOAD, Sql, PL/SQL, Visio, Shell Scripting, Cobol, Jcl, MVS, DB2, UNIX, Control-M, PVCS
Responsibilities:
• Responsible for assisting in architectural design and development related to data warehouse and IBM InfoSphere Information Server platforms.
• Involved in understanding the scope of application, present schema, data model and defining relationship within and between groups of data.
• Administered the DataStage for Security Hazards, denied access for non- authorized ones.
• Maintained backup for emergency purpose.
• Allocated memory space on DataStage by configuring the nodes to it for various purposes.
• Used Datastage Parallel Extender to load data, utilizing the available processors to achieve job performance, configuration management of system resources in Orchestrate environment.
• Developed necessary metadata repositories for all initiatives and ensure these meet current needs and provide flexibility for future growth.
• Parallel Extender was used to run the jobs efficiently.
• Managed migration of all job batches and related designs from Version 7.5 to 8.1, incorporating new functionality and notification capabilities. Incorporated functionality of pipeline and partitioning parallel processing using multiple processors to gain performance boost.
• Used DB2 EE stage for doing loads into DB2 UDB Database and to load large volumes of data into Teradata.
• Extensively used Lookup, Join, Merge stages for joining various information and also used Parallel Transformer, Column Generator, Funnel, Filter, Switch, Modify, Pivot, Row Generator from various database DB2 Enterprise, Oracle Enterprise.
• Used Metadata Repository for storing datastage log messages.
• Used Distributed Transaction Stage for treating the read, deleting messages and delivery of rows to a database.
• Used ProfileStage to reduce the overall time. Automation was done using profile stage.
• Written SQL and PL/SQL procedures to facilitate coding.
• Used PL/SQL for writing procedures and triggers.
• Designed Conceptual, Logical and Physical Data Mart using Star Schema methodology using ERWIN data model.
• Involved in loading Deal decisioning and Deal Processing data in Data warehouse. This required various complex data transformations, with the involvement of number of tables.
• Participated in source to target mapping design document, prepared system requirement documents based on data model and source target mapping documents.
• Optimized Query Performance, Session Performance and Reliability.
• Scheduled various ETL Batch processes using AUTOSYS.
• Worked with Connect Direct processes, FTP, SCP scripts for Transferring files between Servers and Legacy systems.
• Created interfaces for SQL data access to corporate level operations codes used as dimensions in the Data Warehouse.
• UNIX scripts written to create and migrate files for ad hoc client projects, executing custom programs based on project specifications.
• Scheduled ETL processes to avoid Conflicts both at Resource utilization level, Database lock issues and to avoid Datastage contention issues.
• Troubleshooted, reloaded and rejected records using DTS, SQL scripts.
• Performance tuning of ETL jobs.
• Performed the Unit testing for jobs developed to ensure that it meets the requirements.
• Responsible for Planning UAT Testing.
• Used various stages of Quality stage like standardize stage, survive stage, format convert, transfer and unijoin stages.
• Used ClearCase to store entire-project .dsx backups.
• Interacted with Report Users and Business Analysts to understand and document the Requirements and translated these to Technical Specifications for Designing Universes and Business Object Reports.
Confidential,PA
February 2006-January 2007
Data warehouse Developer
Environment: DataStage 7.0 (Parallel jobs, Server jobs), Oracle 9i, DB2UDB TOAD, XML Files, SQL Server 2000, Sql, Pl/Sql, UNIX, Shell Scripting, Cobol, UNIX
Responsibilities:
• Actively participated in the Team meetings to gather business requirements and in developing the Specifications.
• Participated in discussions with Project Manager, Business Analysts and Team members on any technical and/or Business Requirement issues.
• Worked with DataStage Manager to import/export metadata from database, DataStage Components between Datastage projects.
• Developed jobs using different stages like Transformation, Aggregation, Source dataset, external filter, Row generation, and Column generation.
• Participated in Design, Source to Target Mappings between sources to operational staging targets, using Star Schema.
• Extensively worked on Error handling, cleansing of data, Creating Hash files and performing lookups for faster access of data.
• Performed Investigate parses and analyzes free-form and single-domain columns by determining the number and frequency of unique values, and classifying or assigning a business meaning to each occurrence of a value within a column.
• Implemented CRC-32 DataStage function for change data capture.
• Used SQL coding for overriding the generated SQL in DataStage and also created stored procedures and functions using PL/SQL.
• Developed UNIX Shell scripts to automate file manipulation and data loading procedures.
• Used Datastage Director to schedule, monitor, cleanup resources, and run job.
• Wrote UNIX shell Scripts for file validation and scheduling DataStage jobs.
• Extensively used Control M for job scheduling.
• Extensively worked with DataStage Job Sequences to Control and Execute DataStage Jobs and Job Sequences using various Activities and Triggers.
• Participated in Unit testing, Functional testing and Integration testing and provided process run time.
• Provided Standard Documentation, Best practice, Common ETL Project Templates. Fine Tune jobs/Process to higher performances & debug critical/complex job.
Confidential,UT
June 2004-January 2006
DataStage Consultant
Environment: DataStage 6.1, DB2UDB, Flat Files, Teradata, Oracle9i, Shell scripts, Sybase, HP UNIX, Windows 2000
Responsibilities:
• Extensively Worked on DataExtraction, Transforms, Loading and Analysis.
• Prepared Technical Design Documents. Designed, Developed, Tested server and parallel jobs using DataStage Designer as per technical design documents.
• Responsible for developing all third party (Discover) interface modules.
• Participated in Design and code walkthroughs.
• Performing Administrator functions like creating projects, setting tunables, protecting project, unlocking the jobs and also installing DataStage plug-ins like pivot stage, Informix plug ins on UNIX servers.
• Used Data Stage Version Control to keep track of multiple versions of Datastage jobs and promote latest job versions from Development to QA and Production environments.
• Performing export and import of data stage components, table definitions and routines using Data Stage Manager.
• Extensively used Parallel Stages like Join, Merge, Lookup, Filter, Remove Duplicates, Funnel, Row Generator, Modify, Peek etc. for development and de-bugging purposes.
• Knowledge of configuration files and partition techniques for Parallel jobs.
• Worked with buildops and developed custom parallel stages for handling special requirements such as auditing the history data of the company.
• Developed Informix specific High Performance Loader (HPL) load/unload jobs for fast loading data from source files to tables.
• Automated the whole validation process using UNIX Shell scripts.
• Provided IT support to multiple environments like Production, UAT and QA environments.
• Coordinated with CM team to deploy ETL code to production environments.
• Worked with DBA team to improve the report processes by using proper indexes in the queries in ETL jobs.
• Worked with Solution delivery team to analyze the errors came up during conversion and implementing the business rules in ETL jobs as per the solutions provided by the team.
• Documented all modules to describe program developments, logic, coding and testing.
• Used configuration management tools like ClearCase/ClearQuest for version control and migration.
• Created UNIX shell scripts using Kshell for the automation of different processes and for scheduling the datastage jobs using wrappers.