Sap Bobj Data Services Developer Resume
SUMMARY
- Over Five years of experience in IT Field with special reference to Multiple Data Specific Tools
- More than Two years of experience as an SAP BusinessObjects Data Services Consultant
- Experience especially working in Data Migration, Data Warehousing and Data Conversionprojects
- Proficient in Installation and Configuration of BODS including Server (Jobserver, Access Server and Profiler Server) and Client components and its connectivity to underlying Databases and SAP
- Experience in performing Data Profilingand ETL (Extract, Transform and Load) from SAPECC and Legacy systems
- Active participation in two End-To-Endprojects using Agile Methodology
- Experience implementing Rapid Marts and developing Slowly Changing Dimension logic including Type 1 and Type2
- Data Warehousing Skills include developing the Staging, Dimension and Fact jobs to build an ExtendedStar Schema and negotiating the job schedule with BO Admin
- Skills in scheduling jobs using SAP BODS and TIDAL Enterprise Scheduler
- Experience in writing SQL syntax for tasks including creating and managing DB tables using tools like TOAD and MYSQL Editor
- Involved in creating DB Schemas and ODBC connection for Local, Central and Profiler Repository with the help of DBA
- Proficient in ETL processes dealing with multiple sources such as Flat Files, XML Schemas, DTDs and Excel workbooks
- Extensive experience as a Technical Writer creating Technical Guides for various Data Manipulating Tools
- Worked With Robohelp HTML, MS-Office Suite and MadCap Software for the development of technical guides
BODS Installation
- Estimating Disk Space; Installing MYSQL Server, Front and ODBC for creating DB Schemas for Local, Central and Profiler repositories; Installing BODS XI 3.2 server and client components, Local Repository Configuration, Activating Central Repo from the Designer, Registering Users and Repositories in WebAdmin, Setting up Datastores including DB and SAP Datastores
BODS Development
- Rapid Marts, Batch and Real-time Jobs, Performance Optimization, ETL, Data Modeling and Data Warehouse, Data Conversion, Data Migration, Creating DB tables, Importing Metadata to local Repository, Repository maintenance, Queries in Database; Debugging and Auditing, Add-To/Check-In/Check-Out objects with Central Rerpository, Creating Projects, Jobs, Work Flows, Conditionals, Data Flows, Query and other Transforms, Lookups, Variables, Functions and custom Functions, Data mapping; Scheduling jobs in WebAdmin, Publishing jobs as webservice
Software & Technologies
- Database: MySQL Server, SQL Server and Oracle
- Platforms: Microsoft Windows, Mac OSX
- Other: MS-Office Suite, RoboHelp, MadCap Software
Confidential, Connecticut
Jan' 2010 - Present
Position: ETL Developer
Projects: Warranty Data Warehouse Implementation and Enhancements
Scope: Data Migration (Full Lifecycle)
- Played an active part as a Developer in the implementation of Data Mart by building a Type-II Extended Star Schema for Client's Warranty related Data. The data volume exceeding 90 Million records originated from multiple Legacy and SAP ECC systems
- Participated in the requirement gathering workshop to understand the scope of the project. Identified the pain-points and gaps during Data Modeling
- Analyzed the existing Webi reports generated by Mainframe systems that formed the basis for this project and discussed the additional requirement
- Analyzed the mainframe applications including PHDB & SB from their IMG Guidelines to understand the Source Tables, Source systems, File Formats, Data Types, Column Names, Field Lengths & export frequency
- Created the S2T mapping by reviewing the IMG document for the ETL job design
- Installed and configured required software including SAP GUI and supported applications and requested accesses to Dev, QA and PROD environment
- Installed BODS 12.3 version and completed the configuration for DataStores, Local Repository & Connections, created Developers Profile and requested credentials Log-On credentials to connect to BODI and imported Rapid Mart ATL's and Real-Time jobs from templates
- Used Designer to create New Flat File schemas, included their path on Jobserver machine, added File names using Variables, declared Data Type, Column Name, Field Length in it
- Created DB tables for Staging, Dimension, Fact tables, imported the table definition in the Designer, added and converted Template tables to Physical Tables for Intermediate and Reference tables.
- Built around Sixty ETL Jobs in BODS and utilized Initialization Scripts to declare Conditions & Global Variables by adding SQL syntax, Workflows, Conditionals, DataFlows, Statements and Clauses like ifthenelse, WHERE etc.; Declaring SQL for Pre-Load SQL commands and Post-Load Commands, Transforms like Query, Table Comparison, History Preserve, Hierarchy Flattening, Map Operation, Key Generation etc., Adding Audit and Error Management Fields including OPERATION$, ROWEFFDT$, ROWTRMDT$, RSID$, Lookups, Lookup_ext, Outer Joins and functions including nvl, Concatenate, RAND, gen_row_num() etc
- Performed Data Profiling did Column Analysis and Relationship analysis on DB tables to analyze the data integrity
- Conducted Job Validation, Auditing and Syntax checks to evaluate for errors, execution time and bottlenecks
- Requested test data to load the staging tables and Data Mart
- Collaborated with data providers and data consumers on data-related issues and did follow up
- Profiled the test data, tested jobs using this data to evaluate the Referential Integrity & Performance Statistics and incorporated the Performance Optimization and Tuning techniques to Push Down operations to DB for Resource-Intensive operations like Sort by, Group By, Lookups, and Joins etc. Performed Debugging utilizing the Breakpoints, Try/Catch, Auto-Correct Load and Auto-recovery mechanisms
- Setting up the DOP, No of Threads, No of Loaders, Rows per commit, Push-Down to DB technique, Sort Asc or Desc, Group By, Filter By, Row-By-Row Select, Sorted input, Bulk Load, Rollup, Aggregate Aware functions
- Brought Surrogate Keys from Time_DIM Rapid Mart into Fact Jobs, added Fiscal Dates, Calendar Dates before Year 2000, adding historic Fiscal calendar dates from SAP R/3 FICO in TIME_DIM
- Exported Metadata in XML to be used in designing the Universe and generating Reports
- Scheduled Jobs in Management Console and in Tidal maintaining Parent-Child relationship and dependencies and adding file_exists function in DS
- Prepared Documentation including, Logical and Physical Architecture and ETL Presentations, Test plans and KT to Production Support team
Confidential, Amsterdam
Oct' 2008- Dec'2010
Position: Technical Writer
Industry: IT Solutions
Project: Marketing Tool Version 1 & 2
- Conducted meeting with the Tool Designer and subject matter experts to Understand the scope of the tool
- Worked on DM tool and Invenna for developing a better understanding thus having a better insight
- Created Developer Guide for the 1st and 2nd version of the Marketing tool. The software aimed at managing the Data in such a way as to facilitate the business to get a quick insight into sales and Distribution for planning the future strategies.
- Remained in constant contact with the Developers for the improvement of the guides during their development as per requirement
- Used Different Skins as asked for a better outlook of the guides
- Experienced in the usage of Software such as RoboHelp and Madcap
- Prepared Guides in various formats Such as HTML and PDF etc
Confidential, Peshawar
June 2007- Oct'2008
Industry: Education
Project: Technical Writing
- Analyzed and reviewed the existing manuals that needed upgrade and modified them as per requirement
- Duties include writing technical manuals for the New users of the Language Lab along with conducting workshops for explaining the usage
- Worked diligently to train the people new in technical writing
- Conducted presentations regarding technical writing and business language
- Worked with slides (projector) and power point for the presentations
Department of Nederland als tweede taal
NT2 Diploma
SAP Division | iData Consulting Services
SAP BusinessObjects Data Services Training
Modern Languages Department
Masters of Arts in Technical Writing
Bachelor Of Arts in Political science and English