Sap Bods Lead Etl Developer Resume
Denver, CO
SUMMARY:
- Overall 8 years of experience as a SAP BODS ETL Developer
- Expertise in Business Objects ETL, Data Services (BODS) and Data Integrator (BODI)
- Extensive experience with modeling Data Warehouse using Ralph Kimball’s dimensional modeling, star and snowflake schema approach
- Extensive experience in development of OLAP systems using BO
- Experience in deployment of Business Objects & Web Intelligence from Various Sources
- Good experience with direct user interaction in requirements gathering, development and acceptance
- Extensive experience with Oracle CDC and GoldenGate as a source database for handling change data capture
- Worked with Salesforce tables as a source and target data source using BODS jobs
- Have developed large number of Job and data flows, workflows, query transformations, scripts etc. in Data integrator
- Experience working on SAP EIM suite including Data Services, Data Quality and metadata management
- Experience in data profiling using Information Steward and implementing validation rules
- Extensive experience in working with heterogeneous data sources like RDBMS, Web Services, Flat files, XML Files, Excel workbooks, Salesforce, SAP ECC, SAP R/3
- Experience using the Management console to monitor performance and scheduling of jobs
- Experience working in a Multi - user environment with local repositories and secure central repositories to handle code migration and change management in BODS\
- Extensive experience as a Production Support Developer
- Have designed and implemented CDC and ODS specifications in Data integrator
- Have implemented outbound messages using XML in Data integrator
- Very proficient with the meta data repository of Data Integrator
- Hands on experience in Tuning Mappings, Identifying and resolving Performance bottlenecks in various levels like sources, targets and mappings
- Experience with full lifecycle activities with Data Integrator - ETL design, modeling and scheduling
- Hands on experience with UNIX Shell scripting
- Experience with T-SQL and PL/SQL programming on Oracle
- Hands on experience with SQL & Oracle administration
- Possess excellent Problem Solving, Investigative, and Interpersonal skills
TECHNICAL SKILLS:
Programming: C, C++, JAVA, SQL, Stored Procedures, SAP BODS scripting
Database: Oracle 7x / 8i / 9i/10g/12c Exadata, MS SQL Server 7.0/2000/2005/2008
Web: Oracle 9i AS, PostgreSQL 9.2, Web logic, ASP .NET
Operating System: UNIX (HP-UX), Windows 3.1, Windows 9X, Windows
NT/2000/ME/XP/7, MS: DOS, Mac OS
Business Intelligence: Dimensional Modeling, BO Data Services 4.2/3.x, Business Objects 5.1.6/6.1, XI R2, Web Intelligence, SAP HANA, SAP R/3, Informatica Power Center 9.1
Agile Development: JIRA, RALLY
Database Tools: Erwin, Rapid SQL, TOAD, Oracle SQL Developer
Tools: & Utilities:SQLPLUS, Vi Editor
Others: Microsoft products (Word, Excel, Power Point, Outlook), Visio, MS-Access
PROFESSIONAL EXPERIENCE:
Confidential, Denver CO
SAP BODS Lead ETL Developer
Responsibilities:
- Lead the design and development of BODS jobs to load changed data into fact tables from Oracle GoldenGate tables
- Implemented Oracle GoldenGate as part of the Exadata migration and transition from Oracle CDC
- Designed a dimensional model to integrate warehouse data from JP Morgan
- Extensively used Informatica Power Center to create repositories, workflows and data mappings
- Worked with data quality transforms like address cleanse, data cleanse and match
- Performed rigorous tests to check data correctness in the Salesforce environment
- Optimized jobs to execute faster and leverage the available memory
- Worked with SAP ECC and non-SAP sources in the ETL code
- Worked with SAP R3 data flows to pull data from SAP tables using Data Transport
- Worked with scripts that handle Oracle CDC subscriptions manually - Create, Subscribe, activate, extend and purge CDC subscriptions from within a script
- Tested the data for correctness using SQL queries
- Created complex SQL queries and optimized the performance of the queries
- Work around an obsolete test environment - traced mock data through the path and made sure the target data is up to date
- Worked on multiple projects by prioritizing tasks and delivering within deadlines
- Worked with extracts on Tableau and published the same
- Identified incorrect data and its causes in Business Object Universes and modified the same
Environment: Data Services 4.2, Oracle 12C, Windows 7 Enterprise, Tableau 9.0, Webi 4.1
Confidential
ETL Developer
Responsibilities:
- Design and develop Business Objects Data Services Jobs to integrate the data from two different warehouse for reporting and data analysis
- Provide pointers to improve DS job design
- Worked extensively on IS data profiling, data cleansing, performance tuning
- Modify existing DS jobs to optimize performance using parallel execution
- Created a design plan to port ETL code from one environment to another and implemented the same
- Addressed the ported code for missing data and tables in the new environment and made sure the loads ran without any issues
- Used the DQ transforms to cleanse data in BODS
- In DS, designed jobs to eliminate duplicate records to get faster run times
- Solved issues with Data store ODBC connection restrictions on a UNIX environment.
Environment: Data Services 4.1, Oracle 11g, Windows 7 Enterprise, UNIX.
Confidential
ETL Developer
Responsibilities:
- Have done performance tuning on existing Data Services jobs for faster processing times
- Worked with R3 data flows to pull data from SAP tables using Data Transport
- Designed and developed Data Services jobs to bring over tables from Oracle, SAP and PostgreSQL source systems; Added new columns to the existing tables in the warehouse based on business and reporting requirements
- Worked with transformations like Query, Map Operation, Merge, Pivot, SQL, Validation etc.
- Worked extensively with Oracle PL/SQL to handle Oracle geometry data. Used oracle pre-defined geometry functions
- Created an Oracle stored procedure to keep track of execution metrics for the Data Services objects and Oracle stored procedures
- Implemented full refresh and Incremental load for warehouse tables and one-time history loads for incremental tables
- Handled change data capture (CDC) with ETL control tables that maintained the last loaded record information for each warehouse table
- Designed and implemented a DS job to handle hard and soft deletes from the source systems
- Optimized existing Informatica workflows for better performance in terms of processing times
- Redesigned cumbersome Data Services jobs for better modularity and control; Also, enhanced reusability of objects
- Used the Data Services interactive debugger to troubleshoot errors and monitor the data at various points in a job
- Documented the projects in Microsoft word with screenshots from Data Services Designer.
Environment: Data Services 4.0, Oracle 10g, PostgreSQL 9.2, Windows 7 Enterprise.
Confidential
ETL Developer
Responsibilities:
- Designed and developed Data Services jobs to load data into dimension tables and implemented Slowly Changing Dimensions
- Worked with transformations like Query, History Preserving, Table Compare, Map Operation, Map CDC Operation
- Used Lookup function to populate fields from other tables
- Implemented initial (Full) and delta (Incremental) load for Fact tables
- Created CDC datastore and used CDC tables in jobs to load changed data into tables
- Worked with scripts that handle CDC subscriptions manually - Create, Subscribe, activate, extend and purge subscriptions from within a script
- Tested the data for correctness using SQL queries
- Created complex SQL queries and optimized the performance of the queries
- Created new indexes on tables to optimize job performance and designed the job accordingly
- Used the Data Services interactive debugger to troubleshoot errors and monitor the data at various points in a job
- Documented the projects using Auto-Documentation Feature
Environment: Data Services XI 3.0, Oracle 9i/10g, Data modeling, Windows 2003 Server/XP, and BO XI R2 Designer.
Confidential, Minneapolis, MN
BO/Data Services (ETL) Developer
Responsibilities:
- Designed and setup the Data Services environment
- Implemented a secure Central repository and setup User Privileges
- Created a web service Data store to make API calls to the web service
- Executed batch scripts from Data Services to communicate to an FTP server
- De-duplicated the incoming data and mapped those using transformations like query transform, reverse pivot, and map operation.
- Used Lookup function to populate a field from other table
- Loaded the target table using Table comparison transformation
- Tested the data for correctness by comparing it against the data generated by its Oracle Stored procedure counterpart
- Optimized the caching options to get optimum performance for the job
- Worked with fixed-width files that contain credit card transaction details
- Created DS jobs to produce flat files according to the specs used for credit card verification
- Applied business logic to identify the transactions that are to be processed again
- Maintained all the transaction details and the verification response in a table and updated it periodically
- Created complex SQL queries and optimized the performance of the queries
- Used the Data Services interactive debugger to troubleshoot errors
- Documented the projects using Auto-Documentation Feature
Environment: Data Services XI, Oracle 9i/10g, SQL Server 2005, Data modeling, Windows 2003 Server/XP, and BO XI R2 Designer
Confidential, Chicago, IL
BO/Data Integrator (ETL) Developer
Responsibilities:
- Designed the Data Model for the project
- Analyze source data, to design Logical and Physical Database
- Designed STAR SCHEMA following Dimensional Modeling Approach
- Loaded data from ORACLE DB into Staging table
- Cleaned data in DI using proper data types and mappings
- Loaded Dimension and Fact table
- Implemented Type 2 Dimension, for History Preserving
- Scheduled Job using Web Admin Feature
- Designed and developed several Data Integrator jobs
- Documented the entire DI project, using Auto-Documentation Feature
- Resolved Loop in the Universe
- Resolved Chasm trap in the Universe
- Created Ad Hoc Webi reports
- Designed Universe consisting of 24 classes and 450 objects
- Created linked report using external variable at report level and at Universe level
- Created Custom Hierarchy for Dice and Drill functionality
Environment: Data Integrator XI, Oracle 9i/10g, Data modeling, Windows 2003 Server/XP, BO XI R2 Designer, BO XI R2 Info view
Confidential, Santa Monica, CA
Data Warehouse Developer
Responsibilities:
- Analyzed data source to designed simple and efficient query transforms
- Loaded data from flat files
- Designed work flows and data flows
- Cleaned data in DI using proper data types and mappings
- Extracted data from Oracle database
- Wrote custom function to implement Business day logic for job execution
- Scheduled Job using Web Admin
- Designed and developed several Data Integrator jobs
- Wrote customized Scripts
- Documented the entire DI project, using Auto-Documentation Feature
- Closely worked with B.A, to designed efficient Data Integrator Jobs
- Created context, alias to resolve the loops in the universe
- Constantly worked on complex reports providing for a report generation based on user needs
- Created User Prompts, Conditions and Filters to improve the report generation
- Modified or added new objects to universe according to the requirement
Environment: BO XI R2 Designer, BO XI R2 Info view, Data Integrator XI, Oracle 9i/10g, Data modeling, Windows 2003 server/XP
Confidential
Data Warehouse Developer
Responsibilities:
- Created Universe using Designer
- Interacted with Business Users to know their Business views in modifying Reports accordingly
- Modified the existing Reports according to the alignment in the Universe
- Scheduled Reports using Broadcast Agent
- Applied filters, Breaks, formulas and drill functions in report
- Modified the existing Universes by merging different Classes, adding new objects and resolving the Loops and Chasm Traps by context
- Converted files to single delimited appended by date
- Used T-SQL (Nested Cursor, SP) to load into staging tables
- Designed Work flows and Data flows
- Wrote scripts to populate dimension and fact tables
- Cleaned data in DI using proper data types and mappings
- Loaded data into staging table
- Designed and developed several Data Integrator jobs
- Documented to whole DI project, using Auto-Documentation Feature
- Performed UNIT test against Source, Staging and Data Mart
- Wrote customized PERL program, later invoked by DI
Environment: Business Objects, Designer, Info View, Data Integrator 6.1/6.5, MS SQL 2003, Data modeling, Windows 2003 server/XP, Perl