Etl/sap Bods Developer Resume
Battle Creek, MI
SUMMARY
- Over 7+ years of extensive experience in the field of Information Technology with emphasis on Business Intelligence concepts, Database management systems, development and complete Project Life Cycle in Data Warehousing and Client/Server technologies.
- Worked extensively with Data migration, Data marts, Data profiling, ETL Processes, Data mining, Data audits and Web reporting features for data warehouses.
- Proficiency in data warehousing techniques for data cleansing, Slowly Changing Dimension phenomenon, surrogate key assignment, change data capture (CDC) and Strong knowledge of Fact & Dimension tables.
- Expertise in using IDOC, BAPI, scripting, looping and also used ABAP dataflow for Extracting Data from SAP ERP system.
- Extensive Experience in preparing Test Plans, Test Scenarios, Test Cases and executing them.
- Good Exposure to all stages of Software Development Life Cycle (SDLC) and Software Test Life Cycle (STLC) and Agile methodologies.
- Experienced in interacting with business users to analyze the business process and requirements and transforming requirements into screens, designing reports, formatting and rolling out the deliverable.
- Hands on experience in implementing the data integrator transforms like history preservation, table comparison, map operation, pivot and hierarchy flattening.
- Implemented various performance optimization techniques like Table - Partitioning, caching, full push down etc.
- Experience in creating ETL mappings using Data Integrator to move Data from multiple sources like Flat files, Excel files, XML Files, Oracle, and Microsoft SQL Server into a common target area such as Data Marts.
- Written DI Scripts for implementation of Recovery Mechanism for unsuccessful batch jobs execution
- Experience in setting up Development, Test and Production environment by additionally setting up a central repository and migration of reusable of objects such as Jobs, Workflow, Dataflow both by Import/Export directly and Checking In/Out in Central Repository
- Hands on experience working with the processes like Address cleansing and Match consolidation.
- Expertise in Relational Database Systems such as Microsoft SQL Server 2008/2005, MYSQL, DB2, Oracle 10g/9i and Microsoft Access.
- Substantial development experience in creating Stored Procedures, PL/SQL, SQL, Packages, Triggers, Functions and Query optimization.
- Strong understanding of logical and physical database designs and also dimensional modeling schemas like Star and Snowflake schema.
- Extensively worked on designing, implementing, distributing and maintaining Universes.
- Expertise in Business Objects XI 3.2/XI R2/6.x/5.x with experience in Designer, Desktop Intelligence, Web Intelligence.
- Strong experience in training end users to create, modify and customize their own Reports using Business Objects Reporter, Info View and Web Intelligence.
- Excellent interpersonal skills, ability to interact with people at all levels good communication and presentation skills.
TECHNICAL SKILLS
ETL Tools: Business Objects Data Integrator 11.7/12.2, Data Quality 11.7, SAP Data Services XI 3.0/3.2.
ReportingTools: Business Objects XI 3.1/XIR2/6.x, Universe Designer, Web Intelligence, Desktop Intelligence, Xcelsius2008, InfoView, LiveOffice, CMC, andCMS.
DataBase: Oracle9i/10g/11g,SQLServer 2000/2005/2008, DB2,Teradata,MySQL,MSAccess,SAW
Modeling Tools: TOAD, Erwin, and Visio.
Operating Systems: WINDOWS 98/XP/ NT/Server 2000/2003/ Vista/7, LINUX, UNIX.
Languages: C, C++, SQL, JAVA, Perl, Python.
Web Environment: HTML, Java-script, ASP.Net, and XML.
IDE & Packages: SQL Navigator, Visual Studio 2005, Eclipse, MS OFFICE Suite (Word, Excel)
PROFESSIONAL EXPERIENCE
Confidential, Oxnard, CA
ETL/SAP BODS Developer
Responsibilities:
- Involved in interacting with client for understanding the business and writing the technical specification documents, test case documents from functional specifications.
- Designed the ETL processes using Data Integrator to load data from multiple databases like Oracle, My SQL, and XML, Flat Files (Fixed Width) to staging database and from staging to the target Data Warehouse.
- Extensively used Query Transform, Map Operations, Table Comparison, Merge, Case, SQL, and Validation Transforms in data integrator in order to load data from source to Target Systems.
- Analyzed all tables and created various indexes to enhance query performance.
- Implemented various performance optimization techniques such as caching, Push-down memory-intensive operations to the database server, Table partitioning etc.
- Involving in CR’s (Change Request) Issues and resolving them in an estimated time period.
- Extracting Data from different sources like SAP ECC6.0 and flat files and loading them into requested target flat files, excel sheets, Oracle Tables in Data Integrator.
- Involved in working with SAP extraction using BAPI’s/RFC function calls.
- Worked with tables involving Hierarchies and resolved them using Hierarchy flattening whenever required.
- Working with different transformations like Query, Key Generation, Pivot, Map operation, Table Comparison in Data Integrator.
- Involved in writing scripts and also used some built-in functions like To Date, lookup, key generation and Custom Functions like checking whether the load is a initial load or a bulk load.
- Involved in creation of recoverable workflows during source and target server crashes and job failures.
- Experience in debugging execution errors using Data Integrator logs (trace, statistics and error) and by examining the target data.
- Extensively Used Lookup ext, Lookup Seq and also created custom functions where repetitive code is needed.
- Written DI Scripts for File existence check, Daily incremental read, Daily Incremental update, workflow level Script read, workflow level Script update, workflow level Error Handling and Job Dependency.
- Used Data Services inbuilt functions like Aggregate Functions, Database Functions, Date Functions, Lookup Functions, Math Functions, System Functions and Validation Functions.
Environment: SAP Data Integrator 12.2, Business Objects XI, TOAD, Oracle 10g/11g, My SQL, XML, PL/SQL, OBIEE, Windows XP, Windows 2008 server
Confidential, Battle Creek, MI
ETL/SAP BODS Developer
Responsibilities:
- Hands on experience in development of jobs using transformations like, Map Operation, Table Comparison, Effective Date, History Preserving, SQL, Key Generation, Query, Case, Merge, Validation, Pivot, Reverse Pivot, Row Generation Etc.
- Used Data Services inbuilt functions like Aggregate Functions, Database Functions, Date Functions, Lookup Functions, Math Functions, System Functions and Validation Functions.
- Created Custom functions to make the code reusable.
- Created Scripts and worked with Conditional and While Loop to implement complex logic
- Worked on Active Customer Data Cleansing, and loading data in BW.
- Extracting Data from SAP ECC systems using ABAP data flows transformed Data and stage them to Data marts.
- Created ABAP Data flows and used Shared Directory Data transport method to perform the ETL operation.
- Created DS extract jobs to extract data RDBMS tables to SAP BW system and also exported the job extecution parameters to SAP BW system.
- Created the batch jobs by extracting the data from spread sheets and validated the data with respect to BW.
- In BODS, to process the IDOCS we were using real time jobs. In which we are using IDOC as a message source.
- Real time job will be configured with a real time service. A real time service can handle one IDOC and that will be attached to RFC client Interface along with RT service.
- RFC client interface will interact with SAP system and pick the IDOC. The IDOC will be processed by corresponding Real time service and data will be inserted in ODS data base.
- Worked on optimizing the performance of long running jobs used techniques like pushing down operations to database level, Degree of Parallelism, collect statistics to monitor jobs.
- Scheduled and Monitor jobs using Data Services Management console.
- Created Scripts like starting script and ending script for each job, sending the job notification to users using scripts and declaring the variables local and Global Variables.
Environment: Business Objects Data Services XI 3.2, SQL Server2005, Oracle 10g/9i, SAP ECC 3.0, SAP BI 7.0, TOAD, Web Intelligence XI R2, Windows XP, Windows 2003 Server.
Confidential, WI
ETL/SAP BODI
Responsibilities:
- Familiarized the business, its strategy and implementation’s.
- Experience in interacting with client at onsite to gather business requirements, analyzing and developing ETL designs
- Gathered Customer, User and Owner requirements; analyzed and documented them.
- Used Business Objects Data Integrator XI 3.2 for ETL extraction, transformation and loading data from heterogeneous source systems.
- Designed and customizing data models for Data warehouse supporting data from multiple sources on real time.
- Tuned Data Integrator Mappings and Transformations for the better performance of jobs.
- Customized configuration of Business Objects Web Intelligence by fine tuning and optimization for better performance.
- Migrated reports from BO 6.5 to BO XI2.
- Designed and developed Data Integrator (DI) XI scripts for data transformation.
- Experience in cleansing data in DI using proper data types and mappings and to create Workflows using object library and tool palette to execute data flows.
- Created multiple universes and resolved Loops by creating table aliases and contexts.
- Worked with database connections, SQL joins, cardinalities, loops, aliases, materialized views, views, aggregate conditions, parsing of objects, LOVs, and hierarchies.
- Created complex reports using user-defined functions like @variables, @prompt, condition Objects and aggregate Awareness Functions for SQL enhancement.
- Enabled/Disabled different servers in Central Configuration Manager and Central Management Console.
Environment: Business Objects XIR2/6.5/6.1, Data Integrator, Dashboard Manager, Xcelsius, Web I 6.1, DESKI, JSP, SQL, PL/SQL, Erwin, SQL*Loader, UNIX Shell Script, Windows 2000/NT, ORACLE, PL/SQL, SQL Enterprise Manager
Confidential, CA
Data Integrator developer
Responsibilities:
- This project is an EDW (Enterprise Data Warehousing) Project which automates all the processes and sub-processes involved in the organization resource management and space management unit wise.
- This application gives the information like spaces given to a particular circle, space occupied by associates in a circle, space unoccupied by associates in a circle, billing of spaces, expanses of spaces on the basis of billing.
- This application gives expanses of spaces on monthly basis for all the circles in the organization and this data is further used to develop the reports for the organization.
- The Reviews are taken place on a monthly basis to let know the status of the Circle/Sub Circle to the Top Management People.
- Also allows the users to view the history information each unit wise and help the top management in decision making.
- The project involves providing capability for the user to develop his/her customized ad-hoc reports.
- Development and Implementation of ETL Logic using Data Integrator according to client requirements.
- Implementing Star Schema, Fact and Dimension Concepts.
- Creating reports using Designer, Web Intelligence
- Writing complex queries to implement the logic behind the reports.
Environment: Business Objects XI R2, Business Objects Data Integrator 11.7.2, Web Intelligence, Java, Oracle 9i DW, SQL, PL/SQL, MS SQL Server 2005.
Confidential, Andover, MA
SAP BODS (Data Integrator/Data Quality) Consultant
Responsibilities:
- Responsible for gathering and analyzing business requirements.
- Responsible for ETL extraction transform and loading data from different heterogeneous source including Oracle.
- Implemented various transformation techniques in data services and loaded into flat files.
- Extensively used the Address Server for processing of the EMEA (European) addresses for Cleansing in Data Quality in conjunction with the dictionaries and directories.
- Responsible for designing tqhe prototype for loading the data into SAP BW 7.0 using external source system (Data Services 3.0).
- Successfully configured the data services 3.0 to load the data into business warehouse.
- Monitoring the loads in BW ODS and responsible for solving issues related to Unit of measure, Date fields and Currency.
- Involved in writing DI scripts and also used some built-in functions like Search Replace, Lookup ext and Custom Functions like Sending Email whenever an exception is raised.
- Involved in using SQL *loader to load the data from excel sheet or flat file in to oracle tables without using ETL tool.
- Involved in performance issues while doing Full load and delta load using data services.
- Responsible for using various error handling techniques in data services to avoid duplication of rows and/or missing rows.
- Successfully implemented conversion logic in Business objects data services and able to load data into Persistent Staging area in BW 7.0
- Involved in performance improvement of existing ETL jobs by implementing different performance tuning techniques such as Slicing and Use of Cache Memory.
- Working with Data Service Management Console to deploy and schedule batch jobs as Database Administrator.
Environment: SAP Data Services XI 3.0, Oracle 10g, ECC 6.0, SAP BW 3.5, BW 7.1, Windows Server 2003.
Confidential, Long Beach, CA
Business Objects Developer
Responsibilities:
- Successfully implemented Business Objects XI R2 infrastructure for enterprise reporting.
- Defined database Data Stores and File Formats to allow Data Integrator XI R2 to connect to the source or target database and files.
- Migrated Business Objects from 6.5 to XI R2.
- Identified and tracked the slowly changing dimensions (SCD), heterogeneous Sources and determined the hierarchies in dimensions.
- Automating all error handling, error escalation, and email notification procedures.
- Used Business Objects Data Quality cleansing transformations for de-duplication, house-holding and global address parsing.
- Logical database design, physical database design, data modeling with TOAD.
- Developing complex Universes by using advanced features like aggregate aware, Conditions and hierarchies.
- Create Context Alias to resolve the Loops and Chasm Traps in Business Objects.
- Responsible for designing, building, testing, and deploying Universes, Reports and Xcelsius Dashboards through Business Objects
- Involved in designing, creating, enhancing and testing Universe, Web Intelligence reports.
- Modify report content and export reports in multiple formats based on user input
- Implemented Aggregate awareness & aggregate navigation with Business Objects Designer.
- Exported reports into various formats like PDF, EXCEL, and WORD for the convenience of the end users.
Environment: Business Objects XI R2, Data Integrator XI, Data Quality XI, Oracle 10G/9i, PeopleSoft, Hotel 360 application, QAAWS, MS Access, TOAD 8.0., Windows XP/2003 server
Confidential
Data Integrator developer
Responsibilities:
- Involved in analysis of three different source systems and prepared technical specifications for the ETL
- Developed Error/Exception handling mechanism to enhance the data quality loaded to EDW
- Involved in enhancing the data model for new additions of dimension and reference tables
- Responsible for data migration in order to achieve an automated migration.
- Involved in all stages of data migration.
- Created mappings and Mapplets using the transformations such as the Source qualifier, Aggregator, Expression, lookup, Router, Filter, Sequence Generator, Update Strategy
- Collaborated with other team members to create technical specifications for ETL/ELT processes based on functional specifications and Developed ETL/ELTs based on the technical specifications.
- Extensively worked with Power Connect to import sources from external systems like ERP.
- Created SSIS Packages to migrate slowly changing dimensions.
- Involved in setting up of SSRS 2005 for corporate internal reports.
- Used Workflow Manager for creating workflows, Worklets, email and command tasks
- Involved in tuning Informatica Mappings and Sessions by implementing parallelism, partitioning and caching
- Developed Unix and Perl scripts to automate different tasks involved as part of loading process and scheduling
- Tivoli workflow scheduler (TWS) was used as part of scheduling
- Performed Unit Testing and Integration Testing on the mappings
Environment: Informatica PowerCenter 7.1.1, Teradata, SSIS, Erwin, XML, Oracle, SQL, BTEQ, Power Connect, PL/SQL, SAP, UNIX, C#, SQL Server 2005, Windows XP, TOAD.