We provide IT Staff Augmentation Services!

Sap Data Services Consultant Resume

4.00/5 (Submit Your Rating)

CA

PROFESSIONAL SUMMARY

More than 7 year of extensive experience in IT industry, with about 6 years of experience in Business Intelligence solutions in developing Data warehouse / Data Marts and Client Server Applications using Data Integrator.Excellent Skills in Data Integrator XI/12.2 in design, creation, and implementation of work flows, data flows, scripts and performing simple and complex transforms in dataflow using different data integrator, platform and data quality transforms.Expert - level mastery in designing and developing complex mappings to extract data fromdiverse sources including flat files, RDBMS tables, legacy system files.Experience in setting up Development, Test and Production environment by additionally setting up a local and central repository and migration of reusable of objects such as Jobs, Workflow, Dataflow both by Import/Export directly and Checking In/Out in Central Repository.Expertise in implementing Data Cleansing, Data Profiling, Transformation Scripts, Stored Procedures/Triggers and necessary Test plans to ensure the successful execution of the data loading processes.Hands on experience of Recovery Mechanism for unsuccessful batch jobs execution.Experience in debugging execution errors using Data Integrator logs (trace, statistics, and error) and by examining the target dataExtensively used Data Services Management Console to schedule and execute jobs,manage repositories and perform metadata reporting.Experience in creating and administering the metadata Data Integrator repository in a multi user environment.Proficiency in data warehousing techniques for data cleansing, Slowly Changing Dimension phenomenon, surrogate key assignment and change data captureExtensive knowledge in all phases of Software Development Life Cycle SDLC (Requirement Analysis, Design, Development, Testing and Documentation).Worked with End-To-End implementation with Data warehousing team and Strong understanding of Data warehouse concepts Star Schema, Snowflake Schema and Multi-Dimensional Models for query and analysis requirements.Hands on Expertise in optimizing, debugging and testing SQL queries and stored procedures written in Oracle.My experience includes system support, performance tuning, backup and recovery, space management, maintenance and troubleshooting.Experience in resolving on-going maintenance issues and bug fixes, and performance tuning of reports.Ability to understand the business environment and translate business requirements into technical solutions.Outstanding communication and interpersonal skills, ability to learn quickly, good analytical reasoning and high compliance to new environment and tools.

TECHNICAL SKILLS

ETL Tools: Business Objects Data Integrator 11.7/12.2, Data Quality 11.7, SAP Data Services XI 3.0/3.2

Databases: Oracle 9i/10g/11g, SQL Server 2000/2005/2008, Sybase, DB2, Teradata, MySQL, MS AccessModeling Tools: Mapping Composer, Mapping Designer, TOAD, Visual Paradigm, Rational Rose

Other Tools: Test Director, Work Bench, Enterprise Manager, Transaction Tracer, Win Runner, QTP, NS3 Simulator, M5 Simulator, Wireshark

OS: WINDOWS 98/XP/ NT/Server 2000/2003/ Vista/7, LINUX, UNIX/SOLARIS

Languages: C, C++, SQL, JAVA, Perl, Python

Web Environment: HTML, Java-script, XML

IDE & Packages: SQL Navigator, Visual Studio 2005, Eclipse, MS OFFICE Suite (Word, Excel)

WORK EXPERIENCE

Client: Confidential CA

Role: SAP Data Services Consultant

Cepheid is a leading molecular diagnostics company that is dedicated to improving healthcare by developing, manufacturing, and marketing accurate yet easy-to-use molecular systems and tests.

Cepheid had developed reports to see invoiced sales, booked sales order, pending shipments by entity. Cepheid needed to be able to view invoiced sales by region or by territory starting with current quarter and looking back 4 quarters. Redesigned the ETL to handle the new region and territory dimensions using Business Objects Data Services XI 3.2

Responsibilities:

  • Worked with business process owners, stakeholders to understand and document the requirements. Interacted with the manager to understand the business requirements and implemented the business rules to redesign and enhance the existing dataflows.
  • Designed, developed and implemented solutions with data warehouse, ETL, data analysis, and BI reporting technologies.
  • Responsible for day to day operations and technical deployment to ensure solution is in compliance with quality and regulations, meeting user’s requirements, and applying standards and best practices where appropriate.
  • Managed the central and local repositories using the Management console and Repository Manager.
  • Used data extracts from legacy QAD systems and SQL Server to populate the Microsoft SQL Server 2008 target systems.
  • Made changes to the structure of the database tables in SQL Server for adding and modifying the column values and data types in order to accommodate the changes made to the ETL.
  • Used different transforms such as Query, Case, Table Comparison, Key Generation, Merge, Date Generation, Pivot transforms and functions like ifthenelse(), substr, lookup, lookup ext etc.
  • Migrated the dataflows to the production environment for Go Live of the Web Intelligence business Reports which used the underlying ETL.
  • Made changes to the existing Universe using the Universe Designer to in corporate the new changes in the ETL and customized the WEBI reports as per the user’s requirements.
  • Validated the data in the reports and underlying fact table with the Golden data.

Environment: Business Objects Data Services XI 3.2, MS SQL Server 2008, BOXI 3.1 and Windows XP, Universe Designer, Web Intelligence.

Client: Confidential Houston, TX

Role: SAP Data Services Consultant

Champion is a global supplier of complete line of proprietary specialty chemicals, offering technical solutions to problems in the oil and gas industries. Champion has expanded its expertise in research and development, manufacturing and servicing of its products for oilfield and related applications. Our team was liable for ETL process flow designs and walkthroughs, and data validation/analysis of required source systems. Coordinating with the business users and resolving various defects before the Go Live. Developed Datawarehouse/Datamarts using Business Objects Data Services XI 3.2

Responsibilities:

  • Interacted with Business Users and Managers in gathering business requirements.
  • Involved in meetings with functional users, to determine the flat file layouts, data types and naming conventions of the column and table names.
  • Worked on Dimension modeling as per business needs.
  • Configured and managed Repositories using administrator console as per business requirements.
  • Created mappings using Designer and designed workflows using Workflows to build DW as per business rules.
  • Most of the transforms used were like the Source Lookup, Case, Merge, and Sequence Generator, Query Transform, Map Operations, Table Comparison, SQL, and Validation Transforms
  • Prepared data Extract from legacy system on Sybase and DB2 and populating SAP target systems on which data store for BO has been created.
  • Used various Transforms like Map Operation, Table Comparisons, Row-Generation, Query History Generation, Key Generation, Pivot, Reverse Pivot and functions like lookup, index search for transforming data.
  • Configured the mappings to handle the updates to preserve the existing records using History Preserve Transformation (Slowly Changing Dimensions Type-2)
  • Developing and modifying changes in data flows according to the business logic.
  • Understand technical issues and identify architecture and code modifications needed to support changing user requirements for multiple Data Services jobs and application
  • Recovering Data from Failed sessions and Workflows.
  • Experience in debugging execution errors using Data Services logs (trace, statistics and error) and by examining the target data.
  • Analyze the types of business tools and transforms to be applied.
  • Tuned performance for large data files by increasing block size, cache size and implemented performance tuning logic on sources, workflows, data flow’s and SAP ECC 6.0 target system in order to provide maximum efficiency and performance.
  • Involved in writing the Unit Test Cases.
  • Involved in user training sessions for and assisting in UAT (User Acceptance Testing)
  • Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.
  • Implemented security for Central repository by group and object level.

Environment: Business Objects Data Services XI 3.2, Teradata, Sybase, Oracle 10g, MS SQL Server 2005, TOAD, and Windows XP

Client: Confidential Racine, Wisconsin

Role: SAP Data Services Consultant

Wholesale distribution Industry based in Wisconsin, the client manufactures outdoor recreational equipment with a team of 1500 employees generating over $420 million revenue. The primary objective of the project is to integrate sales/purchases information from heterogeneous source systems like ORACLE and SQL Server and loading it into centralized data warehouse. The role also involves replacement of legacy system no longer able to keep up with the Business Intelligence demands of the growing company.

Responsibilities:

  • Participated in the review and approval of the technical transformation requirements document.
  • Used technical transformation document to design and build the extraction, transformation, and loading (ETL) modules.
  • Performed source data assessment and identified the quality and consistency of the source data.
  • Extensively worked on Data Services XI 3.2 for migrating data from one database to another database.
  • Strong experience of BODS administration and Configuration.
  • Strong ETL performance tuning experience.
  • Experience doing extracts from SAPsource systems using ABAP dataflows and loading into SQL EDW.
  • Managed the ETL environment with regular updates to items such as address libraries which have been received from our vendors
  • Worked with tables involving Hierarchies and resolved them using Hierarchy flattening whenever required.
  • Experience with complex data integration projects, including multi-source system, multi-subject area, multi-data entity data environments
  • Involved in performance issues while doing Full load and delta load using data services.
  • Implemented various performance optimization techniques such as caching, Push-down memory-intensive operations to the database server, etc.
  • Implemented Server groups to use advance performance tuning feature such as parallel data flow, dividing dataflow in to sub dataflow and Degree of Parallelism.
  • Made modifications to existing universes by adding new tables in the data warehouse, creating new objects and exported the universes to repository.
  • Responsible for designing, building, testing, and deploying Dashboards, Universes and Reports using Web Intelligence through Business Objects.

Environment: SAP Data Services XI 3.2, Business Objects XI, TOAD, Universe Designer, Web Intelligence, Oracle 10g/11g, MS SQL Server 2005, XML, Windows XP, Windows 2008 server

Client: ConfidentialSt Louis, MO

Role: Senior BO Data Integrator Developer

A.G. Edwards Inc. firm provides securities and commodities brokerage, investment banking, trust services, asset management, financial and retirement planning, private client services, investment management, and other related financial services to individual, governmental, and institutional clients. For maintaining the Customer information and the history preservation Business Objects Data Integrator Tool was used and in turn used the data for reporting purposes.

Responsibilities:

  • Responsible for analyzing business needs, collecting information from client regarding business needs and interact with business analyst to implement solution for business needs.
  • Used Business Objects Data Integrator to perform Extract, clean, conform and load data in to oracle data warehouse. Worked on SQL and Toad to study the data in different systems and to validate reports.
  • Used Business Objects Data Integrator for ETL extraction, transformation and loading data from heterogeneous source systems such as excel, IDOCS and flat files.
  • Expertise in working with various transforms such as SQL, Table comparison, History Preservation, Row Generation, Query, Map Operation, Merge, Look Ups, Pivot, Case etc.
  • Define various audit points and rules to ensure that a data flow loads correct data into the data warehouse.
  • Used ABAP dataflows , pre-load and post-load commands.
  • Installed and Configured Data Insight for Data Profiling.
  • Involved in linking Job Servers on different machines to a single repository for load balancing using the Data Integrator Server manager
  • Responsible for Testing and debugging various ETL jobs.
  • Involved in writing DI scripts and also used some built-in functions like Search Replace, Lookup ext and Custom Functions like Sending Email whenever an exception is raised.
  • Responsible for using various error handling techniques in data services to avoid duplication of rows and/or missing rows.
  • Working with Data Integrator Management Console to deploy and schedule batch jobs as Database Administrator.

Environment: SAP Data Integrator 12.2, Toad v9, Oracle 10g, Windows Server 2003

Client: Confidential Tampa, FL Role: Data Integrator Designer

USF Health provides medical education performs research activities and healthcare services. Their major challenge was to gain greater visibility into financial, physician performance, educational and research data. This project is aimed at developing a research oriented DWH for fast, accurate decision making. Objectives include, consolidating information from multiple data sources, establishment of an integrated business intelligence reporting and analysis solution etc.

Responsibilities:

  • Involved in gathering business requirements, functional requirements and data specification.
  • Analyzed and created Fact and Dimension tables.
  • Optimization of prevailing data flows based on source/target based performance job performance and understanding table partitioning and parallel execution in data flows.
  • Creation of ETL Processes for the added new Subject areas, Develop Data Extraction, Transformation and Loading Program using Data Integrator.
  • Involved in Creation of jobs that specify data mappings and transformations by using Data Integrator Designer.
  • Used Data Integrator to load data from different sources (SQL, Text files) to the target database.
  • Responsible for creating/implementing both source and target based CDC using time stamp and history preserving transforms.
  • Involved in creation of recoverable workflows during source and target server crashes.
  • Involved in using Job Server in creation of real time jobs. And also involved in Performance tuning at source, transformation, and target levels.
  • Experience working on DI Management Console to schedule, monitor and execute batch jobs. As well as publishing batch reports and real time jobs via webservices.
  • Used Business Objects Data Integrator interactive debugger to test the jobs and fixed the bugs.
  • Documented BODI mappings and Scheduling jobs as per the standards of the Company.
  • Worked on Error handling and auditing.

Environment: Data Integrator 11.7.2, Oracle 10g, MS SQL Server 2005, Windows XP/NT.

Client: Confidential Fort Lauderdale, FL

Role: BO Developer.

Bank of Florida is a global financial services firm that provides its products and services to corporations, financial institutions and individuals. The company provides services in banking sector, wealth management, asset management, investment and management advisory.

Responsibilities:

  • Created the reports using Business Objects functionalities like Queries, Slice and Dice, Drill down, @Functions, Cross Tab, Master Detail and Formulas etc.
  • Created the Reports using the universes as the main Data providers and writing the complex queries.
  • Involved in resolving the loops, Cardinalities, contexts and checked the Integrity of the Universes.
  • Involved in the creating the reports on the web using the BO Web Intelligence from the online internet database and migrated from third party database.
  • Exporting the universe to the Repository to make resources available to the users.
  • Designed and developed universes for reporting generation from different database.
  • Called one report from another and bypassed prompts when there is no data used.
  • Involved in the designing and building of universes, classes and objects.
  • Created repository and user, user groups using supervisor module.
  • Creating the various ad-hoc reports.
  • Scheduled BO reports through Broad Cast Agent.
  • Monitored the reports that were scheduled through Broad Cast Agent console.

Environment: Business Objects 6.5 (Designer, Reporter, Supervisor, Broadcast Agent and WebIntelligence), Oracle 8i, SQL Server 7.0, and Windows NT.

Golden Source Corporation, India/ New York, USA

Software Engineer

Golden Source provides Enterprise Data Management ("EDM") software and services to financial institutions globally.EDM allows its clients to collect, standardize, consolidate, and manage information about their securities and products, customers, transactions, and other operations, to better manage the distribution of critical data to business applications.

Responsibilities:

  • Responsible for requirements analysis, design and development of the system.
  • The project involved detailed design using E-R Diagrams and Data Flow Diagrams.
  • Created database objects such as tables, views, synonyms, indexes, sequences and database links as well as custom packages tailored to business requirements
  • Develop and maintain scripts for monitoring, troubleshooting and administration of the databases.
  • Developing and Tuning SQL, triggers and stored procedures.
  • Tuning database and SQL statements and schemes for optimal performance.
  • Tested the mapping of the data and database fields to the Golden Source Data Model
  • Experience in wide range of tools used in development such as Putty, Clear Case and Anthill for creating deliverable packages. Worked on Company Oriented tools like Orchestrator, Environment Manager and Message specification File (MSF)
  • Functional testing of Securities data to Golden Source Data model as well as Automation testing using Workbench. Used both TestDirector and Workbench for maintaining theTest Plans, Test Cases, Test Execution, DefectManagementandBug Reporting
  • Good knowledge in Load Testing (uploading volumes of data in test environment)
  • Knowledge of Data files from vendors like Bloomberg, Telekurs, FTID (Interactive Data), S&P, Crosswalk, Fidelity, etc.
  • Raised exceptions found while testing the module/component/packages in each release
  • Use of Agile methodology and SCRUM technique
  • Preparation of Documentation, User manuals and Release Notes
  • Coordination with QE team on test plans, test cases and test schedules
  • Maintaining history of the customers by using SQL * loader and developing data warehouse from the flat files directly.

Environment: Oracle 9i, PL/SQL, SQL*Loader, SQL*PLUS, Visual Basic 6.0, Windows XP.

We'd love your feedback!