Etl Developer/sap Bods Consultant Resume
Santa Clara, CA
SUMMARY:
- Senior technical consultant with over 9+ yrs. of experience in Data warehousing - SAP Data services, Data Marts, Data Integrator, Data Cleansing (Address Cleansing), Data Profiling (Data Insight), Debugging, Data Integrator Logs, Data Services Management Console, Performance Tuning
- Extensive experience in the field of Enterprise Data Warehousing (EDW) and Data Integration
- Experienced in migrating data from a variety of source systems into SAP ECC / SAP BW / CRM.
- Heavy ETL (Extract, Transform, and Load) and Data Profiling experience from source systems including Flat Files, Legacy systems and SAP using SAP Data Services (BODI/BODS 3.x/BODS 4.2)
- Strong knowledge of Entity-Relationship concept, Facts and dimensions tables, slowly changing dimensions and Dimensional Modeling (Star Schema and Snow Flake Schema).
- Ability to Design and Develop complex ETL solutions by translating design requirements into feasible, reusable and scalable technical implementations
- Created Jobs, Workflows, Transformations and Data Flows according to requirement to integrate source data in to Target data ware house.
- Has good expertise in Data Migration Projects. Very familiar with SAP data Migration Framework/AIO Methodologies
- Migrated Customer Master, Material Master into SAP ECC from Various Legacy Sources.
- With a team of people comprising of BODS Developers and ECC team for achieving Data Migration Project goals
- Experienced in PL/SQL, T-SQL, Oracle 10g/9i/8i, MS Access, SQL Server, DB2, Netezza, Teradata Databases with experience in using TOAD, WinSQL, MSSQL, SQL*Plus, Procedures/Functions and Performance tuning.
- Strong working experience of data warehouse techniques and practices, experience including ETL processes, dimensional data modeling, OLAP and star schema.
- Has been involved in all the phases of development, from requirement gathering, blue-print, implementation, testing and go-live (Using ASAP Methodology).
- Extensive experience building data marts & carrying out reporting activities
- Experience creating documentation such as, functional, technical specification and, validation reports.
- Installed Information Steward and set it up to work with Data Services environment.
- Worked on profiling legacy data, creating data quality rules, score cards using Information Steward based on the Expedien’s Data Quality Assessment.
- Strong experience in SAP BOBJ (Warehouse / Reporting tool)
- Good problem solving, reporting and statistical skills with excellent communication and inter-personal skills.
- Demonstrated ability to work both in independent and team-oriented environments.
TECHNICAL SKILLS:
SAP Business Objects Data Services 4.2/3.2: SAP BW / BI / CRM & SAP FI, SD & MM modules
ABAP / WEBI / SQL / BEx Queries / SQL data mart: AIO Methodology / SAP Best Practices
SQL/PLSQL,TOAD,SQL Developer,SQL*PLUS, Unix Shell Scripting: Oracle 11g/10g/9i, Teradata 6.0, SQL Server 2005/2008/MDX, MS AccessMy SQL: Baan, Clarify, Sales Force
Nexus, Nolio, Jenkins, Perforce, ALM: Tidal Scheduler, Redwood Software(scheduler)
PROFESSIONAL EXPERIENCE:
Confidential - Santa Clara, CA
ETL Developer/SAP BODS Consultant
Responsibilities:
- Designed, built, and maintained a separate process to support CEEP transactions track - to load client transactional data into MKT database per SVB coding standards.
- Worked on bringing client’s transaction data to one platform to provide holistic view of client utilization, develop client transaction profile and establish transaction based triggers for Marketing, Sales and Client Services.
- Created new tables and lookup tables to load WIRES/ACH/FX data which will be used by Tableau dashboards to generate reports for FX advisories.
- Worked on CRM Individual and Ben Owner entities as source.
- Worked on ETL solutions to support Germany Canada data privacy compliance, access monitoring and access control.
- Worked on VPD enablement and data masking for PII and NPPI data elements to follow GDPR Compliance.
- Worked with various complex queries with joins, sub-queries, and nested queries in SQL to test the model.
- Created detailed functional and technical design documents and translated the business processes into ETL mappings.
- Worked closely with database engineers and architects to maintain integrity and stability of database
- Worked with heterogeneous sources including relational sources and flat files
- Implementing Slowly Changing dimension while loading data in Data warehouse.
- Developed ETL Jobs for initial full loading and incremental loading
- Involved in the deployment of restartibility logic in mappings to enable failed sessions to be rerun without any modifications.
- Developed daily schedules for jobs to be run on Redwood scheduler considering various job dependencies.
- Responsible for coordinating development and testing efforts with offshore team members.
- Implemented logic to control job dependencies between ETLs solely through the use of event-raise and event-wait tasks and entries made by ETLs in pilot database tables
- Performed production support duties on a 24/7 basis post go-live.
- Demonstrated ability to quickly grasp team guidelines, processes, practices and procedures.
- Worked independently and completed assigned project responsibilities under limited supervision and aggressive deadlines
Environment: CRM, Business Objects Data Services (BODS) 4.2, Information Stewards 4.1/4.2, Tableau 10.2, Oracle 10g/11g, TOAD, SQL Developer, Flat Files, DAT Files and Windows XP, UNIX, Jenkins, Nolio, Nexus, Redwood Software(scheduler), Perforce, ALM(defect tracking)
Confidential - San Francisco, CA
SAP Data Services Consultant
Responsibilities:
- Worked on moving insurance (Claims&Counters) data from legacy systems to Oracle Metavance systems.
- Designed, built, and maintained an application and database processes for a Data Mart for the Actuarial team of a dental health care payer. This data mart is used by the actuarial staff to monitor dental health insurance policies.
- Designed, built, and maintained a separate process to support Mexico government regulatory reports called Obligaciones Pendientes de Cuplir (OPC). This process captures new claims, modified claims, and paid claim ‘movements’ for all claims processed by Dentegra Mexico.
- Subject matter expert on Extract, Transform, Load (ETL) processes as used to support various business processes such as claims workflow, regulatory reporting, etc.
- Worked in BODS Data Quality- Customer master address cleansing
- Enhanced and developed Interfaces for Claims, Ex-Gratia and High risk entities.
- Extensively used ETL to load data from source systems like Flat Files, Oracle, ODBC, Excel Files and CSV Files into staging tables and load the data into the target database Oracle (MetaVance Application).
- Wrote PL/SQL, T-SQL procedures for processing business logic in the database. Tuning of SQL queries for better performance.
- Created dynamic procedures to drop and create Index for the Staging environment to facilitate faster loading of data.
- Worked with various complex queries with joins, sub-queries, and nested queries in SQL to test the model.
- Creating Scripts, Jobs, Workflows, Data Flows and Executing jobs with Data Integrator Designer. Loading Data into Target using flat files and from database tables used as source.
- Created Scripts like starting and ending script for each job, sending the job notification to users using scripts and declaring the variables in various local and Global Variables.
- Responsible for testing the ETL jobs in QA before moving them to Production (Integration testing cycles and mock cycles).
- Created Model and developed EIM Master Data Migration using SAP BODS.
- Involved in overall conversion of claims and counters from Healthcase systems (Mexico Business) to Oracle Metavance systems right from gathering requirements to successful go-live implementation and support.
- Successfully implemented sign-off and validation mechanism, at various levels of data transition in between several groups of business.
- Performed production support duties on a 24/7 basis.
Environment: Business Objects Data Services (BODS) 4.0/3.2 (12.2.2.0), Data Quality, Business Objects 4.0/XI R3, Oracle 10g/11g, SQL Server 2005/2008, SQL, PL/SQL, TOAD, SQL Developer, Flat Files, DAT Files and Windows XP, UNIX, Tidal Job Scheduler,IBM Rational Clear Quest, ClearCase, Winscp
Confidential - Hercules, CA
SAP Data Services Consultant
Responsibilities:
- Involved in gathering business requirements and preparation of data mappings from legacy systems to SAP. Worked with SMEs for each system area in preparing the mapping for legacy systems to the iDoc structure for loading.
- One of the key areas where I was able to help the business and the team is with the understanding of the data. Many of the current business users and SMEs were not familiar with the internal data relationship and was relying on my expertise to analyze the relationship to derive the mappings from legacy systems
- Prepared requirements documents based on the iDoc and LSMW structures available for loading.
- Prepared the mapping documents for data sets from each source systems, by conducting multiple interviews with the business users and respective technical teams.
- Used SAP best practice AIO templates to extract, transform and load source data to SAP ECC using SAP Data Services
- Extracted the data from legacy systems like BaaN, Clarify and various data files
- Designed and developed multiple batch jobs, complex, data flows and workflows by using the various transformations available in SAP Data Services like query, validation, case,look up...etc.
- Validated the data based on the business rules and the iDoc loading standards
- Cleansed the data using the various cleansing transformations available from BODS
- Performed de-duplication of data based on the SAP guidelines and Bio-Rad business requirements
- Cleansing the data as per the client needs and Transforming the data as per the business rules.
- Done Data Services code migration from one env to another such as from dev to QA and QA to PROD.
- Involving in the defect remediation in UAT and QA phases and performing build activities and deploy the product to QA, UAT environment.
Environment: Oracle, PL/SQL, Sales force, Clarify, Baan, SAP Business Objects Data Services (BODS) 4.0, MS Access, Putty, Winscp.
Confidential- San Francisco, CA
Sr. ETL Developer & SAP Data Services Consultant
Responsibilities:
- Worked on moving insurance data from legacy systems to Oracle Metavance systems.
- Enhanced and developed Interfaces for Groups, Billing and GL Profit and Loss.
- Enhanced already existing interfaces with the new line of business with lot of unit testing and regression testing to ensure the changes does not affect the current business structure.
- Installed Information Steward and set it up to work with Data Services environment.
- Did comprehensive data cleansing of the data from legacy and populated cleansed data to
- Metavance using Business Objects Data Services/Data Integrator (BODS/BODI).
- Built reusable data quality components.
- Built several one-time & reusable Business Objects Data Services (BODS) ETL jobs for the data extraction, cleansing and transformation.
- Data Loading, Unit Testing of jobs, Unit Testing of database, Scheduling, Automation of jobs with TIDAL Job scheduler.
- Data warehouse captures the data of Oracle modules like(Purchase, sales, Account Receivable, Inventory, GL)
- Worked on Unix Commands for Copying, Archiving of files on daily basis from ftp Locations, implementing logic & daily scheduling of UNIX .sh jobs.
- Implemented ETL control, restart recovery process and best practices.
- Performance tuning of ETL jobs. Data volume has been in terabytes.
- Implemented System Profiles to make the code movement easy from environment to
Environment: Oracle Financials (Accounts Payables, Account Receivables, General Ledger),Business Objects Data Services 4.0/3.2 (12.2.2.0), Data Quality, Business Objects 4.0/XI R3, Oracle 10g/11g, SQL Server 2005/2008, SQL, PL/SQL, TOAD, SQL Developer, Flat Files, DAT Files and Windows XP, UNIX, Tidal Job Scheduler,IBM Rational Clear Quest, ClearCase, Winscp.
Confidential - Fairfax, VA
Data Migration Consultant
Responsibilities:
- Extensively used the SAP best practices AIO(All-in-one) methodology
- Developed the STG, Lookup, AIO jobs and worked on the Mapping, Validation, and Enrichments in the AIO.
- Imported metadata for hierarchies, Idocs and tables
- Created ABAP Data Flows.
- Loaded data into Customer master, Material master, and Vendor master using DEBMAS, MATMAS, and CREMAS IDOCS by referencing SAP AIO jobs.
- Extensively used lookup, lookup ext functions in Data Integrator to load data from source to Target using a transform Table.
- Worked on performance Tuning of long running jobs used techniques like pushing down operations to database level, using cache memory, Degree of Parallelism, collect statistics to monitor jobs.
- Extensively used Query Transform, Map Operations, Table Comparison, Merge, Case, SQL, and Validation Transforms in order to load data from source t o Target Systems.
- Experience in designing Business Objects data services (BODS) solution for cleansing Master data from legacy system (SAP R2/R3).
- Used parsing, correction, standardization, duplicate matching for arranging information using Data Quality (Data Cleansing) Management.
- Extensively used ASSOCIATE, MATCH, GLOBAL ADDRESS CLEANSE, DATA CLEANSE, USA REGULATORY ADDRESS CLEANSE transforms to maintain quality data for end users use.
- Defined separate data store for each database to allow Data Integrator XI to connect to the source or target database.
- Created Scripts like starting script and ending script for each job, sending the job notification to users using scripts and declaring the variables local and Global Variables.
- Migrated and tested jobs in different instances and validated the data by comparing source and target tables.
- Migrated jobs to Data Services and ran the jobs in Data services, tested their validity.
- Design and developed Data Integrator scripts for data transformation.
- Created, validated and executed jobs to transfer data from source to target.
- Experience in debugging execution errors using Data Integrator logs (trace, statistics and error) and by examining the target data.
- Used Scripts for declaring the variables and creating the tables and inserting data into them. Environment: Business Objects Data Services XI 3.2, Business Objects XI R2, Oracle 10g/9i, SAP ECC 6.0, SAP R2/R3, ClearQuest, TOAD, UNIX, Flat Files, SQL and PL/SQL.
Confidential - Portland, ME
SAP BODS / BW Consultant
Responsibilities:
- Responsible to load the financial data into SAP BW from multiple source systems.
- Prepared the mapping documents for data sets from each source systems, by conducting multiple workshops with the business users and respective technical teams.
- Designed and developed dashboards in Finance and Sales for Vice president, and finance controller.
- Designed and developed multiple batch jobs, complex, dataflows and workflows by using the various transformations available in SAP Data services, like query, validation, pivot...etc.
- Designed and Developed Financial Data Model and loaded the same by triggering the info packages using SAP Data Services.
- Worked extensively on Data Migration Framework to migrate the data from several source systems into the consolidated datawarehouse and SAP ECC.
- Designed and developed highly formatted reports using Crystal Reports for the financial data on top of BEx queries.
Confidential - Bloomfield, CT
SAP BODS / BW Consultant
Responsibilities:
- The main goal of this project was to build a financial Datamarts in SQL server by pulling the financial data from SAP ECC on daily basis.
- Installed and integrated SAP Dataservices with SAP ECC by creating RFC destination and doing required transports in SAP ECC.
- Developed Batch jobs for pulling the date from SAP ECC Using ABAP dataflows.
- Was successful in Troubleshooting and performance tuning which reduced execution time of several jobs from several hours to less than one.
- Designed and developed Adhoc reports using WEBI reports on top of SQL datamart and trained super users on how to use/modify/run the same.