Datawarehouse Architect Resume Profile
Summary
- Information Technology Data Warehouse Consultant with over Ten 10 years of experience and broad based technical knowledge and in-depth understanding of system/software engineering lifecycle phases.
- Experienced in Gathering Analysis of requirements, implementation, testing and developing ETL solutions for leading clients.
- Interfacing with clients for business requirement gathering, conducting system analysis and finalizing technical / functional specifications.
- Designing, developing, testing, troubleshooting and debugging of the applications.
- ETL experience in using SAP BODS, Cognos Data Manager, Informatica, and SQL Server DTS.
- Expertise in using SAP Business Objects Data Services BODS 4.1/4.0 Designer, Repository Manager, Server Manager, Configuring / Controlling Job Server and Web Admin
- Extensively used stored procedures for Data Cleansing and Data Profiling.
- Have good hands on experience in ETL and Data Cleansing applications using Informatica Power Center, Data services, and PL/SQL.
- Providing post-implementation, application maintenance and enhancement support tthe client with regard tthe product / software application.
- Performed Pre Migration Assessment, Migration Post Migration.
- Good understanding in Ralph Kimball methodology for Star and Snowflake Schemas, Fact and Dimensions tables, Physical and Logical Data Modeling.
- Experience in Data modeling using SAP Sybase Power Designer ERWIN tools.
- Experience in writing SQL queries and stored procedure in Oracle and SQL Server databases.
- Proficient in Oracle 11g/10g/9i/8i/8.0/7.x, MS SQL Server 2012/2008/2005, database triggers, materialized views, Packages, stored procedures, functions, and database constraints , SQL Loader, SQL Plus, SQL Navigator, PL/SQL and oracle DBMS Packages, oracle supplied utilities.
- Used DB tools like SQL Developer, Embarcaderand TOAD.
- Involved in ETL end tend implementation.
- Used Scheduling tools like Tivoli, Autosys SAP Redwood CPS .
- Developed Test strategies tverify the ETL work flows, Data flows and transform logic based on Extraction and Loading.
- Strong experience in carrying Functional, Integration, Performance, reliability testing.
- Ability tanalyze existing systems, conceptualize and design new ones, and deploying innovative solutions with high standards of quality.
- Strong analytical skills and in depth experience in Business Process analysis coupled with documentation and self-organizing skills.
- Domain expertise of working for Reinsurance, Insurance, Telecom, Manufacturing Banking/Finance industry.
- Lead a Team of 10 people in both Onsite/Offshore.
- Cooperating communicating with other team members for efficient project work.
- Effective in communicating complex technical information tboth top-level management and end-users. Flexible and resourceful, applying strong organizational, time-management and planning skills tdeliver projects on time/budget, in pressure and aggressive timelines.
- Excellent Project Management skills.
- Strong communication and interpersonal skills. Experienced effective team leader as well as team player.
Technical Skills
- ETL Tools : SAP Business Objects Data Services, 4.1/4.0/3.2, Cognos Data Manager 10.2/10/8.4.1, Informatica 8.1, SQL Server DTS
- OLAP/BI : Cognos Report Studi10.2/10/8.4.1.
- Data Modeling : SAP Sybase Power Designer, Erwin
- Databases : Oracle 11g/10g/9i/8i/8.0/7.x, SQL Server 2012/2008/2005, DB2
- Languages : SQL, PL/SQL.
- Scheduling Tools : Sap Redwood CPS , Autosys Tivoli
- Operating Systems : UNIX, Windows, MS-DOS.
Experience
Confidential
Datawarehouse Architect
- Financial Data Store FDS 2.0 is a Data Warehouse for Columbia which is initiated tprovide data tusers.
- Provided recommendation document on Hardware Sizing and Architecture.
- Worked with various teams tsetup user access levels, and server sizing.
Data Modeling
Worked closely with Data Modeler in implementing Datawarehouse data model.
Business Objects Data Service
- Designed ETL framework 40 Jobs as part of a report POC tenhance report performance.
- Defined best practices document.
- ETL Design use of Complex ETL routines textract data from ODS.
- Upgraded SAP BODS 4.1 tSAP BODS 4.2
- Created repositories, Job Server and managed users.
- Implemented Server groups tuse advance performance tuning feature such as parallel data flow.
Environment: SAP Business Objects Data Services BODS 4.2/4.1, Oracle 11g
Confidential
Datawarehouse Architect
- Direct Re-insurance Warehouse is an enterprise warehouse for Gen Re which is initiated tprovide the SAP FSRI SAP BW data tbusiness users.
- Gathering Requirement and Analysis on the requirement.
- Provide Recommendation on Hardware Sizing and Architecture.
- Interacting with Client, Managers, SME's, BA and Database team.
- Extensively in meetings and interviewing Business and Power users, tdetermine the sources, data Layout, Business needs, and naming conventions of the column and table names.
- Performed complex troubleshooting, root cause analysis, solution development and implement reporting systems taddress the issue.
- Worked on Ralph Kimball methodology for Star and Snowflake Schemas, Fact and Dimensions tables, Physical and Logical Data Modeling.
- Coordinate with other teams tensure availability of Source Data.
Data Modeling
- Designed Logical, Physical models using SAP Sybase Power Designer.
- Designed views on top of the database for reporting needs.
Business Objects Data Service
- Designed ETL framework jobs.
- Created source ttarget mapping documents, job dependency charts.
- Defined Technical Design document.
- Created Custom ABAP programs tpull CLOB data from ECC.
- ETL Design use of Complex ETL routines textract data from ODS converting business functional specifications intmappings/workflows .
- Liaise with Business users and Business Analysts regarding requirements.
- ETL Design converting business functional specifications intmappings/workflows and testing
- Tuned performance for large data files by increasing block size, cache size and implemented performance tuning logic on sources, workflows, and data flows.
- Performance tuning using Recovery mechanisms
- Implemented Server groups tuse advance performance tuning feature such as parallel data flow.
- Involved in writing DS scripts and alsused some built-in functions like Search, Replace, Lookup ext and Custom Functions for ETL audit process.
- Created Data Services mappings tload the data warehouse, the mappings involved extensive use of simple and complex transformations like SQL Query Transform, Table Comparison, Query, etc. in Data flows.
- Involved in System integration testing along with other team members using various source data scenarios.
- Experienced in scheduling and monitoring of jobs using SAP Redwood CPS during pre-production phases.
- Involved in code enhancement, testing based on change request CR from end users.
- Experience in debugging execution errors using Data Services logs trace, statistics and error and by examining the target data.
- Enterprise wide Data Quality Dashboard
- Worked with various teams tsetup user access levels, and server sizing.
- Guiding the development team right from requirement tdeployment.
- Leading a Team of 6 in both Onsite/Offshore.
- Involved in ETL end tend implementation.
Environment: SAP Business Objects Data Services BODS 4.1/4.0, SAP ECC 6.0, MS SQL server 2012
Confidential
Datawarehouse Architect
- Coordinate with other teams tensure availability of Source Data.
- Interacting with Client, Managers, SME's, BA and Database team.
- Setting up Dev, UAT and Production Environment.
- Analysis on the requirements
- Gathering Requirement from Business, Working closely with Business Leaders tunderstand the requirement.
- This project is tcreate an ultimate repository of the accounting representation of all financial transactions.
- Timely response tcustomer requests.
- Ncustomer complaints on delivery or quality of deliverables.
- The development of feeds with ease according tthe functional specification document.
- Worked on Ralph Kimball methodology for Star and Snowflake Schemas, Fact and Dimensions tables, Physical and Logical Data Modeling.
- Completeness of deliverables without any rework.
- Involved in performance tuning.
- Unit testing and UAT Testing
- UAT cycle support on every major release.
- Deploying intproduction and upgrading the production environment.
- Defining the new process for the application
- Setting up the new security rules and regulations for User and Developers
- Creating complex workflows, promoting the code from one environment tanother environment.
- Deploying the code in Production.
- Taking care of PRD issues and release issues
- Giving the Estimates for the Implementation.
Business Objects Data Service
- Configuration of BODS , Repository creation and server management
- Central Repository configuration and best practices for version control
- Scheduling the jobs and User Management in Management Console
- ETL Design use of Complex ETL routines textract data from ODS converting business functional specifications intmappings/workflows .
- Liaise with Business users and Business Analysts regarding requirements.
- Developed Source tTarget Mapping as per the requirements.
- ETL Design converting business functional specifications intmappings/workflows and testing
- Exported and Data Integrator jobs tdifferent repositories for backup and code migration purposes.
- Tuned performance for large data files by increasing block size, cache size and implemented performance tuning logic on sources, workflows, and data flows.
- Performance tuning using Recovery mechanisms
- Implemented Server groups tuse advance performance tuning feature such as parallel data flow, dividing dataflow in tsub dataflow and Degree of Parallelism.
- Involved in writing DS scripts and alsused some built-in functions like Search, Replace, Lookup ext and Custom Functions like Sending Email whenever an exception is raised.
- Created Data Services mappings tload the data warehouse, the mappings involved extensive use of simple and complex transformations like Key Generator, SQL Query Transform, Table Comparison, Case, Validation, Merge, lookup etc. in Data flows.
- Involved in System integration testing along with other team members using various source data scenarios.
- Experienced in scheduling and monitoring of jobs using DI management console during pre-production phases.
- Worked on Creating Repository and associating the Job server tit.
- Involved in code enhancement, testing based on change request CR from end users.
- Experience in debugging execution errors using Data Services logs trace, statistics and error and by examining the target data.
- Enterprise wide Data Quality Dashboard
- Guiding the development team right from requirement tdeployment.
- Leading a Team of 10 in both Onsite/Offshore.
- Handling Project Management for different assignments.
- Involved in ETL end tend implementation.
Environment: SAP Business Objects Data Services BODS 4.0 Designer, Repository Manager, DS Management Console, CMC , SAP Information Steward 4.0, MS SQL server 2008, Oracle 10g, SQL Developer
Confidential
Sr. Technical Consultant
- Coordinate with other teams tensure availability of Source Data.
- Extensively in meetings and interviewing Business and Power users, tdetermine the flat file layouts, sources, data Layout, Business needs, and naming conventions of the column and table names.
- Interacting with Client, Managers, SME's, BA and Database team.
- Provide Recommendation on Hardware Sizing and Architecture.
- Setting up Dev, UAT and Production Environment.
- Analysis on the requirements
- Gathering Requirement from Business, Working closely with Business Leaders tunderstand the requirement.
- Replacing the existing Console Solution with ConsoleFree Solution and providing support for ERP LN FP3, ERP Infinium, ERP BPCS, and ERP LX. Creating Data Models, Creating Job streams, Fact builds and dimensional builds for ERP LN FP5,FP7 FP8 Feature pack .
- Timely response tcustomer requests.
- Ncustomer complaints on delivery or quality of deliverables.
- The development of feeds with ease according tthe functional specification document.
- Completeness of deliverables without any rework.
- Worked on Ralph Kimball methodology for Star and Snowflake Schemas, Fact and Dimensions tables, Physical and Logical Data Modeling.
- Involved in performance tuning.
- Unit testing and UAT Testing
- UAT cycle support on every major release.
- Deploying intproduction and upgrading the production environment.
- Defining the new process for the application
- Setting up the new security rules and regulations for User and Developers
- Creating complex workflows, promoting the code from one environment tanother environment.
- Deploying the code in Production.
- Taking care of PRD issues and release issues
- Giving the Estimates for the Implementation.
Cognos Data Manager
- Created Job Streams, Fact and Dimension Builds, Look ups, user defined functions.
- Used SQL wherever necessary, inside and outside the mappings.
- Created simple Cognos report studireports as part of Console free solution.
- Created Cognos Framework Manager Model using stored procedures.
- Guiding the development team right from requirement tdeployment.
- Leading a Team of 7 in both Onsite/Offshore.
- Handling Project Management for different assignments
- Involved in ETL end tend implementation.
- Upgraded the ConsoleFree Solution developed in Cognos Data Manager 8.4 tCognos Data Manager 10.0, 10.1.1 10.2.
- Certified the solution free solution on oracle 9i, 10g, 11g versions.
Environment: IBM Cognos 8.4 Data manager 10.2/10.1.1/10.0/8.4, MS SQL server 2008, Oracle 10g, TOAD, Cognos Framework Manager and Cognos Report Studio.
Confidential
Sr. Software Engineer
- Bacardi is a centralized data warehouse for CSBB T - Cards is used by business users and Source is a mainframe system. Source tBacardi Migration is initiated as 70 of data in Source and Bacardi are in sink.
- Analyzing the existing mainframe job process.
- Analyze and Review, functional requirements captured intData Migration Functional Specifications.
- Designing the HLL LLD for each process.
- Preparing the High Level Column Detailed Column Mapping Documents.
- Interprets business rules and requirements captured in Data Migration Functional Specifications.
- Interaction with technical team texplain the functional requirements and converts them inttechnical.
- Define the process for review and delivery of data migration documents.
- Handling MOCK run and Dress rehearsals.
- BODS installation
- Reviewing the BODS code for data migration.
- Identifying the business rule from Data Owners and Implementing in BODS.
- Creating Technical and Business reconciliation reports tsupport the Migration.
- Identifying the custom requirement for the data elements missing and designing the solution.
- Analyzing Legacy system.
- Understanding BODS standard content and assisting the usage of the standard content.
- Post Load Data Validations.
- Test case Preparation and reviewing with business team
- Training Data Owners and Business users
- Preparing Data Profiling reports
- Responsible for code migration tvarious environments
- Design/Develop/Review of BODS ETL code.
- Created Data Dictionary Ref Data.
- Data Profile Management.
- Creating views for crystal reports.
- Scheduling of BODS ETL Jobs
- Design/Develop/Review of Data Quality Jobs using BODS as ETL
- Creating PL/SQL procedures and functions.
- Working with Oracle Text/CAT search indexes.
- Performance Tuning.
- Creating a reporting engine which involved developing materialized views and UNIX scripts for Enterprise Reporting Model.
- Creating summarized dashboards using Oracle OLAP for analytical reporting.
- Version control through SVN
Environment: SAP Business Objects Data Services BODS 3.2 Designer, Repository Manager, DS Management Console , DB2 and UNIX.
Confidential
Sr. Software Engineer
CSBB T Customer Experience collects key channel operational metric data from numerous systems of record and manual reporting processes tbuild the consumer technology heatmap. The Central Metric Data Repository CMDR project will create a single solution for collection, storage, analytics, and reporting of these metrics. The solution will enable more efficient resource utilization, improve data quality/control, and reduce reporting process variation.
- Re-designed the CMDR data model.
- Worked on Ralph Kimball methodology for Star and Snowflake Schemas, Fact and Dimensions tables, Physical and Logical Data Modeling.
- Designed the Security data model for CMDR application.
- Prepared CRUD Matrix.
- Preparing the High Level Column Detailed Column Mapping Documents.
- Analyzed the specifications and identifying the source data that needs tbe moved tthe data warehouse.
- Responsibilities included designing and developing mappings.
- Extracted, transformed and load data from different source Flat Files and Oracle tOracle 10g.
- Developed various Mappings with multiple Sources, Targets, and Transformations using Informatica 8.1.
- Used heterogeneous files from Oracle, Flat Files as source.
- Designed lookup transformations tperform lookups on related data.
- Extensively worked on all the transformations like Filter, Aggregator, Expression, Router, Lookup, Update Strategy, Sequence generator, Rank.
- Worked on troubleshooting the Mappings and improving Performance.
- Created Packages and Procedure for CMDR application.
- Used SQL wherever necessary, inside and outside the mappings.
- Created Materialized Views for reporting.
- Involved in ETL end tend implementation.
Environment: Informatica Power Center 8.1 Mapping Designer, Workflow Manager and Workflow Monitor , Oracle 10g, SQL, PL/SQL, TOAD, ERWIN, Autosys and UNIX.
Confidential
Sr. Software Engineer
- Jawwal, a cellular communications company is interested in implementing Siebel CRM, thereby replacing its customer care functionality based on its legacy Customer Care Billing systems namely, Eppix for post-paid customers, and Minsat rating engine for prepaid customers. The customer wants treplace most of the Legacy Databases with Oracle 10g. It's a target driven Data Migration.
- Analyzing the current Legacy databases.
- Identifying the databases and schemas tbe replaced in Phase 1.
- Identifying the tables in each database from where data needs tbe migrated.
- Documenting Business rules for each Legacy applications
- Creating Data Dictionary Ref Data.
- Identifying the Source and Target Entities both in Legacy System and Siebel.
- Prepared CRUD Matrix.
- Preparing the High Level Column Detailed Column Mapping Documents.
- Created Packages and Procedure for automation of Data Profiling ETL Development.
Environment: SQL, PL/SQL, Oracle 10g, TOAD and Windows NT.
Confidential
Sr. Software Engineer
Electrolux is a world-leading producer of appliances and equipment for kitchen, cleaning and outdoor use. An Enterprise Data Warehouse EDW has been developed and deployed for Electrolux-NA.
As a Developer:
- Migration of database.
- Re-Created some of the existing Job Stream for better performance.
- Extraction of data from source systems present at different physical locations both for historical load and ongoing processes.
- Cleansing, Transformation and Application of business rules on the extracted data.
- Loading the transformed data intthe data warehouse.
- Perform aggregations and calculations of the data warehouse data.
- Created Views for Generating reports.
As a Production Support:
- As a support person I used trun the all job streams both manually and monitoring the scheduled jobs on Tivoli.
- Fix the issues on a daily basis.
- Handled enhancement work.
Environment: Cognos 8 BI Data Manager, SQL Server DTS, SQL Server 2000, Tivoli, and Windows NT.
Confidential
Software Engineer
- Employers Direct Insurance Company EDIC is an emerging leader in California workers' compensation insurance, focused on dramatically reducing loss ratios through aggressive management of claims, loss control, and fraud. The main objective of this project is tprocess the vast customer data inta warehouse tsupport various business requirements. Sources were coming from SQL Server and AS400. Analyzed the specifications and identifying the source data that needs tbe moved tthe data warehouse.
- Responsibilities included designing and developing mappings.
- Extracted, transformed and load data from different source databases AS400 and SQL server tOracle data warehouse.
- Developed various Mappings with multiple Sources, Targets, and Transformations using Data Manager.
- Designed lookup transformations tperform lookups on related data.
- Used SQL wherever necessary, inside and outside the mappings.
- Created and run Fact Builds and Job Streams.
- Created Materialized Views for reports generation.
Environment: Cognos 8 BI Data Manager, Oracle 10g, SQL Developer, TOAD, Windows NT.
Previous Experience
Software Engineer Nov 2004 tSep 2006 , Hyderabad, India.