Datawarehouse Lead Resume
UsA
Summary:
Over 11+ Years of experience in IT and involved myself with the entire SDLC i.e. Understand and Analyze the Business Requirements/Functional Specifications by interacting with Business Analysts, Project Planning, Design (Logical and Physical), Development, Testing, Implementation and Maintenance of various DW/DM's and Client/Server applications under Unix/Windows Environments. It also includes effective communication/co-ordination within the team and Product managers/Business and strict Process Adherence (CMM).
Professional Experience:
- Planning and implementing projects and Working in a fast-paced environment while solving crises.
- Designed the data modeling for party gaming using the data modeling tool Unica.
- Designed the ETL modeling for multiple data marts like dell financial systems (DFS) and manufacturing.
- Dealt with heterogeneous source systems like Flat file ,XML files for loading to enterprise data warehouse using Informatica, Abinitio ETL tools .
- Writing the functional and technical specification documents for various Projects includes finance, manufacturing insurance and reinsurance domains .
- Sound knowledge of Various data warehousing modeling techniques like Inman and Ralph Kimball's approaches.
- Sound knowledge/experience in Fine Tuning Informatica Mappings, Sessions and workflows.
- Extensive experience in Informatica ETL design , Trouble Shooting and Debugging.
- Strong Working Experience in RDBMS like Oracle, SQL Server and TERADATA RDBMS.
- Designed the business objects universe to satisfy the end user requirements for Adhoc reporting.
- Well developed Interpersonal and Communication skills with good Documentation and Presentation skills.
- Possesses strong Commitment and excellent Analytical, and efficient Problem Solving skills, currently leading a team of 8 members at Offshore and 3 members onsite team
- Expertise in all phases of SDLC (System Analysis, Design, Development, Testing, Deployment, support, documentation & Configuration Management).
- Good experience in working with UNIX scripting.
- Reputation for strong organizational skills, excellent communication skills, dedicated teamwork, attention to details, ability to work under pressure to balance competing priorities and meet deadlines.
TECHNICAL SKILLS:
Databases : Teradata V13.1.1/V2R6, Oracle 10g/9i/8i, SQLServer2008
Tools & Utilities : Informatica 9.0/8.6.1/7.x /6.x, Abinitio, Microsoft Visio, Toad, Teradata SQL Assistant, Erwin, Business Objects 5.x, D2
Languages : SQL, PL/SQL, UNIX Shell Scripting, HTML
Operating System: UNIX (Sun), Windows 98, 2000, XP, GNU/Linux
Others : Control-M, Crontab and Autosys
Achievements:
- Received Striker at Dell award, Customer Experience award, Business Impact award, Individual Contributor award and Team award.
Projects:
Company : Confidential, USA Nov'11 to Till DateRole : Datawarehouse Lead
Project : Underwriting Data Mart
Project Description:
The XL Re Underwriting Data Mart (UDM) is a data source for underwriting analytics, management information and reports on a platform common to all XL Re underwriting groups. Providing data for both operational and analytical use, the UDM strives to be the definitive source underwriting data reported to management.
Responsibilities:
- Work with Business Users and Managers to understand the business needs and Design the Data Model to fulfill the needs.
- Provides the Optimal ETL Solution to extract and load the data from different source system to EDW.
- Establish and enforce technical standards for deliverables of projects.
- Designing the High level ETL Dataflow.
- Designing the ETL Reference Pattern.
- Create the Technical Design Specification Document.
- Create Source to Target Mapping Document.
- Design POC Mappings and help the ETL team on data acquisition techniques and loading strategies during the development phase.
- Query optimization and Informatica Code optimization to improve the Load time.
- Testing and setting up the peer reviews for each of the modules to minimize the development errors.
- Creating the Autosys jobs to run the ETL on daily basis.
- Creating Change Tickets to deploy the code in to Production.
- Responsible to handle and fix the issues in Stabilization period.
- Prepared the Support document and Transition the project to the Support team.
Client : Confidential, Dover, USA
Project : CNG (Claims Next Generation)
Role : ETL Lead & Architect
Project Description:
PM Claims Organization implementing new claims processing application (CNG) utilizing Guide Wire Claim Center; Adoption of this product changes system of record from V2K-ClaimsWorkBench database that supports CNG.
Main goal is to build the new system for claims subject area and decommission the existing process as its not serving the complete business requirements. Instead of getting the data from multiple sources in different forms which is not meeting the requirements, business decided to get it from single point from the source to make it as an integrated solution & to maintain single version of truth in personal market warehouse.
Providing the near real time data through Golden Gate to Operational data store for NRT reports as business demands which will be using ROLAP functionality which will be refreshed every 10 minutes.
We as well provide operational and analytics report in support of the Claims Next Generation Program for business analysis and decision making by business users with the help of dimensional model defined by PMIT.
Responsibilities:
- Writing the functional specification documents based on the BRD
- Prepare the project plan, Line up the resources and Track the project status
- Validate if data model captures all the functionalities requested by Biz Users.
- Query optimization and Fine tune the Informatica Mappings.
- Designed audit control mechanism for GG to ODS data validation.
- Designing the process to load the type 2 data from source (Oracle) to Operational data store using Golden gate and Teradata functionality through triggers, procedures and macros for ~350 tables.
- Co-coordinating & communicating with Business analyst and technical leads (for any modeling or change of requirements)
- Supported during the deployment phases for immediate resolutions (if any data issue is identified)
- Propose the new optimized solutions to make the data available for NRT reporting.
- Code reviews and pass on the review comments to our onsite technical leads.
- Designed the automation mechanism for deployment process and scheduling the jobs .
Environment: MS Visio, Erwin , INFORMATICA 8.6.1, Teradata Version 13.1.1, ESP, Oracle, Golden Gate, UNIX
Projects:
Company : Confidential, Jun '09 - Sep '10# Project : DFS
Role : ETL Architect & Technical Lead
Project Description:
The objective of this project is to generate reports that help the DFS business to know the revenue generated and the loss on investment. The data is coming from two different Source Systems called Info Lease and Vision Plus into the Teradata DDW. During this Phase we are generating reports called Delinquency which will give the details about the DPD (Days Passed Due) customers.
# Project: ODM GLOBAL FULFILLMENT June 08 to May 09
Role: ETL Architect & Technical Lead
Project Description:
The Objective of this project is to bring in the Order Status information from Different applications called OFS/O2/WTCS. There is a Single Unified Data model designed to load the data from three different regions like EMEA, DAO and APJ. This helps the business to use the integrated data to track the Order Status.
# Project: Equallogic Integration: Jan'08 to May 08
Role: ETL Architect
Project Description:
The Objective of this Project is to extract the Equallogic Order data from Dell Data warehouse Tables (Teradata), Loading in to XML files and Place the files into Vendor Specific Server called Tumbleweed. This gets the data from different departments like Accounts, Finance, manufacturing, Sales and Services. The Goal of this project is to generate different reports like Order Status, Shipping and Billing details and the Service Details specific to Equallogic Customers.
Source: Teradata
Target: XML
# Project: EMF LOGISTICS: Aug'07 to Dec'07
Role: ETL Architect and Technical Lead
Project Description:
The EMF Ireland Speedway system contains shipping information on all EMEA orders. It contains Ship Code Rules which determines which carriers and Fulfillment Centers (which are storage places for vendor owned inventory) to be used, it sends out manifests, and takes data feeds from the Fulfillment Centers (inventory updates as well as Invoice and Serial #s for the items picked and shipped). The carriers send in POD status updates and also Freight Cost data. In addition, the Speedway system supplies 100% of the data to the Order Inquiry Tool and the FCC tool, and it provides reporting access.
Currently Europe users use only Speedway feed for reporting purposes. When the ION Poland facility goes live, this shipping data will not be available in the Speedway feed. Users would have to go to ION database for getting this info. Even though ION and Speedway are similar to each other, they are not data integrated. Hence, in order to get shipping details of entire Europe, one would have to go to 2 different sources - Speedway and ION. To resolve this issue,
As part of the DDW EMFP Logistics project, FDL would server as the central repository wherein data would be consolidated from Speedway and ION databases. FDL would serve as the Source to DDW.
Responsibilities:
- Provides the Optimal ETL Solution to extract and load the data from different source system to EDW.
- Establish and enforce technical standards for deliverables of projects.
- Designing the High level ETL Dataflow.
- Designing the ETL Reference Pattern.
- Created the Technical Design Specification Document.
- Mapping Document Reviews with Data Architect.
- Review all the DDL's with DATA Architect to make sure the Indexes, Compression and Partitioning is as expected.
- Design POC Mappings and help the ETL team on data acquisition techniques and loading strategies during the development phase.
- Testing and setting up the peer reviews for each of the modules to minimize the development errors.
- Creating the Control-M drafts and Schedule the Jobs.
- Creating Change Tickets to deploy the code in to Production.
- Responsible to handle and fix the issues in Stabilization period.
- Prepared the Support document and Transition the project to the Support team.
Environment:
WINDOWS XP, UNIX, Informatica 8.x, Teradata, Oracle, Business Objects, Control-M, Test Director.
# Project: GEARS (Global Enterprise Accounts Receivable System) April'07 to July'07
Role: Individual Contributor
Project Description:
The objective of this Project is to bring GEARS (Oracle A/R) data into the data warehouse so that the business can do reporting on it. Also as part of this, Finance will implement a "GAAP Calculator" application to handle revenue recognition processes. GEARS (Oracle A/R) will be the platform for the GAAP Calculator "engine". GAAP Calculator will be seamlessly integrated with Dell's Oracle AR. GEARS and GAAP Calculator will partner to develop a single source of data that is synchronized to ensure cut-off period matches for invoicing and revenue recognition. Currently, most of Dell's revenue recognition calculations are performed within the Dell Data Warehouse. These calculations are based on general business rules around revenue recognition but do not adequately address the audit trail and internal controls required under Sarbanes-Oxley.
#Project: CCFRAUD Dec'06 - Mar'07
Role: Individual Contributor
Project Description:
The objective of this project is to build one consistent process for fraud detection across the globe and bring that information into Data warehouse for reporting purpose. Replicate functionality from US CCFraud tool as a baseline, with accommodations for the regional differences where they need to exist. Implement global requirements including currency, date formats, language logic so tool can be tailored to regional needs
Use vendor modeling where external address/name/phone information is available and internal modeling where it is not.
Responsibilities:
- Create the Development, Testing and Production Sandbox with the help of ETL Engineering team.
- Create the Functional Specification based on Source to Target Matrix document.
- Create the Database objects like Tables, Views and Macros in Teradata.
- Develop Abinitio Graphs to bring the data into Data Warehouse.
- Unit Testing and Peer reviews at the end of each stage to minimize the development errors.
- Create Control-M drafts for each process and Schedule the jobs.
- Creating Change Requests to deploy the Code in to Production Boxes.
- 24X7 Support during the stabilization Phase.
- Create the Support Document and transition the Project to the Production Support team.
Environment:
WINDOWS XP, UNIX, ABINITIO, ORACLE 9i, Teradata, Control-M, D3, Test Director.
Projects:
Company : Confidential, April'06 - Nov'06Client : Party Gaming
# Project : CAMPAIGN MANAGEMENT
Role : System Analyst
Project Description:
This project involved implementing a Campaign Management Tool, Unica's 'Affinium Campaign', in Party Gaming environment.
Affinium Campaign allows marketers to create, test, optimize, deploy and analyze multi-wave, cross-channel marketing campaigns quickly and easily. It is a Web-based solution that provides an easy-to-use graphical user interface that encapsulates complex SQL queries to support the direct marketing processes of selection, suppression, segmentation, sampling, creating output lists, populating contact history, and closed-loop response tracking and reporting.
In Party Gaming, the Affinium tool is deployed on top of its enterprise warehouse on Teradata.
Responsibilities:
- Understanding and gathering the requirements based on the BRD.
- Validate if data model captures all the functionalities requested by Biz Users.
- Design and develop Informatica interface for exporting campaign data on Teradata to the OLTP system on Oracle.
- Design and Develop Informatica interface for importing the response data from OLTP Systems on Oracle on to Teradata.
- Define and maintain Teradata CM data model to UNICA.
- Assist marketing team to design Flow charts in Affinium based on the campaign requirements.
Environment: Teradata Version 12.0, INFORMATICA 8.6.0, BO XI,SIEBEL, CISCO Scheduling tool called Dollar Universe, Toad, Test Director, Remedy, UNIX.
Company : Confidential, Nov'05 - March'06
Client : Confidential#Project : Integrated Bonus System
Role : System Analyst
Project Description:
The IBS is been populated with data related to poker business like number of players visited the poker site, Game types, bonus rules and the bonus awarded information. The main purpose of IBS is to provide the primary source of data for reporting and analysis for the following:
- Game information in terms of Game Type and Sub Game types.
- Player information in terms of his email address and Bank account information.
- Bonus Rules and related Bonus information.
Responsibilities:
- Understanding and analyzing the Data Model.
- Preparing the Source to Target matrix (mapping) document that lists the business rules as well.
- Write the Functional Specification Document.
- Creating target tables/views in Teradata.
- Develop Informatica ETL jobs for data extraction, transformation, loading, data cleansing by using several built in transformations and standards.
- Creation, configuring, scheduling and running of Sessions, Worklets and Workflows.
- Unit testing and peer reviews at the end of each stage.
- Documentation of all the above mentioned phases of development like, manuals for Implementation Plan, Mapping Document, Input File Formats, DDL's for Staging and Final Tables.
Environment: Windows XP, UNIX, Informatica 7.1.3, Oracle 9i, Teradata and Control-M.
Projects:
Company : Confidential, India Sep'04 - Oct' 05
Client : EMC
Project : EMC-IDW
Role : Sr.ETL Developer
Project Description:
The IDW is getting the data from various systems like Oracle Apps, Sybase, SQL Server and Flat Files. It is been populated with data related to Sales and Marketing and Customer Service Data Mart. The main purpose of Sales and Marketing Data Mart is to provide the primary source of data for reporting and analysis for the following:
Financial information in terms of sales orders more specifically Booking, Billing, Shipping and Backlog transactions, Inventory information in terms of Balances, Transfer transactions. Customer Installation information of the sales order.
Responsibilities:
- Understanding the EMC IDW Architecture.
- Analyzing the source data, and understand the business rules and get the clarifications from the client.
- Development of ETL jobs for data extraction, transformation, loading, data cleansing using various Informatica transformations and standards.
- Testing and setting up the peer reviews for each of the modules to minimize the development errors.
- Creating, configuring, scheduling and running Informatica Sessions, Worklets and Workflows.
- Monitoring the jobs by using control-M and sending the daily status report to the internal project team and to the client.
- Regular/Timely working sessions with the client during the data validations phase
- Discussing the performance criteria with the client and there by tuning the graphs for optimal performance.
- Documentation of the entire project Begin-to-End, like, manuals for Technical Specifications, Mapping Document, Input File Formats, DDL's for Staging and Final Tables, Process Monitoring Manual, ISSUE/Escalation Template etc
Environment: Windows 2000, Informatica 7.1, Oracle 8I, UNIX, Control-M, TOAD, Test Director 8.0
Company : Confidential, India May '03-Aug '04
Client : ConfidentialProject : Citizen Friendly Services of Transport Department (CFST) Phase I
Role : Team Lead
Project Description:
The main scope of the system is to automate the total Transport Department all over the state. The 34 centers in A.P, which are computerized, will be in network so that sharing of data is possible and the data is also replicated in central server located at STA, Hyderabad.
In the First Phase of the Project the Transport department basically provides three categories of services: Licensing, Registration of Transport and Non-Transport vehicle and issue of Fitness Certificates for Transport vehicles.
The benefits of Computerization are Centralized database system, Paperless office, Avoiding duplications, ease of administration, Increasing the revenue by effective tax collection, Quick services to the customers, avoiding the role of Agents, PVC Cards for License and Registration with Photo and Signatures
Responsibilities:
- Installation of Oracle Software and Database creation
- SQL, PL/SQL procedures development and enhancements
- Loading the agent and license data into the system
- Backup & Recovery Management
- Troubleshooting in Application software and Database Administration
- Assisting the RTO staff with their daily queries and the troubles they are Facing, like correcting the wrong entries they made from their front end by going to the respective tables and applications at the backend and correct/revoke the same
Environment: Windows NT, Oracle 8I, PL/SQL, D2K, TOAD
EDUCATION:
- Bachelor of Since