Data Modeler Designer Resume Profile
Experience Summary
- Worked extensively in Data Warehousing and Business Intelligence areas as Data Architect, Data Modeller, ETL Designer and BI / System Analyst of Information systems
- Modelled data models for Enterprise Data Warehouse, OLAP, OLTP and MDM Master Data Management systems adhering to industry standards for data warehouses
- Have designed data models using both dimensional modelling and relational modelling techniques
- Have strong expertise in customisation of Insurance standard data models like IBM's IIW Insurance Information Warehouse , ACORD LAH data model, Teradata's FSLDM
- Have done extensive data profiling for source system data to identify the data quality issues up-front
- Worked in complete Software Development Life Cycle SDLC using both waterfall model and Agile methodologies
Technology
A list of important hardware, software products, tools and methods that I have worked with are below:
- Operating Systems:
- Windows Variants - 7, Vista, NT, 2003, XP
- Unix, Linux Distributions
Software Products Language/Database:
- IBM's IIW Data Model,
- Teradata FSLDM Model,
- ACORD LAH Model.
- Industry Leading Admin Systems:
- Oracle's AdminServer Admin System, FASAT Agency Admin System
Databases:
- Teradata V13, Oracle 11g, 10g
- Programming Languages:
- SQL, PL/SQL, Java 5.0, C, C , COBOL
Tools:
- Industry Leading Tools:
- CA ERWin Data Modeler
- HP QualityCenter,
- HP QuickTestPro
- Informatica Products:
- Informatica PowerCenter 9.1/9.0/8.6,
- Informatica B2B and Analyst.
- Informatica MDM
Other tools:
- Data Masketeer
- Legacy Tools:
- CA Realia COBOL V 3.3, Norcom Screen IO,
- Pervasive SQL 9.0
The details of the various assignments that I have handled are listed here, in chronological order.Assignments
Project | Confidential |
Customer | Confidential |
Location | Confidential |
Period | Confidential |
Description | The project's objective is to decommission existing legacy data warehouse present in Mainframe DB2 database and migrate both Policies and Quotes of Personal LOBs Homes, Motor and Travel to a new database PDL 'Personal Data Layer' on Teradata platform. As part of decommissions, existing DB2 marts are decommissioned and new data marts over PDL are being constructed to meet business user needs for reporting. PDL is an agnostic data model designed to fit all lines of businesses of Insurance such as both Personal and Commercial lines of business. |
Role | Confidential |
Database | Confidential |
Tools | CA ERwin Data Modeler, Informatica PowerCenter v9.1 |
Responsibilities | Analyse the legacy data structure and its data elements and discuss with business and IT stakeholders about their usage and come up with data dictionary. Prepare Logical Data Model using CA Erwin Data Modeler for Underwriting, Customer subject areas Analyse the existing database applications and design Enterprise Conformed Dimension Bus that works for both existing database applications and PDL. Design Physical Data Model for Teradata using CA Erwin Data Modeler Perform comprehensive data analysis of 8 Source Systems Data present in legacy MI warehouse. Creation of mapping documents for Informatica PowerCenter ETL tool for all 8 source systems. Design Data acquisition and Data Integration Strategy for all 8 source systems Coordinate with Design and Developer Teams for smooth design and development efforts. Create PDL Data Marts for business needs using Dimensional Modelling Principles |
Project | Confidential |
Customer | Confidential |
Location | Confidential |
Period | Confidential |
Description | The Guidewire Implementation Project provides new applications for policy and claims administration comprising of Guidewire PolicyCentre PC and Guidewire ClaimsCentre CC integrated into the existing Aviva IT landscape. The goal of this project is to design Integrated Data Store IDS following Teradata FSLDM model, and also to extend the usage of IDS by feeding data to Finance EDW EDW and UN Sanctions downstream applications. |
Role | Data Modeler Designer |
Database Data Model | Teradata V13 Teradata FSLDM V11 |
Tools | CA ERwin Data Modeler, Informatica PowerCenter V9.1 |
Responsibilities | Analyse the Solution Outline Design SOD and gather both functional and non-functional requirements Discuss with business and IT stakeholders and Prepare Logical Data Model using CA Erwin Data Modeler adhering to Teradata FSLDM Design Physical Data Model for Teradata using CA Erwin Data Modeler Perform comprehensive data analysis of Source Data formatted in XML and XSDs Creation of mapping documents for Informatica PowerCenter ETL tool Design Data acquisition and Data Integration Strategy Creation of mapping documents for Informatica PowerCenter ETL tool for all 8 source systems. |
Project | Confidential |
Customer | Confidential |
Location | Confidential |
Period | Confidential |
Description | MDM Broker Data Hub project is a transformational initiative that would change the way AVIVA Canada does business with brokers by enhancing the broker experience in providing a single golden source of truth for master data of all broker information that will be accessible across the enterprise, multiple applications and business units. The project involves four major work components. One, data extraction of broker information from various source systems to MDM Landing Area using Informatica PowerCenter. Second, Informatica MDM matches multiple broker records and merges into one best version of truth BVT of broker information. Third, a new broker management UI application is created using J2EE technologies following SOA methodologies and MDM SIF Service Integration Framework . Fourth, BVT of broker information is synchronized with various downstream applications for smooth functionality. |
Role | MDM Data Modeler, MDM/Siperion Developer, Informatica Data Analyst |
Database Data Model | Oracle 11g , Custom Data Model |
Tools | CA ERWin Data Modeler, Informatica MDM V9.1 |
Responsibilities | Analyse the BRD and gather both functional and non-functional requirements Perform comprehensive data analysis for the source data Creation of mapping documents for ETL tool. Prepare Logical Data Model LDM for Broker MDM hub using Erwin Data Modeler that satisfies both MDM needs and Broker Management UI application. Prepare Physical Data Model PDM against Oracle 11g database using Erwin Data Modeler Configure Landing Tables, Stage Tables and Base Tables in Informatica MDM tool based on PDM created Create mappings between Landing, Staging and Base Tables for Master Data Management |
Project | Confidential |
Customer | Confidential |
Location | Confidential |
Period | Confidential |
Description | Solvency II pillar 1 requires Insurance organizations to measure the change in the market value of assets due to various risk factors, perform the stress tests to determine its contribution to risk and its impact to Aviva's financial position. Pillar 3 requires organizations to produce the asset portion of the QRT for Aviva to review its current financial positions and also comply with the external regulatory reporting requirements. IT Solution includes extracting the asset information Aviva invested with Aviva Investors AINA and loading to EDW and then to SAS Analytics for further consumption. |
Role | Data Modeler |
Database Data Model | Oracle 11g IBM's IIW V8.4 Data Model |
Tools | Erwin Data Modeler, Multi Model Mapper MMM , Teradata SQL Assistant |
Responsibilities | Analyse the BRD and gather both functional and non-functional requirements for both AUSA and ACAN Perform comprehensive data analysis for the source data Prepare Logical Data Model LDM based on IBM's IIW Enterprise Model using Erwin Data Modeler Prepare Physical Data Model PDM against Oracle database using Erwin Data Modeler Generate database scripts for Oracle Configure IIW's Multi-Model Mapper for the Assets data elements and generate hyperlinks |
Project | Confidential |
Customer | Confidential |
Location | Confidential |
Period | Confidential |
Description | The objective is to obtain Contingent Profit Commission component from existing Aviva systems so as to leverage the CPC modules in Igloo internal model. The results of the CPC modelling will determine its contribution to risk and assist in projection of its impact to Aviva's financial position as part of Solvency II regulation. |
Role | Data Modeler |
Database Data Model | Oracle 11g IBM's IIW V8.4 Data Model |
Tools | Erwin Data Modeler, Multi Model Mapper MMM , Teradata SQL Assistant |
Responsibilities | Analyse the BRD and gather both functional and non-functional requirements Perform comprehensive data analysis for the source data Prepare Logical Data Model LDM based on IBM's IIW Enterprise Model using Erwin Data Modeler Prepare Physical Data Model PDM against Oracle database using Erwin Data Modeler Generate database scripts for Oracle Configure IIW's Multi-Model Mapper for the Assets data elements |
Project | Confidential |
Customer | Confidential |
Location | Confidential |
Period | Confidential |
Description | Aviva USA and Aviva Canada, being subsidiaries of Aviva Group located in UK, has to comply with the Solvency II regulation set forth by European Union. The engagement involves creation of Data Architecture as per TOGAF architecture principles and proposing fitment of IBM's IIW for Solvency II Data needs. My focus of the engagement is to analyse the prepared Data Architecture, verify the fitment of Solvency II data requirements Solvency II reports RTS, SFCR, QRT and Internal Model data requirements against IIW v8.3 and v8.4, and suggest Architecture team in the data model selection. I was also required to prepare Conceptual Data Model in compliance with IIW V8.4. |
Role | Data Architect, Data Modeler |
Data Model | IBM's IIW V8.4 Data Model |
Tools | Erwin Data Modeler |
Responsibilities | Understanding the Aviva Group Requirements to comply with Solvency II regulation. High Level Analysis of data flow from Various Systems as part of Data Architecture Interview and have discussions with Business and IT stakeholders, and prepare Data Architecture and interaction Diagrams following TOGAF The Open Group Architecture Framework . Performing fitment test of IIW V8.3 and V8.4 against SII Data Elements Preparing Conceptual Data Model CDM in ERWin Model as per IBM's IIW Business Model |
Project | Confidential |
Customer | Confidential |
Location | Confidential |
Period | Confidential |
Description | Aviva North America has initiated a finance transformation initiative to restructure finance processes and solutions across the North American Region Canada and USA . Data from different source systems are loaded into the Insurance Annuity Schema IAS which acts as an integration layer. The financial data FI is fed into the Oracle E-Business Suite through the Financial Accounting Hub FAH for GL processing and non-financial or management MI data into the Data Store Financial Data Store for the management reporting. The processed GL data from Oracle E-Business Suite is then transported to the Data Store from FAH R12 and then to the reporting layer that will provide enhanced reporting capabilities to the business community accessing the system. |
Role | Data Modeler, Business Systems Analyst, ETL Designer |
Database Data Model | Teradata V13 IBM's IIW V8.3 Data Model |
Tools | Erwin Data Modeler, HP Quality Centre, Informatica Powercenter, Teradata SQL Assistant, Oracle SQL Developer |
Responsibilities | Gathering business and Functional requirements, scope of activities across all streams and prepare Specification Requirement Document SRD . Analysis of Source and Target Systems in all streams Providing Mapping of data elements between Source and Target for ETL process Analysing and understanding ACORD data model for Source field mappings Preparing Logical Data Model LDM in ERWin Model as per IBM's IIW Enterprise Model Assistance in preparing Physical Data Model PDM of Data Store using Teradata database Prepare ETL Mapping sheets for the data integration Performing Testing and Defect Management in HP Quality Centre application. |
Project | Confidential |
Customer | Confidential |
Location | Confidential |
Period | Confidential |
Description | The project involves quality assurance in terms of functional and end-to-end testing of AdminServer application through automation and manual test bed. AdminServer was chosen to assist in the establishment of a standardized annuity business process on a single platform for Aviva USA, to ensure consistency of administrative services. Being a transactional based system, once the contract information is keyed in, it provides real-time financial transactions, reversals, and unbundled fund functionality within a 100 browser-based system. Testing covers end-to-end New Business processing, In-force Administration activities of Annuity Administration. |
Role | Quality Assurance Analyst Automation Engineer |
Tools | Oracle's AdminServer Admin System, FASAT Agent Admin System, HP Quality Centre QC , HP Quick Test Pro QTP |
Responsibilities | Analysis of business requirements, and scope of application Estimation of testing in terms of effort and schedule for both Test Case Design and Execution Designing the effective test cases using the various testing techniques of functional testing Constant Interaction with BAs for the change in business requirements Performing Testing and Defect Management in HP QC application Walkthroughs of Test Documentation with Client's Business Analyst, Quality Lead Regression testing of AdminServer on every patch. Ensure the overall quality of the assigned tasks as per Client's Quality Policy, CMMi Level 5 standards and TCS in-house quality policy iQMS Integrated Quality Management System in TCS' integrated Project Management System iPMS . |
Project | Confidential |
Customer | Confidential |
Location | Confidential |
Period | Confidential |
Description responsibilities | The primary goal of project is to perform functional testing and system testing of web applications as well as stand-alone applications. The main phases of the testing are documentation of test cases and testing the application. Documentation involves analysing the requirements and preparing the effective test cases. Testing concentrates on execution of prepared test cases and defect logging to ensure the quality. The most challenging task is to have constant interaction with the client for the changing business requirements and tweaking the test documentation as per the change in requirements. |
Role | Quality Assurance Analyst |
Tools | HP Quality Centre, Compuware Track Record |
Responsibilities | Analysis of business requirements, scope, scalability and scope of application Estimation of testing in terms of effort and schedule Preparing the test plan Constant Interaction with Clients for the change in business requirements Coordinating and managing the testing activities at offshore Defects management using HP Quality Centre, Compuware Track Record applications Reviewing the Offshore deliverables Walkthroughs of Test Documentation with Client's Business Analyst, Quality Lead and Development Lead Executing smoke testing of the application Ensure the overall quality of the project as per Client's Quality Policy, CMM Level 5 standards and TCS in-house quality policy iQMS Integrated Quality Management System in TCS' integrated Project Management System iPMS . Preparing the test summary report at the end of testing |
Project | Confidential |
Customer | Confidential |
Location | Confidential |
Period | Confidential |
Description | The objective of project is to migrate the legacy application 'Select Universal Life' in DOS environment to Windows 2003 Server environment. The Character User Interface CUI screens in DOS environment were converted to Graphical User Interface GUI using GUI Screen IO. The existing flat file database BTRIEVE was upgraded to Pervasive SQL 9.0. Reports which were being reported from EZPage were converted to be generated using Report Writer. |
Role | COBOL Developer |
Solution Environment | Windows 2003, CA Realia COBOL V 3.3, Norcom Screen IO, Pervasive SQL 9.0 |
Tools | TCS Masketeer tool, CA Realia Workbench , HP Quality Center, Compuware Track Record |
Responsibilities | Involve in Customer Interaction with onsite for requirement gathering. Mask production data using TCS Masketeer tool for using in development and testing environments Migration of Realia COBOL programs to Realia II COBOL Workbench 3.3 in Windows 2003 environment. Migration of Screen IO programs to GUI Screen IO in Windows 2003 environment Data model migration from Btrieve 6.1 database to Pervasive SQL 9.1 Migration of Level 2 Report Writer Programs to Report Writers II 3.17 Code development, Walkthroughs and Testing. Fixing the identified bugs, by finding the root cause. Involved in end-to-end activities for entire life cycle of migration project. |