Data Modeler Resume
PROFESSIONAL SUMMARY
- About Eight years of extensive experience in the complete Software Development Life Cycle (SDLC) covering Requirements Management, Data Analysis, Data Modeling, System Analysis, Architecture and Design, Development, Testing and Deployment of business application.
- Extensive experience in designing the Data models for OLTP & OLAP database system.
- Strong Data Modeling experience using ER diagram, Dimensional data modeling, Star Schema modeling, Snow-flake modeling using tools like Erwin, EMBARCADERO ERStudio.
- Gathering and translating business requirements into technical designs and development of the physical aspects of a specified design.
- Extensive experience in Database Design, Normalization, Selective De-normalization, Forward/ Reverse Engineering of Applications, Conceptual , Logical and Physical data modeling using Erwin, E/R Studio and Sybase Power Designer.
- Expert knowledge inSDLC(Software Development Life Cycle) and good experience in the field of business analysis, reviewing, analyzing, and evaluating business systems and user needs, business modeling, document processing.
- Possess strong Conceptual and Logical Data Modeling skills, has experience with JAD sessions for requirements gathering, creating data mapping documents, writing functional specifications.
- Experience in conducting JAD sessions, interacted with various business users and acted as a liaison between the business and development teams.
- Designed and developed Data Marts by following Star Schema and Snowflake Schema Methodology, usingindustry leadingData Modeling tools likeERWIN and EMBARCADERO ER Studio.
- Experienced in optimizing performance in relational and dimensional database environments by making proper use of Indexes and Partitioning techniques.
- Created the enhanced logical model in 3NF using ER/Studio and the many to many relationships between the entities are resolved using associate tables.
- Performed various operations like Data Cleansing, Data Scrubbing, Data Profiling and maintained data governance
- Gathered and documented Functional and Technical Design documents.
- Experienced in preparing Business Process Re-engineering Models.
- Assisted quality assurance team in preparing test scripts and test cases
- Experienced in optimizing performance in relational and dimensional database environments by making proper use of Indexes and Partitioning techniques.
- Possess a strong analytical, verbal, inter-personal skill that helps in communicating with developers, team members.
- Proficient in using MS Office including MS Visio and MS Project
- Enthusiastic and project-oriented team player with solid communication and leadership skills and the ability to develop different solutions for challenging client needs.
Education: Bachelor of Business Administration
PROFESSIONAL EXPERIENCE
Client: Confidential- Bensenville, IL March2012–Feb 2013
Role: Data modeler
Confidential has more than 6 million customers and $5 billion annual revenue and they provide services to customers in 25 states. As data modeler role was to design and implement the data warehouse to support 360-degree view of the customer and to build a point of integration for historical and current data from all major customer touch points. The project was to design, develop and maintain a data warehouse to present an integrated, consistent view of enterprise-wide data with a Decision Support System feature to compare and analyze product prices, quantities, and Customer profiles without compromising any customer information.
Responsibilities:
- Attended and participated in information and Requirements Gathering sessions.
- Ensured that Business Requirements can be translated into Data Requirements.
- Created Business Requirement documents (BRD’s), such as SRS & FRS and integrated the requirements and underlying platform functionality.
- Translated Business Requirements into working Logical and Physical Data Models.
- Developed the Logical and physical data model and designed the data flow from source systems to Teradata tables and then to the Target system.
- Designed the technical specifications document for Teradata ETL processing of data into master data ware house and strategized the integration test plan and implementation.
- Developed complex Multi load and Fast Load scripts for loading data into Teradata tables from legacy systems.
- Used advanced data modeling concepts such as Family of Stars, Confirmed Dimensions, and Bus Matrix in order to handle complex situations.
- Designed and Developed Use Cases, Activity Diagrams, and Sequence Diagrams using Unified Modeling Language (UML).
- Involved in the analysis of the existing claims processing system, mapping phase according to functionality and data conversion procedure.
- Performed Normalization of the existing OLTP systems (3rd NF), to speed up the DML statements execution time.
- Designed Star and Snowflake Data Models for Enterprise Data Warehouse using ER Studio.
- Data modeling in Erwin; design of target data models for enterprise data warehouse (Teradata).
- Created and Maintained Logical Data Model (LDM) for the project. Includes documentation of all Entities, Attributes, Data Relationships, Primary and Foreign key Structures, Allowed Values, Codes, Business Rules, Glossary Terms, etc.
- Worked extensively on Transactional- grain, Periodic snap shot grain and Accumulating snap shot grain while designing dimensional models.
- Validated and updated the appropriate LDM\'s to Process Mappings, Screen Designs, Use Cases, Business Object Model, and System Object Model as they evolve and change.
- Designed the Database Tables & Created Table and Column Level Constraints using the suggested naming conventions for constraint keys.
- Worked on optimizing and tuning the Teradata views and SQL’s to improve the performance batch and response time of data for users.
- Developed and designed automated solution using Teradata macros, stored procedures and excel macros for business users to load reference data directly to tables.
- Maintained Data Model and synchronized it with the changes to the database.
- Involved with all the phases of Software Development Life Cycle (SDLC) methodologies throughout the project life cycle.
- Attendant architecture meeting and data governance meeting to understand the project.
- Identified and mapped various data sources and their targets successfully to create a fully functioning data repository.
Environment: EMBARCADERO ER Studio, Teradata, Toad, Windows XP, SQL Server
Client: Confidential -Providence, RI Nov 2010 – Feb2012
Role: Data Modeler
The projects goal was to create a database platform for the mutual funds system which is connected through networks for effective data pooling from different parties involved in the sales and management of mutual funds. The core modules that collected data were Sales Processing, and Commission Calculations, Transfer of units, funds management, and signature scanning and retrieval. As a database developer my role was to support the database activities.
Responsibilities:
- Extensively used Data Modeling, Data Analysis for OLTP and OLAP systems.
- Participated in JAD session with Business Users and Sponsors to understand and document the business requirements in alignment to the financial goals of the company.
- Participated in brain storming sessions with application developers and DBAs to discuss about various de-normalization, Partitioning and Indexing Schemes for Physical Model.
- Created the Conceptual Model for the Data Warehouse using Erwin Data Modeling tool.
- Reviewed the Conceptual EDW (Enterprise Data Warehouse) Data Model with Business Users, App Dev. and Information Architects to make sure all the requirements are fully covered.
- Created Logical & Physical Data Modes using Entity Relationship Diagrams.
- Designed the physical data models using data provisioning and consumption techniques in Oracle environment.
- Analyzed the existing logical model of the ODS to understand the relationships between different entities.
- Worked on Requirements Traceability Matrix to trace the business requirements back to Logical Model.
- Reviewed the Logical Model with Application Developers, ETL Team, DBAs and Testing Team to provide information about the Data Model and business requirements.
- Normalized the tables/relationships to arrive at effective Relational Schemas without any redundancies.
- Identified the Facts & Dimensions Tables and established the Grain of Fact for Dimensional Models.
- Created Multi-Way Aggregate Fact Tables as a specific summarization across dimensions of product, region and date.
- Created Snowflake Schemas by normalizing the dimension tables as appropriate, and creating a Sub Dimension named Demographic as a subset to the Customer Dimension.
- Created refined data models pertaining to reassessed business requirements conforming to applicable data standards and successfully documenting the existing system model and changes proposed/applied to the model.
- Developed, Implemented & Maintained the Conceptual, Logical & Physical Data models.
- Developed Scripts that automated DDL and DML statements used in creations of Databases, Tables, Constraints, and updates.
- Extensively used Metadata & Data Dictionary Management; Data Profiling; Data Mapping.
- Applied Data Governance rules (primary qualifier, class words and valid abbreviation in Table name and Column names).
- Involved incapturing Data Lineage, Table and Column Data Definitions, Valid Values and others necessary information in the data models.
- Worked on Forward Engineering to enhance the Data Structures of the Existing Database and Creating New Schemas.
Environment: Erwin 8, Oracle 10g/11g, Toad, Windows XP, SQL Server 2008, Microsoft Excel.
Client: Confidential, FLMay2009–Oct2010
Role: Data Modeler
As a data modeler for Well care my job was to translate business goals and strategies into IT implications and implementations. In this project I helped in creating predictive data models in order to target customers for direct marketing campaign and to extract consumer data as needed for model development.
Responsibilities:
- Created Business Requirement Documents (BRD’s), such as SRS & FRS and integrated the requirements and underlying platform functionality.
- FacilitatedJAD sessionsto determinedatarulesandconductedLogical Data Modeling (LDM)andPhysical Data Modeling (PDM)reviews withDataSMEs.
- Collected the information about different Entities and Attributes.
- Participatedin Data Analysis and Data DictionaryandMetadata Management -Collaborating with Business Analysts, SMEs, ETL Developers, Data Quality Analysts and Database Administrators for design and implementation of Logical Data Model.
- Designed Fact Tables and Dimension Tables for Data Mart to support all the business requirements using ER Studio.
- Assisted in generating Surrogate ID’s for the Dimensions in the Fact Table for indexed and faster access of data.
- Made changes, as required, in existing SQL functions, procedures, and packages and also wrote new code per requirements using TOAD for DB2.
- Developed the LogicalDataModel (LDM) & PDM andassisted theDBA to create thephysicaldatabasebyproviding the DDLscriptsincluding Indexes, Data Quality checks scripts and Base View scripts, keeping adherence to database optimization
- Aided and verified DDL implementation by DBAs, corresponded andcommunicated data and system specifications to DBA, Development Teams, Managers and Users.
- Ensuredthe delivery of LDM, PDM, and Staging PDM and Data Lineage Mapping documents.
- Managedthe Data Model changes in all enterprise applications. PerformedImpact Analysis to ensure all systems leads are informed and coordinate efforts to implement changes.
- Developed, Integrated, Managed and Promoted Enterprise Level Model, Physical Models and Data Model standards that Source Data from multiple origins Internal, External and Third Party Data.
- Part of team conducting logical data analysis and data modeling JAD sessions, communicated data-related standards.
- Participated Business Intelligence (BI) team to understanding the database and generating various interactive reports.
Environment:EMBARCADERO ER Studio, DB2, Toad, Windows XP, SQL, SQL*Plus, HTML, XML.
Client: - Confidential, Columbus, OH Jan 2007 - April 2009
Role: Data Modeler
The project is to enhance the Auto Insurance Policy Transactions and Claims data warehouse according to the current business user requirements. This enhancement enabled the business users to perform predictive analysis and come up with several data mining patterns.
Responsibilities:
- Created Business Requirement Documents (BRD’s), such as SRS & FRS and integrated the requirements and underlying platform functionality.
- Identify the business requirements by going over the business process and interacting with the business analyst and management.
- Conducted Analysis and Profiling of potential Data Sources, upon high level determination of sources during Initial Scoping by the Project Team.
- Involved in discussions with clients/users to determine the Dimensions, Hierarchies and Levels, Basic Measures, Key Metrics for Relational, Multidimensional Data Model for the system and understanding report requirements. Designing Star/Snowflake Schema Data Models.
- Coordinated Data Profiling/Data Mapping with Business Subject Matter Experts, Data Stewards, Data Architects, ETL Developers, and Data Modelers.
- Interacted with business users to analyze the business process and requirements and transforming requirements into Conceptual, Logical and Physical Data Models, Designing Database, Documenting and Rolling out the Deliverables.
- Involved in Dimensional Modeling, identifying the Facts and Dimensions.
- Created Custom Fact Tables to track and measure specific types of transaction that customer made.
- Updated the definitions & physical names based on the Oracle standards.
- Created Core Fact Tables to track and measure all types of transactions.
- Created Conformed Dimensions to allow Roll up & Drill down across to other data marts.
- Used Degenerate Dimensions in order to create unique policy number for insurance claims.
- Developed Logical/ Physical Data Models using EMBARCADERO ER Studio tool across the subject areas based on the specifications and established Referential Integrity of the system.
- Maintained and Enhanced Data Model with changes and furnished with Definitions, Notes, Reference Values and Check Lists.
- Worked very close with Data Architectures and DBA team to implement Data Model changes in database in all environments. Generate DDL scripts for Database Modification, Macros, Views and set tables.
Environment: EMBARCADERO ER Studio, Oracle 10g, Toad, Windows XP, SQL Server 2000/ 2005, XML, Excel, Access.
Client: - Confidential, Lakeland, FL Jan 2005 - Dec 2006
Role: Data Modeler
Projects goal was to implement enhancements to their enterprise data warehouse model and implement data marts for their sales and inventory management used for enterprise reporting purposes and analysis. Additional responsibilities included physical model restructuring of sales and inventory transactional systems.
Responsibilities:
- Gathered and analyzed business requirements through continuous discussions with the business analysts, data analysts, and performed GAP analysis of the business rules, user administration and requirements.
- Involved in the study and understanding of existing system and data architecture and proposed changes necessary to incorporate the new data models.
- Reverse Engineered the existing database structure to understand the existing data models so that any changes incorporate would synchronize with current model.
- Created conceptual models and conducted several walkthrough meetings with Architects for approvals and also conducted team meetings for creating logical and physical models from the conceptual model.
- Created physical and logical models using Erwin to effectively translate conceptual model into logical and physical models conforming to the business and data requirements.
- Created a family of stars to effectively translate the physical model into a dimensional model using conformed dimensions, role playing dimensions mini and sub dimensions and fact less fact tables.
- Incorporated enhancements into DW in accordance to Ralph Kimball Design specifications and adhering to the business needs and industry standards.
- Development of Daily and Weekly inventory and sales reports using Micro strategy.
- Worked on integrating data from disparate sources and the was involved with data cleansing, validating and loading, client user and account information
- Implemented performance tuning strategies like sub query flattening, view merging and recreating indexes to speed up the performance of highly resource intensive queries.
- Performed logical and physical data modeling and database design for an Oracle Retail Data ODS designed to enable Retail LOB fraud detection data mining with the use of analytics tools.
Environment: Erwin, Oracle 10g, DB2, Toad, Windows XP, SQL Server 2000/ 2005, XML, Excel, Access.