We provide IT Staff Augmentation Services!

Senior Data Modeler Resume

5.00/5 (Submit Your Rating)

Dover, NH

PROFESSIONAL SUMMARY:

  • 7 years of expertise in Data Modeling for Data Warehouse/Data Mart development, Data Analysis for Online Transaction Processing (OLTP) and Data Warehousing (OLAP)/Business Intelligence (BI) applications
  • Experience in Big Data Hadoop Ecosystem in ingestion, storage, querying, processing, and analysis of big data.
  • Experienced in integration of various relational and non - relational sources such as DB2, Teradata, Oracle, Netezza, SQL Server, NoSQL, COBOL, XML and Flat Files, to Netezza database.
  • Business Intelligence: Requirement s analysis, Key Performance Indicators (KPI), metrics development, sourcing and gap analysis, OLAP concepts and methods, aggregates / materialized views and performance, rapid prototyping, tool selection, semantic layers Excellent experience in writing SQL queries to validate data movement between different layers in data warehouse environment.
  • Hands on noledge of Hive, Sqoop, MR, Storm, Pig, HBase, Flume, Spark.
  • Data Warehousing: Full life-cycle project leadership, business-driven requirements, capacity planning, gathering, feasibility analysis, enterprise and solution architecture, design, construction, data quality, profiling and cleansing, source-target mapping, gap analysis, data integration/ETL, SOA, ODA, data marts, Inman/Kimball methodology, Data Modeling for OLTP, canonical modeling, Dimension Modeling for data warehouse star/snowflake design.
  • Experience in BI/DW solution (ETL, OLAP, Data mart), Informatica, BI Reporting tool like Tableau and QlikView and experienced leading teh team of application, ETL, BI developers, Testing team.
  • Good understanding and hands on experience with AWS S3 and EC2.
  • Good experience on programming languages Python, Scala.
  • Logical and physical database designing like Tables, Constraints, Index, etc. using Erwin, ER Studio, TOAD Modeler and SQL Modeler.
  • Excellent noledge in preparing required project documentation and tracking and reporting regularly on teh status of projects to all project stakeholders
  • Extensive ETL testing experience using Informatica 9x/8x, Talend, Pentaho.
  • Heavy use of Access queries, V-Lookup, formulas, Pivot Tables, etc. Working noledge of CRM Automation Salesforce.com, SAP.
  • Experience in Dimensional Data Modeling, Star/Snowflake schema, FACT & Dimension tables.
  • Expertise on Relational Data modeling (3NF) and Dimensional data modeling.
  • Experience in developing Map Reduce Programs using Apache Hadoop for analyzing teh big data as per teh requirement.
  • Practical understanding of teh Data modeling (Dimensional & Relational) concepts like Star-Schema Modeling, Snowflake Schema Modeling, Fact and Dimension tables.
  • Skillful in Data Analysis using SQL on Oracle, MS SQL Server, DB2 & Teradata.
  • Proficient in System Analysis, ER/Dimensional Data Modeling, Database design and implementing RDBMS specific features.

TECHNICAL SKILLS:

Project Management: Microsoft Office, Lotus Notes, MS Outlook, MS Projects

Reporting Layer: Spotfire, Tableau, Erwin Report Designer

Database Tools: Microsoft SQL Server14.0, Teradata 15.0, Oracle 11g/9i/12c and MS Access

Tools: & Technologies: Informatica PowerCenter /Data Quality/Business Analyst/Metadata Manager

PROFESSIONAL EXPERIENCE:

Confidential, Dover, NH

Senior Data Modeler

Responsibilities:

  • Working with teh Data Office, DAE and Application Development teams to translate business needs into data solutions
  • Responsible for creating canonical models as a part of DAE for product, party and finance domains.
  • Working on creating a conceptual model for teh product domain by understanding teh liberty application, focusing on teh auto and property related entities and attributes.
  • Working with data architect to design databases and data structures that meet organizational needs using conceptual, logical, and physical data models.
  • Designing canonical models to improve efficiency, reduce data redundancy and improve data profiling.
  • Work with multiple source systems like OM, Safeco, Majesco and building logical models based on teh business models provided.
  • Perform equivalency mapping exercise to map entities across multiple system and make sure teh data lineage is maintained.
  • Standardize teh model using ACORD standards that define and represent teh structure and context of messages exchanged between teh companies in teh insurance industry.
  • Create UDP s with source system mappings to maintain traceability and data lineage.
  • Develop data vault models based on teh canonical model.
  • Create hubs, links and satellites for each subject and align teh attributes based on teh structure of teh canonical model built.
  • Design workflow and process flow diagrams for various subject areas of teh OM model using lucid chart.
  • Develop mapping spreadsheets that provide teh Data Warehouse Development ETL team with source to target data mapping.

Environment: ERWIN9.8, Alation, SQL Developer, Microsoft Excel, Customer Service Workbench

Confidential, Albuquerque, NM

Data Architect/Modeler

Responsibilities:

  • Worked on a project called Fluent where I worked on various tools like IBM data modeler to create logical and physical data models for existing databases.
  • Working on querying through teh warehouse to glance through incoming data and build data marts of them.
  • Hands on Experience with conceptual, logical and physical modeling using modeling tools
  • Advises/leads projects involving teh ETL related activities and teh migration or conversion of data between enterprise data systems. Coordinates interactions between central IT, business units, and data stewards to achieve desired organizational outcomes.
  • Gatheird and analyzed existing physical data models for in scope applications and proposed teh changes to teh data models according to teh requirements.
  • Apply structure and object - oriented analysis, modeling techniques to analyze, design, implement, and manage information technology systems.
  • Worked with Architecture team to get teh metadata approved for teh new data elements that are added for this project.
  • Working on conducting an equivalency mapping between two data sources FACETS and Health Rules Payor (HRP) at a columnar lever based on definitions.
  • Prepared a comparison for both data sources based on subject areas and documented them in Collibra for data governance purposes.
  • Responsible for data profiling, data lineage, building data models for teh warehouse within teh project.
  • Maintaining referential integrity among entities by identifying both primary/ foreign key relationships base on business rules, using appropriate data type and applying naming standards across departments.
  • Working on equivalency mapping work between two existing data sources.
  • Involved in Normalization and De-Normalization of existing tables for faster query retrieval and designed both 3NF data models for ODS, OLTP systems and dimensional data models using star and snowflake schemas.

Environment: Infosphere Data Architect IDA 9.3, Erwin 9.8, SQL Developer, Business Objects, Collibra.

Confidential, Austin, Texas

Data Architect/Modeler

Responsibilities:

  • Working in Regulatory Compliance IT team where worked as Data Architect role which involved Data Profiling, Data Modeling, ETL Architecture& Oracle DBA.
  • Build an inventory of data needed to implement teh architecture.
  • Handled importing data from various data sources, performed transformations using Hive, MapReduce, and loaded data into HDFS.
  • Designed teh Logical Data Model using ERWIN 9.64 with teh entities and attributes for each subject area.
  • Developed long term data warehouse roadmap and architectures, designs and builds teh data warehouse framework per teh roadmap.
  • Designed and developed architecture for data services ecosystem spanning Relational, NoSQL, and Big Data technologies.
  • Specifies overall Data Architecture for all areas and domains of teh enterprise, including Data Acquisition, ODS, MDM, Data Warehouse, Data Provisioning, ETL, and BI.
  • Advises on and enforces data governance to improve teh quality/integrity of data and oversight on teh collection and management of operational data.
  • Able to guide / partner with VP / Directors for architecting solutions for teh big data Organization.
  • Developed Data Mapping, Data Governance, and Transformation and cleansing rules for teh Master Data Management Architecture involving OLTP, ODS.
  • Designed and developed a Data Lake using Hadoop for processing raw and processed claims via Hive and Informatica.
  • Worked with Netezza and Oracle databases and implemented various logical and physical data models for them.
  • Data modeling, Design, implement, and deploy high - performance, custom applications at scale on Hadoop /Spark.
  • Developed and implemented data cleansing, data security, data profiling and data monitoring processes.

Environment: DB2, CA Erwin 9.6, Oracle 12c, MS-Office, SQL Architect, TOAD Benchmark Factory, SQL Loader, PL/SQL, SharePoint, Erwin r9.64, Talend, MS-Office, Redshift, SQL Server 2008/2012, Hadoop, Spark, AWS, Hive, Sqoop.

Confidential, Detroit, MI

Data Architect/ Modeler

Responsibilities:

  • Responsible for developing and supporting a data model and architecture that supports and enables teh overall strategy of expanded data deliverables, services, process optimization and advanced business intelligence.
  • Worked with Data Vault Methodology Developed Normalized Logical and Physical database models
  • Designed Star and Snowflake Data Models for Enterprise Data Warehouse using ERWIN.
  • Worked with Data Steward Team for designing, documenting and configuring Informatica Data Director for supporting management of MDM data.
  • Creation of BTEQ, Fast export, Multi Load, TPump, Fast load scripts for extracting data from various production systems
  • Owned and managed all changes to teh data models. Created data models, solution designs and data architecture documentation for complex information systems.
  • Worked with reverse engineering Data Model from Database instance and Scripts.
  • Implemented Data Vault Modeling Concept solved teh problem of dealing with change in teh environment by separating teh business keys and teh associations between those business keys, from teh descriptive attributes of those keys using HUB, LINKS tables and Satellites.
  • Gather and analyze business data requirements and model these needs. In doing so, work closely with teh users of teh information, teh application developers and architects, to ensure teh information models are capable of meeting their needs.
  • Transformed Logical Data Model to Physical Data Model ensuring teh Primary Key and Foreign key relationships in PDM, Consistency of definitions of Data Attributes and Primary Index considerations.
  • Designed ER diagrams (Physical and Logical using Erwin) and mapping teh data into database objects and identified teh Facts and Dimensions from teh business requirements and developed teh logical and physical models using Erwin.

Environment: Data Vault, ERWIN 9.6, MySQL, PostgreSQL, SQL Server, Erwin, Informatica, Tableau.

Confidential, Denver, CO

Data Analyst/Modeler

Responsibilities:

  • Responsible for teh design of logical and physical data models based upon analysis of complex business requirements
  • Developed teh design & Process flow to ensure that teh process is repeatable.
  • Participated in performance management and tuning for stored procedures, tables and database servers.
  • Create Logical Data Model for Staging, ODS and Data Mart and Time dimension as well.
  • Involved in maintaining and updating Metadata Repository with details on teh nature and use of applications/data transformations to facilitate impact analysis.
  • Created DDL scripts using ER Studio and source to target mappings to bring teh data from source to teh warehouse.
  • Designed teh ER diagrams, logical model (relationship, cardinality, attributes, and candidate keys) and physical database (capacity planning, object creation and aggregation strategies) for Oracle and Teradata.
  • Worked in importing and cleansing of data from various sources like Teradata, Oracle, flatfiles, MS SQL Server with high volume data
  • Generated comprehensive analytical reports by running SQL queries against current databases to conduct data analysis.
  • Produced PL/SQL statement and stored procedures in DB2 for extracting as well as writing data.
  • Designed Logical & Physical Data Model /Metadata/ data dictionary using Erwin for both OLTP and OLAP based systems.
  • Co - ordinate all teams to centralize Meta-data management updates and follow teh standard Naming Standards and Attributes Standards for DATA &ETL Jobs.
  • Wrote and executed SQL queries to verify that data TEMPhas been moved from transactional system to DSS, Data warehouse, data mart reporting system in accordance with requirements.
  • Worked in importing and cleansing of data from various sources like Teradata, Oracle, flat files, SQL Server 2005 with high volume data

Environment: ER Studio, SQL Server 2008, SQL Server Analysis Services, SSIS, Oracle 10g, Business Objects XI, Rational Rose, Data stage, MS Office, MS Visio, SQL, SQL Server 2000/2005, Rational Rose, Crystal Reports 9

Confidential

Data Analyst/Modeler

Responsibilities:

  • Designed new application logical/physical data models in SAP Power Designer.
  • Generated DDL using SAP Power Designer and loaded them in teh Data Warehouse
  • Extensively used reverse engineering feature of Erwin to save teh data model with production.
  • Work with Business users during requirements gathering and prepared Conceptual, Logical and Physical Data Models.
  • Wrote PL/SQL statement, stored procedures and Triggers in DB2 for extracting as well as writing data.
  • Translated business requirements into working logical and physical data models for Data warehouse, Data marts and OLAP applications.
  • Design and implemented business intelligence to support sales and operations functions to increase customer satisfaction.
  • Design Star and Snowflake Data Models for Enterprise Data Warehouse using ERWIN
  • Created and maintained Logical Data Model (LDM) for teh project. Includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc.
  • Validated and updated teh appropriate LDM's to process mappings, screen designs, use cases, business object model, and system object model as they evolve and change.
  • Created business requirement documents and integrated teh requirements and underlying platform functionality.
  • Experience in developing dashboards and client specific tools in Microsoft Excel and Power Point.
  • Responsible for teh development and maintenance of Logical and Physical data models, along with corresponding metadata, to support Applications.
  • Excellent noledge and experience in Technical Design and Documentation
  • Use forward engineering to create a physical data model with DDL that best suits teh requirements from teh Logical Data Model.
  • Involved in preparing teh design flow for teh Data stage objects to pull teh data from various upstream applications and do teh required transformations and load teh data into various downstream applications
  • Perform logical data modeling, physical data modeling (including reverse engineering) using teh Erwin Data Modeling tool.

Environment: Oracle 9i, PL/SQL, Solaris 9/10, Windows Server 2003 & 2008. NZSQL, Erwin 8.0, ER- Studio 6.0/6.5, Toad 8.6, Informatica 8.0, IBM OS 390(V6.0), DB2 V7.1

We'd love your feedback!