Data Modeler/analyst | Resume
Charlotte, NC
PROFESSIONAL SUMMARY:
- Experience as a Data Modeler/Analyst with background in Database development and designing data models for both (OLTP) and (OLAP) systems for Health Care domains
- Strong hands - on experience in various Data Modeling tools using ERWIN, IBM Data ArchitectandER Studio for developing Entity-Relationship diagrams
- Gathering and translating business requirements into technical designs and development of the physical aspects of a specified design
- Extensive experience in Datamodeling: Conceptual, Logical/Physical, Relational and Multi-dimensional modeling,analyzing data sources and creating interface documents
- Strong hands-on experience inforward engineering, reverse engineering and naming standards processes
- Generated DDL scripts for implementing data modeling changes
- Participated in JAD sessions and requirements gathering & identification of business subject areas
- Expert knowledge in Business Requirement Document (BRD), Technical Specification Document (TSD), Customer Relationship Management (CRM), Business Rules Management (BRM)
- Involved inData Analysis, Data Validation, Data Cleansing and Data Verification
- Experience in data analysis and data profiling using complex SQL on various sources systems including Oracle and Netezza
- Proficient in performance tuning and query optimization techniques using different tools such as Explain plan, Tracing and TKPROF
- Have worked with ETL tools to extract, transform and load data from relational databases and various file formats and loaded to target database
- Excellent knowledge inSQLand codingPL/SQLPackages, Procedures
- Strong Knowledge in Designed and developed Data Marts by following Star Schema and Snowflake Schema Methodology, usingdata modeling tools likeErwin
- Experienced on different data modeling methodologies from 3NF, Data Vault & Hybrid designs for EDW architecture
- Excellent knowledge in RalphKimball and BillInmon approaches
- Excellent analytical skills with exceptional ability to master new technologies efficiently
- Experience in creating Views, Materialized views and Partitions for the Oracle warehouse
- Experience working with Master Data Management
- Ability to work on multiple projects independently as well as in a team
- A dedicated team player with excellent communication and interpersonal skills with a proven track record of ability,creativity and innovation along with demonstration of very strong technical and managerial skills and while successfully leading teams to strict project deadlines
TECHNICAL SKILLS:
Languages: Java and SQL
Databases: Oracle 12/11/10, DB2, MS SQL server, PL/SQL and Netezza
Modeling Tools: Erwin, IBM Infosphere Data Architect, ER Studio and Oracle Data modeler
Modeling Specialties: Logical & Physical Data Models with OLTP & OLAP/Dimensional modeling
Data Analysis: End user Requirement Gathering, JAD Sessions, Data profiling, Source Systems Analysis and Data Relationships
Tools: GitHub, Tableau and PowerBI
Others: Teradata and Hadoop
Methodologies: SDLC, Agile and Waterfall
Operating Systems: Windows, Linux and Mac OS
WORK EXPERIENCE:
Confidential, Charlotte, NC
Data Modeler/Analyst
Responsibilities:
- Interacted with business users to analyse business processes and gather and collectuse case requirements
- Data Migration source to target mapping using ER-studio/Erwin
- Debugging the SQL-Statements and stored procedures for various business scenarios
- Developed advanced SQL queries to extract, manipulate, and/or calculate information to fulfill data and reporting requirements including identifying the tables and columns from which data is extracted
- Worked closely withdatagovernance, SMEs and vendors to definedatarequirements
- Prepared Use Cases & User Stories as per the user requirements documentation and developed detailed work plans, schedules, project estimates, resource plans and status reports
- Worked on preparing scope documents and allotted tasks to data modeling team
- Worked withdatainvestigation, discovery and mapping tools to scan every singledatarecord from many sources. Utilized IBM Infosphere Information Analyzer to perform data profiling
- Identification of Risks in schedule and marking them in the Use Cases
- Used data vault modeling method which was adaptable to the needs of this project
- Translated business requirements into working logical and physicaldata models for Staging, Operational Data Store and Data marts applications
- Utilized IBM Data Architect for creating CDM, LDM & PDM
- Utilized SVN for data modeling Multi-user environment and versioning
- Utilized IBM Fast track tool for creating source to target mapping inclusive of all transformation & data quality rules
- Forward Engineering the Physical Model to generate the DDL scripts to implement on Netezza database
- Conducted performance tuning of the database that included creating optimal distribution & organization keys for Netezza database
- Worked with DBA & ETL tool to implement regular grooming of historical tables for optimal performance
- Generate Data modelling reports: Data dictionaries, Data Model Diagram documents
- Regularly conducted and participated in weekly status meeting
- Performance Tuning on SQL Queries for efficiency and performance
- Involved with all the phases of Software Development Life Cycle (SDLC) methodologies throughout the project life cycle
Environment: ER Studio, IBM Data Architect 9, Netezza 7, Hadoop, MS Visio, Oracle 11g, IBM Data Stage, Toad, SQL Server
Confidential, Charlotte, NC
Data Modeler
Responsibilities:
- Developed a Conceptual model using Erwin based on requirements analysis
- UsedErwinfor reverse engineering to connect to existing database andODSto create graphical representation in the form of Entity Relationships and elicit more information
- Developed the logical data models and physical data models that capture current state/future state data elements and data flows usingERStudio
- Reverse Engineered DB2 databases and then forward engineered them to Teradata usingERStudio
- Created Physical DataModelfrom the Logical DataModelusing Compare and Merge Utility inER Studioand worked with the naming standards utility
- Produced PL/SQL statement and stored procedures in DB2 for extracting as well as writing data
- CreatedPL/SQLprocedures, triggers, generated application data, Created users and privileges, used oracle utilities import/export
- Generated comprehensive analytical reports by runningSQLqueriesagainst current databases to conduct data analysis
Environment: Erwin, ER Studio, Teradata, SQL and PL/SQL
Confidential, Columbus, Ohio
Data Modeler/Architect
Responsibilities:
- Involved in interactions with Subject Matter Expert, Project Manager, Developers, and the end-users in more various JAD sessions to Gather Requirements
- Involved in all phases ofSDLCfrom the requirement, design, development, testing and rollout to the field user and support for the production environment
- Translated BusinessRequirements into DataDesignRequirements used for driving innovative data designs that meet business objectives
- Assisted in migration of data models from Oracle Designer to Erwin and updating the data models to correspond to the existing database structures
- Worked with Informatics Data quality tool to do data profiling
- Performed extensive files Validation, Data Verification against Data Warehouse and performed debugging of the SQL-Statements and stored procedures for business scenarios
- Created Data Mapping documents detailing the transfer of data from Source to Target
- Performed data analysis using SQL queries on source systems to identify data discrepancies and determine data quality
- Created and maintained Logical & Physical Data Models for the project. Included documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms etc
- Used Erwin to create report templates. Maintained and changed the report templates as needed to generate varying data dictionary formats as contract deliverables
- Developed and maintains data models and data dictionaries, data maps and other artifacts across the organization
- Involved in logical and Physical Database design & development, Normalization and Data modeling using Erwin and SQL Server Enterprise manager
- Conducted design walk through sessions with Business Intelligence team to ensure that reporting requirements are met for the business
- Creation of Views and Materialized Views to meet requirements
- Generated SQL scripts and implemented the relevant databases with related properties from keys, constraints, indexes & sequences
- Involved in Performance Tuning of the database, which included Creating Indexes and Optimizing SQL Statements
- Build and Manage AWS Data Lake for data storage and analytics
Environment: IBM Infosphere Data Architect (IDA), DB2, Teradata, Oracle 11g, IBM Data Stage, Toad, SQL and Informatica Data-Stage Server