We provide IT Staff Augmentation Services!

Sr. Data Modeler Resume

2.00/5 (Submit Your Rating)

Phoenix, AZ

SUMMARY

  • Above 7+ years of professional experience in Data Modeling and Data Analysis.
  • Proficient in Software Development Life Cycle (SDLC), Project Management methodologies, and Microsoft SQL Server database management.
  • Experience working with Agile and Waterfall data modeling methodologies.
  • Experience in Big Data Hadoop Ecosystem in ingestion, storage, querying, processing and analysis of Big data.
  • Well versed expertise in cloud development using MS Azure, Azure Data Lake, Azure Data Factory, Azure PaaS & AWS services like EC2, EC3, Redshift.
  • Good experience in conducting Joint Application Development (JAD) sessions with SMEs, Stakeholders and other project team members for requirements gathering and analysis.
  • Hands on experience in Normalization and De - Normalization techniques for optimum performance in relational and dimensional database environments.
  • Sound knowledge in Data Analysis, Data Validation, Data Cleansing, Data Verification and identifying data mismatch.
  • Excellent Knowledge of Ralph Kimball and Bill Inmon's approaches to Data Warehousing.
  • Extensive experience in using ER modeling tools such as Erwin, ER/Studio, Teradata, and MDM.
  • Familiar with Installation, configuration, patching and upgrading of Tableau tool across the environments
  • Proficient in writing DDL, DML commands using SQL developer and Toad.
  • Experience in Performance tuning on oracle databases by leveraging explain plans, and tuning SQL queries.
  • Excellent SQL programming skills and developed Stored Procedures, Triggers, Functions, Packages using SQL, PL/SQL.
  • Experience in Data transformation and Data mapping from source to target database schemas and also data cleansing.
  • Good experience on Relational Data modeling (3NF) and Dimensional data modeling.
  • Expert in building Enterprise Data Warehouse from Scratch using both Kimball and Inmon Approach.
  • Experience in data analysis to track data quality and for detecting/correcting inaccurate data from the databases.
  • Extensive experience on usage of ETL & Reporting tools like SQL Server Integration Services (SSIS), SQL Server Reporting Services (SSRS).
  • Proficient in data mart design using dimensional data modeling identifying Facts and Dimensions, Star Schema and Snowflake Schema.
  • Experience designing security at both the schema level and the accessibility level in conjunction with the DBAs
  • Experience in Designing and implementing data structures and commonly used data Business Intelligence tools for data analysis.
  • Efficient in implementing Normalization to 3NF/ De-normalization techniques for optimum performance in relational and dimensional database environment
  • Hands on experience with modeling using Erwin in both forward and reverse engineering processes.
  • Experience in Requirement gathering, System analysis, handling business and technical issues & communicating with both business and technical users.
  • Strong background in data processing, data analysis with hands on experience in MS Excel, MS Access, UNIX and Windows Servers.
  • Experience with DBA tasks involving database creation, performance tuning, creation of indexes, creating and modifying table spaces for optimization purposes.
  • Excellent analytical skills with exceptional ability to master new technologies efficiently.

TECHNICAL SKILLS

Data Modeling: Erwin 9.7, Toad, ER studio 9.7, Star-Schema Modeling, Snowflake-Schema Modeling, FACT and dimension tables, Pivot Tables

Big Data Tools: Apache Hadoop 3.0, Hive 2.3, NoSQL, Cassandra 3.11, 2.2, HBase 1.2

Languages: PL SQL, T-SQL, Unix Shell scripting, XML

Database: Oracle 12c/11g, MS SQL Server2016/2014, DB2, Teradata 14/15, Cassandra.

Testing Tools: Win Runner, Load Runner, Test Director, Mercury Quality Center, Rational Clear Quest

BI Tools: Tableau 7.0/8.2, SAP Business Objects, Crystal Reports

ETL/Data warehouse Tools: Informatica, SAP Business Objects XIR3.1/XIR2, Talend

Operating System: UNIX, Windows 8/7, Linux, Red Hat

Other Tools: TOAD, BTEQ, MS-Office suite (Word, Excel, Project and Outlook)

PROFESSIONAL EXPERIENCE

Confidential, Phoenix, AZ

Sr. Data Modeler

Responsibilities:

  • As a Sr. Data Modeler responsible for data warehousing, data modeling, data governance, data architecture standards, methodologies, guidelines and techniques.
  • Partner with various business stakeholders and technology leaders, gather requirements, converted them into scalable technical and system requirement documents
  • Designed rule engine to handle complicated data conversion requirements when syncing data among multiple POS systems and the centralized ERP system.
  • Coordinate with other Data Architects to Design Big Data, Hadoop projects and provide for a designer that is an idea-driven.
  • Designed Data lake, Master data, Security, data hub and data warehouse/data marts layers.
  • Created logical, physical models according to the requirements and physicalize them.
  • Effectively articulated reasoning for data model design decisions and strategically incorporated team member feedback to produce the highest quality data models.
  • Worked with project and application teams to ensure that they understand and fully comply with data quality standards, architectural guidelines and designs.
  • Performed Reverse engineering of the source systems using Oracle Data modeler.
  • Involved in capturing Data Lineage, Table and Column Data Definitions, Valid Values and others necessary information in the data models.
  • Identified Facts & Dimensions Tables and established the Grain of Fact for Dimensional Models.
  • Generate the DDL of the target data model and attached it to the JIRA to be deployed in different Environments.
  • Tuned DB queries/processes and improved performance
  • Reverse engineered crystal reports (Command Performance), SSRS reports to identify logic/business rules for the Driver's Performance Metrics, Customer Order Performance, Order management and daily sales.
  • Created data mart based on multiple POS system for Power BI Dashboards/reports.
  • Worked on Data load using Azure Data factory using external table approach.
  • Involved in creating Pipelines and Datasets to load the data onto data warehouse.
  • Worked closely with ETL SSIS Developers to explain the complex Transformations using Logic
  • Created ETL Jobs and Custom Transfer Components to move data from Transaction systems to centralized area (Azure Sql data Warehouse) to meet deadlines.
  • Extensively used SSIS transformations such as Lookup, Derived column, Data conversion, Aggregate, Conditional split, SQL task, Script task and Send Mail task etc.
  • Developed SSIS packages to load data from various source systems to Data Warehouse.

Environment: Oracle Data modeler, Visio, Microsoft outlook, Adobe PDF, DQ Analyzer, SQL Server, Azure data factory, Power BI, Microsoft Teams, Microsoft Visual studio.

Confidential, Lowell, AR

Sr. Data Modeler

Responsibilities:

  • Worked as a Sr. Data Modeler to generate Data Models using Erwin and developed relational database system.
  • Extensively used agile methodology as the Organization Standard to implement the data Models.
  • Worked in an environment of Amazon Web Services (AWS) provisioning and used AWS services.
  • Created semantically rich logical data models (non-relational/NoSQL) that define the Business data requirements.
  • Converted conceptual models into logical models with detailed descriptions of entities and dimensions for Enterprise Data Warehouse.
  • Involved in creating Pipelines and Datasets to load the data onto Data Warehouse.
  • Coordinate with Data Architects to Design Big Data, Hadoop projects and provide for a designer that is an idea-driven.
  • Created database objects in AWS Redshift. Followed AWS best practices to convert data types from oracle to Redshift.
  • Worked on NoSQL Databases as Cassandra
  • Identified data organized into logical groupings and domains, independent of any application or system.
  • Developed data models and data migration strategies utilizing sound concepts of data modeling including star schema, snowflake schema.
  • Created S3 buckets in the AWS environment to store files, sometimes which are required to serve static content for a web application.
  • Developed Model aggregation layers and specific star schemas as subject areas within a logical and physical model.
  • Identified Facts &Dimensions Tables and established the Grain of Fact for Dimensional Models.
  • Configured Inbound/Outbound in AWS Security groups according to the requirements.
  • Established measures to chart progress related to the completeness and quality of metadata for enterprise information.
  • Developed the data dictionary for various projects for the standard data definitions related data analytics.
  • Managed storage in AWS using Elastic Block Storage, S3, created Volumes and configured Snapshots.
  • Generate the DDL of the target data model and attached it to the Jira to be deployed in different Environments.
  • Conducted data modeling for JAD sessions and communicated data related standards.
  • Investigate Data Quality Issues and provide recommendation & solution to address them.
  • Optimized and updated UML Models (Visio) and Relational Data Models for various applications.
  • Validated the data of reports by writing SQL queries in PL/SQL Developer against ODS.
  • Prepared reports to summarize the daily data quality status and work activities.
  • Performed ad-hoc analyses, as needed, with the ability to comprehend analysis as needed.

Environment: Erwin 9.7, NoSQL, Sqoop, Cassandra 3.11, AWS, Hadoop 3.0, SQL, Pl/SQL

Confidential, Philadelphia, PA

Data Modeler/Data Analyst

Responsibilities:

  • Worked as a Data Modeler/Analyst to generate Data Models using E/R Studio and developed relational database system.
  • Interacted with Business Analyst, SMEs and other Data Architects to understanding Business needs
  • Created Logical & Physical Data Model on Relational (OLTP) on Star schema for Fact and Dimension tables using E/R Studio.
  • Performed GAP analysis to analyze the difference between the system capabilities and business requirements.
  • Used Agile Method for daily scrum to discuss the project related information.
  • Responsible for Big data initiatives and engagement including analysis, brain storming, POC, and architecture.
  • Designed and developed architecture for data services ecosystem spanning Relational, NoSQL, and Big Data technologies.
  • Loaded and transformed large sets of structured, semi structured and unstructured data using Hadoop/Big Data concepts.
  • Involved in Data flow analysis, Data modeling, Physical database design, forms design and development, data conversion, performance analysis and tuning.
  • Created and maintained data model standards, including master data management (MDM) and Involved in extracting the data from various sources.
  • Designed the data marts using the Ralph Kimball's Dimensional Data Mart modeling methodology using E/R Studio.
  • Worked on Master data Management (MDM) Hub and interacted with multiple stakeholders.
  • Proficient in developing Entity-Relationship diagrams, Star/Snow Flake Schema Designs, and expert in modeling Transactional Databases and Data Warehouse.
  • Worked on normalization techniques, normalized the data into 3rd Normal Form (3NF).
  • Worked Normalization and De-normalization concepts and design methodologies like Ralph Kimball and Bill Inmon approaches and implemented Slowly Changing Dimensions.
  • Implemented Forward engineering to create tables, views and SQL scripts and mapping documents.
  • Used reverse engineering to connect to existing database and create graphical representation (E-R diagram).
  • Designed and developed architecture for data services ecosystem spanning Relational, NoSQL, and Big Data technologies.
  • Performance tuning and stress-testing of NoSQL database environments in order to ensure acceptable database performance in production mode.
  • Developed automated data pipelines from various external data sources (web pages, API etc) to internal data warehouse (SQL server) then export to reporting tools.
  • Connected to AWS Redshift through Tableau to extract live data for real time analysis.
  • Performed data analysis and data profiling using complex SQL on various sources systems including Oracle.
  • Monitored the Data quality and integrity of data was maintained to ensure effective functioning of department.
  • Managed database design and implemented a comprehensive Star-Schema with shared dimensions.
  • Analyzed the data which is using the maximum number of resources and made changes in the back-end code using PL/SQL stored procedures and triggers
  • Developed and maintained stored procedures, implemented changes to database design including tables and views.
  • Conducted Design reviews with the business analysts and the developers to create a proof of concept for the reports.
  • Performed detailed data analysis to analyze the duration of claim processes
  • Created the cubes with Star Schemas using facts and dimensions through SQL Server Analysis Services (SSAS).
  • Deployed SSRS reports to Report Manager and created linked reports, snapshots, and subscriptions for the reports and worked on scheduling of the reports.
  • Generated parameterized queries for generating tabular reports using global variables, expressions, functions, and stored procedures using SSRS.
  • Developed, and scheduled variety of reports like cross-tab, parameterized, drill through and sub reports with SSRS.

Environment: E/R Studio v16, OLTP, Agile, MDM, SQL Server 17, NoSQL, AWS, Oracle 12c, PL/SQL

Confidential

Intern -Data Analyst

Responsibilities:

  • Worked with Business users during requirements gathering and prepared Conceptual, Logical and Physical Data Models.
  • Wrote PL/SQL statement, stored procedures and Triggers in DB2 for extracting as well as writing data.
  • Optimized the existing procedures and SQL statements for the better performance using EXPLAIN PLAN, HINTS, SQL TRACE and etc. to tune SQL queries.
  • The interfaces were developed to be able to connect to multiple databases like SQL server and oracle.
  • Assisted Kronos project team in SQL Server Reporting Services installation.
  • Developed SQL Server database to replace existing Access databases.
  • Attended and participated in information and requirements gathering sessions
  • Translated business requirements into working logical and physical data models for Data warehouse, Data marts and OLAP applications.
  • Designed and created web applications to receive query string input from customers and facilitate entering the data into SQL Server databases.
  • Performed thorough data analysis for the purpose of overhauling the database using SQL Server.
  • Designed and implemented business intelligence to support sales and operations functions to increase customer satisfaction.
  • Converted physical database models from logical models, to build/generate DDL scripts.
  • Maintained warehouse metadata, naming standards and warehouse standards for future application development.
  • Extensively used ETL to load data from DB2, Oracle databases.
  • Involved with data profiling for multiple sources and answered complex business questions by providing data to business users.
  • Worked with data investigation, discovery and mapping tools to scan every single data record from many sources.
  • Designed both 3NF data models for ODS, OLTP systems and dimensional data models using star and snow flake Schemas
  • Wrote and executed unit, system, integration and UAT scripts in a data warehouse projects.
  • Extensively used ETL methodology for supporting data extraction, transformations and loading processing, in a complex EDW using Informatica.
  • Worked and experienced on Star Schema, DB2 and IMS DB.

Environment: Oracle, PL/SQL, DB2, Erwin7.0, UNIX, Teradata SQL Assistant, Informatica, OLTP, OLAP, Data Marts, DQ analyzer.

We'd love your feedback!