We provide IT Staff Augmentation Services!

Data Modeler/data Analyst Resume

4.00/5 (Submit Your Rating)

West Chester, PA

SUMMARY

  • Over 7 + years of experience as a Data Modeler/Analyst with high proficiency in requirement gathering and data modeling.
  • Strong working Experience with Agile, Scrum, Kanban and Waterfall methodologies.
  • Proficient in Software Development Life Cycle (SDLC), Project Management methodologies, and Microsoft SQL Server database management.
  • Experienced in designing Conceptual, Logical and Physical data models using Erwin, Sybase Power Designer and ER Studio data modeling tools.
  • Very good experience and knowledge on Amazon Web Services: AWS Redshift, AWS S3 and AWS EMR.
  • Hands on experience in Normalization and De - Normalization techniques for optimum performance in relational and dimensional database environments.
  • Good experience with JAD sessions for requirements gathering, creating data mapping documents, writing functional specifications, queries.
  • Experience in implementing Data warehousing solutions involving Dimension modeling and Snow flake schema implementation 3NF
  • Experience in modeling with both OLTP/OLAP systems and Kimball and Inmon Data warehousing environments.
  • Capture, validate and publish metadata in accordance with enterprise data governance policies and MDM taxonomies.
  • Excellent experience in writing SQL queries to validate data movement between different layers in data warehouse environment.
  • Excellent experience in writing and executing unit, system, integration and UAT scripts in a data warehouse projects.
  • Experience in data analysis to track data quality and for detecting/correcting inaccurate data from the databases.
  • Extensive experience on usage of ETL & Reporting tools like SQL Server Integration Services (SSIS), SQL Server Reporting Services (SSRS).
  • Extensive experience working with XML, Schema Designing and XML data.
  • Experience in Data Transformation and Data Mapping from source to target database schemas and also data cleansing.
  • Extensive experience in development of T-SQL, DML, DDL, DTS, Stored Procedures, Triggers, Sequences, Functions and Packages.
  • Experience designing security at both the schema level and the accessibility level in conjunction with the DBAs
  • Experience in performance tuning and optimization of the database for business logic implementation.
  • Experience in integration of various data sources with multiple Relational Databases like SQL Server, Teradata, and Oracle.
  • Good experience in data transformation, data mapping from source to target database schemas and also data cleansing.
  • Hands on experience with modeling using Erwin in both forward and reverse engineering processes.
  • Experience in reporting using Business Objects, Crystal reports, Designed WEBI /CR reports.
  • Excellent communicative, interpersonal, intuitive, analysis, leadership skills, quick starter with ability to master and apply new concepts.

TECHNICAL SKILLS

Data Modeling Tools: Erwin Data Modeler 9.7/9.6, Erwin Model Manager, ER Studio v17, and Power Designer.

Programming Languages: SQL, PL/SQL, HTML5, XML and VBA.

Reporting Tools: SSRS, Power BI, Tableau, SSAS, MS-Excel, SAS BI Platform.

Cloud Platforms: AWS, EC2, EC3, Redshift & MS Azure

OLAP Tools: Tableau 7, SAP BO, SSAS, Business Objects, and Crystal Reports 9

Databases: Oracle 12c/11g, Teradata R15/R14, MS SQL Server 2016/2014, DB2.

Operating System: Windows, Unix, Sun Solaris

ETL/Data warehouse Tools: Informatica 9.6/9.1, SAP Business Objects XIR3.1/XIR2, Talend, and Pentaho.

Methodologies: RAD, JAD, RUP, UML, System Development Life Cycle (SDLC), Agile, Waterfall Model

PROFESSIONAL EXPERIENCE

Confidential - West Chester, PA

Data Modeler/Data Analyst

Responsibilities:

  • As a Data Modeler/ Data Analyst, I was responsible for all data related aspects of a project.
  • Participated in design discussions and assured functional specifications are delivered in all phases of SDLC in an Agile Environment.
  • Responsible for Big data initiatives and engagement including analysis, brainstorming, POC, and architecture.
  • Worked with Big Data and Big Data on Cloud, Master Data Management and Data Governance.
  • Worked on Data load using Azure Data factory using external table approach.
  • Identified the entities and relationship between the entities to develop Conceptual Model using Erwin.
  • Involved in Normalization and De-Normalization of existing tables for faster query retrieval.
  • Created SSIS packages to pull data from SQL Server and exported to MS Excel Spreadsheets and vice versa.
  • Defined facts, dimensions and designed the data marts using the Ralph Kimball Dimensional Data Mart modeling methodology using Erwin.
  • Facilitated transition of logical data models into the physical database design for data marts using Inmon methodology.
  • Designed and developed dynamic advanced T-SQL, stored procedures, XML, user defined functions, parameters, views, tables, triggers and indexes.
  • Developed the required data warehouse model using Star schema for the generalized model
  • Deployed and uploaded the SSRS reports to SharePoint Server for the end users and involved in enhancements and modifications.
  • Worked on Cloud computing using Microsoft Azure with various BI Technologies and exploring NoSQL options for current back using Azure Cosmos DB (SQL API)
  • Designed and developed a Data Lake using Hadoop for processing raw and processed claims via Hive and Informatica.
  • Extensively used Star and Snowflake Schema methodologies.
  • Used Normalization (1NF, 2NF & 3NF) and De-normalization techniques for effective performance in OLTP and OLAP systems.
  • Developed dimensional model for Data Warehouse/OLAP applications by identifying required facts and dimensions.
  • Used windows Azure SQL reporting services to create reports with tables, charts and maps.
  • Created and maintained SQL Server scheduled jobs, executing stored procedures for the purpose of extracting data from DB2 into SQL Server.
  • Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions and measured facts.
  • Generated parameterized queries for generating tabular reports using global variables, expressions, functions, and stored procedures using SSRS.

Environment: Agile, MS Azure, SSIS, SQL, MS Excel 2019, SSRS, Erwin 9.7, T-SQL, XML, NoSQL, OLTP, OLAP, Hive 3.1, Hadoop 3.0

Confidential - Chicago, IL

Data Modeler/Analyst

Responsibilities:

  • Worked as a Data Modeler to generate Data Models and developed relational database system.
  • Identified and compiled common business terms for the new policy generating system and also worked on contract Subject Area.
  • Interacted with Business Analysts to gather the user requirements and participated in data modeling JAD sessions.
  • Extensively used Agile methodology as the Organization Standard to implement the data Models.
  • Worked on migrating of EDW to AWS using EMR and various other technologies.
  • Worked at conceptual/logical/physical data model level using Erwin according to requirements.
  • Initiated and conducted JAD sessions inviting various teams to finalize the required data fields and their formats.
  • Responsible for troubleshooting issues in the execution of MapReduce jobs by inspecting and reviewing log files in AWS S3.
  • Designed Data Flow Diagrams, E/R Diagrams and enforced all referential integrity constraints.
  • Performed extensive Data Validation, Data Verification against Data Warehouse and performed debugging of the SQL-Statements and stored procedures for business scenarios.
  • Created MDM base objects, Landing and Staging tables to follow the comprehensive data model in MDM.
  • Implemented Forward engineering to create tables, views and SQL scripts and mapping documents.
  • Created SSIS package to load data from Excel to SQL server using connection manager.
  • Used Normalization and De-normalization techniques for effective performance in OLTP and OLAP systems.
  • Translated business concepts into XML vocabularies by designing XML Schemas with UML.
  • Involved in Normalization (3rd normal form), De-normalization (Star Schema) for Data Warehousing.
  • Worked on Amazon Redshift and AWS and architecting a solution to load data, create data models.
  • Produced report using SQL Server Reporting Services (SSRS) and creating various types of reports.
  • Worked with data compliance teams, team to maintain data models, Metadata, data Dictionaries, define source fields and its definitions.
  • Conducted Design discussions and meetings to come out with the appropriate Data Mart using Kimball Methodology.
  • Designed and Developed Use Cases, Activity Diagrams, and Sequence Diagrams using UML
  • Established a business analysis methodology around the RUP (Rational Unified Process).
  • Involved in SQL Server and T-SQL in constructing Tables, Normalization and De-normalization techniques on database Tables.
  • Worked on analyzing source systems and their connectivity, discovery, data profiling and data mapping.
  • Developed and implemented data cleansing, data security, data profiling and data monitoring processes.

Environment: Agile, Erwin 9.6, MapReduce, AWS, SQL, MDM, SSIS, MS Excel 2018, OLTP, OLAP, XML, SSRS, T-SQL

Confidential - Phoenix, AZ

Data Modeler

Responsibilities:

  • Performed Data Analysis on the source data in order to understand the relationship between the entities
  • Worked with functional analyst to prepare the Source Target mapping specs with the business logic and also helped in data modeling
  • Initiated and conducted JAD sessions inviting various teams to finalize the required data fields and their formats.
  • Used ER/Studio and the many too many relationships between the entities are resolved using associate tables.
  • Extensively involved in extracting the data using SSIS from OLTP to OLAP.
  • Worked closely with the Application development Team to Implement Security Master Data Management (MDM) Solution.
  • Created SSIS packages to Extract, Transform and load data using different transformations such as Lookup, Derived Columns, Condition Split, Aggregate and Pivot Transformation.
  • Written PL/SQL, stored procedures & triggers, cursors for implementing business rules and transformations.
  • Created and managed many objects in large Oracle Databases containing millions of records.
  • Involved in the analysis of the existing claims processing system, mapping phase according to functionality and data conversion procedure.
  • Ensured that Business Requirements can be translated into Data Requirements.
  • Designed Star and Snowflake Data Models for Enterprise Data Warehouse using ER/ Studio.
  • Designed & Developed of Converting Relation Data to XML on Windows, Mainframe, and UNIX.
  • Created tables, indexes, views, snap shots, database links to see the information extracted from SQL files.
  • Developed normalized Logical and Physical database models to design OLTP system for insurance applications
  • Extensively used Metadata & Data Dictionary Management; Data Profiling and Data Mapping.
  • Worked in Relational Database Concepts, Entity Relation Diagrams, Normalization and De-normalization Concepts.
  • Resolved the data type inconsistencies between the source systems and the target system using the mapping documents.
  • Supported team in resolving SQL Reporting services and T-SQL related issues and Proficiency in creating different types of reports.
  • Designed and implemented Parameterized and cascading parameterized reports using SSRS.

Environment: ER/Studio, SSIS, OLTP, OLAP, MDM, PL/SQL, XML, T-SQL, SQL

Confidential - Redwood City, CA

Data Analyst/Data Modeler

Responsibilities:

  • Created Databases, table spaces, and buffer pools in Power designer to create physical tables on the data base.
  • Involved in the data analysis and database modeling of both OLTP and OLAP environment.
  • Extensively used SQL Analyzer and wrote complex SQL Queries using joins, sub queries and correlated sub queries
  • Designed a STAR schema for the detailed data marts and Plan data marts involving confirmed dimensions.
  • Worked extensively in both Forward Engineering as well as Reverse Engineering using data modeling tools.
  • Developed Custom Logging so user can know when a row is inserted in custom logging table by every SSIS package that executes.
  • Created Tableau scorecards, dashboards using stack bars, bar graphs, scattered plots, geographical maps, Gantt charts using show me functionality.
  • Used forward engineering to create a Physical Data Model with DDL that best suits the requirements from the Logical Data Model
  • Developed Snowflake Schemas by normalizing the dimension tables as appropriate.
  • Performed Normalization of the existing OLTP systems (3rd NF), to speed up the DML statements execution time.
  • Provided data sourcing methodology, resource management and performance monitoring for data acquisition.
  • Created and modified various Stored Procedures used in the application using T-SQL.
  • Extensively involved in creating PL/SQL objects i.e. Procedures, Functions, and Packages.
  • Involved in designing and developing Data Models and Data Marts that support the Business Intelligence Data Warehouse.
  • Defined and implemented ETL development standards and procedures for data warehouse environment.
  • Written complex Expressions and Calculations for SSRS reports with Conditional Formatting.
  • Implemented pivot tables and charts in MS Excel for summarizing data.

Environment: Power designer 16.6, OLTP, OLAP, SQL, SSIS, T-SQL, PL/SQL, SSRS, MS Excel 2016

Confidential

Data Analyst

Responsibilities:

  • Involved with Data Analysis primarily Identifying Data Sets, Source Data, Source Meta Data, Data Definitions and Data Formats
  • Generated comprehensive analytical reports by running SQL queries against current databases to conduct data analysis.
  • Written a complex SQL, PL/SQL, Procedures, Functions, and Packages to validate data and testing process.
  • Extensively developed Unix Shell scripts to run batch Jobs and communicate the messages to the users
  • Created packages using Integration Services (SSIS) for data extraction from Flat Files, Excel
  • Worked on data profiling and data validation to ensure the accuracy of the data between the warehouse and source systems.
  • Created reports using SSRS from OLTP and OLAP data sources and deployed on report server.
  • Involved in data mining, transformation and loading from the source systems to the target system.
  • Involved in testing the XML files and checked whether data is parsed and loaded to staging tables.
  • Used procedures like Proc Import for getting data into SAS from different formatted files like CSV file, .txt file
  • Created action filters, parameters and calculated sets for preparing dashboards and worksheets.
  • Reviewed many T-SQL issues using management studio for better performance.
  • Involved in SQL Development, Unit testing and Performance Tuning and to ensure testing issues are resolved on the basis of using defect reports.
  • Worked on Trillium Data Quality tool for monitoring Production systems for Data Anomalies and resolve issues.
  • Generated ad-hoc reports using MS Reporting services and Crystal reports.
  • Involved with data profiling for multiple sources and answered complex business questions by providing data to business users.

Environment: SQL, PL/SQL, SSIS, MS Excel 2012, OLTP, OLAP, SSRS, XML, T-SQL

We'd love your feedback!