We provide IT Staff Augmentation Services!

Sr. Data Modeler/data Analyst Resume

2.00/5 (Submit Your Rating)

SUMMARY

  • Over 7+ Years of working experience in Data Analysis and Data Modeling with high proficiency in handling databases and data models.
  • Good experience working on analysis tool like Tableau for regression analysis, pie charts, and bar graphs.
  • Strong experience with architecting highly performance databases using PostgreSQL, PostGIS, MYSQL and Cassandra.
  • Solid Excellent experience in creating cloud based solutions and architecture using Confidential Web services ( Confidential EC2, Confidential S3, Confidential RDS) and Confidential Azure.
  • Experienced in working on agile methodology . Assisted quality assurance team in preparing test scripts and test cases. Worked on Documenting Data Models, logic, coding, testing, changes and corrections.
  • Well experienced in writing Complex queries, stored procedures, functions, cursors and packages using PL/SQL Programming.
  • Solid hands on experience with administration of data model repository, documentation in metadata portals in such as Erwin, ER Studio and Power Designer tools.
  • Capture, validate and publish metadata in accordance with enterprise data governance policies and MDM taxonomies.
  • Responsible for data governance rules and standards to maintain the consistency of the business element names in the different data layers.
  • Experienced in Data Analysis, Data Migration, Data Cleansing, Transformation, Integration, Data Import, and Data Export through the use of multiple ETL tools such as Informatica Power Center.
  • Experienced in Data Modeling including Data Validation/Scrubbing and Operational assumptions.
  • Very good knowledge in Data Analysis, Data Validation, Data Cleansing, Data Verification and Identifying Data Mismatch.
  • Extensively experience on EXCEL PIVOT tables to run and analyze the result data set and perform UNIX scripting.
  • Extensive experience on usage of ETL & Reporting tools like SQL Server Integration Services (SSIS), SQL Server Reporting Services (SSRS)
  • Experience in Normalization and Demoralization processes, Logical and Physical Data Modeling techniques.
  • Experience in Database performance tuning and Data Access optimization, writing complex SQL quires and PL/SQL blocks like stored procedures, Functions, Triggers, Cursors and ETL packages.
  • Experience with SQL Server and T - SQL in constructing Temporary Tables, Table variables, Triggers, user functions, views, Stored Procedures.
  • Energetic and self-motivated team player with excellent communication, interpersonal, and leadership skills, thrives in both independent and collaborative work environments.

TECHNICAL SKILLS

Data Modeling Tools: ER/Studio V17, Erwin 9.7, Power Sybase Designer 16.6.

Cloud Management: Confidential Web Services(AWS), Confidential Redshift, MS Azure, Azure Data Factory

Big Data Tools: Hadoop 3.0, HDFS, Hive 2.3

Methodologies: RAD, JAD, RUP, UML, System Development Life Cycle (SDLC), Agile, Waterfall Model.

Programming Languages: SQL, PL/SQL, UNIX shell Scripting

Operating System: Windows, Unix, Sun Solaris

ETL Tools: Informatica 9.6/9.1 and Tableau.

Methodologies: RAD, JAD, RUP, UML, System Development Life Cycle (SDLC), Agile, Waterfall Model.

PROFESSIONAL EXPERIENCE

Confidential, Lowell, AR

Sr. Data Modeler/Data Analyst

Responsibilities:

  • Massively involved as Sr. Data Modeler/Analyst role to review business requirement and compose source to target data mapping documents.
  • Extensively used Erwin for Data modeling. Created Staging and Target Models for the Enterprise Data Warehouse.
  • Conducted and participated JAD sessions with the Project managers, Business Analysis Team, and development teams to gather, analyze and document the Business and reporting requirements.
  • Extensively used Agile methodology as the Organization Standard to implement the data Models.
  • Configured Hadoop Ecosystems to read data transaction from HDFS and Hive.
  • Researched and developed hosting solutions using MS Azure for service solution.
  • Responsible for Big data initiatives and engagement including analysis, brainstorming, POC, and architecture.
  • Designed the Logical Data Model using Erwin with the entities and attributes for each subject areas.
  • Developed Data Mapping, Data Governance, and Transformation and cleansing rules for the Master Data Management Architecture involving OLTP, ODS.
  • Generate the DDL of the target data model and attached it to the Jira to be deployed in different Environments.
  • Developed T-SQL scripts to create databases, database objects and to perform DML and DDL tasks.
  • Defined facts, dimensions and designed the data marts using the Ralph Kimball's Dimensional Data Mart modeling methodology using Erwin.
  • Worked on Data load using Azure Data factory using external table approach.
  • Worked on Data load using Azure Data factory using external table approach.
  • Developed and implemented data cleansing, data security, data profiling and data monitoring processes.
  • Developed and maintained data dictionary to create metadata reports for technical and business purpose.
  • Developed and configured on Informatica MDM hub supports the Master Data Management (MDM), Business Intelligence (BI) and Data Warehousing platforms to meet business needs.
  • Performed ad-hoc analyses, as needed, with the ability to comprehend analysis as needed.
  • Facilitated in developing testing procedures, test cases and User Acceptance Testing (UAT).
  • Worked with project management, business teams and departments to assess and refine requirements to design/develop BI solutions using MS Azure.

Technologies: Erwin 9.7, SQL, Ms Azure, T-SQL, MDM, OLTP, ODS Ms Excel 2016, 3NF, Agile, Ralph Kimball, Hadoop 3.0, HDFS, Hive 2.3

Confidential, Cincinnati, OH

Data Modeler

Responsibilities:

  • As a Data Modeler involved in the entire life cycle of the project starting from requirements gathering to end of system integration.
  • Interacted with users and business analysts to gather requirements.
  • Reviewed business requirement and compose source to target data mapping documents.
  • Participated in design discussions and assured functional specifications are delivered in all phases of SDLC in an Agile Environment.
  • Coordinated with Data Architects on AWS provisioning EC2 Infrastructure and deploying applications in Elastic load balancing.
  • Defined facts, dimensions and designed the data marts using the Ralph Kimball's Dimensional Data Mart modeling methodology using Erwin.
  • Developed Conceptual Model and Logical Model using Erwin based on requirements analysis.
  • Designed Star and Snowflake Data Models for Enterprise Data Warehouse using Erwin.
  • Designed both 3NF data models for ODS, OLTP systems and dimensional data models using Star and Snow Flake Schemas.
  • Used Normalization and Denormalization techniques for effective performance in OLTP systems.
  • Implemented Forward engineering to create tables, views and SQL scripts and mapping documents.
  • Created Project Plan documents, Software Requirement Documents, Environment Configuration and UML diagrams.
  • Worked on NoSQL databases including Cassandra.
  • Implemented multi-data center and multi-rack Cassandra cluster.
  • Translated business concepts into XML vocabularies by designing XML Schemas with UML.
  • Worked on Confidential Redshift and AWS and architecting a solution to load data, create data models.
  • Worked on designing a Star schema for the detailed data marts and plan data marts involving confirmed dimensions.
  • Developed and maintained data Dictionary to create Metadata Reports for technical and business purpose.
  • Completed enhancement for MDM (Master data management) and suggested the implementation for hybrid MDM (Master Data Management)
  • Worked on designing, implementing and deploying into production an Enterprise data warehouse.
  • Developed Data Mapping, Data Governance and transformation rules for the Master Data Management Architecture involving OLTP and ODS.
  • Developed Data Migration and Cleansing rules for the Integration Architecture (OLTP, ODS, DW).
  • Worked on the metadata management and part of data governance team which created the Data.
  • Conducted numerous POCs to efficiently import large data sets into the database from AWS.
  • Wrote DDL and DML statements for creating, altering tables and converting characters into numeric values.
  • Worked with MDM systems team with respect to technical aspects and generating reports.
  • Created PL/SQL procedures in order to aid business functionalities like bidding and allocation of inventory to the shippers.
  • Performed Data profiling, data mining and identified the risks involved with data integration to avoid time delays in the project.
  • Worked on Performance Tuning of the database which includes indexes, optimizing SQL Statements.
  • Involved in capturing data lineage, table and column data definitions, valid values and others necessary information in the data model.
  • Created or modified the T-SQL queries as per the business requirements.
  • Involved in user sessions and assisting in UAT (User Acceptance Testing).

Technologies: Erwin9.7, Agile, AWS, EC2, PL/SQL, XML, OLTP, ETL, Cassandra, NoSQL, EDW.

Confidential, Charlotte, NC

Sr. Data Analyst

Responsibilities:

  • As a Data Analyst analyzed data quality issues against source system and prepared a data quality document confirming all the source data quality.
  • Used SQL tools to run SQL queries and validate the data loaded in to the target tables.
  • Created tables, views, sequences, indexes, constraints and generated SQL scripts for implementing physical data model.
  • Generated periodic reports based on the statistical analysis of the data from various time frame and division using SQL Server Reporting Services (SSRS).
  • Worked with ETL team involved in loading data to staging area to data warehouse.
  • Created dimensional model for reporting system by identifying required dimensions and facts using Erwin.
  • Evaluated data profiling, cleansing, integration and extraction tools.
  • Worked on creating Excel Reports which includes Pivot tables and Pivot charts.
  • Involved in extensive Data validation by writing SQL queries and Involved in back-end testing and worked with data quality issues.
  • Performed error checking and validate ETL procedures and programs using Informatica session log for exceptions.
  • Extracted data from legacy systems into staging area using ETL jobs & SQL queries.
  • Designed and developed architecture for data services ecosystem spanning Relational, NoSQL, and Big Data technologies.
  • Responsible for the data architecture design delivery, data model development, review, approval and Data warehouse implementation.
  • Worked on ETL testing, and used SSIS tester automated tool for unit and integration testing.
  • Designed and created ETL framework from ground up.
  • Developed and delivered dynamic reporting solutions using SSRS.
  • Involved in designing and developing Data Models and Data Marts that support the Business Intelligence Data Warehouse.
  • Involved in extensive data validation by writing several complex SQL queries and Involved in back-end testing and worked with data quality issues.
  • Created SQL tables with referential integrity, constraints and developed queries using SQL, SQL*PLUS and PL/SQL.
  • Performed GAP analysis of current state to desired state and document requirements to control the gaps identified.
  • Worked on ETL process of data loading from different sources and data validation process from staging area to data warehouse.
  • Managed all indexing, debugging and query optimization techniques for performance tuning using T-SQL.
  • Worked with the application and Business Analyst team to develop requirements.
  • Involved in extensive data validation by writing several complexes SQL queries.
  • Involved in back-end testing and worked with data quality issues.
  • Involved in Data profiling and performed Data Analysis based on the requirements, which helped in catching many Sourcing Issues upfront.
  • Generated comprehensive analytical reports by running SQL queries against current databases to conduct data analysis.
  • Wrote T-SQL statements for retrieval of data and Involved in performance tuning of T-SQL queries and Stored Procedures.

Technologies: SQL, PL/SQL, SAS, Confidential Excel 2010, T-SQL, Pivot Tables, VLOOK up, triggers, Stored Procedures.

Confidential, Newport Beach, CA

Data Analyst

Responsibilities:

  • Understand the data visualization requirements from the Business Users.
  • Writing SQL queries to extract data from the Sales data marts as per the requirements.
  • Developed Tableau data visualization using Scatter Plots, Geographic Map, Pie Charts and Bar Charts and Density Chart.
  • Designed and deploy rich Graphic visualizations with Drill Down and Drop-down menu option and Parameterized using Tableau.
  • Performed Data Analysis and data profiling using complex SQL on various sources systems including Oracle.
  • Involved in extensive Data validation using SQL queries and back-end testing
  • Worked on Normalization and De-Normalization techniques for OLTP systems.
  • Explored traffic data from Google Analytics, connecting them with transaction data, and presenting as well as writing report for every campaign, providing suggestions for future promotions.
  • Extracted data using SQL queries and transferred it to Confidential Excel and Python for further analysis.
  • Worked on performance tuning of SQL queries for data warehouse consisting of many tables with large amount of data.
  • Participated in updating the dimensional model, and identify the Facts and Dimensions
  • Used T-SQL stored procedures to transfer data from OLTP databases to staging area and finally transfer into data marts.
  • Developed SQL queries to fetch complex data from different tables in remote databases using joins, database links and bulk collects.
  • Data Cleaning, merging and exporting the dataset was done in Tableau Prep.
  • Data processing and cleaning techniques carried out to reduce text noise, reduce dimensionality in order to improve the analysis.
  • Created numerous reports using report lab and python packages. Installed numerous Python modules.

Technologies: SQL, Python, Tableau, Business Intelligence, Access, Excel, MS Office, MS Visio, Statistics, Machine Learning.

Confidential

Data Analyst

Responsibilities:

  • Designed and implemented business intelligence to support sales and operations functions to increase customer satisfaction.
  • Involved in data analysis, data discrepancy reduction in the source and target schemas.
  • Developed complex PL/SQL procedures and packages using views and SQL joins.
  • Developed of reports using different SSIS Functionalities like sort prompts and cascading parameters, Multi Value Parameters.
  • Conducted detailed analysis of the data issue, mapping data from source to target, design and data cleansing on the Data Warehouse
  • Involved in identifying the Data requirements and creating Data Dictionary for the functionalities
  • Analyzed and build proof of concepts to convert SAS reports into tableau or use SAS dataset in Tableau.
  • Created or modifying the T-SQL queries as per the business requirements.
  • Developed and optimized stored procedures for use as a data window source for complex reporting purpose.
  • Performed the batch processing of data, designed the SQL scripts, control files, batch file for data loading.
  • Coordinated with data stewards / data owners to discuss the source data quality issues and resolving the issues based on the findings.
  • Worked with data investigation, discovery and mapping tools to scan every single data record from many sources.
  • Involved with data profiling for multiple sources and answered complex business questions by providing data to business users.
  • Worked on SQL Server concepts SSIS (SQL Server Integration Services), SSAS (Analysis Services) and SSRS (Reporting Services).
  • Developed database objects including tables, Indexes, views, sequences, packages, triggers and procedures to troubleshoot any database problems
  • Worked on different data formats such as Flat files, SQL files, Databases, XML schema, CSV files
  • Involved in designing Parameterized Reports for generating ad-hoc reports as per the business requirements.

Technologies: SAS, Tableau 8.1, Ad-hoc, SQL, T-SQL, Flat Files, SSIS Vs 2013, SSRS, SSAS, SML, Business Intelligence.

We'd love your feedback!