We provide IT Staff Augmentation Services!

Sr. Data Modeler/data Analyst Resume

0/5 (Submit Your Rating)

Chicago, IL

SUMMARY:

  • 8+ years of extensive experience in the complete Software Development Life Cycle (SDLC) covering Requirements Management, Data Modeling, Data Analysis, Data Mapping, System Analysis, Architecture and Design, Development, Testing and Deployment of business applications, business analysis.
  • Strong Data Mapping experience using ER diagram, Dimensional/Hierarchical data modeling, Star Schema modeling, Snowflake modeling using tools like Erwin, E/RStudio and Sybase Power Designer.
  • Experienced creating DDLscripts for implementing Data Modeling changes.
  • Experienced creating ERWIN reports in HTML, RTF format depending upon the requirement, Published Data model in model mart, created naming convention files, co - coordinated with DBAs’ to apply the data model changes.
  • Expertise in using data modeling Tools like ERWIN, Power Designer and E/R Studio.
  • Excellent experience with Normalized tables and dimensions up to 3NF in order to optimize the performance.
  • Experienced in Data Integration techniques like Data Extraction, Transformation and Loading(ETL) from disparate Data Source databases like Oracle, SQL Server, MS Access, flat files, CSV files and XML files into target warehouse using various transformations inInformatica, SSIS.
  • Expertise in Physical Modeling for multiple platforms such as Oracle/Teradata/ SQL Server/DB2/Netezza.
  • Expertise in Source to Targetdata mapping, Standardization Document Slowly Changing Mapping Creation, Star/Snowflake Schema Mapping Creation, RDMS, Building Data Marts and Meta Data Management.
  • Experienced using Teradata SQL Assistant and data load/export utilities like BTEQ, FastLoad(Fload), Multi Load(MLoad), Fast Export, and exposure to Tpumpon UNIX/Windows environments.
  • Experienced Creating the Logical and Physicaldesign of the Data Warehouse (both Fact and Dimension tables) using STAR Schema and Snowflake Schema approach.
  • Proficient in Normalization (1NF/2NF/3NF) /De-normalization techniques in relational/dimensional database environments.
  • Experienced with Teradata SQL and TeradataUtilities and extensively worked on Teradatadata modeling projects.
  • Solid understanding of Rational Unified Process (RUP) using Rational Rose, Requisite Pro, Unified Modeling Language (UML), Object Modeling Technique (OMT), Extreme Programming (XP), and Object Oriented Analysis (OOA).
  • Strong experience in conducting User Acceptance Testing (UAT) and documentation of Test Cases. Expertise in designing and developing Test Plans and Test Scripts.
  • Expertiseinrelational database concepts, dimensional database concepts, and database architecture & design for Business Intelligence and OLAP.
  • Experienced in development methodologies like RUP, SDLC, AGILE, SCRUM and Waterfall.
  • Well versed with SDLC software development life cycle. Experience in dealing with different data sources ranging from Flat files, Oracle, Sybase, SQL server, Teradata, Ms Access and Excel.
  • Experienced with creating reports using Crystal Reports XI.
  • Extensive experience in loading high volume data, worked extensively with data migration, data cleansing and ETL processes.

TECHNICAL SKILLS:

Data Modeling Tools: Erwin, E/R Studio, Sybase Power Designer, MS Visio

DataAnalysis: Requirement Analysis, Business Analysis,DataFlow Diagrams, Business rules,DataModeling.

Databases: Oracle 12c/11g/10g, IBM DB2, Teradata R15/R14/R13, MS SQL Server, MS Access,Netezza

Languages: SQL, PL/SQL, T-SQL, HTML, Java, Visual Basic, C, C++

Operating Systems: Windows NT/XP/2000/7, UNIX, Linux, Sun Solaris

Others Tools: MS Office, SharePoint, Lotus Notes, Mega, Aginity, Teradata SQL Assistant

ETL Tools: Informatica Power Centre, SSIS

PROFESSIONAL EXPERIENCE:

Confidential, Chicago, IL

Sr. Data Modeler/Data Analyst

Responsibilities:

  • Analysed business, data and systems requirements in order to perform data modelling and design (high level conceptual business/data models, entity-relationship and dimensional logical data models, and detailed physical database design models).
  • Build Conceptual, Logical and Physical Data Models for OLTP Systems and publishing LDM, PDM and Data dictionary at the end of each sprint.
  • Responsible for the consistency of data design across projects by adhering to standards, following roadmaps and implementing strategic initiatives.
  • Designs the logical data model consisting of entities, attributes and relationships between entities and documents the model in CA Erwin
  • Managed all indexing, debugging and query optimization techniques for performance tuning using T-SQL.
  • Developed complex Teradata SQL code in BTEQ script using OLAP and Aggregate functions to name few.
  • Working in AgileData Modelling methodology and creating data models in sprints in SOA architecture
  • Involved in data governance processes, author data transformation scripts, perform data cleansing actions, and execute data integration/consolidation processes.
  • Conducteddata model reviews with developers, architects, business analysts and subject matter experts to collaborate and gain consensus.
  • Used E.F Codd'sNormalization(1NF, 2NF & 3NF) and Demoralization techniques for effective performance in OLTP and OLAP systems.
  • Designed and developed various applications using oracle products/tools for process automations software into database server using UNIX shell scripts, PL/SQL packages, procedures and database triggers based on client/server and multi-tier technologies.
  • Write complex SQL queries to pull the required information for business use from database using Teradata SQL Assistant.
  • Documented the source to target mapping spread sheet which have all the information of source and target data types and allthe necessary transformation rules which are in turn used for their metadata updates.
  • Created source to target mapping specifications using Informatica data quality tool.
  • Maps data entities and attributes from source to target data stores and creates documentation of data mapping.
  • Executed SQL queries to retrieve data from databases for analysis and Created NetezzaSQL scripts to test the table loaded correctly
  • Populated or refreshedTeradata tables using FastLoad (FLoad), MultiLoad (MLoad), TD load utilities.
  • Develops and performs standard queries to ensure data quality, identify data inconsistencies, missing data and resolve as needed.
  • Cleaning up the data models in Erwin, updating metadata information where necessary

Environment: Erwin r9.6, Informatica 9.5, Oracle 12c, Teradata 15, DB2, SSIS, Business Objects, SQL Server 2008/2012, SQL, PL/SQL, IBM DB2, VBA MS Excel, MS Excel, NetezzaAginity, Teradata SQL Assistant, Metadata, UNIX.

Confidential, Teaneck, NJ

Sr. Data Modeler/ Data Analyst

Responsibilities:

  • Gathered& identified business data requirements from business partners and development teams, understand the information needs and translate those data requirements into conceptual, logical and physical database models.
  • Created Conceptual, Logical and Physical data model as per enterprise standard.
  • Designed the Logical Model into Dimensional Model using Star Schema and Snowflake Schema on Erwinto build data warehouse.
  • Wrote, tested and implemented Teradata FastLoad, MultiLoad and BTEQ scripts, DML and DDL.
  • Interpreted stakeholder functional and information needs and created high level business information models and/or conceptual data models understandable by the business and application owners.
  • Reverse engineered the existing data marts and identified the Data Elements (in the source systems), Dimensions, Facts and Measures required for reports.
  • Cleansed unnecessary tables and columns and redefined various attributes and relationships in the Reverse Engineering Model.
  • Provided centralized direction for data catalogs, metadata repositories, data definitions, and data relationships
  • Created data models and analytical systems for OLAP and assisted in data extraction, transformation and loading process (ETL)
  • Worked on BTEQ scripting and as part of it built complex SQLs to map the data as per the requirements.
  • Involved in data governance activities such as setting standards for data definition, class words.
  • Createdand maintained Logical Data Model (LDM) for the project. Includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc.
  • Responsible for determining the bottlenecks and fixing the bottlenecks with performance tuning using NetezzaDatabase.
  • Mapped the business requirements and new databases to the logical data model, which defines the project delivery needs.
  • GeneratedDDL statements and scripts from Physical Data Model to create objects like table, views, indexes, stored procedures and packages.
  • Transformations including aggregation, summation construct from source to data warehouse.
  • Adjusted and maintained SQL script and perform further data analysis and data aggregation.
  • Profiled data to conclude primary index, secondary index and unique index.
  • Maintained and enforce data architecture/administration standards, as well as standardization of column name abbreviations, domains and attributes.
  • Created and maintained Meta data for the enterprise such as data dictionary, data definition.

Environment: Erwin r9.5, Oracle 11g, Teradata 14, DB2, SSIS, Business Objects, SQL Server 2005/2008, MS Excel, Teradata SQL Assistant, Aginity, UNIX, Informatica.

Confidential - New York, NY

Sr. Data Modeler/ Data Analyst

Responsibilities:

  • Conducted or performeddata modelling exercises in support of subject areas and/or specific client needs for data, reports, or analyses, with a concern towards reuse of existing data elements, alignment with existing data assets and target enterprise data architecture.
  • Developedconceptual, logical, and physical Enterprise Data Model based on industry standards.
  • Involved in preparing logical data models and conducted controlled brain-storming sessions with project focus groups.
  • Used Model Mart of Erwin for effective model management of sharing, dividing and reusing model information and design for productivity improvement.
  • Developed processes on both Teradata and Oracle using shell scripting and RDBMS utilities such as Multi Load, Fast Load, Fast Export, BTEQ (Teradata) and SQL*Plus, SQL*Loader(Oracle).
  • Translated the business requirements into workable functional and non-functional requirements at detailed production level using Workflow Diagrams, Sequence Diagrams, Activity Diagrams and Use Case Modelling.
  • Developed Logical Dimensional Models and processed the key facts and dimensions required for the business support.
  • Applied Hot-bug fixes and also version patches for Netezza, Oracle, SQL server and Informatica in Windows environment.
  • Designed the Logical Model into Dimensional Model using Star Schema and Snowflake Schema.
  • Rationalized the relationships expressed in the logical data model to the schema of the physical data store (that is, database tables and columns).
  • Created and maintained Logical Data Model (LDM) for the project. Includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc.
  • Generated DDL Scripts from Physical Data Model using technique of Forward Engineering in Erwin.
  • Developed complex SQL queries, and perform execution validation for remediation and Analysis.
  • Identified objects and its relationship from existing database which can be used as a reference for data mart, then transformed those objects into physical model using Reverse Engineering in Erwin.
  • Migrated the source code from Oracle into NetezzaDatabase.
  • Performed the Data Modelling effort for the gaps identified while data mapping.
  • Developed SQL queries on Teradata in order to get the data from different tables using joins and database links.

Environment: Windows 7, Microsoft Office SharePoint 2007, Cognos, Rational Requisite Pro, MS Office (Word, Excel and PowerPoint), Teradata, MS Project, MS FrontPage 2003, MS Access, CSV files, EDI, Documentum 2.0., UML, Java, Erwin, MS Visio, Oracle 11g, Toad, Oracle Designer, SQL Server 2008, Oracle SQL developer 2008, Micro strategy 9.2, Tableau report builder.

Confidential - Miami, FL

Data Modeler/Data Analyst

Responsibilities:

  • Worked with Business Architects and System Analysts to gather business data elements from business requirements and translate them into a Logical Data Model.
  • Defined, developed and delivered consistent information and data standards, methodologies, guidelines, best practice and approved modeling techniques around data quality, data governance and data security.
  • Partnered with subject matter experts, architects and developers to capture and analyze business needs to complete all data modeling related artifacts including Conceptual, Logical and Physical Data Models.
  • Identified objects and relationships and how those all fit together as logical entities, these are then translated into physical design using the forward engineering in E/R Studio tool.
  • Performed database tuning and optimize complex SQL queries using Teradata Explain, stats and indexes.
  • Worked extensively on SQL querying using Joins, Alias, Functions, Triggers and Indexes.
  • Worked on Designer Tools like Source Qualifier, Aggregate, Lookup, Expression, Normalizer, Filter, Router, Rank, Sequence Generator, Update Strategy and Joiner.
  • Used Reverse Engineering and Forward Engineering techniques on databases from DDL scripts and created tables and models in data mart, data warehouse and staging.
  • Cleansed unnecessary tables and columns and redefined various attributes and relationships in the Reverse Engineering Model.
  • Created Conceptual Data Model, Data Flow Diagram, Data Topology, Logical Data Model and Physical Data Model.
  • Developed scripts for loading the data into the base tables in EDW using FastLoad, MultiLoad and BTEQ utilities of Teradata.
  • Created and maintained metadata such as Data Elements dictionary for the entire Development Center Data Mart.
  • Conducted peer reviews of completed data models and plans to ensure quality and integrity from data capture through usage and archiving.
  • Used Mega Tool as well as E/R Studio to create Conceptual Data Model.

Environment: Oracle 10g, Teradata 14, DB2, SSIS, Business Objects, SQL Server 2005/2008, ER/Studio Windows XP, MS Excel, Netezza, SQL, PL/SQL, Teradata SQL Assistant.

Confidential

Data Analyst/Data Modeler

Responsibilities:

  • Generated reports using SQL from oracle database, which were used for comparison with legacy systemand created Temporary Tables to store the data from Legacy system.
  • Identified the Objects and relationships between the objects to develop a logical model and translated the model into physical model using Forward Engineering in ERWIN.
  • Transformations including aggregation, summation construct from operational data source to data warehouse.
  • Developed Data Mapping, Data Governance, and Transformation and Cleansing rules for the Master Data Management.
  • Normalized the database upto 3NF to put them into the star schema of the Data warehouse.
  • Used Teradata Utilities (BTEQ, Multiload, and Fast Load) to maintain the database. .
  • Developed, monitored the workflows and responsible for performance tuning of the staging and 3NF workflows.
  • Designed Data Marts using dimensional modeling and wrote PL/SQL ETL packages with Oracle PL/SQL to extract data from relational data store transform and load into the data mart.
  • Implemented Referential Integrity using primary key and foreign key relationships.
  • Identified and tracked slowly changing dimensions and determined the hierarchies in dimensions.
  • Translated business concepts into XML vocabularies by designing XML Schemas with UML.
  • Created entity-relationship diagrams, functional decomposition diagrams and data flow diagrams.
  • Implemented Forward engineering to create tables, views and SQL scripts and mapping documents.
  • Denormalized the database to put them into the star schema of the data warehouse.
  • Understood existing data model and documented suspected design affecting the performance of the system. .
  • Conducted logical data model walkthroughs and validation.

Environment: Erwin r8.2, Oracle SQL Developer, Oracle Data Modeler, Teradata, SSIS, Business Objects, SQL Server 2005/2008, Windows XP, MS Excel.

We'd love your feedback!