We provide IT Staff Augmentation Services!

Senior Data Modeler Resume

0/5 (Submit Your Rating)

Franklin Lakes, NJ

SUMMARY

  • 10+ years of extensive Information Technology experience in all phases of Software development life cycle including System Analysis, Design, Data Modeling, Dimensional Modeling, implementation and Support of various applications in OLTP and Data Warehousing.
  • Data Modeler with strong Conceptual, Logical and Physical Data Modeling skills, Data Profiling skills, Maintaining Data Quality, experience with JAD sessions for requirements gathering, creating data mapping documents, writing functional specifications, queries.
  • Extensive experience in Relational and Dimensional Data modeling for creating Logical and Physical Design of Database and ER Diagrams using multiple data modeling tools like ERWIN, ER Studio and Power Designer.
  • Experience in modeling DW components (Staging area, normalize, and reporting DB (Star Schema / data mart).
  • Experience with Mainframe systems (COBOL, JCL, CICS, VSAM, DB2, IMS, IDMS) and also conversion of Mainframe data to ETL Staging tables.
  • Experience with ERP Systems (SAP, Oracle EBS) on different functional modules like Finance and Controlling, Sales and Distribution, Demand Planning etc.
  • Experience with different Data Warehouse architectures like Kimball, Inmon, Hub and Spoke architectures etc.
  • Expertise in ETL tools likes Informatica (Power Center 9.1/8.6/8.1/6.2/1.7 ) for building Data Mart.
  • Knowledge on Big data, Hadoop, HDFS, Map Reduce and Splunk.
  • Experienced in SQL Server Performance Tuning, Query Optimization and running Database checked the physical and logical consistency of a database using DBCC Utilities.
  • Worked effectively on SQL Profiler, Index Tuning Wizard, Estimated Query Plan to optimize the performance tuning of SQL Queries and Stored Procedures.
  • Worked extensively on forward and reverse engineering processes. Created DDL scripts for implementing Data Modeling changes. Created ERWIN reports in HTML, RTF, PDF format depending upon the requirement, published Data model in model mart, created naming convention files, coordinated with DBAs to apply the data model changes.
  • Experience in modeling a source system into the DW and participating in the generation logical and physical models.
  • Extensive experience in advanced SQL Queries and PL/SQL stored procedures.
  • Extensive experience in writing functional specifications, translating business requirements to technical specifications, created/maintained/modified data base design document with detailed description of logical entities and physical tables.
  • Experience with Agile data modeling development and Scrum Methodologies.
  • Possess strong Documentation skill and knowledge sharing among Team, conducted data modeling review sessions for different user groups, participated in requirement sessions to identify requirement feasibility.
  • Extensive Experience working with business users/SMEs as well as senior management.
  • Strong understanding of the principles of Data warehousing, Fact Tables, Dimension Tables, star and snowflake schema modeling.
  • Experience in backend programming including schema and table design, stored procedures, Triggers, Views, and Indexes.
  • Excellent analytical, inter - personal and communication skills with a strong technical background.

TECHNICAL SKILLS

Data Modeling: Dimensional Data Modeling, Relational Data Modeling, Star and Snowflake Schema, Fact and Dimension Tables, Conceptual, Logical and Physical Data Modeling, ER Studio 7.1.1, UML, Rational Toolkit, ERwin 4.0/3.5.2/3. x, Power Designer.

Data Warehousing: Informatica PowerCenter 8.1/8.0/7.1/7.0/6.2/6.1/5.2 , Informatica PowerMart 4.7, PowerConnect, Power Exchange, Data Explorer, Data Profiling, Data cleansing, OLAP, OLTP, SQL*Plus, PL/SQL,SSISData Stage 8.5/8.1.0/7.0/6. X/5.x (Designer, Manager, Director).

Reporting: Crystal Reports, Business Objects, Micro Strategy, SSRS, OBIEE, Cognos.

Databases: Oracle 11g/10g/9i/8i/8.0/7.0 DB2 8.0/7.0,MS SQL 7.0, SQL Server 2000/2005/2008/2012 , MS Access 7.0/97/2000, Tera Data, Quest Central DB2.

Environment: UNIX, LINUX, Windows 98/2000/XP/Vista.

Business Domains: Banking - Mortgage Banking. Financial services - Brokerage, Trading, Portfolio management. Insurance - Property and casualty Insurance.

PROFESSIONAL EXPERIENCE

Confidential, Franklin Lakes, NJ

Senior Data Modeler

Responsibilities:

  • Worked with Business Analysts in gathering reporting requirements representing Data Design team.
  • Design of Teradata data Warehouse tables as per the Business Objects reporting requirements.
  • Modeling the tables as per the Corporate standards using Erwin, generation of DDL and coordination with DBA in the creation of tables.
  • Preparation of Report Mapping documents, Source to Target Mapping documents (STM) and for loading the data to Target tables.
  • Developed ETL load SQL queries using Complex Joins which involves tables spanning the entire data model for the Teradata Data Warehouse.
  • Worked on developing PL/SQL stored procedures to do the validations and implementing transformations.
  • Worked on Model Mart as repository for data models.
  • Developed SQL stored procedures as per the business requirements of the ETL load processes.

Environment: - Erwin Data Modeler r7.3, Teradata SQL Assistant 7.1, Oracle E-Business Suite, Teradata, Business Objects, Oracle SQL Developer, Microsoft SQL Server 2008 R2.

Confidential, Somers, NY

Senior Data Modeler

Responsibilities:

  • Participated in requirements session with IT Business Analysts, SME’s and business users to understand and document the business requirements as well as the goals of the project.
  • Worked on building the data model using ER Studio as per the requirements, discussion and approval of the model from the BA.
  • Analyzed the source systems (Mainframe COBOL copybooks, DB2 and Oracle Databases) to understand the source data relationships along with deeper understanding of business rules and data integration checks.
  • Conversion of Order management system on Mainframe to web based systems.
  • Worked on designing Data Warehouse tables to load the extracts from SAP BW.
  • Extensively worked on analyzing SAP tables and custom fields added to the SAP tables.
  • Worked on creating Complex SQL queries for loading the required data from Source tables.
  • Extensively worked on preparation of data mapping documents from Mainframe source data of Copy books, VSAM files to ETL staging tables.
  • Creation of DDL and DML scripts are per the standards.
  • Worked on performance tuning of the SQL queries and Stored procedures.
  • Worked on fixing data quality issues like missing data, duplicate data and incorrect data entries.
  • Developed UNIX scripts required for the DB2 Data Warehouse deployed on UNIX platform.
  • Creation of data flow diagrams as per the requirements of the projects.

Environment: - ER Studio Data Architect 9.0, DB2, Oracle 11g, UNIX, TOAD, SQL Navigator, SAP ECC 6.0, SAP BI 7.0, Informatica V9.5

Confidential, Newark, DE

Senior Data Modeler

Responsibilities:

  • Participated in requirement gathering session with IT Business Analysts, SME’s, business users and Sprint Teams to understand and document the business requirements as well as the goals of the project.
  • Worked on enhancements to the Data Warehouse model using Erwin as per the Business reporting requirements.
  • Analyzed the source system (Progress Databases) to understand the source data relationships along with deeper understanding of business rules and data integration checks.
  • Implemented the Slowly changing dimension scheme (Type II) for most of the dimensions.
  • Reviewed the logical model with Business users, ETL Team, DBA’s and testing team to provide information about the data model and business requirements.
  • Experience working on Dodd-Frank regulations in Mortgage Banking domain.
  • Worked on Agile development methodologies like Scrum, coordinating with different Sprint teams.
  • Worked in the preparation of XML files mapping to Database tables as per the application development requirements.
  • Developed SQL Stored Procedures and Autosys Scheduler to load data into the database.
  • Creation of LDM and PDM for database using Erwin, coordinating with business to document proper metadata, definitions, UDP’s as per the corporate standards.
  • Extensively worked on SQL in analyzing the database and querying the database as per the business scenarios.
  • Used Model manager option in Erwin to synchronize the data models in Model Mart approach.
  • Integrated Data Quality routines in the Informatica mappings to standardize and cleanse Account and Contact data.
  • Extensively worked on Informatica Data Explorer in creating data profiles with comparative profiling analysis and accuracy of mapping logic.
  • Worked with DBA to create the physical model and tables. Scheduled multiple brain storming sessions with DBAs and production support team to discuss about views, partitioning and indexing schemes, case by case for the facts and dimensions.

Environment: - Erwin, Mortgage Express, Business Objects, Oracle SQL Developer, Microsoft SQL Server 2008 R2.

Confidential, Montvale, NJ

Data Modeler

Responsibilities:

  • Participated in requirement gathering session with business users and sponsors to understand and document the business requirements as well as the goals of the project.
  • Created and reviewed the conceptual model for the EDW (Enterprise Data Warehouse) with business users.
  • Analyzed the source system (JD Edwards) to understand the source data and JDE table structure along with deeper understanding of business rules and data integration checks.
  • Identified various facts and dimensions from the source system and business requirements to be used for the data warehouse.
  • Created the dimensional logical model with approximately 10 facts, 30 dimensions with 500 attributes using ER Studio.
  • Implemented the Slowly changing dimension scheme (Type II) for most of the dimensions.
  • Reviewed the logical model with Business users, ETL Team, DBA’s and testing team to provide information about the data model and business requirements.
  • Created the DDL scripts using ER Studio and source to target mappings (S2T- for ETL) to bring the data from JDE to the warehouse.
  • Worked with DBA to create the physical model and tables. Scheduled multiple brain storming sessions with DBAs and production support team to discuss about views, partitioning and indexing schemes, case by case for the facts and dimensions.
  • Worked on Mercury Quality Center to track the defect logged against the logical and physical model.
  • Worked as an onsite project coordinator once the design of the database was finalized in order to implement the data warehouse according to the implementation standards.
  • Worked with client and off shore team to make sure that the reports and dashboards are delivered on time.
  • Participated in UAT sessions to educate the business users about the reports, dashboards and the BI System. Worked with the test team to provide insights into the data scenarios and test cases.
  • Worked on data profiling and data validation to ensure the accuracy of the data between the warehouse and source systems.

Environment: - ER Studio 8.0.3, Microsoft SQL 2008 Server, Microsoft SQL Management Studio, Microsoft SQL 2008 Integration Services,Tera Data, Microsoft SQL 2008 Reporting Services, Microsoft SQL 2008 Analysis Services, Mercury Quality Center 9.

Confidential, Pennington, NJ

Data Modeler/Data Analyst

Responsibilities:

  • Participated in JAD session with business users and sponsors to understand and document the business requirements in alignment to the financial goals of the company.
  • Created the conceptual model for the data warehouse with emphasis on mutual funds and annuity using EMBARCADERO ER Studio data modeling tool.
  • Reviewed the conceptual EDW (Enterprise Data Warehouse) data model with business users and Information architects to make sure all the requirements are fully covered.
  • Analyzed large number of COBOL copybooks from multiple mainframe sources (16) to understand existing constraints, relationships and business rules from the legacy data.
  • Worked on rationalizing the requirements across multiple product lines.
  • Reviewed and implemented the naming standards for the entities, attributes, alternate keys, and primary keys for the logical model.
  • Created the logical model for the EDW with approximately 75 entities and 1000 attributes using ER Studio. The logical model was fully attributed till 3rd normalization and contains both current and history tables. Data model is divided in number of sub models for the ease of understanding and comprehension.
  • Reviewed the logical model with application developers, ETL Team, DBAs and testing team to provide information about the data model and business requirements.
  • Worked with ETL to create source to target mappings (S2T).
  • Worked with DBA to create the physical model and tables. Worked on Informatica Data Quality tool for data cleansing, conformance and freshness of data.
  • Had brain storming sessions with application developers and DBAs to discuss about various de-normalization, partitioning and indexing schemes for physical model.
  • Worked on Requirements Traceability Matrix to trace the business requirements back to logical model.

Environment: - ER Studio 7.1.1, Quest Central for DB2 v 4.8, COBOL copybooks, Mainframe DB2, Mercury Quality Center 9, Informatica PowerCenter 8.1

Confidential, Winston-Salem, NC

Data Modeler

Responsibilities:

  • Analyzed existing logical data model (LDM) and made appropriate changes to make it compatible with business requirements.
  • Expanded Physical Data Model (PDM) for the OLTP application using Erwin.
  • Involved in data model reviews with internal data architect, business analysts, and business users with explanation of the data model to make sure it is in-line with business requirements.
  • Created rationalized domains to bring consistency in the tables.
  • Identified source systems, their connectivity, related tables and fields and ensure data suitably for mapping.
  • Worked with cross-functional teams and prepared detailed design documents for production phase of current customer database application.
  • Involved in exhaustive documentation for technical phase of the project and training materials for all data management functions.
  • Used Reverse Engineering approach to redefine entities, relationships and attributes in the data model as per new specifications in Erwin after analysing the database systems currently in use.
  • Enforced referential integrity in the OLTP data model for consistent relationship between tables and efficient database design.
  • Developed the required data warehouse model using Star schema for the generalized model.
  • Used forward engineering approach for designing and creating databases for OLAP model.
  • Conducted design walk through sessions with Business Intelligence team to ensure that reporting requirements are met for the business.
  • Developed Data Mapping, Data Governance, Transformation and Cleansing rules for the Master Data Management Architecture involving OLTP, ODS and OLAP.
  • Collaborated with ETL, BI and DBA teams to analyse and provide solutions to data issues and other challenges while implementing the OLAP model.
  • Developed and maintained data dictionary to create metadata reports for technical and business purpose.

Environment: Erwin 4.0, Toad, PL/SQL, Oracle 9i, SQL Server 2000, SQL*Loader, UNIX, Windows 2005

Confidential

COBOL/ETL Developer

Responsibilities:

  • Worked on identifying facts, dimensions and various other concepts of dimensional modeling which are used for data warehousing projects.
  • Worked on Normalization and De-normalization techniques.
  • Expertise in developing Report programs using COBOL .
  • Worked on mainframe applications involving COBOL, JCL and DB2. Defined relationships and cardinalities among entities.
  • Developed PL/SQL stored procedures to do the validations.
  • Created and maintained Database Objects (Tables, Views, Indexes, Partitions, Database triggers etc).

We'd love your feedback!