Data Modeler Resume Profile
Summary:
- Over 8 years of extensive Information Technology experience in all phases of Software development life cycle including System Analysis, Design, Data Modeling, Dimensional Modeling, implementation and Support of various applications in OLTP and Data Warehousing.
- Data Modeler with strong Conceptual, Logical and Physical Data Modeling skills, Data Profiling skills, Maintaining Data Quality, experience with JAD sessions for requirements gathering, creating data mapping documents, writing functional specifications, queries.
- Experience with Mainframe systems COBOL, JCL, CICS, VSAM, DB2, IMS, IDMS and also conversion of Mainframe data to ETL Staging tables.
- Expertise in Gathering and Analyzing Information Requirements, Data Analysis, Data Architecture, Business E-R modeling, Dimensional Modeling, ETL Design
- Experience in Informatica Power Center 9x/8.x/7x with Oracle 11g/9i/10g, SSIS and SQL Server Data Warehouse in Microsoft DW/BI environments.
- Experience in implementing Data warehousing solutions involving Dimension modeling and Snow flake schema implementation 3 NF .
- Experience in integration of various data sources with multiple Relational Databases like SQL Server, Teradata, Oracle and DB2.
- Experience on various Software Development Life Cycles including Analysis, Design, Development and Testing to solidify client requirements in conjunction with Software Developers.
- Experience in RDBMS Oracle PL/SQL, SQL, Stored Procedures, Functions, Packages, Triggers worked with Terra bytes of Volume databases.
- Hands on experience in migrating database application from legacy to newer technology, data movement, data mapping
- Good experience in data transformation, data mapping from source to target database schemas and also data cleansing.
- Used Teradata Fast Export utility to export large volumes of data from Teradata tables and views for processing and reporting needs.
- Experience in modeling with both OLTP/OLAP systems and Kimball and Inmon Data Warehousing environments.
- A good expertise in Extract Transform and Load ETL data from spreadsheets, database tables and other sources using Microsoft Data Transformation Service DTS and Informatica.
Technical Skills:
- Data Modeling Tools: Erwin r9, Erwin r8, Erwin r7.1/7.2, Rational Rose 2000, ER Studio and Oracle Designer
- OLAP Tools: Microsoft Analysis Services, Business Objects, and Crystal Reports 9
- ETL Tools: Microsoft DTS/SSIS, SSRS and Informatica 7.1.3
- Programming Languages: SQL, T-SQL, Base SAS and SAS/SQL, HTML, XML, VB.NET
- Database Tools: Microsoft SQL Server 2000/2008, Teradata, Oracle 10g/9i, and MS Access
- Packages: Microsoft Office 2010, Microsoft Project 2010, SAP and Microsoft Visio, Share point Portal Server 2003/2007.
- Operating Systems: Microsoft Windows 9x / NT / 2000/XP / Vista/7 and UNIX
- Quality Assurance Tools: Win Runner, Load Runner, Test Director, Quick Test Pro, Quality Center, Rational Functional Tester
Confidential
Sr. Data Modeler/ Data Analyst
Responsibilities
- Developed the logical data models and physical data models that capture current state/future state data elements and data flows using ER Studio.
- Developed a Conceptual model using Erwin based on requirements analysis.
- Used Erwin for reverse engineering to connect to existing database and ODS to create graphical representation in the form of Entity Relationships and elicit more information.
- Involved in Data Mapping activities for the data warehouse.
- Produced PL/SQL statement and stored procedures in DB2 for extracting as well as writing data.
- Ensured production data being replicated into data warehouse without any data anomalies from the processing databases.
- Developed Star and Snowflake schemas based dimensional model to develop the data warehouse.
- Created Physical Data Model from the Logical Data Model using Compare and Merge Utility in ER/Studio and worked with the naming standards utility.
- Reverse Engineered DB2 databases and then forward engineered them to Teradata using ER Studio.
- Provided source to target mappings to the ETL team to perform initial, full, and incremental loads into the target data mart.
- Responsible for migrating the data and data models from SQL server environment to Oracle 10g environment.
- Worked closely with the ETL SSIS Developers to explain the complex Data Transformation using Logic.
- Extensively worked on the naming standards which incorporated the enterprise data modeling
- Generated comprehensive analytical reports by running SQL queries against current databases to conduct data analysis.
- Worked closely with the ETL SSIS Developers to explain the complex Data Transformation using Logic.
Environment: Erwin r9, SQL Server 2008, SQL Server Analysis Services 2008, SSIS 2008, SSRS 2008, Oracle 10g, Business Objects XI, Rational Rose, Data stage, MS Office, MS Visio
Confidential
Sr. Data Modeler/ Data Analyst
Responsibilities
- Involved in preparing Logical Data Models/Physical Data Models.
- Identify source systems, their connectivity, related tables and fields and ensure data suitably for mapping.
- Designed Logical/ Physical Data Model for E-Delivery System Using ER-Studio.
- Created SQL Loader Scripts, table sizing, indexing, table partition, SQL tuning.
- Created/ Tuned PL/SQL Procedures, SQL queries for Data Validation for ETL Process.
- Created Validation reports/System integration reports by Oracle Developer Suite 10g.
- Created DDL, DML scripts. Created and worked with X-Reference tables for data validation between difference data marts/Databases
- Created PL/SQL procedures, triggers, generated application data, Created users and privileges, used oracle utilities import/export.
- Designed data models with GE- ERC standards up to 3rd NF OLTP/ODS and de normalized OLAP data marts with Star Snow flake schemas.
- Created more than 20 New Models and with more than 100 tables. Used Star Schema and Snow flake Schema for data marts / Data Warehouse.
- Created DHTML using PL/SQL Tool kits ,packages, Dynamic SQL, used Apache
- Provided conceptual and technical modeling assistance to developers and DBA's using Erwin and Model Mart. Validated Data Models with IT team members and Clients.
- Data Analysis and Application testing Tuning using Analyze and Explain plan.
- Extracted the source data from Oracle tables, MS SQL Server, sequential files and excel sheets.
- Developed mappings in Informatica to load the data from various sources includes SQL server, DB2, Oracle, Flat files into the Data Warehouse, using different transformations like Source Qualifier,
- Experience in developing, deploying and running IBM Quality Stage jobs.
Environment: Oracle 8i, PL/SQL, SQL Loader 8.1, Toad 4.0, UML, Erwin 3.5,K-Shell, Erwin3.5, PL/SQL TOOL KIT, Perl, JDBC, Toad 3.0, PL/SQL
Confidential
Sr. Data Modeler/ Data Analyst
Responsibilities
- Interacting with business users to analyze the business process and requirements and transforming requirements into Conceptual, logical and Physical Data Models, designing database, documenting and rolling out the deliverables.
- Conducted analysis and profiling of potential data sources, upon high level determination of sources during initial scoping by the project team.
- Developed logical/ physical data models using Erwin tool across the subject areas based on the specifications and established referential integrity of the system.
- Worked with ETL to create source to target mappings and performed validation for source to target mappings.
- Analyzed large number of COBOL copybooks from multiple mainframe sources to understand existing constraints, relationships and business rules from the legacy data.
- Involved in data model reviews with internal data architect, business analysts, and business users with in depth explanation of the data model to make sure it is in line with business requirements.
- Understood basic business analysts concepts for logical data modeling, data flow processing and data base design
- Created Physical Data Model PDM for the OLAP application using ER Studio.
- Participated in JAD session with business users and sponsors to understand and document the business requirements in alignment to the financial goals of the company.
- Worked with DBA to create the physical model and tables.
- Developed and maintained data dictionary to create metadata reports for technical and business purpose.
- Developed Data Mapping, Data Governance, Transformation and Cleansing rules for the Master Data Management Architecture involving OLTP, ODS and OLAP.
- Collaborated with the Reporting Team to design Monthly Summary Level Cubes to support the aggregated level of detailed reports. Worked on the Snow-flaking the Dimensions to remove redundancy.
- Collaborated with ETL, BI and DBA teams to analyze and provide solutions to data issues and other challenges while implementing the OLAP model.
- Worked on identifying facts, dimensions and various other concepts of dimensional modeling which are used for data warehousing projects.
Environment: Erwin 4.0, Erwin 3.5.2, Toad, PL/SQL, Oracle 9i, SQL Server 2000, Windows 2005, ER Studio 7.1.1, Quest Central for DB2 v 4.8, COBOL , Tera Data, Microsoft SQL 2008 Reporting Services
Confidential
Sr. Data Modeler/Analyst
Responsibilities
- Involved in creating Physical and Logical models using Erwin.
- Created and maintained Database Objects Tables, Views, Indexes, Partitions, Synonyms, Database triggers, Stored Procedures in the data model.
- Designed the ER diagrams, logical model relationship, cardinality, attributes, and, candidate keys and physical database capacity planning, object creation and aggregation strategies for Oracle and Teradata as per business requirements using Erwin
- Extracted data from various sources like Oracle, Mainframes, flat files and loaded into the target Netezza database.
- Involved in extensive DATA validation using SQL queries and back-end testing
- Used SQL for Querying the database in UNIX environment
- Written complex SQL queries for validating the data against different kinds of reports generated by Business Objects XIR2
- Extensively used ETL methodology for supporting data extraction, transformations and loading processing, in a complex EDW using Informatica.
- Creates tables and queries to produce additional ad-hoc reports.
- Responsible for analyzing various data sources such as flat files, ASCII Data, EBCDIC Data, Relational Data Oracle, DB2 UDB, MS SQL Server from various heterogeneous data sources.
- Performed data mining on Claims data using very complex SQL queries and discovered claims pattern.
- Used Teradata OLAP functions like RANK, ROW NUMBER, QUALIFY, CSUM and SAMPLE.
- Extracted data from various sources like Oracle, Netezza and flat files and loaded into the target Netezza database
- Designed and developed cubes using SQL Server Analysis Services SSAS using Microsoft Visual Studio 2008
- Resolved the data type inconsistencies between the source systems and the target system using the Mapping Documents.
- Experience in Data Transformation and Data Mapping from source to target database schemas and also data cleansing.
- Resolved the revolving issues by conducting and participating in JAD sessions with the users, modelers and developers.
Environment: PL/SQL, Business Objects XIR2, ETL Tools Informatica9.5/8.6/9.1 Oracle 11G, Teradata V2R13/R14.10, Teradata SQL Assistant 12.0, Netezza, Power Designer, Erwin, Informatica, MDM
Confidential
Data Modeler/Analyst
Responsibilities
- Closely worked with business analysts/Data Modeler in the process of requirements gathering for various application specific databases
- Worked on two sources to bring in required data needed for reporting for a project by writing SQL extracts
- Worked on building up the data model for Mobile Banking in the data warehouse
- Worked on building up a SQL query for extraction of data from the warehouse for a reporting schedule Call Report for Federal Government
- Worked on designing a Relational Model and Dimensional Model for Interactive Services such as Internet Banking, PC banking and Bill Pay for the bank
- Worked on designing the Net Loss Model for the bank which involves data regarding loans and recoveries
- Worked on building a conceptual and logical model starting from basic fields analysis spreadsheet of Credit Mart as a part of one of my projects.
- Worked on Metadata Repository MRM for maintaining the definitions and mapping rules up to mark
- Worked on unstructured modeling for NBO Next Best Offer which involved modeling of various marketing strategies involving emails, pictures and marketing material
- Was involved in fulfilling all the end users request which fell under the general maintenance work such as adding and removing columns and tables from data base not without the prior approval from BA's and Data Modeling Team
- Used Graphical Entity-Relationship Diagramming to create new database design via easy to use, graphical interface.
- Conducted GAP analysis and data mapping to derive requirements for existing systems enhancements for a project.
- Designing and customizing data models for Data warehouse supporting data from multiple sources on real time
- Used Financial Service Logical Data Model FSLDM for referencing my design work
- Applied data cleansing/data scrubbing techniques to ensure consistency amongst data sets.
- Developed the logical data models and physical data models that confine existing condition/potential status data fundamentals and data flows using ER-Studio
- Ensured that the logical model is compliant with the conceptual model and all the high-level business rules and integrated and document.
- Defined the Primary Keys PKs and Foreign Keys FKs for the Entities, created dimensions model star and snowflake schemas using Kimball methodology
- Facilitated JAR/JAD sessions to determine data definitions and business rules governing the data, and facilitated review sessions with subject matter experts SMEs on the logical and physical data models.
- Applied data naming standards, created the data dictionary and documented data model translation decisions and also maintained DW metadata.
- Used data profiling tools and techniques to ensure data quality for data requirements.
- Extensively used SQL for performance tuning.
- Developed and executed queries in SQL for reporting using SQL Manager
Environment: ER Studio, DataStage, Microsoft SQL Server 2008,IBM Data Studio, EMC Greenplum ,Oracle, IBM DB2, Business Objects XI 3.1,pgAdmin III