Sr. Data Analyst / Data Modeler Resume
Madison, WI
SUMMARY:
- Over 8 years of experience in Information Technology as Data Analyst and Data Modeler with advanced analytical and problem solving skills.
- Extensive experience with OLTP/OLAP System and E - R modeling , developing Database Schemas like Star schema and Snowflake schema used in relational, dimensional and multidimensional modeling .
- Extensive experience in Relational Data Modeling, Dimensional Data Modeling, Logical/Physical Design, ER Diagrams, Forward and Reverse Engineering, Publishing Erwin diagrams, analyzing datasources and creating interface documents.
- Hands on experience in SAS Programming activities like merging SAS datasets, developing SAS procedures, data cleaning, report writing, macros, formats, informats, functions, storing and managing data in SAS Files with good understanding of relational databases such as Teradata and Oracle.
- Strong and Excellentexperience in Data Analysis, Data Migration, Data Validation, Data Cleansing, Data Verification, identifying data mismatch, Data Import, and Data Export through the use of multiple ETL tools such as Pentaho, Talend and Informatica PowerCenter and Informatica BDE
- Excellent experience on Teradata SQL queries, Teradata Indexes, Utilities such as Mload, Tpump, Fast load and FastExport.
- Experience in Data masking using Informatica data masking transformation and Strong experience in using Excel and MS Access to dump the data and analyze based on business needs.
- Experience in understanding the business needs and gathering user requirements by conducting and participating in JAD sessions, interacting with end-users and training them on the applications developed for the organization.
- Expert knowledge in SDLC (Software Development Life Cycle) and was involved in all phases in projects.
- Experience in Design, Development and implementation of the enterprise-wide architecture for structured and unstructured data providing architecture and consulting services to Business Intelligence initiatives and data driven business applications.
- Have experience in Dimensional Modeling using Star and Snowflake schema methodologies of Data Warehouse and Integration projects
- Extensively worked on ERWIN tool with all features like REVERSE Engineering , FORWARD Engineering, SUBJECT AREA , DOMAIN, Naming Standards Document etc.,
- Extensive experience in Relational Data Modeling, Dimensional Data Modeling , Logical/Physical Design, ER Diagrams , Forward and Reverse Engineering ERWIN diagrams, analyzing data sources and creating interface documents .
- Experience with Business Process Modeling, Process Flow Modeling, Data flow modeling
- Experience in working with creating ETL specification documents, & creating flowcharts, process work flows and data flow diagrams.
- Worked with ETL processes to transfer/migrate data from relational database and flat files common staging tables in various formats to meaningful data in Oracle and MS-SQL .
- Performed extensive data cleansing, data manipulations and date transforms and data auditing.
- Experience in using Oracle, SQL*PLUS, and SQL*Loader
- Developed database objects like PL/SQL packages, Oracle tables, stored procedures , triggers that are used for meeting the business objectives.
- Developed Data dictionary and maintained Metadata for each model.
- Coordinated with other developers in tuning long running SQL queries to enhance system performance.
- Involved in debugging and developing exception-handling procedures whenever required
- Extensive knowledge in software testing methodology and developing Test Plans, Test Procedures , Test Case Design and Execution, Modification Requests .
- Strong in conceptualizing and communicating enterprise data architecture frameworks for global enterprises for inter operation of data warehouses, middleware, and web applications .
- Good documentation skills and was involved in making functional and technical documentation.
TECHNICAL SKILLS
Tools: Erwin Data Modeler/ Model Mart, Power Designer, Embarcadero ER Studio
Data Warehousing: Informatica 9.6/9.5/9.1/8.6 (Repository Manager, Designer, Workflow Manager, and Workflow Monitor),Ab-Initio, Talend, Pentaho
BI / ETL Tools: Informatica, Data Stage, Talend, Ab initio, Business Objects XI, Cognos, Crystal Reports, SAS
Reporting Tools: Business Objects6.5, XIR2, Tableau, TIBCO Spotfire
Languages: JAVA, J2EE, XML, C and C++, SQL, noSQL
Database systems: ORACLE 9i/10g/11g/12c, DB2, Teradata, Netezza, MongoDB, SQL Server 2005, MS-ACCESS
RDBMS: Netezza Twin fin, Teradata R14, R13, R12, Oracle 11g/10g/9i/8i/7.x
Programming: SQL, PL/SQL, UNIX Shell Scripting, VB Script
Environment: Windows (95, 98, 2000, NT, XP), UNIX
Databases: Oracle 11g/10g/9i/8i, SQL Server 2000, 2005
Web Technologies: HTML, DHTML, XML, XSD, SOAP, WSDL, Web services
Other Tools: TOAD, MS-Office suite (Word, Excel, Project and Outlook), BTEQ, Teradata V2R6/R12/R13/R14 SQL Assistant, Aginity
PROFESSIONAL EXPERIENCE
Confidential, Madison, WI
Sr. Data Analyst / Data Modeler
Responsibilities
- Involved in Data mapping specifications to create and execute detailed system test plans. The data mapping specifies what data will be extracted from an internal data warehouse, transformed and sent to an external entity.
- Documented logical, physical, relational and dimensional data models. Designed the Data Marts in dimensional data modeling using star and snowflake schemas.
- Prepared documentation for all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, and glossary evolve and change during the project
- Coordinated with DBA on database build and table normalizations and de-normalizations
- Created, documented and maintained logical & physical database models.
- Identified the entities and relationship between the entities to develop Conceptual Model using ERWIN.
- Developed Logical Model from the conceptual model.
- Responsible for different Data mapping activities from Source systems
- Involved in data model reviews with internal data architect, business analysts, and business users with explanation of the data model to make sure it is in-line with business requirements.
- Involved with Data Profiling activities for new sources before creating new subject areas in warehouse
- Extensively worked Data Governance , i.e. Metadata management, Master data Management, Data Quality, Data Security
- Redefined many attributes and relationships in the reverse engineered model and cleansed unwanted tables/columns on Teradata database as part of data analysis responsibilities.
- Performed complex data analysis in support of ad-hoc and standing customer requests
- Delivered data solutions in report/presentation format according to customer specifications and timelines
- Used Reverse Engineering approach to redefine entities, relationships and attributes in the data model as per new specifications in Erwin after analysing the database systems currently in use.
- Enforced referential integrity in the OLTP data model for consistent relationship between tables and efficient database design.
- Created the test environment for Staging area, loading the Staging area with data from multiple sources.
- Involved in SQL Development , Unit Testing and Performance Tuning and to ensure testing issues are resolved on the basis of using defect reports.
- Tested the ETL process for both before data validation and after data validation process. Tested the messages published by ETL tool and data loaded into various databases
- Experience in creating UNIX scripts for file transfer and file manipulation.
- Tested the database to check field size validation , check constraints , stored procedures and cross verifying the field size defined within the application with metadata .
Environment: Windows XP, Informatica Power Center, QTP 9.2, Test Director 7.x, Load Runner 7.0, UNIX AIX, PERL, Shell Scripting, Data Flux, Oracle 9i, Quality Center 8.2, SQL, TOAD, PL/SQL, Flat Files
Confidential, Minneapolis MN
Sr. Data Analyst / Data Modeler
Responsibilities
- Analyzed Data sources and requirements and business rules to perform logical and physical Data modeling.
- Defined the key columns for the Dimension and Fact tables of both the Warehouse and Data Mart
- Wrote MySQL queries from scratch and created views on MySQL for Tableau.
- Interacted with the End users frequently and transferred the knowledge to them
- Conducted and participated JAD sessions with the Project managers, Business Analysis Team, Finance and Development teams to gather, analyze and document the Business and reporting requirements.
- Integrate HP BSM with HP UCMDB 9.03.
- Updated existing models to integrate new functionality into an existing application.
- Conducted one-on-one sessions with business users to gather data warehouse requirements.
- Developed normalized Logical and Physical database models to design OLTP system
- Created dimensional model for the reporting system by identifying required dimensions and facts using Power Designer .
- Created DDL scripts for implementing Data Modeling changes. Created Power Designer reports in HTML, RTF format depending upon the requirement, Published Data model in model mart , created naming convention files, co-coordinated with DBAs ' to apply the data model changes.
- Used forward engineering to create a Physical Data Model with DDL that best suits the requirements from the Logical Data Model .
- Maintaining and implementing Data Models for Enterprise Data Warehouse using Power Designer
- Create and maintain Metadata, including table, column definitions
- Worked with Database Administrators , Business Analysts and Content Developers to conduct design reviews and validate the developed models.
- Responsible for defining the naming standards for data warehouse .
- Possessed strong Documentation skills and knowledge sharing among Team, conducted data modeling review sessions for different user groups, participated in sessions to identify requirement feasibility.
- Extensive experience in PL/SQL programming - Stored Procedures, Functions, Packages and Triggers
- Massaged the existing model to create new logical and physical models that formed the basis for the new application.
- Used Power Designer for reverse engineering to connect to existing database and ODS to create graphical representation in the form of Entity Relationships and elicit more information
- Developed Data Migration and Cleansing rules for the Integration Architecture (OLTP, ODS, DW).
- Identified the most appropriate data sources based on an understanding of corporate data thus providing a higher level of consistency in reports being used by various levels of management.
- Verified that the correct authoritative sources were being used and that the extract, transform and load (ETL) routines would not compromise the integrity of the source data .
Environment: Power Designer, UNIX, Oracle 11g, Teradata, Informix, MS Excel, Mainframes MS Visio, Rational Rose, Requisite Pro, Tableau
Confidential, NYC, NY
Data Analyst / Data Modeler
Responsibilities
- Worked with DBA to create the physical model and tables. Scheduled multiple brain storming sessions with DBAs and production support team to discuss about views, partitioning and indexing schemes, case by case for the facts and dimensions.
- Worked on IBM Information Analyzer for data profiling and generating various reports
- Identified goals and objectives of projects. Created project plan by gathering the requirements from the management.
- UsedErwin Data Modeler tool for relational database and dimensional data warehouse designs.
- Requirement gathering from the users by conducting a series of meeting with the business system users to gather the requirements for reporting.
- Identification of risks in schedule and marking them in the UseCases.
- Worked on creating new tables and columns to Basel data mart.
- Extensively used StarSchema methodologies in building and designing the logical data model into Dimensional Models.
- Worked on Informatica TDM for data masking and Informatica DVO for data validation between different systems
- Worked on Trillium Data Quality tool for monitoring Production systems for Data Anomalies and resolve issues.
- Wrote complex SQL queries on Netezza and used them in lookup SQL overrides and Source Qualifier overrides.
- Extracted data from various sources like Oracle, Netezza and flat files and loaded into the target Netezza database
- Worked with data investigation , discovery and mapping tools to scan every single data record from many sources.
- Wrote and executed unit, system, integration and UAT scripts in a data warehouse projects.
- Troubleshoot test scripts , SQL queries, ETL jobs, data warehouse/data mart/data store models .
- Created the DDL scripts using ER Studio and source to target mappings to bring the data from Source to the warehouse.
- Worked with DBA to create the physical model and tables .
- Scheduled multiple brain storming sessions with DBA s and production support team to discuss about views, partitioning and indexing schemes case by case for the facts and dimensions.
- Worked on the model based volumetric analysis and data based volumetric analysis to provide accurate space requirements to the production support team.
Environment: ER Studio 8, Mainframes, Netezza, UNIX, Aginity, Informatica TDM, DVO, Information Analyzer
Confidential, Rochester, MN
Data Analyst / Data Modeler
Responsibilities
- Involved with dataprofiling for multiple sources and answered complex business questions by providing data to business users.
- Worked with data investigation, discovery and mapping tools to scan every single data record from many sources.
- Designed class and activity diagrams using Power Designer and UML tools like Visio
- Resolved multiple Data Governance issues to support data consistency at the enterprise level.
- Develop sustainable service models using UCMDB.
- Used Model Mart of Erwin for effective model management of sharing, dividing and reusing model information and design for productivity improvement.
- Created the dimensional logical model with approximately 10 facts, 30 dimensions with 500 attributes using ER Studio.
- Implemented the Slowly changing dimension scheme (Type II) for most of the dimensions.
- Implemented the standard naming conventions for the fact and dimension entities and attributes of logical and physical model.
- Reviewed the logical model with Business users , ETL Team, DBA's and testing team to provide information about the data model and business requirements.
- Worked on Mercury Quality Center to track the defect logged against the logical and physical model.
- Participated in requirement gathering session with business users and sponsors to understand and document the business requirements as well as the goals of the project.
- Created and reviewed the conceptual model for the EDW (Enterprise Data Warehouse) with business users.
- Analyzed the source system (JD Edwards) to understand the source data and JDE table structure along with deeper understanding of business rules and data integration checks.
- Identified various facts and dimensions from the source system and business requirements to be used for the data warehouse.
- Implemented the Slowly changing dimension scheme (Type II & Type I) for most of the dimensions.
- Worked as an onsite project coordinator once the design of the database was finalized in order to implement the data warehouse according to the implementation standards.
- Worked with client and off shore team to make sure that the reports and dashboards are delivered on time.
- Participated in UAT sessions to educate the business users about the reports, dashboards and the BI System .
- Worked with the test team to provide insights into the data scenarios and test cases.
- Worked on data profiling and data validation to ensure the accuracy of the data between the warehouse and source systems .
Environment: ER Studio 9.5, Microsoft SQL Server 2012, Mercury Quality Center 9, Ab initio, Teradata, Oracle11g, DB2, Informix, MS Excel, Mainframes.
Confidential
Data Analyst/Data Modeler
Responsibilities
- Analysis of functional and non-functional categorized data elements for data profiling and mapping from source to target data environment .
- Used SQL for querying the Teradata and oracle DB in Linux environment.
- Tested Complex ETL Mappings and Sessions based on business user requirements and business rules to load data from source flat files and RDBMS tables to target tables.
- Designed & Created Test Cases based on the Business requirements (Also referred Source to Target Detailed mapping document & Transformation rules document).
- Developed working documents to support findings and assign specific tasks
- Involved with data profiling for multiple sources and answered complex business questions by providing data to business users.
- Extensively used ETL methodology for supporting data extraction, transformations and loading processing, in a complex EDW using Informatica .
- Perform data reconciliation between integrated systems .
- Metrics reporting, data mining and trends in helpdesk environment using Access.
- Involved in data mining, transformation and loading from the source systems to the target system.
- Supported business areas and database platforms to ensure logical data model and database design, creation, and generation follows enterprise standards, processes, and procedures
- Generated a variety of metadata artifacts
- Designed database solution for applications, including all required database design components and artifacts.
- Reviewed the logical model with Business users, ETL Team, DBA's and testing team to provide information about the data model and business requirements.
- Worked on Mercury Quality Center to track the defect logged against the logical and physical model.
- Worked as an onsite project coordinator once the design of the database was finalized in order to implement the data warehouse according to the implementation standards.
- Worked with client and off shore team to make sure that the reports and dashboards are delivered on time.
- Participated in UAT sessions to educate the business users about the reports, dashboards and the BI System.
- Worked with the test team to provide insights into the data scenarios and test cases.
- Worked on data profiling and data validation to ensure the accuracy of the data between the warehouse and source systems.
- Provided input into database systems optimization for performance efficiency and worked on full lifecycle of data modeling (logical - physical - deployment)
- Maintained data in the database with consistency and integrity.
- Involved with data cleansing/scrubbing and validation .
- Formed numerous Volatile, Global, Set, Multi-Set tables on Teradata
- Used Teradata OLAP functions like RANK, ROW NUMBER, QUALIFY, CSUM and SAMPLE.
- Used the automated process for uploading data in production tables by using UNIX.
- Performed dicing and slicing on data using Pivot tables to acquire the churn rate pattern and prepared reports as required.
- In depth analyses of data report was prepared weekly, biweekly, monthly using MS Excel, SQL & UNIX. Environment: PL/SQL, Informatica 8.6, Oracle 10g, Teradata V2R12/R13.10, Teradata SQL Assistant 12.0, ERWIN data modeler, ODS, ER Studio 7, Microsoft SQL 2008 Server, Microsoft SQL Management Studio, Microsoft SQL 2008 Integration Services, Tera Data, Microsoft SQL 2008 Reporting Services, Microsoft SQL 2008 Analysis Services, Mercury Quality Center 9.