Sr. Data Analyst/ Data Modeler Resume
Livonia, MI
SUMMARY
- Over 8 years of working experience as a Data Modeler and Data Analyst with high proficiency in requirement gathering and datamodeling including design and support of various applications in OLTP,DataWarehousing, OLAP and ETL Environment.
- Excellence in delivering Quality Conceptual, Logical and PhysicalDataModels for Multiple projects involving various Enterprise New and Existing Applications.
- Excellent experience on Teradata SQL queries, Teradata Indexes, Utilities such as Mload, Tpump, Fast load and FastExport.
- Strong experience inDataAnalysis,DataMigration,DataCleansing,Transformation, Integration, DataImport, andDataExport through the use of multiple ETL tools such as Ab - Initio and Informatica Power Center Experience in testing and writing SQL and PL/SQL statements - Stored Procedures, Functions, Triggers and packages.
- Experienced working with Excel Pivot and VBA macros for various business scenarios.
- Excellent experience inDatamining with querying and mining large datasets to discover transition patterns and examine data.
- Excellent in creating various artifacts for projects which include specification documents,data mapping anddataanalysis documents.
- Proficient in Normalization (1NF/2NF/3NF) /De-normalization techniques in relational/dimensional database environments.
- Excellent experience in trouble shooting SQL queries, ETL jobs,datawarehouse/data mart/data store models.
- Experienced in Performance tuning on oracle databases by leveraging explain plans, and tuning SQL queries.
- Excellent experience in writing SQL queries to validatedatamovement between different layers in data warehouse environment.
- Excellent knowledge in preparing required project documentation and tracking and reporting regularly on the status of projects to all project stakeholders.
- Extensive ETL testing experience using Informatica, Talend, Pentaho
- Extensive knowledge and experience in producing tables, reports, graphs and listings using various procedures and handling large databases to perform complexdatamanipulations.
- An excellent team player& technically strong person who has capability to work with business users, project managers, team leads, architects and peers, thus maintaining healthy environment in the project.
- Excellent SQL programming skills and developed Stored Procedures, Triggers, Functions, Packages using SQL/PLSQL. Performance tuning and query optimization techniques in transactional anddata warehouse environments.
- Extensive experience in advanced SQL Queries and PL/SQL stored procedures.
TECHNICAL SKILLS
Programming Languages: SQL, PL/SQL, UNIX shell Scripting, PERL, AWK, SED
Databases: Oracle10/11g, Teradata R12 R13, R14 MS SQL Server … MS Access, Netezza
Tools: MS-Office suite (Word, Excel, MS Project and Outlook), VSS
Testing and defect tracking Tools: HP/Mercury (Quality Center, Win Runner, Quick Test Professional, Performance Center, Requisite,Visual Source Safe
Operating System: Windows, Unix, Sun Solaris
ETL/Datawarehouse Tools: Informatica … SAP Business Objects XIR3.1/XIR2, Web Intelligence, Talend, Tableau, Pentaho
DataModeling: Star-Schema Modeling, Snowflake-Schema Modeling, FACT and dimension tables, Pivot Tables, Erwin
Tools: & Software: TOAD, MS Office, BTEQ, Teradata SQL Assistant
PROFESSIONAL EXPERIENCE
Confidential, Livonia, MI
Sr. Data Analyst/ Data Modeler
Responsibilities:
- Actively involved in creating Physical and Logical models using Erwin.
- Created and maintained Database Objects (Tables, Views, Indexes, Partitions, Synonyms, Database triggers, Stored Procedures) in thedatamodel.
- Presented thedatascenarios via, Erwin logical models and excel mockups to visualize thedata better.
- Provided subject matter expertise as appropriate to ETL requirements, information analytics, modeling & design, development, and support activities.
- Worked with requestors to develop and understanddatarequirements and reports specifications.
- Involved withDataAnalysis primarily IdentifyingDataSets, SourceData, Source MetaData, Data Definitions andDataFormats
- Designed and Developed physicaldatamodels and MetaDatato support the requirements using Erwin
- Involved withDataProfiling activities for new sources before creating new subject areas in warehouse
- Conducted theDataAnalysis and identified theDataquality issues usingDataprofiling methodologies.
- Enforced referential integrity in the OLTPdatamodel for consistent relationship between tables and efficient database design.
- Responsible for analyzing variousdatasources such as flat files, ASCIIData, EBCDICData, RelationalData(Oracle, DB2 UDB, MS SQL Server) from various heterogeneousdatasources.
- Delivered file in various file formatting system (ex. Excel file, Tab delimited text, Coma separated text, Pipe delimited text etc.)
- Involved in Teradata SQL Development, Unit Testing and Performance Tuning and to ensure testing issues are resolved on the basis of using defect reports.
- Tested the ETL process for both beforedatavalidation and afterdatavalidation process.
- Tested the messages published by ETL tool anddataloaded into various databases
- Experience in creating UNIX scripts for file transfer and file manipulation.
- Wrote and executed unit, system, integration and UAT scripts in adatawarehouse projects.
- Wrote and executed SQL queries to verify thatdatahas been moved from transactional system to DSS,Datawarehouse,datamart reporting system in accordance with requirements.
- Ensuring onsite to offshore transition, QA Processes and closure of problems & issues.
- Tested the database to check field size validation, check constraints, stored procedures and cross verifying the field size defined within the application with metadata.
Environment: Netezza, Erwin 9.6, DB2, Information Analyzer, MDM, Quality centre, Excel, MS-Word, Informatica 9.5,DataFlux, Oracle 12c, Quality Center 8.2.x, SQL, TOAD, PL/SQL, Flat Files, Teradata
Confidential, Plano, TX
Sr. Data Analyst/ Data Modeler
Responsibilities:
- Worked with Businessusersfor requirements gathering, business analysis and project coordination.
- Developed a Conceptual Model and Logical Model using Erwin based on requirements analysis.
- Created various PhysicalDataModels based on discussions with DBAs and ETL developers.
- Worked ondatamapping process from source system to target system.
- Created dimensional model for the reporting system by identifying required facts and dimensions using Erwin.
- Extensively used Star and Snowflake Schema methodologies.
- Developed and maintainedDataDictionary to create Metadata Reports for technical and business purpose.
- Worked on Performance Tuning of the database which includes indexes, optimizing SQL Statements.
- Used Model Mart of Erwin for effective model management of sharing, dividing and reusing model information and design for productivity improvement.
- Implemented Forward engineering to create tables, views and SQL scripts and mapping documents.
- Prepareddatadictionaries and Source-Target Mapping documents to ease the ETL process and user's understanding of thedatawarehouse objects
- Translated business concepts into XML vocabularies by designing XML Schemas with UML.
- Used Normalization (1NF, 2NF & 3NF) and Denormalization techniques for effective performance in OLTP and OLAP systems.
- Good understanding and experience with the entireDataMigration process from analyzing the existingdata, cleansing, validating, translating tables, converting and subsequent upload into new platform.
- Created documentation and test cases, worked with users for new module enhancements and testing.
- Worked with businessanalystto design weekly reports using combination of Crystal Reports.
- Understood existingdatamodel and documented suspected design affecting the performance of the system.
- Extracteddatafrom databases like Oracle, SQL server and DB2 using Informatica to load it into a single repository fordataanalysis.
- Involved in development and implementation of SSIS, SSRS and SSAS application solutions for various business units across the organization.
- Experienced indatamigration and cleansing rules for the integrated architecture (OLTP, ODS, DW)
- Wrote DDL and DML statements for creating, altering tables and converting characters into numeric values.
Environment: Erwin, Oracle SQL Developer, OracleDataModeler, Teradata, SSIS, Business Objects, Teradata, Oracle12c, SQL server 2012, SQL Assistant 13.11,Datastage 8.1, DB2, Informatica Power Center 9.5.
Confidential, Southfield, MI
Sr. Data Analyst/Data Modeler
Responsibilities:
- Worked on discovering entities, attributes, relationships and business rules from functional requirements
- Worked ondatamapping and documenting ETL transformations.
- Worked on creating Indexes to improve the performance, constraints on various tables
- Performed Data Analysis and Data Profiling and worked on data transformations and data quality rules.
- Designed both 3NFdatamodels for ODS, OLTP systems and dimensionaldatamodels using star and snow flake Schemas.
- Involved in extensivedatavalidation by writing several complex SQL queries and Involved in back-end testing and worked withdataquality issues.
- Collected, analyze and interpret complexdatafor reporting and/or performance trend analysis
- Wrote ad-hoc SQL queries and worked with SQL and Teradata databases
- Collaborated with thedatawarehousing team, ensuring thatdatainfrastructure supports the needs of analytics team and validatingdataquality.
- Worked on Physical design for both SMP and MPP RDBMS, with understanding of RDMBS scaling features.
- Written complex SQL queries for validating thedataagainst different kinds of reports generated by Business Objects.
- Written SQL scripts to test the mappings and Developed Traceability Matrix of Business Requirements mapped to Test Scripts to ensure any Change Control in requirements leads to test case update.
- Developed regression test scripts for the application and Involved in metrics gathering, analysis and reporting to concerned team and Tested the testing programs
- Involved in Teradata SQL Development, Unit Testing and Performance Tuning and to ensure testing issues are resolved on the basis of using defect reports.
- Modified the automated scripts from time to time to accommodate the changes/upgrades in the application interface.
Environment: T-SQL, ETL Tools Informatica 8.6, Oracle 11G, Netezza, Business Objects XIR3, Teradata SQL Assistant 12.x, TOAD, PL/SQL, Flat Files, Load Runner
Confidential, Troy, MI
Sr. Data Analyst/ Data modeler
Responsibilities:
- Produced functional decomposition diagrams and defined logicaldatamodel.
- Designed a logical and physicaldatamodel using database tool Erwin
- Used forward engineering to create DDL from PhysicalDataModel, based on the requirements from the LogicalDataModel.
- Conducted team meetings and Joint Application Design (JAD) session.
- Involved in implementingdatawarehouse along with architects anddatagovernance teams and consulted the team on the design issues and implementations.
- Used Erwin for reverse engineering to connect to existing database and ODS to create graphical representation in the form of Entity Relationships and elicit more information.
- Generated comprehensive analytical reports by running SQL queries against current databases to conductdataanalysis.
- Handled performance requirements for databases in OLTP and OLAP models.
- Reverse engineered from Toad database to Erwin and generated SQL script through forward engineer in Erwin.
- Defined and processed the facts, dimensions. Designed thedatamarts using the Ralph Kimball's DimensionalDataMart modeling methodology using ER Studio.
- Involved in the integration ofdatacoming from different sources.
- Involved in the creation, maintenance ofDataWarehouse and repositories containing Metadata.
- Designed different type of STAR schemas like detaileddatamarts and Plandatamarts, Monthly Summarydatamarts using ER studio with various Dimensions Like Time, Services, Customers and various FACT Tables.
- Developed and maintaineddatadictionary to create metadata reports for technical and business purpose.
- Involved in extensiveDatavalidation by writing several complex SQL queries and Involved in back-end testing and worked withdataquality issues.
- Developed regression test scripts for the application and Involved in metrics gathering, analysis and reporting to concerned team and tested the testing programs.
Environment: ERWIN 7.3, 8 oracle 10g, Teradata 13, SQL server 2008, SSIS, SSRS, MS Excel, Toad for MySQL 6.3, ER Studio, Teradata, HTML, TOAD, SQL * LOADER, Quality Center 7.2.x, SQL, TOAD, PL/SQL, Flat Files
Confidential, Alpharetta, GA
Data Analyst
Responsibilities:
- Create variousDataMapping Repository documents as part of Metadata services (EMR).
- Collaborate withdatamodelers, ETL developers in the creating theDataFunctional Design documents.
- Provide inputs to development team in performing extraction, transformation and load fordatamarts anddatawarehouses.
- Performed in depth analysis indata& prepared weekly, biweekly, monthly reports by using SQL, Ms Excel, Ms Access, and UNIX.
- Document the complete process flow to describe program development, logic, testing, and implementation, application integration, coding.
- Document variousDataQuality mapping document, audit and security compliance adherence
- Excellent Data Analytical / User interaction and Presentation Skills.
- Good Understanding of advanced statistical modeling and logical modeling using SAS.
- Comfort manipulating and analyzing complex, high-volume, and high-dimensionality data from varying data sources.
- Ability to communicate the results of analyses in a clear and effective manner.
- Understanding of basic statistics calculations
- Written SQL scripts to test the mappings and Developed Traceability Matrix of Business Requirements mapped to Test Scripts to ensure any Change Control in requirements leads to test case update.
- Interact with Business System Analysts and Software Developers to transform business requirements and application requirements into appropriate data model solutions
- Work with the business and the ETL developers in the analysis and resolution ofdatarelated problem tickets.
- Performeddataanalysis anddataprofiling using complex SQL on various sources systems including Oracle.
Environment: MS Excel, MS Access, Oracle 9i, UNIX, Windows XP, SQL, PL/SQL, Power Designer, VBA