Sr. Data Analyst/ Data Modeler Resume
Irving, TX
SUMMARY
- Around 9 years of experience in Data Modeling/Data Analysis including Data Development, Implementation and Maintenance of databases and software applications.
- Experience working with Agile, Waterfall methodologies, Ralph Kimball and Bill Inmon approaches.
- Experience in Big Data Hadoop Ecosystem in ingestion, storage, querying, processing and analysis of big data.
- Having good knowledge in Normalization and De - Normalization techniques for optimum performance in relational and dimensional database environments.
- Excellent knowledge in Data Analysis, Data Validation, Data Cleansing, Data Verification and identifying data mismatch.
- Performed data analysis and data profiling using complex SQL on various sources systems including Oracle and Teradata.
- Specialization in Data Modeling, Data warehouse design, Building conceptual Architect, Data Integration and Business Intelligence Solution.
- Expert in writing SQL queries and optimizing the queries in Oracle, SQL Server and Teradata.
- Experience in designing Star schema, Snowflake schema for Data Warehouse, ODS architecture.
- Experience in designing Logical, Physical & Conceptual data models for to build the Data Warehouse.
- Strong experience in Data Analysis, Data Migration, Data Cleansing, Transformation, Integration, Data Import, and Data Export through the use of multiple ETL tools such as Ab Initio and Informatica Power Center Experience in testing and writing SQL and PL/SQL statements - Stored Procedures, Functions, Triggers and packages.
- Excellent experience on using Teradata SQL Assistant, Teradata Administrator, PMON and data load/export utilities like BTEQ, Fast Load, Multi Load, Fast Export, and Exposure to Tpump on UNIX/Windows environments and running the batch process for Teradata.
- Extensive experience in supporting Informatica applications, data extraction from heterogeneous sources using Informatica Power Center.
- Experience in automating and scheduling the Informatica jobs using UNIX shell scripting configuring Korn-jobs for Informatica sessions.
- Experience in designing error and exception handling procedures to identify, record and report errors.
- Solid hands on experience with administration of data model repository, documentation in Meta data portals in such as Erwin, ER Studio and Power Designer tools.
- Experience in conducting Joint Application Development (JAD) sessions with SMEs, Stakeholders and other project team members for requirements gathering and analysis.
- Experienced in various databases Design of development and Production environment involving Oracle, SQL server, Netezza, MY SQL, DB2, MS Access, Teradata.
- Experienced working with Excel Pivot and VBA macros for various business scenarios.
- Software Development Life Cycle (SDLC) experience including Requirements, Specifications Analysis/Design and Testing as per the Software Development Life Cycle.
- Solid Excellent experience in creating cloud based solutions and architecture using Amazon Web services (Amazon EC2, Amazon S3, Amazon RDS) and Microsoft Azure.
- Excellent knowledge on creating reports on SAP Business Objects, Webi reports for multiple data providers.
- Excellent experience in writing and executing unit, system, integration and UAT scripts in a data warehouse projects.
- Excellent experience in writing SQL queries to validate data movement between different layers in data warehouse environment.
- Excellent experience in troubleshooting test scripts, SQL queries, ETL jobs, data warehouse/data mart/data store models.
- Extensive knowledge and experience in producing tables, reports, graphs and listings using various procedures and handling large databases to perform complex data manipulations.
- Experience in testing Business Intelligence reports generated by various BI Tools like Cognos and Business Objects.
- Excellent in creating various artifacts for projects which include specification documents, data mapping and data analysis documents.
PROFESSIONAL EXPERIENCE
Sr. Data Analyst/ Data Modeler
Confidential, Irving, TX
Responsibilities:
- Created logical data model from the conceptual model and it's conversion into the physical database design using ERWIN.
- Mapped business needs/requirements to subject area model and to logical enterprise model.
- Worked with DBA's to create a best fit physical data model from the logical data model.
- Delivered dimensional data models using ER/Studio to bring in the Employee and Facilities domain data into the oracle data warehouse.
- Enforced referential integrity in the OLTP data model for consistent relationship between tables and efficient database design.
- Developed the data warehouse model (star schema) for the proposed central model for the project.
- Created the conceptual model for the EDW (Enterprise Data Warehouse) and get it reviewed by business users.
- Developed custom reports using HTML, Python and MySQL.
- Involved in designing RedShift DB Clusters, Schema, and tables.
- Worked with project team representatives to ensure that logical and physical ER/Studio data models were developed in line with corporate standards and guidelines.
- Performed performance tuning of OLTP and Data warehouse environments using SQL Plan management by creating plan baseline in order to ensure plan stability and good performance.
- Created 3NF business area data modeling with de-normalized physical implementation data and information requirements analysis using ERWIN tool.
- Created data structure to store the dimensions in an effective way to retrieve, delete and insert the data.
- Created ER Diagrams, Data Flow Diagrams, grouped and created the tables, validated the data, identified PK/ FK for lookup tables.
- Implementation and Maintenance of ETL solution to load the tables using Teradata tools, utilities and SQL, adhering to coding standards.
- Performing UNIX commands using putty application to view table structures, stored procedures, and also to grant and access the permissions.
- Performed manual updates to data to fulfill the business requests and ensure adherence to data governance entry policies.
- Worked on Metadata exchange among various proprietary systems using XML.
- Involved in the part of production team and worked on the issues in ETL to make some changes while loading the database.
- Created high level ETL design document and assisted ETL developers in the detail design and development of ETL maps using Informatica.
- Responsible for creating the staging tables and source to target mapping documents for the ETL process.
- Involved in writing queries and stored procedures using MySQL and SQL Server.
- Identified and tracked the slowly changing dimensions, heterogeneous sources and determined the hierarchies in dimensions.
- Helped in migration and conversion of data from the Sybase database into Oracle database, preparing mapping documents and developing partial SQL scripts as required.
- Generated ad-hoc SQL queries using joins, database connections and transformation rules to fetch data from legacy Oracle and SQL Server database systems.
- Used ERWIN for reverse engineering to connect to existing database and ODS to create graphical representation in the form of Entity Relationships and elicit more information.
- Used Model Mart of ERWIN for effective model management of sharing, dividing and reusing model information and design for productivity improvement.
- Participated in performance management and tuning for stored procedures, tables and database servers.
Environment: SQL, PL/SQL, ETL Informatica, Sybase, Teradata SQL Assistant. ERWIN, SQL server2012/2014, Business Objects XI, MS Excel 2010, Rational Rose, TOAD for Data Analysis, ER Studio, UNIX language.
Sr. Data Analyst/ Data Quality Analyst
Confidential, Eden Prairie, MN
Responsibilities:
- Developed full life cycle software including defining requirements, prototyping, designing, coding, testing and maintaining software.
- Review the transformation specifications written by various work streams to propose additional DQ rules.
- Collaborate with a small team of skilled professionals to create a center of excellence in develop data governance and management.
- Extensively used Agile methodology as the Organization Standard to implement the data Models
- Communicate with all business functions to maintain a comprehensive understanding of data quality requirements.
- Validate existing Data Quality rules to ensure they meet Data Governance requirements.
- Determine data quality requirements by studying business functions; gathering information; evaluating output requirements and formats.
- Maintained Referential Integrity by Introducing foreign keys and normalized the existing data structure to work with the ETL team and provided source to target mapping to implement incremental, full and initial loads into the target data mart.
- Worked on normalization techniques. Normalized the data into 3rd Normal Form (3NF).
- Arranged various guiding sessions for Programmers, Engineers, System Analysts and others for clarification of performance requirements, interfaces project capabilities and limitations.
- Developed solutions for data quality issues and collaborate with the business and IT to implement those solutions.
- Wrote SQL queries (aggregates, conditional statements, sub queries) and analyze large data sets resulting in problem resolutions and improvement recommendations.
- Analyzed and presented the gathered information in graphical format for the ease of business managers.
- Produced Source to target data mapping by developing the mapping spreadsheets.
- Establish data quality KPIs and metrics and create the reporting necessary to measure and communicate the status toward achievement of data quality targets.
- Executed the strategy for Enterprise Data functional, SOA, API and data quality testing.
- Implemented Hadoop based test frameworks and Selenium based test frameworks
- Provided IT and business partners consultation on using the Hadoop based testing platform effectively
- Validated business data objects to ensure the accuracy and completeness of the database.
- Supported SalesForce.com maintenance with services such as periodic data cleansing and workflow.
- Implemented ETL techniques for Data Conversion, Data Extraction and Data Mapping for different processes as well as applications.
- Developed and maintained the data dictionaries, Naming Conventions, Standards, and Class words Standards Document.
- Involved in Data profiling in order to detect and correct inaccurate data and maintain the data quality.
- Developed Data Migration and Cleansing rules for the Integration Architecture (OLTP, ODS, DW).
- Reviewed different codes and designs to ensure no ERRORS in the systems and to recommend required update if needed.
- Identified and tracked the slowly changing dimensions (SCD I, II, III & Hybrid/6) and determined the hierarchies in dimensions.
- Generated various reports using SQL Server Report Services (SSRS) for business analysts and the management team.
- Designed OLTP system environment and maintained documentation of Metadata.
- Using Informatica& SAS to extract transform & load source data from transaction systems.
- Well experienced in documenting data relationships, business rules, allowed rules, evolved glossary and codes.
Sr. Data Modeler/ Data Analyst
Confidential, Reston, VA
Responsibilities:
- Gathered and translated business requirements into detailed, production-level technical specifications, new features, and enhancements to existing technical business functionality.
- Part of team conducting logical data analysis and data modeling JAD sessions, communicated data-related standards.
- Worked on NoSQL databases including Cassandra. Implemented multi-data center and multi-rack Cassandra cluster.
- Coordinated with Data Architects on AWS provisioning EC2 Infrastructure and deploying applications in Elastic load balancing.
- Performed Reverse Engineering of the current application using Erwin, and developed Logical and Physical data models for Central Model consolidation.
- Translated logical data models into physical database models, generated DDLs for DBAs
- Performed Data Analysis and Data Profiling and worked on data transformations and data quality rules.
- Involved in extensive data validation by writing several complex SQL queries and Involved in back-end testing and worked with data quality issues.
- Collected, analyze and interpret complex data for reporting and/or performance trend analysis
- Wrote and executed unit, system, integration and UAT scripts in a data warehouse projects.
- Extensively used ETL methodology for supporting data extraction, transformations and loading processing, in a complex DW using Informatica.
- Developed and maintain sales reporting using in MS Excel queries, SQL in Teradata, and MS Access.
- Involved in writing T-SQL working on SSIS, SSRS, SSAS, Data Cleansing, Data Scrubbing and Data Migration.
- Redefined many attributes and relationships in the reverse engineered model and cleansed unwanted tables/columns as part of Data Analysis responsibilities.
- Designed the data marts using the Ralph Kimball's Dimensional Data Mart modeling methodology using Erwin.
- Written complex SQL queries for validating the data against different kinds of reports generated by Business Objects XIR2
- Worked in importing and cleansing of data from various sources like Teradata, Oracle, flat files, with high volume data
- Written SQL scripts to test the mappings and Developed Traceability Matrix of Business
- Involved in extensive data validation by writing several complex SQL queries and Involved in back-end testing and worked with data quality issues.
- Created SQL tables with referential integrity, constraints and developed queries using SQL, SQL*PLUS and PL/SQL.
- Performed GAP analysis of current state to desired state and document requirements to control the gaps identified.
- Developed the batch program in PL/SQL for the OLTP processing and used Unix Shell scripts to run in corn tab.
- Identified & record defects with required information for issue to be reproduced by development team.
- Worked on the reporting requirements and involved in generating the reports for the Data Model using crystal reports
Environment: Erwin 9.0, PL/SQL, Business Objects XIR2, Informatica 8.6, Oracle 11g, Teradata R13, Teradata SQL Assistant 12.0, PL/SQL, Flat Files
Sr. Data Modeler/ Data Analyst
Confidential
Responsibilities:
- Worked with Business Analysts team in requirements gathering and in preparing functional specifications and translating them to technical specifications.
- Worked with Business users during requirements gathering and prepared Conceptual, Logical and Physical Data Models.
- Planned and defined system requirements to Use Case, Use Case Scenario and Use Case Narrative using the UML (Unified Modeling Language) methodologies.
- Gather all the analysis reports prototypes from the business analysts belonging to different Business units; Participated in JAD sessions involving the discussion of various reporting needs.
- Reverse Engineering the existing data marts and identified the Data Elements (in the source systems), Dimensions, Facts and Measures required for reports.
- Conduct Design discussions and meetings to come out with the appropriate Data Warehouse at the lowest level of grain for each of the Dimensions involved.
- Created Entity Relationship Diagrams (ERD), Functional diagrams, Data flow diagrams and enforced referential integrity constraints.
- Designed a STAR schema for sales data involving shared dimensions (Conformed) for other subject areas using Erwin Data Modeler.
- Created and maintained Logical Data Model (LDM) for the project. Includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc.
- Validated and updated the appropriate LDM's to process mappings, screen designs, use cases, business object model, and system object model as they evolve and change.
- Conduct Design reviews with the business analysts and content developers to create a proof of concept for the reports.
- Ensured the feasibility of the logical and physical design models.
- Worked on the Snow-flaking the Dimensions to remove redundancy.
- Wrote PL/SQL statement, stored procedures and Triggers in DB2 for extracting as well as writing data.
- Defined facts, dimensions and designed the data marts using the Ralph Kimball's Dimensional Data Mart modeling methodology using Erwin.
- Involved in Data profiling and performed Data Analysis based on the requirements, which helped in catching many Sourcing Issues upfront.
- Developed Data mapping, Data Governance, Transformation and Cleansing rules for the Data Management involving OLTP, ODS and OLAP.
- Created data masking mappings to mask the sensitive data between production and test environment.
- Normalized the database based on the new model developed to put them into the 3NF of the data warehouse.
- Used SQL tools like Teradata SQL Assistant and TOAD to run SQL queries and validate the data in warehouse.
- Created SSIS package for daily email subscriptions to alert Tableau subscription failure using the ODBC driver and PostgreSQL database.
- Designed logical and physical data models, Reverse engineering, Complete compare for Oracle and SQL server objects using Erwin.
- Construct complex SQL queries with sub-queries, inline views as per the functional needs in the Business Requirements Document (BRD).
- Worked with supporting business analysis and marketing campaign analytics with data mining, data processing, and investigation to answer complex business questions.
- Involved in designing and developing SQL server objects such as Tables, Views, Indexes (Clustered and Non-Clustered), Stored Procedures and Functions in Transact-SQL.
- Developed scripts that automated DDL and DML statements used in creations of databases, tables, constraints, and updates.
Environment: PL/SQL, Erwin8.5, MS SQL 2008, OLTP, ODS, OLAP, OLTP, ODS, OLAP, SSIS, Tableau, ODBC, Transact-SQL, TOAD, Teradata SQL Assistant
Data Analyst/Data Modeler
Confidential, st. Santa Ana, CA
Responsibilities:
- Involved in Data mapping specifications to create and execute detailed system test plans. The data mapping specifies what data will be extracted from an internal data warehouse, transformed and sent to an external entity.
- Analyzed business requirements, system requirements, data mapping requirement specifications, and responsible for documenting functional requirements and supplementary requirements in Quality Center.
- Setting up of environments to be used for testing and the range of functionalities to be tested as per technical specifications.
- Tested Complex ETL Mappings and Sessions based on business user requirements and business rules to load data from source flat files and RDBMS tables to target tables.
- Wrote and executed unit, system, integration and UAT scripts in a data warehouse projects.
- Wrote and executed SQL queries to verify that data has been moved from transactional system to DSS, Data warehouse, data mart reporting system in accordance with requirements.
- Troubleshoot test scripts, SQL queries, ETL jobs, data warehouse/data mart/data store models.
- Excellent experience and knowledge on data warehouse concepts and dimensional data modelling using Ralph Kimball methodology
- Responsible for different Data mapping activities from Source systems to Teradata
- Created the test environment for Staging area, loading the Staging area with data from multiple sources.
- Used CA Erwin Data Modeler (Erwin) for data modeling (data requirements analysis, database design etc.) of custom developed information systems, including databases of transactional systems and data marts.
- Responsible for analyzing various data sources such as flat files, ASCII Data, EBCDIC Data, Relational Data (Oracle, DB2 UDB, MS SQL Server) from various heterogeneous data sources.
- Delivered file in various file formatting system (ex. Excel file, Tab delimited text, Coma separated text, Pipe delimited text etc.)
- Executed campaign based on customer requirements
- Followed company code standardization rule
- Performed ad hoc analyses, as needed, with the ability to comprehend analysis as needed
- Involved in Teradata SQL Development, Unit Testing and Performance Tuning and to ensure testing issues are resolved on the basis of using defect reports.
- Tested the ETL process for both before data validation and after data validation process. Tested the messages published by ETL tool and data loaded into various databases
- Experience in creating UNIX scripts for file transfer and file manipulation.
- Provide support to client with assessing how many virtual user licenses would be needed for performance testing.
- Ensuring onsite to offshore transition, QA Processes and closure of problems & issues.
- Tested the database to check field size validation, check constraints, stored procedures and cross verifying the field size defined within the application with metadata.
Environment: Informatica 8.1, Data Flux, Oracle 9i, Quality Center 8.2, SQL, TOAD, PL/SQL, Flat Files, Teradata