We provide IT Staff Augmentation Services!

Data Analyst Resume

0/5 (Submit Your Rating)

Colorado Springs, CO

SUMMARY

  • 8 years of IT experience in the Analysis, design, development, testing and Implementation of Business Intelligence solutions using Data Warehouse/Data Mart Design, ETL, OLAP, Client/Server applications.
  • Solid Back End Testing experience by writing and executing SQL Queries.
  • Experience in Data Analysis, Data Validation, Data Cleansing, Data Verification and identifying data mismatch.
  • Excellent testing experience in all phases and stages of Software Testing Life Cycle and Software Development Life Cycle (SDLC) with good working knowledge of testing methodologies, disciplines, tasks, resources and scheduling.
  • Very good understanding of Data Warehousing concepts, Data Analysis, Data Warehouse Architecture and Designing.
  • Developed ETL Mapping Document based on Data profiling and source system analysis.
  • Experience in Data Modeling using Erwin in Client/Server and distribute applications development.
  • Expert in writing SQL queries and Test Case Design, Test Tool Usage, Test Execution, and Defect Management.
  • Extensively used ETL methodology for supporting data extraction, transformations and loading processing, in a corporate - wide-ETL Solution using ETL tools like Informatica
  • Experienced working with Excel Pivot and VBA macros for various business scenarios.
  • Strong experience in Data Analysis, Data Migration, Data Cleansing, Transformation, Integration, Data Import, and Data Export through the use of multiple ETL tools such as Ab Initio and Informatica PowerCenter Experience in testing and writingSQL and PL/SQL statements - Stored Procedures, Functions, Triggers and packages.
  • Expertise in Testing complex Business rules by creating mapping and various transformations
  • Strong working experience on DSS (Decision Support Systems) applications and Extraction, Transformation and Load (ETL) of data from Legacy systems.
  • Extensively strong on databases including Oracle 10g/9i/8i, MS SQL Server 2000 / 7.0.
  • Good experience in Data Modeling using Star Schema and Snow Flake Schema and well versed with UNIX shell wrappers, KSH and Oracle PL/SQL programming.
  • Extensive experience in writing SQL to validate the database systems and for backend database testing.
  • Good experience in data sources, data profiling, data validation, developing low level design patterns based on the business and functional requirements.
  • Excellent communication, analytical and interpersonal skills.

TECHNICAL SKILLS

OPERATING SYSTEMS: Windows XP,NT/95/2000, Linux 8.X

LANGUAGES KNOWN: C, PL/SQL, SQL*Plus

RDBMS: Oracle 9i/10g/11g, TeradataR12/R13, Netezza, SQL Server

REPORTING TOOLS: Crystal Reports 6.5, Business Objects XI R2

SCRIPTING LANGUAGES: VB Script, Java Script, XSLT

DATABASES: Oracle 9i/10g, Teradata, SQL Server

DATAMODELLING TOOLS: Erwin 3.5.1/4.x, Designer 2000

DATAWAREHOUSING: Informatica PowerCenter 9x/8x,ETL, Data Mining, TOAD

PROFESSIONAL EXPERIENCE

Confidential, Westchester, PA

Sr. Data analyst

Responsibilities:

  • Prepared Test cases based on Technical Specification document.
  • Involved in fixing invalid mappings, testing of Stored Procedures and Functions, Unit and Integrating testing of Informatica Sessions, Batches and the Target Data.
  • Involved in the validation of the OLAP Unit testing and System Testing of the OLAP Report Functionality and data displayed in the reports.
  • Analysis of functional and non-functional categorized data elements for data profiling and mapping from source to target data environment. Developed working documents to support findings and assign specific tasks
  • Analyzed and tested various reports for Hierarchies, Aggregation, Conditions and Filters.
  • Checked the reports for any naming inconsistencies and to improve user readability.
  • Involved in Unit test plans and Performing Unit Testing on ETL Mappings andWorkflows.
  • Executed the Stored procedure when passing the valid/Invalid Parameters.
  • Responsible for testing all new and existing ETL data warehouse components.
  • All Mappings & workflows succeeded in Testing Environment move from Developmentto Production Environment.
  • Performed Verification, Validation, and Transformations on the Input data (Text files, XML files) before loading into target database.
  • Good Usage of tools When perform the database Testing like TOAD 8.5.1
  • Tracking and Reporting of the defects using the Defect Tracking Tool Mantis 1.0.8.
  • Preparation of defect reports.
  • Interacted with functional analysts to understand requirements and write high level test scripts.
  • Reviewed ERD dimensional model diagrams to understand the relationships and cardinalities to help in preparation of integrity test cases.
  • Written test cases for data extraction, data transformation, and reporting.
  • Responsible for Testing Schemas, Joins, Data types and column values among source systems, Staging and Data mart.
  • Analyzed the objectives and scope of each stage of testing process from the Test plan.
  • Interacted with business analysts to gather the requirements for business and performance testing.
  • Responsible for performing the data validation, process flow, dependency, Functionality Testing and User Acceptance Testing.
  • Extensively used Quality Center to prepare test cases, execution of test cases and bug tracking.

Environment: Erwin 4.5/4.0, Informatica Power Center 8.1/9.1, Power Connect/ Power exchange, Oracle 11g,Main frames,DB2 MS SQL Server 2008 R2, SQL,PL/SQL, XML, Windows NT 4.0, Sun Solaris Unix 2.6, Unix Shell Scripting.

Confidential, Colorado Springs, CO

Data Analyst

Responsibilities:

  • Involved in understanding Logical and Physical Data model using ErwinTool.
  • Involved in Data mapping specifications to create and execute detailed system test plans. The data mapping specifies what data will be extracted from an internal data warehouse, transformed and sent to an external entity.
  • Analysis of functional and non-functional categorized data elements for data profiling and mapping from source to target data environment. Developed working documents to support findings and assign specific tasks
  • Worked on claims data and extracted data from various sources such as flat files, Oracle and Mainframes.
  • Worked with data investigation, discovery and mapping tools to scan every single data record from many sources.
  • Analyzed business requirements, system requirements, data mapping requirement specifications interacting with client, developers and QA team.
  • Extensively test the reports for data accuracy and universe related errors
  • Tested several dashboards and deployed them across the organization to monitor the performance.
  • Used SQL tools to run SQL queries and validate the data loaded in to the target tables
  • Validated the data of reports by writing SQL queries in PL/SQL Developer against ODS.
  • Involved in user training sessions and assisting in UAT (User Acceptance Testing).
  • Strong ability in developing advanced SQL queries to extract, manipulate, and/or calculate information to fulfill data and reporting requirements including identifying the tables and columns from which data is extracted
  • Metrics reporting, data mining and trends in helpdesk environment using Access
  • Performing data management projects and fulfilling ad-hoc requests according to user specifications by utilizing data management software programs and tools like Perl, Toad, MS Access, Excel and SQL
  • Written SQL scripts to test the mappings and Developed Traceability Matrix of Business Requirements mapped to Test Scripts to ensure any Change Control in requirements leads to test case update.
  • Involved in extensive DATA validation by writing several complex SQL queries and Involved in back-end testing and worked with data quality issues.
  • Worked with end users to gain an understanding of information and core data concepts behind their business.
  • Assisted in defining business requirements for the IT team and created BRD and functional specifications documents along with mapping documents to assist the developers in their coding.
  • Identify & record defects with required information for issue to be reproduced by development team.
  • Designed and developed database models for the operational data store, data warehouse, and federated databases to support client enterprise Information Management Strategy.
  • Flexible to work late hours to coordinate with offshore team.
  • Experience in creating UNIX scripts for file transfer and file manipulation.

Environment: Informatica PowerCenter 9.0, TOAD, Oracle 9i, PL/SQL, MS Access, Quality Center 9.2, MS Excel 2007, PL/SQL, Java, Business Objects XIR2, ETL Tools Informatica 8.6, Oracle 11G, Teradata V2R12, Teradata SQL Assistant 12.0

Confidential, PA

Data Analyst

Responsibilities:

  • Analysis of functional and non-functional categorized data elements for data profiling and mapping from source to target data environment. Developed working documents to support findings and assign specific tasks
  • Involved in Data mapping specifications to create and execute detailed system test plans. The data mapping specifies what data will be extracted from an internal data warehouse, transformed and sent to an external entity.
  • Identify issues, information and behaviors during the adoption of a proprietary information management system.
  • Did data profiling for multiple compliance feeds.
  • Worked with data investigation, discovery and mapping tools to scan every single data record from many sources.
  • Extensively used ETL methodology for supporting data extraction, transformations and loading processing, in a corporate-wide-ETL Solution using Informatica.
  • Perform data reconciliation between integrated systems.
  • Metrics reporting, data mining and trends in helpdesk environment using Access
  • Written complex SQL queries for validating the data against different kinds of reports generated by Business Objects XIR2
  • Extensively used MS Access to pull the data from various data bases and integrate the data.
  • Assisted in the oversight for compliance to the Enterprise Data Standards
  • Worked in importing and cleansing of data from various sources like DB2, Oracle, flat files onto SQL Server 2005 with high volume data
  • Worked with Excel Pivot tables.
  • Create and Monitor workflows using workflow designer and workflow monitor.
  • Performing data management projects and fulfilling ad-hoc requests according to user specifications by utilizing data management software programs and tools like Perl, Toad, MS Access, Excel and SQL
  • Written SQL scripts to test the mappings and Developed Traceability Matrix of Business Requirements mapped to Test Scripts to ensure any Change Control in requirements leads to test case update.
  • Involved in extensive DATA validation by writing several complex SQL queries and Involved in back-end testing and worked with data quality issues.
  • Developed regression test scripts for the application and Involved in metrics gathering, analysis and reporting to concerned team and Tested the testing programs
  • Identify & record defects with required information for issue to be reproduced by development team.
  • Flexible to work late hours to coordinate with offshore team.

Environment: Quality Center 9.2, MS Excel 2007, PL/SQL, Business Objects XIR3, ETL Tools Informatica 8.6, Oracle 10G, Teradata V2R12, Teradata SQL Assistant 12.0

Confidential, Charlotte, NC

Data Analyst

Responsibilities:

  • Designed & Created Test Cases based on the Business requirements (Also referred Source to Target Detailed mapping document & Transformation rules document).
  • Involved in Data mapping specifications to create and execute detailed system test plans. The data mapping specifies what data will be extracted from an internal data warehouse, transformed and sent to an external entity.
  • Involved in extensive DATA validation using SQL queries and back-end testing
  • Used SQL for Querying the database in UNIX environment
  • Developed separate test cases for ETL process (Inbound & Outbound) and reporting
  • Involved with Design and Development team to implement the requirements.
  • Developed and Performed execution of Test Scripts manually to verify the expected results
  • Design and development of ETL processes using InformaticaETL tool for dimension and fact file creation
  • Involved in Manual and Automated testing using QTP and Quality Center.
  • Conducted Black Box - Functional, Regression and Data Driven. White box - Unit and Integration Testing (positive and negative scenarios).
  • Defects tracking, review, analyzes and compares results using Quality Center.
  • Participating in the MR/CR review meetings to resolve the issues.
  • Defined the Scope for System and Integration Testing
  • Prepares and submit the summarized audit reports and taking corrective actions
  • Involved in Uploading Master and Transactional data from flat files and preparation of Test cases, Sub System Testing.
  • Document and publish test results, troubleshoot and escalate issues
  • Preparation of various test documents for ETL process in Quality Center.
  • Involved in Test Scheduling and milestones with the dependencies
  • Functionality testing of email notification in ETL job failures, abort or data issue problems.
  • Identify, assess and intimate potential risks associated to testing scope, quality of the product and schedule
  • Created and executed test cases for ETL jobs to upload master data to repository.
  • Responsible to understand and train others on the enhancements or new features developed
  • Conduct load testing and provide input into capacity planning efforts.
  • Provide support to client with assessing how many virtual user licenses would be needed for performance testing, specifically load testing using Load Runner
  • Create and execute test scripts, cases, and scenarios that will determine optimal system performance according to specifications.

Environment: Windows XP, Informatica Power Center 6.1/7.1, QTP 9.2, Test Director 7.x, Load Runner 7.0, Oracle 10g, UNIX AIX 5.2, PERL, Shell Scripting

Confidential, Denver, CO

Data Warehouse Analyst

Responsibilities:

  • Designed & Created Test Cases based on the Business requirements (Also referred Source to Target Detailed mapping document & Transformation rules document).
  • Involved in extensive DATA validation using SQL queries and back-end testing
  • Used SQL for Querying the database in UNIX environment
  • Developed separate test cases for ETL process (Inbound & Outbound) and reporting
  • Involved with Design and Development team to implement the requirements.
  • Developed and Performed execution of Test Scripts manually to verify the expected results
  • Design and development of ETL processes using InformaticaETL tool for dimension and fact file creation
  • Involved in Manual and Automated testing using QTP and Quality Center.
  • Conducted Black Box - Functional, Regression and Data Driven. White box - Unit and Integration Testing (positive and negative scenarios).
  • Defects tracking, review, analyzes and compares results using Quality Center.
  • Participating in the MR/CR review meetings to resolve the issues.
  • Defined the Scope for System and Integration Testing
  • Prepares and submit the summarized audit reports and taking corrective actions
  • Involved in Uploading Master and Transactional data from flat files and preparation of Test cases, Sub System Testing.
  • Document and publish test results, troubleshoot and escalate issues
  • Preparation of various test documents for ETL process in Quality Center.
  • Involved in Test Scheduling and milestones with the dependencies
  • Functionality testing of email notification in ETL job failures, abort or data issue problems.
  • Identify, assess and intimate potential risks associated to testing scope, quality of the product and schedule
  • Created and executed test cases for ETL jobs to upload master data to repository.
  • Responsible to understand and train others on the enhancements or new features developed
  • Conduct load testing and provide input into capacity planning efforts.
  • Provide support to client with assessing how many virtual user licenses would be needed for performance testing, specifically load testing using Load Runner
  • Create and execute test scripts, cases, and scenarios that will determine optimal system performance according to specifications.
  • Modified the automated scripts from time to time to accommodate the changes/upgrades in the application interface.
  • Tested the database to check field size validation, check constraints, stored procedures and cross verifying the field size defined within the application with metadata.

Environment: Windows XP, Informatica Power Center 6.1/7.1, QTP 9.2, Test Director 7.x, Load Runner 7.0, Oracle 10g, UNIX AIX 5.2, PERL, Shell Scripting

We'd love your feedback!