We provide IT Staff Augmentation Services!

Sr. Qa / Etl Tester/ Data Analyst/ Reports Tester Resume

0/5 (Submit Your Rating)

Chicago, IL

SUMMARY

  • Over 8 years of IT experience in Quality Assurance and Software Testing of various business applications inClient/Server environments, Web based applications, Data Warehousing and Business Intelligence solutions.
  • Excellent working knowledge of System Development Life Cycle (SDLC), Software Testing Life Cycle (STLC) and Defect Life Cycle.
  • Experienced in Functional testing, System testing, Application testing, Performance testing, Load testing, Integration testing, Regression testing, Black Box testing, Stress Testing, Smoke testing, Recovery testing, Reports testing, browser compatability testing, UAT and GUI testing.
  • Expertise in writing Test Plans, Test Strategy, Test Procedures, Use Cases, Test cases, Test scripts, RTMs, Defect status reports, Test Summary report.
  • Proficiency in Defect management, including Defect creation, modification, tracking, and reporting using Industry standard Tools like HP Quality Center and Rational Clear Quest.
  • Expert in writing complexSQL queries for back - end and Reports data validatons.
  • Have solid experience on database query tools such as TOAD, SQL Navigator, SQL Assistant and SQL Plus and SQL Developer.
  • Experience in maintaining the test environments; involved in requesting the deployments, requesting data loads, data base backups, restarting the servers, troubleshooting issues.
  • Have tested several complex Reports generated by IBM Cognos, Business Objects BOXI, Micro Strategy including Dashboards, Summary reports, Master detailed, Drill down and Score cards.
  • Experience in testing the reports for both cosmetic static content and data validations as per Report specification documents. linked server is simply a connection to an Object Linking and Embedding Database (OLEDB) data source. Technically, OLEDB is a Microsoft standard API for retrieving data from a wide variety of data.
  • Experience on UNIX commands and Shell Scripting.
  • Experience in Data Analysis, Data Cleansing (Scrubbing), Data Validation and Verification, Data Conversion, Data Migrations and Data Mining.
  • Expert level skills in testing Enterprise Data Warehouses using Informatica Power Center, DataStage, Ab Initio and SSIS ETL tools.
  • Have experience in testing reconcile (initial) and delta (daily) loads.
  • Experienced in using the Oracle SQL* Loader features for loading the data from Flat Files to Data Base tables for Bulk Loading.
  • Experienced in interacting with Clients, Business Analysts, UAT Users and Developers.
  • Knowledge on the phases of Agile methodology and scrum Methodology.
  • Excellent interpersonal, communication, documentation and presentation skills.

TECHNICAL SKILLS

QA Methodologies: Iterative, Agile, Waterfall and Spiral Methodology

QA Tools: Confidential (HP) - Quality Center, TFS (Team Foundation Server), Rational - Clear Quest

BI/Reporting Tools: Business Objects BOXI, MicroStartegy, IBM CognosCrystal Reports, SQR, SSRS, SSAS

Databases: Oracle 11g, 10g, 9i and 8iMicrosoft MS Access, SQL Server 2005/2008, MySQL Teradata V2R6, V2R5

Query Tools: TOAD, Rapid SQL, SQL Developer, SQL Navigator, Teradata SQL Assistant, SQL Plus

ETL Tools: Informatica Power Center 8.x/7.x, AB Initio, Data Stage 7.x/8.x, SSIS

Programming Languages: SQL, PL/SQL, XML, C, SQR (Structure Query Reporting)

Operating Systems: Microsoft Windows - Vista, XP, 2000, NT 4.0, OS/2 UNIX - Sun Solaris, HP-UX

Others: MS Office (Share point, Outlook, Word, Excel, PowerPoint Communicator), MS Project, PuTTY, Remedy

PROFESSIONAL EXPERIENCE

Confidential, Chicago, IL

Sr. QA / ETL Tester/ Data Analyst/ Reports Tester

Responsibilities:

  • Examined and understood the Requirements from client and developed test cases based on functional requirements, reports specifications, data mapping requirements and system specifications.
  • Used Quality Center for Test Planning, Test Designing, Test Analysis, Test Execution, Defect Tracking and Test Result maintenance.
  • Prepared Test Plans and Test Cases for Reports testing and ETL testing.
  • Prepared test data for positive and negative test scenarios for Functional Testing as documented in the test plan.
  • Written test cases for - unit level, functional and integration testing.
  • Effective defect tracking and reporting to improve communications and reduce delay.
  • Writing test cases for unit level testing, integration testing.
  • Exhaustive unit level, integration and regression testing for all the modules of the application.
  • Reviewed the test cases written based on the Change Request document and Testing has been done based on Change Requests and Defect Requests
  • Wrote several complex SQL queries for data verification, data validation, data quality checks.
  • Extensive querying using TOAD to run SQL queries and monitor quality & integrity of data.
  • Involved in preparation of System Test Results after Test case execution.
  • Worked with data base validation, constraints validation, record counts, source to target validation, random sampling and error processing.
  • Done data validation, accuracy of calculated fields, Accuracy of the data shown, Selection criteria shown, Accuracy of subtotals for the OBIEE reports.
  • Validated the front end reporting components created using OBIEE Reports as well as performed back end testing, making sure that the correct data for a specific report are being pulled out from the Data Warehouse.
  • Perform defect Tracking and reporting with strong emphasis on root-cause analysis to determine where and why defects are being introduced in the development process.
  • Involved in mentanence of project post deployment and resolving deployment issues.
  • Tested data migration to ensure that integrity of data was not compromised.
  • Alidated various SSIS and SSRS packages according to data mapping specifications.
  • Used Analytix as a data mapping requirements management tool.
  • Used HP Quality Center as Test Management tool.
  • Used Jira as defect management tool and Share point as version control tool.
  • Extensively used Informatica Workflow to run workflows / mappings and monitored the session logs in Informatica Workflow Monitor.
  • Responsible in testing Initial and daily loads of ETL jobs.
  • Before the Load is accepted to test, have performed the smoke or shake-out test.
  • Extensively used Oracle database to test the Data Validity and Integrity for Data Updates, Deletes & Inserts.
  • Worked in the role of data analyst in mapping and scrubbing sensitive data within the application.
  • Verified Informatica session logs to identify the errors occurred during the ETL execution.
  • Done regression testing after the defect fixed and change made in Functional or Business Requirement Document.
  • Actively participated in stand up meetings, story reviews, retrospectives of the Agile process.

Environment: Informatica Power center 9.1, Oracle 11g, SQL, PL/SQL, OBIEE, SQL Server 2008, T-SQL, SSIS, SSRS, SSAS, Share Point, Quality Center 11, Jira, UNIX, Remedy, SnagIT, CompareIT, MS Office

Confidential, Chicago, IL

Sr. QA Analyst/ETL Tester/Data Analyst/ Reports Tester

Responsibilities:

  • Analyzed the Requirements from client and developed test cases based on functional requirements, reports specifications, data mapping requirements and system specifications.
  • Prepared Test Plans and Test Cases for Reports testing and ETL jobs from the requirements.
  • Prepared test data for positive and negative test scenarios for Functional Testing as documented in the test plan.
  • Reviewed the test cases written based on the Change Request document and Testing has been done based on Change Requests and Defect Requests.
  • Preparation of System Test Results after Test case execution.
  • Effectively coordinated with the development team for closing a defect.
  • Written several complex SQL queries for data verification and data quality checks.
  • Tested Reports for the data quality, accuracy of calculated fields, accuracy of the data shown, selection criteria shown, accuracy of subtotals, accuracy of report totals, sequence of columns.
  • Validated the Reports for valid and invalid prompts.
  • Tested the reports for Usability such as report initiation time, report run time, prompts entering etc.
  • Validated the front end reporting components created using Business Objects Reports as well as back end testing, making sure that the correct data for a specific report are being pulled out from the Data Warehouse.
  • Validated various Business Objects reports for data quality and format.
  • Prepared Test Scenarios by creating Mock data based on the different test cases.
  • Tested report lay out such as Report Title, Report Run Date, Page number, Report versioning, overall appearance, Readability of font, Column heading, titles etc.
  • Validated data, calculated fields, totals, subtotals for the Business Objects and Micro Strategy reports.
  • Tested the dash boards and done the data validations according to report specification documents.
  • Integration and regression testing of the Informatica Maps.
  • Preparation of Test cases for the functional, regression and system integration testing in Test Director
  • Performed functional testing (Count Verification, data validation, referential integrity test, transformation validation and various constraints check).
  • Perform defect Tracking and reporting with strong emphasis on root-cause analysis to determine where and why defects are being introduced in the development process.
  • Involved in mentanence of project post deployment and resolving deployment issues.
  • Tested data migration to ensure that integrity of data was not compromised.
  • Used HP Quality Center and as Test Management tool.
  • Verified session logs to identify the errors occurred during the ETL execution.
  • Extensively used Informatica Workflow Manager to run workflows/mappings and monitored the session logs in Informatica Workflow Monitor.
  • Before the Load is accepted to test, have performed the smoke or shake-out test.
  • Extensively used Oracle database to test the Data Validity and Integrity for Data Updates, Deletes & Inserts.
  • Worked in the role of data analyst in mapping and scrubbing sensitive data within the application.
  • Developed UNIX Shell Scripts for scheduling variousdata cleansingscripts and loading process.
  • Provided the management with weekly QA documents like Test metrics, Daily status reports, Test schedules and Test Exit Report.

Environment: Oracle 10g/11g, Business Objects Xi, SQL Developer, SQL Server 2008, TOAD, Informatica Power Center 8.6.1, HP Quality Center, PL/SQL, UNIX, Putty, Session log files, Flat files, XML files, Remedy, SnagIT, CompareIT, MS Office, Clear Quest (CQ) and Clear Case.

Confidential, Chicago, IL

QA Analyst/ETL Tester/Data Analyst/ Reports Tester

Responsibilities:

  • Analyzed the Requirements from the client and developedTest cases based on functional requirements, general requirements and system specifications.
  • Prepared test data for positive and negative test scenarios for Functional Testing as documented in the test plan.
  • Extensively used Cognos for generating the reports and validating them according to business requirments.
  • Done data validation, accuracy of calculated fields, accuracy of the data shown, Selection criteria shown, accuracy of subtotals, for the Cognos reports.
  • Tested reports in Cognos using Analysis studio, Report studio and query studio.
  • Tested reports for column headings, field titles, column and field alignment, field spacing and size, fields in correct format (dates, currency etc).
  • Written several complex SQL queries for data verification and data quality checks.
  • Verified session logs to identify the errors occurred during the ETL execution.
  • Created Test Cases, traceability matrix based on mapping document and requirements.
  • Verified the logs to identify the errors occurred during the ETL execution.
  • Reviewed the test cases written based on the Change Request document and Testing has been done based on Change Requests and Defect Requests.
  • Preparation of System Test Results after Test case execution.
  • Effectively coordinated with the development team for closing a defect.
  • Prepared Test Scenarios by creating Mock data based on the different test cases.
  • Perform defect Tracking and reporting with strong emphasis on root-cause analysis to determine where and why defects are being introduced in the development process.
  • Ran Data Stage jobs by using Data Stage Director and by running UNIX scripts as well.
  • Used Rational Clear Quest as defect tracking tool.
  • Before the Load is accepted to test, have performed the smoke or shake-out test.
  • Extensively used Oracle database to test Data Validity and Integrity for Data Updates, Deletes & Inserts.
  • Worked in the role of data analyst in mapping and scrubbing sensitive data within the application.
  • Provided the management with weekly QA documents like test metrics, reports, and schedules.

Environment: Oracle 10g, Cognos 8.0, SQL, Informatica Power Center 8.4, Data Stage, Teradata V2R6/V2R5, SQL Assistant, Rational Clear Case, Rational Clear Quest, PL/SQL, UNIX, Putty, Session log files, Flat files, XML files, DB2, Sybase, HP Quality Center, Autosys, SQL Server 2008, and TOAD.

Confidential, Minneapolis, MN

QA Analyst/DW/Reports Tester

Responsibilities:

  • Designed and Created test plan, test scenarios and test cases for Data warehouse and ETL testing.
  • Responsible to translate business requirements into quality assurance test cases
  • Review of test scenarios, test cases and Data warehouse Test Results.
  • Developed test scripts using SQL queries to validate data.
  • Prepared Regression Test Plans, Requirements Traceability Metrics (RTM), positive and negative test scenarios, detailed oriented Test Scripts, Test Kickoff documents, Test Scorecard for test progress status, Test Results, Release Check list, Lessons Learned documents and Regression Test Suite for future use.
  • Responsible in testing Initial and daily loads of ETL jobs.
  • Validated various Ab Initio graphs according to functional requirements.
  • Interacted with design team to decide on the various dimensions and facts to test the application.
  • Planned ahead of time to test the mapping parameters and variables by discussing with BA’s.
  • Extensively usedRational Clear Questto track defects and managed them.
  • Tested several reports detailing price/volume trends and expense variance over previous periods by using financial reporting and through the Interactive Reporting.
  • Extensively tested several MicroStrategy Reports for data quality, fonts, headers, footers, and cosmetics.
  • Tested Informatica mappings for various ETL functions.
  • Extensively involved in testing theETLprocess of data sources, SAP, PeopleSoft, Teradata, SQL Server, Oracle, flat files into the target Teradata, Oracle database as per the data models.
  • Analyzed the testing progress by conducting walk through meetings with internal quality assurance groups and with development groups.
  • Periodical meetings with Directors, Project Managers and Clients for implementing new approaches and techniques.
  • Responsible for documenting the process for future references.
  • Responsible for in-depth knowledge of data, processes or applications in their specific area of responsibilities
  • Reviewing the test activities through daily Agile Software development stand-up meetings.

Environment: Mercury Quality Center, Informatica, MicroStrategy, Rational ClearCase, Rational ClearQuest, Rational RequisitePro, TOAD, Perl, PUTTY, Web logic, Oracle 10g/9i, SQL Server 2005, Unix, PL/SQL, Win XP

Confidential, Minneapolis, MN

DWH Tester

Responsibilities:

  • Analyzed business requirements and module-specific functionalities to identify test requirements and formulate an effective master test plan.
  • Responsible for review of Functional Requirement Specification and System Design Specification documents for testing
  • Performed extensive manual testing on critical functionalities of the application.
  • TestedETLgraphs to extract and load data from different databases such as Oracle, SQL Server and flat files and loaded them in to Oracle.
  • Design and Development of QA documentation like Test Cases and Test scenarios from business and functional requirements.
  • Involved in testing the Ab Intio ETL graphs, by validating whether the mapping adhere to the development standards and naming conventions; whether the mapping do what the technical design says it should do; whether the mapping work correctly in relation to other processes in your data logistical flow
  • Experienced in analyzing the issue by checking the log files in the AIX environment.
  • Used Mercury Quality Center for Test Planning, Test Designing, Test Analysis, Test Execution, Defect Tracking and Test Result
  • Extensive querying using TOAD to run SQL queries and monitor quality & integrity of data.
  • Actively participated in creating requirements Traceability matrices and Test plans

Environment: Oracle 9i, SQL Server 2005, Rapid SQL, Ab Initio, SQL, PL/SQL, Autosys, Toad, IBM AIX, MS Access, Unix, Mercury Quality Center

We'd love your feedback!