We provide IT Staff Augmentation Services!

Performance Test Engineer Resume

0/5 (Submit Your Rating)

Santa Clara, CA

SUMMARY

  • Around 7 years of experience in software quality assurance expertise in testing web based e - commerce applications, GUI based client/server applications through manual and automated testing procedures using Mercury interactive tools Win runner, Test Director, QC, Quick Test Pro QTP and Load runner.
  • Experience in bug reporting tools like Test Director, Bugzilla, Mercury Quality Center and Rational Clear Quest, defect tracking, results analysis and Status reporting.
  • Well-versed with all stages of Software Development Life Cycle (SDLC) and Software Testing life cycle (STLC).
  • Experience in both manual and automated testing also involved in determining gap and risk analysis.
  • Expertise technical skills in Testing, Reporting and Analysis of Commercial and Enterprise applications.
  • Hands on experience of testing functionality of applications from business and user's perspective.
  • Expertise in using Quality center, QTP, Loadrunner, IBM Rational Testing suite.
  • Experience in testing java and C++. Experience in Data Migration Testing
  • Experience in executing test scripts using VB Script language.
  • Experience in various types of testing like Functionality, GUI, Regression, User Interface, Integration, User Acceptance, Smoke, System, localization, and backend testing of client/server and web based applications on various browsers and operating systems.
  • Experience in developing detailed performance test plan and test cases, testing in multiple operating systems such as Windows, UNIX, and Linux.
  • Expertise in performance testing using Loadrunner with Detailed Analysis of Reports Graphs.
  • Familiar with Object-oriented programming and Experience in testing environments with JAVA, C++, Visual Basic.
  • Experience with J2EE Components such as Servlets, Java Server pages (JSP) AJAX, Web Sphere, and Web-Client programming such as XML, XSLT, HTML, DHTML and JavaScript.
  • Extensive experience in creating Stored Procedures and Triggers using T-SQL SQL Server 2000/2005 with tools like Query Analyzer.
  • Proficient in writing scripts in VB script, SQL, and PL/SQL.
  • Hands on experience in performing all QA management activities to ensure QA milestones are achieved.
  • Good interpersonal written and verbal communication skills.
  • Excellent analytical, programming, trouble-shooting and problem solving abilities.

TECHNICAL SKILLS

Testing Tools: Load runner, QTP, Win runner, Quality Center, Rational Robot, Test Manager, JUnit, Selenium, Performance center 9.5

Test Language: TSL Script, VB Script, SQA Basic

Programming Languages: C++, Java, SQL

Web Technology: J2EE, EJB, JSP, JDBC, XML, HTML, ASP 2.0,.NET

RDBMS: Oracle 10g, DB2, Sybase, SQL Server2000, MS Access, LDAP

Database Tool: SQL Toad, Jxplorer, WinSql, RapidSQL

Web Server: Web logic, IIS, Tomcat-Web Server, Web Sphere

Bug Tracker: Test Director, Clear Quest, Quality Center

GUI Tools: Visual Basic, Crystal reports

Version control system: MS VSS, CVS, Clear case

Operating System: Win XP/ 2000/NT/98, UNIX, MS-DOS, MAC OS X, AIX, Linux, HP-UX

Office tools: MS Office - MS Word, Excel, Power Point, Visio, Project

Mainframe: TE3270, TE5750

Browsers: Windows IE, Mozilla, AOL, Mac Safari, AOL, Mozilla, IE

SDLC Methodology: RUP, AGILE, waterfall, Spiral, TDD

PROFESSIONAL EXPERIENCE

Confidential, Virginia

Performance Test Eng

Responsibilities:

  • Analyzed the requirements and designed detailed test cases and analyzed the functional requirements using Scenarios.
  • Developed test plans, test cases (test scenarios), test steps and test scripts.
  • Designed and developed test cases and test steps for putting the application on automated testing using Winrunner.
  • Created, debug and run the test scripts in Winrunner.
  • Involved in performing different types of testing Unit, Integration, Regression and Data Driven Testing.
  • Performed black box testing including smoke, regression, integration and functional testing using Quick Test Pro.
  • Developed automation scripts in QC to automate smoke and regression testing.
  • Defects were tracked, reviewed, analyzed and compared using Test Director.
  • Verification/Check points have been implemented by using Winrunner Window/object GUI checkpoints, Bitmap checkpoints, and Text checkpoints.
  • Created the environment for automation testing and designing the strategy for complete test coverage.
  • Did performance testing using Loadrunner.
  • Created scenarios using Loadrunner.
  • Documented and communicated test results.
  • Involved in testing cash management system (Account Balance Reporting Internal Transfers/Stop Payments Wire Transfers ACH Transfers/Payroll Direct Deposit/Tax Payments Security)
  • Managed the testing process, schedule batch tests, log and track defects using Quality Center.
  • Generated the detailed reports of the bugs, Pass-Fail report and comparison chart.
  • Attended periodical meetings with developers to resolve technical issues and interacted with them to ensure overall quality of the software.
  • Worked with development team to ensure testing issues are resolved on the basis of using defect reports.

Environment: Windows 2000/XP, VB, VB.net, Oracle, QC, Win runner, Test Director, Load runner.

Confidential, Arden Hills, MN

Performance Test Eng

Responsibilities:

  • Responsible for designing, creating and executing performance benchmarks and for identifying root cause of performance defects.
  • Identified real world scenarios and complex usage pattern analysis.
  • Independently develop Loadrunner test scripts according to test specifications/requirements.
  • Using Loadrunner, executed multi-User performance tests, used online monitors, real-time output messages and other features of the Loadrunner Controller.
  • Performed in-depth analysis to isolate points of failure in the application.
  • Investigate and troubleshoot performance problems in a lab environment. This will also include analysis of performance problems in a production environment.
  • Developed automation framework like data driven, key word driven and Modularity.
  • Involved in automating mainframe application using terminal emulator.
  • Developed Driver Script, Startup Script and utility functions in QC.
  • Created and scheduled the scenarios using Loadrunner controller.
  • Involved in recording Vuser Script using Loadrunner VUGen.
  • Extensively used Vugen for Loadrunner scripting by using different protocols.
  • Configured the load generators to ramp up and exercise the application with the required number of Virtual users using Loadrunner.
  • Generated reports/graphs using Loadrunner analysis tools.
  • Executed all Test Cases in all phases of testing like GUI Testing, Functional testing, Regression testing, Integration testing, System testing, end-to-end testing and UAT.
  • Performed manual testing and maintained documentation on different types of testing like positive and Negative testing. Doing functional testing using QTP.
  • Participated in Unit testing working with developers.
  • Performed Database Validation and Integration Testing and manipulations using SQL.
  • Scheduling and running the jobs on daily basis.
  • Coordinated with Project Managers, Business Analysts and System Analysts to set of pre-validation and validation environments to execute the Scripts.

Environment: Java, J2EE, HTML, ASP, VB Script, Java Script, Windows 2003, UNIX, DB2, Load runner, Quick Test professional, Quality center.

Confidential, Santa Clara, CA

Performance Test Engineer

Responsibilities:

  • Involved in preparing (Test Plan, Test Methodology, Test Cases, and Test Script based on Business Requirements Documents BRD.
  • Participated in requirements gathering, requirement analysis, business analysis and use-case analysis.
  • Worked on Performance test using HP Load runner. and doing Cross-Browser Testing
  • Extensively used Loadrunner 9.0 VuGen for generating the automated test Scripts.
  • Rendezvous points to calculate transaction response time under load.
  • Bug tracking is done by
  • Conducted manual test meticulously and exhaustively for entire applications using manual Scripts.
  • Executed and managed execution of load and volume testing scripts in Loadrunner.
  • Worked with Loadrunner specific monitors like the SQL database monitor, system monitors, and web/App.
  • Analyzed various graphs like transaction response time, hits per second graph, pages download per second, throughput, network delay time graph and generated reports using Loadrunner analysis.
  • Generating, analyzing and interpreting the reports post the performance test execution.
  • Suggesting improvement areas in terms of performance enhancements.
  • Developed automated Scripts using QC for performing regression testing.
  • Conducted batch testing for automation programs.
  • Handled exceptions using Recovery Scenarios.
  • Used (Test Director) for storing QC scripts. Executing the scripts Test plans and test cases.
  • Updated modified test cases and test scripts into test director as per the modified requirements.
  • Developed SQL Queries to check data validity and database Integrity.
  • Used Rational Clear Quest as Online Bug tracker.
  • Executed accurately & concisely, report progress defect status using (Rational Clear Quest) for bugs.
  • Preparation of daily status reports and submission to management for their review.

Environment: Load runner, Web Client, Quality Center, Bugzilla, SQL, .NET, Windows 2000/NT/98, Manual testing.

Confidential, Atlanta, GA

BA/QA Analyst

Responsibilities:

  • Interacted with the Business analyst team to analyze and review the requirements document and make sure all the requirements are covered by at least one or more test cases.
  • Participated in the design, code and walk through meetings with the development and design team members.
  • Reviewed the requirements and use case diagrams to write test cases and test plan that would test all possible scenarios
  • Created test sets and documented test cases in the Test Plan to manage the test run process.
  • Used Test director for the documentation of test cases and test scripts.
  • Executed test cases manually and also used automation tool, compared and analyzed actual with expected results and reported all deviations to the appropriate individual(s) for resolution.
  • Performed regression testing to verify all the functionality works after certain builds.
  • Used QTP to convert manual test cases into automated test scripts, performing data driven, functional and regression testing
  • Parameterized the scripts to avoid code redundancy and hard coding of frequently changing values in QTP.
  • Defined GUI Checkpoints to check text boxes, combo boxes, checkboxes and images using QTP.
  • Used Quality Center to log and track defects status. Wrote and tracked anomalies in the Quality Center defect tracking system.
  • Involved in validating the test cases for the various Batch processing transactions at the scheduled and verify the log files for any issues arising during the test.
  • Data Integrity testing on an Oracle Database to validate and test data using TOAD.
  • Wrote SQL queries to get valid data from the database for data integrity.
  • Maintenance of different documents related to the testing process, discussing the work progress, monitoring and suggesting enhancements in the Test strategy and Test plans.

Environment: Windows XP, Quality Center, QTP, VB Script, Access, Oracle, J2EE, TOAD.

Confidential

QA Tester

Responsibilities:

  • Analyze the Business Requirement Documents and Functional Specification to get a better understanding of the system on both technical and business perspectives.
  • Performed the functional testing using Mercury Interactive Winrunner in both Manual and Automated testing process.
  • Recorded the application in Context Sensitive mode and inserted GUI checkpoints on multiple objects to check their properties.
  • Developed and maintained automated test scripts using Winrunner.
  • Created and maintained regression test suit to check the progress of the testing process by performing identical tests before and after fixing defects.
  • Reported bugs using Test Director
  • Monitored the browser’s ability to identify the various web objects and hyper links and reported the broken links.
  • Performed backend testing using SQL queries to check backend data and data integrity.
  • Facilitated unattended test runs using exception handlings with Winrunner Recovery Manager.

Environment: Visual Basic, Winrunner, MS-Project, Test Director, SQL, and HTML, Windows NT/2000.

Confidential

Manual QA Tester

Responsibilities:

  • Analyzing test plan that defines the test environment, phases of testing.
  • Worked on different phases of testing and resources required.
  • Wrote manual test scripts supporting the available business requirements.
  • Executed test cases manually to check GUI and Functional features of the application.
  • Identified test data specific to the test case.
  • Performed the back-end testing to ensure data consistency on front-end.
  • Writing and executing, SQL queries on the database.
  • Worked On Crystal Reports for Generating Reports.
  • Generated the detailed reports of the Bugs, Pass-Fail report and comparison chart.
  • Worked closely with build integration team and developers in order to perform testing activities as per
  • Performed End-to-End testing manually.

Environment: SQL Server, Windows 2000, Java, XML, HTML, Visual Basic 6.0, Crystal Reports, ASP, Java Script.

We'd love your feedback!