We provide IT Staff Augmentation Services!

Test Engineer Resume

0/5 (Submit Your Rating)

Woodlawn, MD

SUMMARY

  • Over 9 years of experience in Software Quality Assurance and Software Testing of Client/Server and Web based applications, looking to work as a Software Test Analyst/Engineer.
  • Extensive experience with Financial / Communication / Health Care applications.
  • Expertise in analyzing requirement specifications, developing test plans, test cases, test scripts and planning for QA methodologies and Rational Unified Process.
  • Excellent understanding of the Software Development Life Cycle and analysis, with emphasis on Black Box Testing, Database Testing, GUI testing, Integration Testing, System Testing, Regression Testing, Load and Performance Testing, User Acceptance Testing.
  • Strong Manual Testing skills and proficient in using automated testing tools like QTP, Win Runner, Load Runner, Test Director/Quality Center.
  • Expert at performing Database Testing of back - end tables and data manipulation using SQL.
  • Hands on experience in writing and executing automated Test Scripts, Tracking Defects and interacting with the development team in correcting the defects.
  • Specialist in using Quality Center/Test Director for global test management, bug tracking and reporting.
  • Experienced in Client/Server application development using Oracle, Sybase, MS SQL Server, MS Access, Visual Basic, VB.NET, Crystal Reports.
  • Extensive experience in setting up and troubleshooting LAN, WAN, Wireless Area Network, Windows Server, Active Directory and recovering data from physically and logically damaged hard drives.
  • Proficient in using Quality Center, Team Foundation Server
  • Excellent communication and inter personnel skills and ability to quickly learn new technology.
  • Self-motivated, responsible and ability to lead as well as work in a team.

TECHNICAL SKILLS

Languages: C, C#, C++, SQL, and Java.

Test Tools: QTP, WinRunner, LoadRunner. Team Foundation Server

Bug Reporting: TestDirector, Quality Center.

Operating System: Windows 95/98/NT/2000/XP/ME/Vista/Windows Server2003 and Linux.

Web Language: HTML,DHTML, COLDFUSION, ASP, and XML

Front Ends: Visual Basic.

RDBMS: Oracle, SQL Server, MS Access, Sybase.

Scripting: TSL, VBScript, Java script, SQL, PL/SQL, C#, C++

Web Servers: Websphere MQ, Weblogic 5.3

App. Servers: IIS, Web Logic

Network: FTP, HTTP, TCP/IP, Telnet

Software: PVCS, MS Word, MS Excel, MS Power point, MS FrontPage, MS Outlook, Adobe Photoshop, Dreamweaver, Norton, Symantec, VSS, and AutoCAD.

PROFESSIONAL EXPERIENCE

Confidential, Woodlawn, MD

Test Engineer

Responsibilities:

  • Design of Test Requirements as per business rules for the application under test.
  • Prepared Test Plan, Test cases based upon Business, Detail and User Requirements.
  • Prepared Test data for interpreting the Positive, Negative and Regression Results.
  • Tested different modules and their functionalities.
  • Developed testing schedules for system, functional, end-to-end and Automation testing and reviewed the known risks upfront with the project team follow up the defects with the development team and tried to fix it with them.
  • Generated Requirement tracebility matrix (RTM), Performance test metrics Daily status, weekly reports, Bug status report, QA Status report, Risk and failure analysis document.
  • Used Quality Center for requirement analysis, scheduling and generating test cases
  • Developed automated Scripts using QTP for various Business processes once the environment was stable.
  • Backend testing of the DB by writing PL/SQL queries to test the integrity of the application and Oracle databases using TOAD
  • Involved in developing Entry & Exit criteria and defined the pass and fail standards.
  • Wrote and modified SQL validation scripts, and writing scripts in SQL to validate the outputs.
  • Studied the existing architecture and the proposed architecture finding the difference in terms of software versions and configuration changes and the scaling factor and documenting them
  • Identified key business scenarios from application specialists or business analysts
  • Interacted directly with developers, Project managers and stake holders. Elaborate performance executions and reporting of test performance results in low/High level.
  • Researched past application response time metrics, business transactions from production support team for developing realistic test scenarios and load models.

Environment: Quality Center, QTP, SOAPUI, Web Services J2EE,JAVA,EJB, Webserver, BEA WebLogic 8.1 App Server

Confidential, Baltimore, MD

Software Tester

Responsibilities:

  • Handled responsibilities of evaluating test results to determine the defect in performance of mainframe systems
  • Tested Application using Manual and as well as using automation Test scripts (QTP).
  • Developed VB script for all the modules which need to be automated
  • Extensively used VB Script to develop and execute automation Test scripts using Quick Test Professional.
  • Developed Keyword Driven, Test automation Framework in Quick Test Pro.
  • Developed VB Script and Enhanced to obtain proper results. Handled dynamically changing Objects through VB Scripting. Split the Script in to number of Actions and made them Reusable.
  • Parameterized data for Data driven testing in order to implement Retesting using multiple sets of data.
  • Responsible for Running Batch tests using Quality Center by launching QTP.

Environment: HP Quality Center, Quicktest Pro, Windows Server, Sql Server, .NET, Web Services

Confidential, Manchester, NH

Performance Tester

Responsibilities:

  • Involved in writing the test conditions, test data, test cases using Functional Specifications.
  • Detecting bugs & classifying them based on the severity.
  • Based on use cases translated the information to test cases and test procedures.
  • Performed High Level Design document reviews. Participated in Feature Design review meetings and presented test case review, strategy and feature functionality.
  • Identified, established test requirements and documented using Test Director for requirements management.
  • Documented test cases and conducted manual testing using Quality Center.
  • Created detailed periodic status reports for senior management to keep them posted on the progress of implementation.
  • Attended periodic meetings, teleconferences and led discussions on problem resolution.
  • Determine system project line readiness and perform regression analysis.
  • Extensively performed system testing to validate user interface, workflow and overall functionality using simulators and on real hardware.
  • Performed integration testing between various modules and with hardware interfaces.
  • Performed GUI Testing, Functional, Performance Testing manually and also used QTP and Loadrunner to automate the testing process.
  • Worked within a team to resolve Relational Database problems.
  • Implemented a full transaction engine using XML messaging/dialog from scratch.
  • Updated XML documents for new requirements for the user
  • Created automation test scripts using QTP tool and performed interface, functionality, and regression testing on new builds of the software.

Environment: Manual Testing, Loadrunner, QTP, Quality Center, Web Services, Windows Server, .NET, C#, HTTP, XML, SQL, Oracle, Java.

Confidential, Reston, VA

Test Engineer

Responsibilities:

  • Reviewed Business Requirement Documents and the Functional Specifications.
  • Documented test cases corresponding to business rules and other operating condition.
  • Involved in Manual Testing, Integration Testing, System Testing, UAT Testing, Regression Testing, Functional Testing, Load Testing, and Performance Testing.
  • Wrote test cases and test scripts for the functional testing.
  • Reported the bugs, Email notifications to the developers using the Quality Center.
  • Maintained and executed test cases and test scripts using Quality Center.
  • Performed independent verification and validation IV&V of third party test efforts
  • Automated the test scripts for functional testing using Quick Test Pro.
  • Enhanced the test scripts through Check Points and Parameterization.
  • Performed backend testing by creating database check points using Quick Test Pro.
  • Performed Load, Stress, Volume Testing using LoadRunner.
  • Worked with performance test team and created, executed, and analyzed performance test scripts, including baseline testing, performance testing, and volume testing.
  • Generated the Vuser scripts with Vugen for the identified business scenarios and enhanced the same through correlation, parameterization.
  • Setup and tested load balancing and failover.
  • Performed load balancing/testing with LodeRunner
  • Monitored SQL Server Resources through SQL Server Monitors to analyze the database bottlenecks.
  • The Average CPU usage, Response Time, Transaction per Second are analyzed for each scenario.
  • Gathered the results from each test run and conducted in-depth analysis on the transaction response times and the performance of each server.
  • Generated detailed reports that include graphs and tables for various performance object counters and application transaction times.

Environment: QTP, Quality Center, LoadRunner, SQL Server, Microsoft Office tools, .NET, J2EE, SQL, PL/SQL, PVCS, Windows, Weblogic, Java, HTML, XML and Oracle.

Confidential, New York, NY

QA Analyst

Responsibilities:

  • Created Test plan and Test cases as per the business requirements.
  • Maintained Test Matrix for the latest test results information
  • Conducted User Acceptance Testing (UAT) and provided training to end-users.
  • Developed and coordinated Test plans and Test procedures from requirements.
  • Analyzed System Requirement and Functional Requirement documents for creating Test Plans and Test Cases for the application.
  • Prepared test data for the positive and negative test cases.
  • Exposed in System, Regression and End-to-End testing for various interfaces.
  • Used TestDirector for documentation, test planning, and bug/defect tracking.
  • Prepared the weekly status reports in MS-Excel to report weekly status to senior management
  • Performed extensive testing of SQL statements and stored procedures on Oracle databases and suggested improvements.
  • Developed SQL Scripts to validate the test cases.
  • Performed GUI Testing, Functionality testing both manually and also used QTP to automate the testing process.
  • Developed test scripts for Data-driven tests using QTP and analyzed the results.
  • Involved in the performance testing of the application using Mercury's Load Runner, dealt with all three components, V U Gen, Controller and Analyzer
  • Responsible for software QA and analysis, with emphasis on performance testing and functional decomposition of web-based (SOA) and legacy systems architecture. I drove the effort to create performance test specifications for all end-to-end components -- legacy mainframe, Java servlets, database queries -- of a cross-platform application upgrade.
  • Integration, System, End-to-End, and Performance testing client-server applications: involved requirements analysis, coding of automated test scripts using Mercury QTP, running manual usability tests, and reporting metrics and recommendations to project management.
  • QA performance of benchmark beta testing including server and benchmark set-up, installation, batch file editing, and results database compilation and reporting.

Environment: Mercury TestDirector, Loadrunner, QTP, Linux, WebLogic, SOAP, Windows, .NET, C++, VB Script, Java Script, Oracle, SQL and PL/SQL.

We'd love your feedback!