We provide IT Staff Augmentation Services!

Sr. Qa Analyst Resume

0/5 (Submit Your Rating)

Overland Park, KS

SUMMARY

  • Over 7 years of cumulative experience in Software Development Life Cycle with emphasis on Performance Engineering using Mercury Interactive tools, Rational Test Suite under Windows and Unix Environments.
  • The extensive software testing exposure includes the Manual and automated testing of Client - Sever, Multi-Tier, Stand-alone and Web-based applications with tools Win Runner, Load Runner, Quick Test Pro, Quality Center, and TestDirector.
  • Experience in System Specification Analysis, Testing Methodology and Test Plan Formulation.
  • Experience and understanding of automation framework.
  • Extensive experience in designing Test Cases, Test Scenarios, Test Scripts and Test reports of manual and automated tests.
  • Possess excellent skills in Manual Testing along with skills in Automation Testing using Win Runner, Quick Test Pro, Load Runner, Wily, Test Director /Quality Center.
  • Proficient in WinRunner GUI maps, checkpoints, synchronization points, parameterization (Data-Driven test) and break points for debugging.
  • Written Load Runner Scripts, enhanced scripts with C++ functions, Parameterized cookies, stored dynamic content in Load Runner functions, used client side secure certificates. Text checks were written, Created scenarios for Concurrent (Rendezvous) and Sequential users. Run time settings were configured for HTTP, iterations. Simulated Modem speeds to bring the testing scenario to real world. CPU, Memory, ASP Requests, Network, Web connections and through put were monitored while running Baseline, Performance, Load, Stress and Soak testing.
  • Conducted Load Testing with Thin and Thick Clients Simultaneously, Scripted Thick Clients in Win Runner and Thin Clients in Web and Citrix Protocols.
  • Handled and used proxy servers and need based headers info was recorded into the scripts.
  • Measured Response times at sub transaction levels at web, App servers and database server levels by using Optimal Application expert. Highly concentrated on Transactions per sec during testing.
  • Oracle Database performance was monitored for Indexes, Sessions, Connections, poorly written SQL queries and dead locks for each component of WSJ application
  • Involved in system and performance testing of numerous Banking and Loan Applications.
  • Experience in Rational Unified Process (RUP), Clear case, Clear Quest, Rational Requisite Pro and UML.
  • Worked extensively with automated testing tools like Win Runner, Load Runner, Test Director, and LRVB Add-in. Created Scenarios, Ran tests with IP spoofing with in process and multithreaded environments analyzed and documented Results.
  • Hands on experience with analysis of business, technical, and functional requirements and Developed, Executed & Tested the test plans, test cases and test strategies.
  • Documented defects found during testing using Quality Center/Test Director and communicated recorded problems to the responsible QA or development personnel with Root Cause Analysis.
  • Experience in communicating the test cases and defects to Independent verification and Validation team.
  • Self-starter and ability to quickly adept and master new concepts and applications.

TECHNICAL SKILLS

Testing tools: Load Runner, Win Runner, Quick Test Pro, Wily, NeoLoad

Scripting Languages: TSL, Shell script, VB script, SQA Basic

Bug Reporting Tools: QualityCenter, Test Director, Clear Quest, Rally

Operating Systems: Windows 95/98/ NT/2000/XP, Unix, Red Hat Linux

Programming Languages: C, C++, Java, VB, SQL

Front-end tools: MS Visual Basic, MS FrontPage, Macromedia, Dream weaver

Databases: MS SQL Server 2000, MS Access-2000,DB2, Oracle

Web Technologies and Scripting Languages: HTML/DHTML, XML, ASP, .Net, JSP, VBScript, JavaScript, FLASH

Other Software's: Adobe PhotoShop, MS Office 2000, Labview, GIS ARC/INFO.

PROFESSIONAL EXPERIENCE

Confidential, Overland Park, KS

Sr. QA Analyst

Responsibilities:

  • Developed test plans and written detailed test cases, test scripts for new and existing applications such as Hosted Checkout, Virtual Terminal, MGD (Mercury Gift on Demand) and CRM.
  • Responsible for analyzing applications and components behavior with heavier loads and optimizing server configurations.
  • Interacted with the Business teams and the end users to gather requirements.
  • Written and uploaded requirements, test cases and test scripts into Quality center.
  • Documented defects found during testing using Quality Center and communicated problems to the responsible QA or development personnel with Root Cause Analysis.
  • Experience in communicating the test cases and defects to Independent verification and Validation team.
  • Extensively used Load Runner 11.0v and upgraded to 11.5 version for performance and stress Testing.
  • Developed Load Runner Scripts for various applications such as Hosted checkout, Virtual Terminal, MGD and CRM using Web HTTP/HTML, Webservices and Ajax TruClient protocols. Enhanced scripts with C++ functions, Parameterized cookies, stored dynamic content in Load Runner functions, used client side secure certificates. Text checks were written, Created scenarios for Concurrent (Rendezvous) and Sequential users. Run time settings were configured for HTTP, iterations. Simulated Modem speeds to bring the testing scenario to real world. CPU, Memory, Network, Web connections and through put were monitored while running Baseline, Performance, Load, Stress and Soak testing.
  • Analyzed performance test results and send reports, graphs that were compared against the Baseline results.
  • Written high level Load Runner scripts by using Virtual User generator for Single User, Base Line, Soak (Endurance test) scenarios by storing dynamically varying object IDs in parameters and validating correct downloads of HTML pages by validating content in sources. Parameterized unique IDS and stored dynamic content in variables and pared the values to Web submit under Http protocols. Cookies were properly handled and used proxy servers and need based headers info was recorded into the scripts.
  • Regression and Performance testing has been done in the DBRA (Database ReArchitecture) Project where all the new and existing applications have been upgraded and deployed on to the new servers.
  • Used Sitescope for monitoring CPU, Memory for all Web, Application servers and DB servers.
  • Monitored database for sessions, connection pool and Memory issues.
  • Interacted with DBA's, Developers, Systems Engineers and Network Engineers during testing and isolated bottlenecks at different levels.
  • Analyzed SQL Servers DB connections, Indexes on tables Deadlock issues on Databases were analyzed by applying proper Indexes and triggers.
  • Regression Testing is performed and the additional scripts/test cases are generated for each version.
  • Developed User Requirement Specification URS documents.
  • Performed Functionality, Regression, Integration and Compatibility Testing.
  • SQL Traces (SQL Profiler) were recorded and analyzed.
  • Build machines equivalent to production environment and installed SSL Certificates.

Environment: Quality center 11.0, Load Runner 11.0 & 11.5v, QTP 11.0, MS SQL Server, IIS server, Load Balancer, Performance Center, JAVA, Linux, VuGen, Test Director J2EE Diagnostic Tool, Web, Windows XP, AIX, IE, Firefox.

Confidential, Fremont, CA

Sr. QA Engineer

Responsibilities:

  • Gathered business requirement, studying the application and collecting the information from developers and business.
  • Installed and Configured Load Runner Environment as well as Neo Load for some of the Flex applications.
  • Written detailed defect documents and test cases and executed test scripts and defect tracking.
  • Used Virtual User Generator to generate VuGen Scripts for web protocol and Neoload scripts for the Flex, Ensure that quality issues are appropriately identified, analyzed, documented, tracked and resolved in Quality Center.
  • Regression/Manual testing has been done and updated all the existing test case documents.
  • Developed and deployed test Load scripts to do end to end performance testing using Load Runner and uploaded all the test scripts in to ALM.
  • Implemented and maintained an effective performance test environment.
  • Identify and eliminate performance bottlenecks during the development lifecycle.
  • Accurately produce regular project status reports to senior management to ensure on-time project launch.
  • Conducted Duration test, Stress test, Base Line tests.
  • Verify that new or upgraded applications meet specified performance requirements.
  • Used Controller to Launch 100, 200 concurrent users to generate load.
  • Used to identify the queries which were taking too long and optimize those queries to improve performance.
  • Work closely with software developers and take an active role in ensuring that the software components meet the highest quality standards
  • Used Performance Center to verify the Application Response Time under different load conditions.
  • Memory leaks at each component level were identified and analyzed
  • Database stored procedure executions, Indexes and dead locks with load were analyzed. Tuned SQL server by changing Parallelism, update statistics, hints settings.

Environment: Load Runner 11.0, Neo Load 3.2, Quality center 11.0, QTP 11.0, Oracle, MS SQL Server, Web logic, Web Sphere, Load Balancer, Performance Center, JAVA, Linux, VuGen, Test Director J2EE Diagnostic Tool, Ethereal, web, Windows XP, AIX, IE, Netscape, Firefox

Confidential, Los Angeles, CA

QA Analyst

Responsibilities:

  • Developed test strategies and test plans and execute them on different multiple projects.
  • Interacted with the Business analysts and the end users to gather requirements and developing User Requirement Specification URS documents.
  • Worked closely with Production Managers, Technical Managers and Business Managers in planning, scheduling, developing, and executing performance tests.
  • Interacted with developers during testing for identifying memory leaks, fixing bugs and for optimizing server settings at web, app and database levels.
  • Responsible for documenting writing manual test cases and defect documents.
  • Developing and executing performance, volume and stress tests.
  • Executed Load Runner scenarios using Load Runner Controller - performance Center and analyzed the results through Load Runner Analysis to find the bottlenecks in networks and server resources including deadlock conditions, database connectivity problems and system crashes under load.
  • Develop test scenarios to properly load / stress the system in a lab environment and monitor / debug performance & stability problems.
  • Collaborate with Performance Dev lead to analyze and performance development experience to identify performance risk areas for the application.
  • Parameterized large and complex test data to accurate depict production trends.
  • Validated the scripts to make sure they have been executed correctly and meets the scenario description.
  • Responsible for monitoring system and application performance using Wily.
  • Created Single User, Base Line and Soak test scenarios. Random pacing between iterations was introduced to get the desired transactions per hour.
  • Designed and developed test environment plans for each product tested.
  • Responsible for documenting test results after each test cycle and reporting all the test performance results.

Environment: Quality Center 11.0, Load Runner 11.0, Wily, Citrix, LDAP, Oracle Database, VuGen, MS SQL Server, Web logic, Web Sphere, Load Balancer, HP RUM, IBM Mainframe AS/400, Performance Center 11.0, JAVA, Test Director J2EE Diagnostic Tool, web, Windows XP, Solaris, AIX, IE.

Confidential, Boise, ID

QA Test Engineer

Responsibilities:

  • Responsible for the successful QA testing projects.
  • Met with business end users and product developers to document and understand the product performance expectations.
  • Protocols used: Web, Citrix ICA, Oracle (2-Tier), Web/Winsocket Dual Protocol, C Vuser.
  • Developed regression test cases and performance test strategies and executing them on multiple, simultaneous projects.
  • Created consistent and reusable scripts in VuGen with multiple transaction codes with interdependency.
  • Used Virtual User Generator to generate VuGen Scripts for web protocol, Ensure that quality issues are appropriately identified, analyzed, documented, tracked and resolved in Quality Center.
  • Developed and deployed test scripts to do end to end performance testing using Load Runner.
  • Implemented and maintained an effective performance test environment.
  • Identify and eliminate performance bottlenecks during the development lifecycle.
  • Accurately produce regular project status reports to senior management to ensure on-time project launch.
  • Responsible for creation of non-functional product requirements and the engagement of the performance team early in the product cycle.
  • Create and review test schedules, test plans, test cases, test scenarios as well as worked with QC for defect tracking. Provided frequent test progress reports to development and upper management teams.
  • Developed the tool suite used for automated regression and performance and scalability testing.
  • Developed front-end performance test scripts in WinRunner and all back-end load test scripts in Load Runner.
  • Used as many as 50 terminals for WinRunner host and 100 Load Runner virtual users to performance test project via the Load Runner controller.

Environment: Load Runner 9.51, Performance Center, SiteScope, Wily, VuGen, Oracle, MS SQL Server, WinRunner, Load Balancer, JAVA, Quality Center, AS/400, Citrix, J2EE Diagnostic Tool, web, Windows 2000/XP,HP-UX, AIX

Confidential, Atlanta, GA

QA Analyst

Responsibilities:

  • Involved in gathering business requirement, studying the application and collecting the information from developers, and business.
  • Installed and Configured Load Runner Environment including Load Generators and Controllers.
  • Used Performance Center to verify the Application Response Time under different load conditions.
  • Developed test plans and written test cases for various new and existing applications and uploaded the requirements into Quality center as well.
  • Created Vuser scripts that contain tasks performed by each Vuser, tasks performed by Vuser’s as a whole, and tasks measured as transactions.
  • Developed Vuser Scripts in web and Citrix Protocols.
  • Designed tests for Benchmark and Stress testing.
  • Created scripts in VuGen for SAP ERP functionality to check the transactional data flow and Load impact.
  • Added performance measurements for Oracle, Web Logic, IIS in Load Runner Test Center.
  • Analyzed results using Load Runner Analysis tool and analyzed Oracle database connections, sessions, Web Logic log files.
  • Responsible for analyzing application and components behavior with heavier loads and optimizing server configurations
  • Maintained test matrix and bug database and generated monthly reports.
  • Interacted with developers during testing for identifying memory leaks, fixing bugs and for optimizing server settings at web, app and database levels.
  • Used Load Runner tool for testing and monitoring actively participated in enhancement meetings focused on making the website more intuitive and interesting.

Environment: Load Runner, Wily, Win Runner, Citrix, Quick Test Pro, LDAP, Oracle, VuGen, MS SQL Server, Web logic, Web Sphere, Load Balancer, IBM Mainframe AS/400, Performance Center, JAVA,Test Director J2EE Diagnostic Tool, Ethereal, JMeter, web, Windows 2000 / XP, Solaris, AIX, IE.

Confidential, Bellevue, WA

QA tester

Responsibilities:

  • Defining the performance goals and objectives based on the client requirements and inputs.
  • Extensively Worked in Web, Citrix,Click and Script, Oracle Protocol in Load Runner.
  • Working experience on Load Runner with VuGen, Controller and Analyzer.
  • Ensure the compatibility of all application platform components, configurations and their upgrade levels in production and make necessary changes to the lab environment to match production
  • Partner with the Software development organization to analyze system components and performance to identify needed changes in the application design
  • Involved in analyzing, interpreting and summarizing meaningful and relevant results in a complete Performance Test Report.
  • Work closely with software developers and take an active role in ensuring that the software components meet the highest quality standards
  • Writes and executes SQL queries in validating test results
  • Performed installation of Load Generator and Load Analysis on Windows platform. Verified the connectivity of the Controller with the data center hosting the application.
  • Provide support to the development team in identifying real world use cases and appropriate workflows
  • Performs in-depth analysis to isolate points of failure in the application
  • Assist in production of testing and capacity certification reports.
  • Investigate and troubleshoot performance problems in a lab environment. This will also include analysis of performance problems in a production environment.
  • Created Test Calendars for writing test plans.
  • Worked closely with clients
  • Interface with developers, project managers, and management in the development,
  • Execution and reporting of test performance results.

Environment: Load Runner, Win Runner, Quick Test Pro, Siebel, LDAP, Oracle, MS SQL Server, Web logic, Web Sphere, Load Balancer, Performance Center, JAVA, Linux, VuGen, Test Director J2EE Diagnostic Tool, Ethereal, JMeter, web, Windows 2000 / XP, Solaris ., AIX, IE, Netscape, Firefox

Confidential, Portsmouth, NH

QA tester

Responsibilities:

  • Performed Load and Stress Testing using Load Runner for various web and client-server based architecture for Property and Casualty Applications.
  • Analyzed the requirement and design documents.
  • Involved in writing Test Plans by incorporating Performance Testing Objectives, Testing Environment, User Profiles, Risks, Test Scenarios, Explanation about the Tools used, Schedules and Analysis, Monitors and Presentation of results.
  • Written Load Runner Scripts, enhanced scripts with C functions, Parameterized Users, stored dynamic content in Load Runner functions, used client side secure certificates. Text checks were written, Created scenarios for Concurrent (Rendezvous) and Sequential users. Run time settings were configured for HTTP, iterations. Simulated Modem speeds to bring the testing scenario to real world. CPU, Memory, ASP Requests, Network, Web connections and throughput were monitored while running the various scenarios in the Load Runner.
  • Created Single User, Base Line and Soak test scenarios. Random pacing between iterations was introduced to get the desired transactions per hour.
  • Analyzed results using Load Runner Analysis tool and analyzed Oracle database connections, sessions, Web Logic log files.
  • Responsible for analyzing application and components behavior with heavier loads and optimizing server configurations
  • Worked closely with Production Managers, Technical Managers and Business Managers in planning, scheduling, developing, and executing performance tests.
  • Interacted with developers during testing for identifying memory leaks, fixing bugs and for optimizing server settings at web, app and database levels.
  • Used Load Runner tool for testing and monitoring.

Environment: Load Runner 7.8, Load Runner TestCenter 7.8, VTS(Virtual Table Server), Websphere, Windows 2000 Advanced Server, IBM Mainframe AS/400, Apache, IIS 5, VuGen, Livelink 9.2, BEA Weblogic 8.1 SP1, Servlets, EJB, Solaris 5.8, Oracle Database.

We'd love your feedback!