We provide IT Staff Augmentation Services!

Performance Test Analyst Resume

0/5 (Submit Your Rating)

Houston, TX

SUMMARY

  • Extensive experience in Performance testing and Performance Engineer involving from requirements gathering to result analysis and interpretations.
  • Experience in JMeter and HP LoadRunner components (Virtual user generator, Controller, Performance center, Result Analysis)
  • Experience with HP LoadRunner multiple protocols (Web HTTP/Html, Web Services, Ajax Truclient, SAP GUI protocols and SAP web Protocols).
  • Analyzed the test results (TPS, Hits/second, Transaction response time, CPU utilization etc.) using HP LoadRunner Analysis, various monitoring tools and prepare Test Reports.
  • Identifying performance issues using monitoring tools such as AppDynamics, HP SiteScope.
  • Written HP LoadRunner scripts, enhanced scripts with C functions, Parameterized, stored dynamic content in LoadRunner functions.
  • Experience in analyzing performance bottlenecks such as very high CPU usage, memory leaks using HP SiteScope.
  • Performance tested the .Net and Java applications using HP LoadRunner.
  • Experience in creating Clips and executing Compositions in SOASTA CloudTest
  • Responsible for administering Dynatrace and Splunk to client team to monitor network activities and daily operations.

TECHNICAL SKILLS

Testing Tools: HP LoadRunner, Blazemeter, JMeter

Test Management: HP APM, Bugzilla, JIRA, Slack

PROFESSIONAL EXPERIENCE

Confidential, Houston, TX

Performance Test Analyst

Responsibilities:

  • Responsible for gathering all requirements related to performance testing and load test strategy.
  • Created Ajax Truclient scripts using HP LoadRunner 12.5
  • Executed Compostitions in SOASTA CloudTest.
  • Correlated data in Clips using Session Packet Wizard in SOASTA CloudTest.
  • Created Clips for mobile and desktop applications in SOASTA CloudTest.
  • Created custom Compositions in SOASTA CloudTest to mimic real life scenarios.
  • Modified Ajax Truclient transactions and events in HP LoadRunner 12.5 and Performance Center
  • Integrated Mozilla firefox browser using Truclient script recording and modified event configuration.
  • Modified X Path changes during script recording in Truclient protocol.
  • Developed dynamic scripts to integrate with backend changes in Truclient protocol.
  • Measured response time and API response time during performance testing for Ajax Truclient protocol scripts.
  • Worked with various protocols like Web (HTTP/HTML), Web (Click and script), Ajax (Click & Script), LR Java protocol for performance testing on HP LoadRunner 12.5
  • Ramp up Virtual users in a load test to achieve a maximal virtual user load of 300 concurrent users using Ajax Truclient protocol.
  • Verify the performance of application on monitoring using HP SiteScope and Unix server after each Load Test.
  • Setup agents for JAVA and .NET applications in Dynatrace.
  • Created agent groups in Dynatrace.
  • Created custom sensors in Dynatrace to manually set up the entry point of transactions.
  • Created custom dashboards and dashlets in Dynatrace as requested by application and business teams.
  • Created JVM strings/arguments to implement in the application console to connect with Dynatrace.
  • Resolved Level 1, Level 2, and Level 3 incidents from Production environment using Dynatrace.

Environment: HP LoadRunner 12.5, HP Performance Center, HP SiteScope, Web Services, Dynatrace Hadoop

Confidential, Princeton, NJ

Performance Test Engineer

Responsibilities:

  • Interacted with Business Analyst and application teams to discuss the performance requirements and load test strategy.
  • Developed performance test plans, test scripts, test scenarios based on business requirements
  • Developed VUser scripts using Web (HTTP/HTML, AJAX TruClient, Web Service, Click n Script) protocols.
  • Administered the performance testing tools including VuGen, Analysis, and Cloud based Load Generators in Performance Center.
  • Identified testing methodology for load, stress testing based on the business processes and analyzed the business requirements along with Product Manager.
  • Enhanced VuGen Scripts by parameterizing the input test data to minimize data caching, unique constraint, and data dependency issue.
  • Inserted Rendezvous point into script to instruct Vusers to perform a specific task simultaneously.
  • Inserted and configured manual correlation/Auto Correlation to handle dynamic data into script including unique session value.
  • Created manual and automated scenario using LR Controller, setup Runtime settings, configured Load Generator, and assigned number of Virtual Users.
  • Configured Ramp Up, Ramp Down, and calculated proper duration of the proposed Load tests.
  • Extensively used HP LoadRunner 12.x for Developing Vuser Scripts.
  • Created customized LoadRunner VuGen scripts at API level with manual correlation, user defined functions, development libraries (classes and methods), and error handling.
  • Enhanced Vuser scripts by adding correlations, parameters, and checking/validation functions.
  • Executed Performance tests using Performance Center.
  • Monitored different graphs like transaction response time and analysed server performance status, hits per second, throughput, windows resources and database server resources etc.
  • Worked closely with development team and provided assistance in performance tuning
  • Analyzed Load pattern and created test scenarios to emulate the real life stress conditions.
  • Analyzed performance transaction, and server resource monitors for meaningful results for the entire test run using HP LoadRunner Analysis
  • Conducted meetings with developers, application team, and business team to analyze the defects evaluate the test executions.
  • Involved in the decision making with the management for final application release.

Environment: HP LoadRunner 12.x, HP Performance Center, .Net, SAP, MySQL, Java, SiteScope, SQL Server, and Oracle

Confidential, Atlanta, GA

Performance Test Engineer

Responsibilities:

  • Expertise in preparing Test Plan and developed test cases for all Performance Testing.
  • Involved in Manual and Automation testing of Web and Client Server Application.
  • Recorded multiple VUGen scripts and did parameterization and correlation.
  • Analyzed business requirements, functional specification, and the required documents for testing.
  • Allocated priorities to all the test cases, taking into consideration the product module priorities.
  • Used JIRA for requirements management, planning, scheduling, running tests defect tracking and managing the defects.
  • Interacted with Business Analyst and application teams to discuss the performance requirements and load test strategy.
  • Developed VUser scripts using Web (HTTP/HTML), Oracle NCA, Oracle Web Application 11i with parameterization, correlation, adding ANSI C and Oracle NCA functions.
  • Administered the automated performance testing tools including VuGen, Analysis, Controller and Load Generator.
  • Identified testing methodology for load, stress testing based on the business processes and analyzed the business requirements along with Product Manager.
  • Enhanced VuGen Scripts by Parameterizing the input test data to minimize data caching, unique constraint, and data dependency issue.
  • Inserted Rendezvous point into script to instruct VUsers to perform a specific task simultaneously.
  • Inserted and configured manual correlation to handle dynamic data into script including unique session value.
  • Created manual and automated scenario in Performance Center, setup Runtime settings, configured Load Generators, and assigned number of Virtual Users.
  • Configured Ramp Up, Ramp Down, and calculated proper duration of the proposed Load test.
  • Extensively used HP LoadRunner for Developing VUser Scripts.
  • Created customized LoadRunner VuGen scripts at API level with manual correlation, user defined functions, development libraries (classes and methods), and error handling.
  • Enhanced VUser scripts by adding correlations, parameters, and checking/validation functions.
  • Executed Performance tests using Performance Center.
  • Monitored different graphs like transaction response time and analysed server performance status, hits per second, throughput, windows resources and database server resources etc.
  • Used Performance Center to execute tests, and maintain scripts
  • Analyzed Load pattern and created test scenarios to emulate the real life stress conditions.
  • Conducted meetings with developers, application team, and business team to analyze the defects evaluate the test executions.
  • Involved in the decision making with the management for final application release.

Environment: HP Quality Center, HP LoadRunner, JIRA, HP Performance Center .Net, HP SiteScope, SQL Server, Oracle.

Confidential, Minneapolis, MN

Performance Test Analyst

Responsibilities:

  • Worked closely with Business Analysts and Developers to gather Application Requirements and Business Processes in order to formulate the test plan.
  • Developed scripts using LoadRunner by recording/playback and as well as by writing custom functions.
  • Created a detailed System Test Plan and procedures.
  • Performed load testing, stress testing, endurance testing, and performance testing on JAVA and .NET applications.
  • Assisted with the implementation and execution of all aspects of the testing activities including planning, creation and execution of test cases, test scripts, test reports.
  • Created custom testing scenarios in Controller with rendezvous points, custom ramp - up and ramp-down time.
  • Reported software mismatches to the development team using JIRA.
  • Developed VUser scripts using Web (HTTP/HTML) protocols.
  • Developed test cases and scripts using HP LoadRunner for Load and Performance testing of the application by creating Virtual users scripts.
  • Parameterized input test data in order to reduce data caching and create real world heavy load on application server and database server.
  • Enhanced VuGen scripts by implementing manual correlation to handle dynamic values including session ID, date and timestamp.
  • Created Load test scenario and added number of Vusers and schedule the scenario to execute.
  • Designed Performance test scenarios using LoadRunner conducted Stress Test and analyzed the results.
  • Extensively utilized HP Performance Center to create and administer Load Test, Stress Test and Volume Test.
  • Created standard report and related graphs and provide to the project management and development team.
  • Identified and documented functional requirements and mapped them to individual test case requirements.
  • Develop the daily status reports to Development, Configuration, DBA and Network Teams
  • Analyzed various graphs including Database Resource graphs, Network Monitor graphs, User Graphs, Error Graphs, Transaction graphs and Web Server Resource Graphs.
  • Provided comprehensive test summary reports after analyzing results to extended teams

Environment: Windows Server, Oracle, HP LoadRunner, HP SiteScope, HP Performance Center, JIRA

Confidential, Germantown, MD

QA Engineer

Responsibilities:

  • Analyzed business requirements and developed Test cases based on Use Cases to evaluate the functionalities.
  • Responsible for implementing effective quality assurance processes and practices.
  • Managed weekly meetings with the QA team to obtain status, gather innovative ideas for continuous process improvements and delegate workload.
  • Wrote complex SQL queries using SQL Plus to validate backend functionality of the application.
  • Strong knowledge in Descriptive Programming using VBScript
  • Generated automated scripts using HP UFT and enhanced scripts using various parameterizations.
  • Parametrized test scripts to utilize for multiple sets of test data.
  • Performed verification of Text, Database checkpoints and synchronization points of the application in UFT.
  • Created and Implemented Centralized Shared Object Repositories to reduce script maintenance time.
  • Created function libraries for common functions for better code reusability.
  • Created Data Driven Framework to reduce automated scripts maintenance time.
  • Documented weekly automation status reports that provided information about the AUT being tested, corresponding test-cases that have been automated, script information, verification point’s information, corresponding bugs’ information, and expected status for the next week.
  • Reviewed product requirements, functional and design specifications to determine and prepared automated test Script.
  • Experience in Web Services, API Testing using SoapUI to test SOAP and REST web services.
  • Prepared Reusable functions, which improve the robustness, re-usability, and maintainability of the test scripts & Frameworks.
  • Communicated with Application Developers, Project Manager and other Team Members Scrum Master and Various Stakeholders on Application testing status.

We'd love your feedback!