We provide IT Staff Augmentation Services!

Sr Performance Test Engineer Resume

5.00/5 (Submit Your Rating)

Detroit, MI

SUMMARY:

  • 11 years of extensive experience in Performance testing of various application for Client Server and Web - based application.
  • Proficient in Planning, Developing, Scripting, Executing, and Analyzing Performance Tests.
  • Extensive experience of Automated testing and Manual testing including Performance, Load, Stress, Concurrent, Endurance, Functional and Regression Testing of Client/Server, Web based applications using various HP Testing Tools such as LoadRunner and Quality Center.
  • Extensive knowledge on different protocols; Web (HTTP/HTML), Siebel, Oracle NCA, Oracle Web Application 11i, Ajax Click & Script, AJAX TruClient, Web (Click & Script), Web services.
  • Extensive knowledge to create performance test scripts using LoadRunner VuGen.
  • Experienced in script enhancement using ANSI C functions, protocol specific functions, parameterization, and correlation.
  • Well versed with configuring Runtime settings in VuGen or Controller to define the way to run the script.
  • Good Understanding of Applications Build on Java Framework. Worked on Apache Tomcat, JBoss and WebSphere Application Tuning for Performance Framework.
  • Experienced in monitoring CPU, Memory, ASP Requests, Network, Web connections and throughput while running Baseline, Performance, Load, Stress and Soak testing.
  • Expertise in tracking defects using tracking tools such as HP Quality Center/ALM (Application Lifecycle Management), and Clear Quest.
  • Experience in performance monitoring tools like BAC, NMon, PerfMon, SiteScope, Wily Introscope, AppDynamics, Dynatrace, SPLUNK, HP Performance Manager and other tools.
  • Expertise in performance testing using LoadRunner with Detailed Analysis of Reports and Graphs.
  • Experience working with Putty to Monitor the Application Server, Core Tier, Database Server Logs.
  • Have experience working on monitoring tools like HP Diagnostics, HP BSM, SiteScope, Wily Introscope, Dynatrace, and Splunk.
  • Expertise in analyzing, interpreting and summarizing meaningful and relevant results in a complete Performance Test Report.
  • Hands on experience on network analysis tools like HttpWatch and Fiddler for tuning the performance of a web site by measuring download times, caching or the number of network round trips and measure web site performance in manual and automated test environment by capturing metrics such as performance measures like JavaScript execution, Rendering Time, CPU Utilization, Asynchronous Requests and Network Requests.
  • Good experience with J-Profiler, ANTS Profiler, HTTP Watch, Fiddler.
  • Excellent ability to understand complex scenarios and business problems and transfer the knowledge to other users/developers in the most comprehensible manner.
  • Quick learner by respect to latest technology, most excellent put into practice and system.

PROFESSIONAL EXPERIENCE:

Sr Performance Test Engineer

Confidential - Detroit, MI

Responsibilities:

  • Involved in decision-making and planning process with application Architect, BSAs and Project Manager to define performance-testing tools, testing approaches and capacity planning as per product owner expectations. Analyzed the Performance requirement and design documents.
  • Written LoadRunner Scripts, enhanced scripts with C functions, Parameterized Users, stored dynamic content in LoadRunner functions, used client-side secure certificates.
  • Executed Native mobile application performance testing using Native true client Protocol.
  • Develop test scenarios to properly load / stress the system in a lab environment and monitor / debug performance & stability problems.
  • Creating Test Plans by incorporating Performance Testing Objectives, Testing Environment, User Profiles, Risks, Test Scenarios, Explanation about the Tools used, Schedules and Analysis, Monitors and Presentation of results.
  • Extensively Worked in Web/HTTP, SOAP/REST API Web Service, TruClient Web, TruClient Mobile Protocol in LoadRunner and Used Virtual User Generator (VuGen) to generate LoadRunner Scripts to ensure that quality issues appropriately identified, analyzed, documented, tracked and resolved.
  • Experience working on AWS cloud integrated with JMeter for load testing web applications under test.
  • Hands on Experience with JMeter for developing test scripts with Http protocol.
  • Experience working on Jenkins Pipeline creating for JMeter integration and creating various shell scripts for executing JMeter test.
  • Hands on experience using BlazeMeter for developing performance test scripts.
  • Working with SDLC team to troubleshoot root cause of the issues related DB and Application servers using Dynatrace Tool.
  • Collected performance monitoring statistics coordinated with tech architects, business analysts to analyse the performance bottlenecks & provided recommendations to improve the performance of the application.
  • Used DynaTrace to measure web site performance in test environment to capture performance metrics of key product features.
  • Experience working on Splunk log analysis. Wrote Regex functions
  • Used JIRA for Project Planning, Sprint Planning and defect tracking for different Projects.
  • Installed Dynatrace AppMon & DCRUMservers and fine tuning.
  • Responsible for analyzing application and components behavior with heavier loads and optimizing server configurations.
  • Experience with Application server tuning and database tuning for performance testing.
  • Hands on experience with SQL Server for creating performance test data.
  • Experience with SQL Server creating indexes on tables and debugging performance bottlenecks.
  • Used Wily Introscope for performance test monitoring to take measurement of % CPU usage, JVM Heap memory Usage, Average response times, and database monitoring.
  • Experience working J-Profiler to debug performance bottlenecks. Debugged method level exceptions and database queries.
  • Debugged into performance bottlenecks using Splunk Log Analysis.
  • Followed Agile & Scrum Methodology in this project.
  • Experience working in Postman and SoapUi for Validating the service level component.
  • Responsible for analyzing, interpreting and summarizing meaningful and relevant results in a complete Performance Test Report and present to Project Manager and Business executives.

Environment: HP Loadrunner 12.55, Mobile Center, HP Performance center 12.55, DynaTrace 6.3, HTTP/HTML Protocol, Cloud Watch Monitoring tool, TruClient Web Protocol, TruClient Mobile and Native Mobile Protocol, Splunk, Web services, MS Office 2013, MS SQL Server 2015, Postman, SoapUi, Fiddler, J-Profiler

Performance Test Lead

Confidential - Augusta, Maine

Responsibilities:

  • Coordinated with internal business departments on their information system's needs.
  • Gathered and Analysed business requirements and procedures.
  • Responsible for developing the performance test strategies, plans and cases.
  • Developed Vuser scripts that contain tasks performed by each Vuser, tasks performed by Vuser's as a whole, and tasks measured as transactions.
  • Responsible for parameterizing large and complex test data to accurate depict production trends.
  • Coordinated creation of stress environments to conduct stress/load testing.
  • Performed the monitoring performance of the application and database servers during the test run using tools like Dynatrace and SiteScope.
  • Developed Test scripts using JMeter using http protocol and Web Services.
  • Utilized Bean shell processors, Regex Functions in JMeter to add customizations to the scripts.
  • Monitored Different kinds of Graphs including Throughput, Hits/Sec, Transaction Response time, Windows Resources (Memory Utilization, CPU Utilization, Threads, etc) while executing the scripts from loadRunner, Performance Center.
  • Preparation and execution of test scripts using JMeter and SOAP UI tool to perform Web Services testing and load testing in Blaze meter.
  • Conducting WSDL review meetings to understand the requirement of each Web Service.
  • Execute each Web Service manually by testing each operation in the WSDL.
  • Performance tested SOA based application using Web Services Protocol.
  • Hands on Experience with Application server tuning with server types like Apache Tomcat and JBoss Application Server.
  • Experience working on Splunk Log analysis for debugging performance issues.
  • Written Regex functions with Splunk to debug Application Server and DB Server log files.
  • Extensively used Load Runner to conduct performance testing of the application.
  • Prepared Load Runner scenarios for Load and Performance testing using different host systems.
  • Developed Load Runner Vugen Scripts using Correlation to parameterize dynamic values.
  • Experience working on Oracle Database for creating performance test data and resolving performance bottlenecks.
  • Analyzed memory leak or high usage of memory of java application using heap dump.
  • Used Visual VM to generate and analyze thread dumps and heap dumps, track down memory leaks, and do lightweight memory and CPU profiling.
  • Correlated and parameterized scripts as well as configured the Run Time settings in Virtual User Generator.
  • Developed Scenarios with different schedules like Ramp-up, Duration.

Environment: Load Runner, JMeter, MQ Series, Web Services, Blaze Meter, Load Runner, Dynatrace, SPLUNK, SiteScope, Vugen Scripts, SAP ECC, BI, BW, SAP GUI, SAP WEB, Oracle.

Performance Test Lead

Confidential - Hartford, CT

Responsibilities:

  • Assisted the project team in identifying and documenting performance test requirements.
  • Worked with business and technology leads to identify the appropriate data for testing, and prepare that data for the test cases
  • Designed and developed performance testing scripts, functions, scenarios, processes for simple to complex testing situations using HP LoadRunner.
  • Responsible for recording scripts for different business scenario and involved in script enhancement with heavy correlations and parameterization.
  • Responsible for load testing on web application using JMeter.
  • Worked on different protocols; Web (HTTP/HTML), Ajax Click & Script, AJAX TruClient, Web (Click & Script), Web services.
  • Involved in test data preparation for the Parameterized values in the scripts for multiple scenarios
  • Designed the performance test scenarios for smoke test, baseline test, scalability test and stress test.
  • Used Scheduler to schedule the scenarios for User's Ramp up/Ramp down in LoadRunner Controller and assigned Vusers group to different Load Generators
  • Observed the entire load test run for any failures/errors and monitored metrics such as Transaction Response Times, Running Virtual Users, Hits Per Second and Windows Resources graph
  • Used AppDynamics to measure web site performance in test environment to capture performance metrics of key product features.
  • Configured Client Side and Server Metrics in AppDynamics to Monitor Application Server and Database Metrics.
  • Performed problem solving analysis and root cause for system functionality and testing challenges using LoadRunner Analysis Tool
  • Opened defect in Quality Center with necessary information and assigned it to development team
  • Worked closely with development team to resolve the defect and ensure that it is resolved and closed accordingly
  • Involved in Performance testing of online batch jobs, creating necessary files for exports and imports batch jobs
  • Created reports for online batch jobs by capturing processing time from batch server logs and ensured that the records are processed within SLA

    Environment: HP LoadRunner 9.52/11, JMeter, App-Dynamics, Windows Server 2003, WinSCP, Wily Introscope, Oracle 10g/11g, Toad for Oracle, Oracle SQL Developer, Quality Center 10, Web services, AIX.

Performance Tester

Confidential - Bentonville, AR

Responsibilities:

  • Studied the URS document and created the Functional Requirement Specification document.
  • Worked according to the activities laid down in each phase of Software development life cycle and Coordinated the environment setup for Testing.
  • Meet with client groups to determine performance requirements and goals and determine test strategies based on requirements and architecture.
  • Responsible for automating these test cases into test scripts using WinRunner 8.2, QTP and LoadRunner.
  • Responsible for providing Performance Requirements guidance, Performance Testing, Performance Monitoring, and Workload Modeling.
  • Identified and classified Manual and Automated test cases by isolating the repetitive actions.
  • Developed detailed Manual Test cases and Scenarios. Studied Requirements and designed manual test cases accordingly.
  • Identifying the functional test cases for Regression Testing and automated these Test Scripts using QTP.
  • Installed the Citrix client to talk with the Citrix server and record the traffic going back and forth.
  • Create scripts to enable the Controller to measure the performance of Web server under various load conditions.
  • Monitored Daily and Scheduled reports generated by UAT analysts and System as a Vuser.
  • Analyzed Load and Generation reports of the Scheduled against the online reports.
  • Created Database Vuser scripts to simulate client activities and performed Load, Stress and Performance test using LoadRunner/Performance Center
  • Generated and Created VuGen scripts using Vuser Generator and Created Scenarios in LoadRunner Controller.
  • Used LoadRunner Analysis to create graphs and reports from the load test results to correlate system information and identify both bottlenecks and performance issues
  • Analyzed LoadRunner/Performance Center test result Involved in Preparing Test Plan and Test Cases based on the analysis of the business requirements.
  • Inserted rendezvous points in order to simulate heavy loads for conducting Load Testing.
  • Used ramp-up and ramp-down to simulate real-time scenarios.
  • Identified functionality and performance issues, including deadlock conditions, database connectivity problems, and system crashes under load.
  • Provided management and vendor with analysis reports and recommendations, which resulted tuning of the application. Vertical scaling and garbage collection were performed. Communicated with the vendor to resolve issues.
  • Confirmed the scalability of the new servers and application under test after the architecture redesign.
  • Conducted weekly meetings with Project Head, Business and development teams.
  • Executing the scenarios, analyzing the Graphs and coordinating with the DBA's and Network Admin to ensure optimum performance.

Environment: Windows NT, Citrix, QA Load, WinRunner, LoadRunner 11.0/11.50/12.0 , Quality Center, Performance Center 11.0/11.50/12.0 , Oracle DB, QTP, MS Office, MS Access, MS Vision, MS Project.

We'd love your feedback!