We provide IT Staff Augmentation Services!

Performance Tester Resume

3.00/5 (Submit Your Rating)

New, YorK

SUMMARY

  • Over 10 Years of experience in E2E Enterprise Application Performance Engineer in system requirements, analysis, design, coding, test effort estimation, risk forecasting, testing on Web, Web Services, backend services and GUI based applications in financial, payment, e - commerce, and Banking sectors.
  • Good experience in using APM tool Dynatracein monitoring business transaction across all tiers (web/app/DB) of the applications.
  • Develop scenario-based testing for the JMeter scripts.
  • Create, schedule, and run the scenarios usingJMeterand generate necessary graphs
  • Extensively worked on JMeterto create Thread Groups and test Web Application for various loads on key business scenarios
  • Created and executedJMeterscripts for performance testing of portal
  • Expertise in Performance Testing using APACHE JMeter, HP LoadRunner (HP VuGen, HP Controller, HP Analysis), ALM (QC), HP Performance Center and HP SiteScope.
  • Expertise in creating the JMeter Test Scripts in Web-based Applications, by using different plugin and Samplers for API, database and SSH calls.
  • Expert on creating System Architecture diagram, Scope of test (POC/RFP), planning, design, coding, execute, monitoring, analyze results, identify potential bottleneck, and suggest customer a complete step to improve application performance.
  • Experience with LoadRunner components: VuGen, Controller, Analysis, Load Generator and with the components of JMeter.
  • Experience in Distributed testing in JMeter in GUI and Non-GUI Mode in Master and Slave machines
  • Experience in testing the web services applications and API’s using SOAP UI and other similar testing tools like JMeter.
  • Outstanding ability in writing advanced LoadRunner scripting on (HTTP/HTML), Siebel, Web Services, Mobile, Socket protocols using Parameterizations, Correlations, Randomization, Rendezvous Points, Custom request, atoi, itoa, Strcmp, Check Points, If-Else and other LR and C functions.
  • Experienced in consulting with Architects, Dev, Operations, Project Managers and Product owners to proactively analyze system performance and creating a solution plan.
  • Comprehensive understanding of Software Development Life Cycles (SDLC), Change/Release management and staging environment management
  • Experience working within an agile development process (Scrum, XP, Kanban, etc.) from
  • Excellency in Identifying performance thresholds that would require scaling of environments and make recommendations on hardware resources necessary to address increased scalability requirements and creating detailed test report for business and Stockholder’s presentation purposes.
  • Used Monitoring tools like Dynatrace, Windows Performance Monitor, NMon, VM Stat, I/O Stat, HP Diagnostics, Solar Winds and DataDog and Application insights. etc.
  • Excellent knowledge and skills in performance monitoring CPU, Memory, Network, Web connections, throughput, transaction response times, web/app server metrics (Windows / Linux / AIX), Database metric and J2EE Performance while running Baseline, Performance, Load, Stress and Soak testing.
  • Functional testing - Exceptional understanding in Functional Testing, Functional Test Scripts, System Testing, Integration Testing, End to End Testing, Regression Testing, and User Acceptance Testing (UAT)like ALM(QC), Rally, Jira test cases mapped the Defects.
  • Key player in Performance Testing/Engineering and solution development.

TECHNICAL SKILLS

Tools: SOAP UI, JMeter, Dynatrace, Splunk Load Runner, HP ALM, Quality Center, Jira, SharePoint, Blueprint, TFS.

Databases: ORACLE, MS SQL Server.

Languages: SQL, PL/SQL, C, C++, Visual Basic 6.0.

Web Technologies: VB Script, HTML, XML

Operating Systems: UNIX, Linux, MS DOS, Windows 2000/XP/7/

MS Office Professional: Word, Excel, PowerPoint, Access, and MS Visio.

PROFESSIONAL EXPERIENCE

Confidential, New York

Performance Tester

Responsibilities:

  • Participated in requirement gathering from business analysts and technical architects.
  • Identify key business scenarios from application specialists and business analysts.
  • Gathered test data, application response time metrics i.e., SLA and critical business transactions for developing realistic test scenarios and load models.
  • Worked closely with the development team and customer to understand the application development components and created a test approach document for NFR proposal.
  • Planned and designed performance tests, identify the test environment and the performance acceptance criteria.
  • Record scripts using different protocols (web HTTP/HTML, web services).
  • Excluded unwanted requests in the script before recording with traffic filtering option and by regenerating the script in the recording summary after recording.
  • Determine whether the test is recorded correctly using TextCheck and Imagecheck.
  • Enhanced and developed scripts using LoadRunner functions VuGen.
  • Perform Parameterization and Correlation on the static and dynamic values of the script.
  • Worked on WSDL files, JSON/XML requests for Web services.
  • Executed Load test, Scalability test, Failover test in the Loadrunner Controller/ ALM.
  • Endurance test for multiple Applications.
  • Executed multiple Performance tests to validate the Application under test to assess the metrics utilized in capacity planning and also by graphs of LR Analysis.
  • Analyzed various graphs generated by Loadrunner Analysis and communicated bottlenecks to the System Administrators and the Developers.
  • Identifying the bottlenecks of components while executing the tests for an application.
  • Used monitoring tools like Dynatrace, HP Sitescope, and Perfmon to detect, isolate and resolve issues proactively.
  • Assisted Application Developers and technical support staff in identifying and resolving defects.

Environment: s: HP LoadRunner 11.x/ 12.x, JMetre 5.0, web http/html, Perfmon, Dynatrace, C Programming knowledge and Java.

Confidential, Jersey City, NJ

Performance Tester

Responsibilities:

  • Provided support in the performance testing usingJMetertask includes developing test plan, test script and reports.
  • Presently working in an Agile development environment. Participate in weekly Scrum meetings for the applications development
  • Handle multiple project deliverables simultaneously, drive NFR meetings with business and project stakeholders, assess application architecture and provide performance test scope and solutions, estimate project effort
  • Define Performance Test Strategy, create/ enhance performance test scripts with HP/Micro Focus Vugen 12.53
  • Create and execute peak load tests, stress tests, endurance tests using Performance Center / Load Runner 12.53 for Web UI applications, REST APIs, SOAP Requests and MS Azure cloud applications.
  • Diagnose performance bottlenecks of the application using Dynatrace and monitoring application health with Grafana, Appinsight and HP OV monitoring tools.
  • Analyze and review test results with project stakeholders, work closely with developers, middleware/devops team in reproducing and fixing the performance defects.
  • Manage and drive technical effort of multiple onshore and offshore resources business processes. Participates in business risk assessment, ROI and other financial analyses.
  • Created API testing (REST/SOAP) using JMter webservice and web samplers
  • Worked with DBA team for requirement gathering and test data collection for specific bind variables and parameters.
  • Worked with Production monitoring and other team admin for requirement analysis and POC validation purposes.
  • Created a basic scenario where each script is based on SQL query, which is unique and have different sets of users.
  • Run a failover test to understand the DB recovery functionality as per the SLA.
  • Monitored CPU utilization, Ram utilization while scaling the injector box for the capability of handling Vusers.
  • Helped offshore team members to understand the Business flow and requirements from the client.
  • Coordinate with other functional team members for the shared environment while executing the test.
  • Created workload profile based on the production peak load analysis.
  • Created several POC as per the client’s request to support the team and validate.

Environment: APACHE JMeter, Dynatrace, Splunk, Soap UI, IBM MQ, IBM WebSphere, IBM WAS, AIX, Dynatrace, DB2, AIX, Toad, BMC, MT putty, Cygwin, SoapUI, Host Servers, Apple pay, Samsung Pay, Device testing, Mobile testing, VM

Confidential, Falls Church VA

QA Tester

Responsibilities:

  • Gathered Non- functional requirement, designed, developed and executed a performance measurement plan used as the basis for assessing process capability.
  • Developed HP LoadRunner Vugen Scripts utilizing Virtual User Generator that emulates important application Load critical transactions.
  • Developed a Self-service framework to give developer testing access by clicking a button and test result will be sent through automated email as soon as test is over.
  • Performed root cause analysis of a system and finding the bottlenecks using monitoring tools CA Intro Scope and Dynatrace.
  • Coordinated with the functional QA team and Developers regarding the issues.
  • Performed Automated Load, Stress, Endurance and Peak Hour testing.
  • Monitored resources to identify Performance Bottlenecks, analyzed test results along with development team and database team and report the findings to the clients using LoadRunner.
  • Involved in Performance test planning, setting goals for the release and concerned in project level status meetings.
  • Tracked and reported the errors discovered using ALM Performance Center.
  • Prepared the gap analysis document by analyzing both the requirement specification document and design specification document
  • Involved in preparing the high-level Test Plan and developed Test Cases in accordance with the functional specifications
  • Responsible for creating test scenarios with Web-Load and executing and documenting the results and the scenarios.
  • Create performance test results using Load Runner Analysis
  • Extensively worked with Web-load to analyzed applications performance for varying Loads and Stress conditions.
  • Executed load tests on web-based and client-based applications.
  • Execute Load Test to verify application optimizations using LoadRunner Controller
  • Created LoadRunner automate test scripts for Load, Stress, Performance Test in Citrix, HTTP/HTML, Web Service protocol-based applications

Environment: Windows, Linux, LoadRunner (vugen, Controller and Analysis), JMeter, Performance Center, App-insights Dynatrace, Jenkins, SOAP UI, WSDL, XML, Json, Azure, CI/CD

Confidential, Pittsburg, PA

QA Tester

Responsibilities:

  • Perform Data driven testing using Selenium WebDriver and Junit Functions and JDBC.
  • Conducting hands on functional, and system integration testing; report, track and follow up on issues in a timely manner.
  • Assist with developing test plan timeline
  • Consulted with product owners and developers to fully understand intended Features and functionality.
  • Gathered Non- functional requirement, designed, developed, and executed a performance measurement plan used as the basis for assessing process capability.
  • Developed HP LoadRunner Vugen Scripts utilizing Virtual User Generator that emulates important application Load critical transactions.
  • Developed a Self-service framework to give developer testing access by clicking a button and test result will be sent through automated email as soon as test is over.
  • Performed root cause analysis of a system and finding the bottlenecks using monitoring tools CA Intro Scope and Dynatrace.
  • Coordinated with the functional QA team and Developers regarding the issues.
  • Performed Automated Load, Stress, Endurance and Peak Hour testing.
  • Monitored resources to identify Performance Bottlenecks, analyzed test results along with development team and database team and report the findings to the clients using LoadRunner.
  • Involved in Performance test planning, setting goals for the release and concerned in project level status meetings.
  • Assists and cooperates with co-workers, supervisor, and management
  • Work with development teams to create test plans for enhancements and fixes on applications
  • Responsible for all aspects of the QA cycle for assigned projects
  • Tracked and reported the errors discovered using ALM Performance Center.
  • Wrote Analyzed LoadRunner on-line graphs and reports to check where performance delays occurred, network or client delays, CPU performance, I/O delays, database locking, or other issues at the database server.

Environment: Java, JDK, IntelliJ Idea, POM, SQL Server, Jenkins, GIT, HTML, Java Script, CSS, JSON, XML, Maven, JIRA, App-insights Dynatrace, Jenkins, SOAP UI, WSDL, XML

Confidential, Dublin, OH

QA Tester

Responsibilities:

  • Worked with the development team to understand technical design and architecture for test planning.
  • Wrote Test Cases and Test Procedures based on the Test Plan.
  • Analyzed project documentation and prepared detailed Test cases.
  • Analyzed, Understood and Estimated requirements.
  • Performed Positive and Negative testing.
  • Developed a Self-service framework to give developer testing access by clicking a button and test result will be sent through automated email as soon as test is over.
  • Performed root cause analysis of a system and finding the bottlenecks using monitoring tools CA Intro Scope and Dynatrace.
  • Coordinated with the functional QA team and Developers regarding the issues.
  • Conducted UI testing, Functional testing, Regression Testing, Acceptance testing and verifying the results with expected results, which should complain with the predefined requirements.
  • Generated Defects Report summary from Quality Center/ALM for discussing in defect calls.
  • Used Quality Center/ALM for Manual Scripts execution, Result analysis and Defect reporting.
  • Involved in Performance test planning, setting goals for the release and concerned in project level status meetings.
  • Tracked and reported the errors discovered using ALM Performance Center.
  • Wrote Analyzed LoadRunner on-line graphs and reports to check where performance delays occurred, network or client delays, CPU performance, I/O delays, database locking, or other issues at the database server.
  • Prepared the gap analysis document by analyzing both the requirement specification document and design specification document
  • Bugs were reported to developers by using JIRA.
  • Used SQL queries in Automation tool to perform database testing and data retrieval.
  • Performed Back-end database testing using SQL queries.
  • Created and maintained Automation Framework using Cucumber.
  • Wrote Features with Scenarios using Gherkin language.
  • Involved in Weekly Status Meetings with development and management teams.

Environment: Java, Confluence, HP Quality Center, SQL, Windows NT/2000/XP, JIRA, Bitbucket, Jenkins

We'd love your feedback!