Sr Performance Engineer Resume
New, JerseY
SUMMARY
- Over 10 Plus years of Quality Assurance experience with strong expertise in Performance/Load & Stress Testing using HP Performance Center/LoadRunner.
- Experienced on Mercury Test Suit (Test Director/ Quality Center, LoadRunner, WinRunner and Quick Test Professional) and rational tools.
- Extensive experience in automated testing of Web based and Client/Server applications with proficiency in Load and Performance Testing. Good experience in agile methodology
- Experience in analysis, design, implementation, execution, maintenance and documentation for system testing.
- Proficient in writing test plans, test cases, test scripts and test result reports.
- Performed Performance Testing, Functional Testing and Regression Testing using automated testing tools including LoadRunner, Performance Center, Quick Test Pro, Quality Center, WinRunner and Test Director.
- Significant experience Load testing various applications including .Net, Websphere, J2EE, CRM, Business Objects and Citrix implementations.
- Extensive experience using LoadRunner for Performance Testing, Stress Testing, Longevity Testing and Regression Testing.
- Proficient in Creating and Enhancing scripts, Executing Tests and Analyzing results using Load Runner, Performance Center and Jmeter.
- Experienced in Design and Execution of Test criteria, Scenarios, and Scripts from requirements.
- Participated in project design and review meetings.
- Experienced in Planning and Translation of Software Business Requirements into test conditions; execution of all types of tests; and identification as well as logging of Software bugs for business process improvement.
TECHNICAL SKILLS
Automation tools: Loadrunner 11.0/9.5/9.1/8.1/7.8/6.5 , Jmeter, Winrunner 6/7/8.2X, Quicktest Pro 6.5/8.0, JProbe, Selenium, Test director 6.5/7.0/7.6, Requisite pro, Test Manager, HP diagnostics Server, HP Site Scope
LoadRunner Protocols: Web Services, Citrix, Oracle NCA, PeopleSoft 8, ODBC, Sybase Ctlib, Sybase Dblib, IMAP, SMTP, POP3, Web HTTP/HTML.
Databases: Oracle 11i,8i/8.0/7.0, Teradata V2R3/V2R4/V2R5, Sybase 11.x/12.0, MS SQL Server 6.5/7.0/2000 , MS Access 7.0/97/2000,IBM DB2 UDB7.0, Informix
Languages: SQL, PL/SQL (Stored Procedures, Functions, Triggers, Cursors), Pro*C, C/C++, HTML 4.0, Visual Basic 6.0/5.0.
Web: Java Web Server 1.2, Microsoft Personal Web Server, Web Logic Server5.x, HTTP.
Operating Systems: Sun Solaris 2.6/2.7, HP - UX, IBM AIX 4.2/4.3, MS-DOS 6.22, Win 3.x/95/98, Win NT 4.0,Win 2000,Windows XP, SCO Unix, HP9000
Other: Ms Word 2000/XP, Visio 5.0, Testdirector 7.2/7.6.
PROFESSIONAL EXPERIENCE
Confidential, New Jersey
Sr Performance Engineer
Responsibilities:
- Configure the LoadRunner controller for running the tests. Verifying that the LoadRunner scripts are working as expected on different Load generator machines
- Developed Vuser scripts using LoadRunner Web (HTTP/HTML).
- I have done extensive functional testing of AI API’s between 3.5 and 4.1 versions.
- I have used Jmeter to test the API’s and also for performance testing.
- Developed Load Test Scripts by using LoadRunner for entire site and did the Parameterization, Pacing, and correlation.
- During API validation, I have crated JMX files to exhaustively test the API name, value params for large data sets.
- Responsible for all phases, planning, developing scripts, execution of performance center scenarios and analysis.
- Parameterization and correlation was done to all web service scripts.
- Before starting performance tests, I conducted basic sanity checking for API’s
- I created and executed API based test cases for newer API and any API’s undergoing changes.
- In performance/load/stress testing I focused on following scenarios: without the AI caching layer performance of the search results for end user with 50% cached queries, with 100% cached queries and gather performance metrics like (a) response time per query (b) throughput (c) data size (d) based on the SLA, any query which had higher res.time than the SLA value (timeouts)
- Responsible for preparing the data sets from production servers.
- We basically did performance metrics for the API’s getSuggestion, search, feed statistics and getCategory.
- We also tested for certain client for backward compatibility.
- In addition, during peak traffic and new product launches I supported our production traffic monitoring. We have Splunk system enabled, which indexes our request access logs generated by Web logic servers. We filter these logs depending on the API, client, language and provide statistics for upper management.
- Sometimes we collect these data, including bot attack data from production and use it for our load tests.
- Analyzed the results with team members and made some recommendations, after making some changes to the data.
- Setup the meeting with the team members for the discussion on Performance issues.
- Filing bugs based on test results and followed up with dev team on critical bugs. Helped dev team to reproduce on their desired environment Confidential time on difficult to reproduce bugs. Also, promptly validated fixed bugs in IT/UAT environment.
- During Performance/load tests, monitor system parameters like CPU, memory utilization, disk space usage and file any radars for utilization excess or memory leaks.
Environment: Jmeter, Loadrunner, Performance Center, Diagnostic Tool, web, Windows 2000/XP, SOUPUI, SQL Server, Network analysis, Quality Center, Diagnostic Tool, web, Windows 2000/XP.
Confidential, Washington
Sr Performance Engineer
Responsibilities:
- Involved in preparing high level scenarios based on Agile Methodologies for each Scrum.
- Developed Test plan, Traceability metrics mapping with Requirements and Test Cases.
- Developed Load Test Scripts by using LoadRunner for entire site and did the Parameterization, Pacing, and correlation.
- Worked close with clients Interface with developers, project managers, and management in the development.
- Developed Vuser scripts using LoadRunner Web (HTTP/HTML).
- Tested web services applications using SOAP Client as well as by using WSDL Files.
- Responsible for Web Services testing and testing AJAX http request.
- Enhanced Vuser scripts by introducing the timer blocks, by parameterizing user id's to run the script for multiple users.
- Responsible for testing backend Oracle database.
- Extensively monitered the UNIX servers. Monitored Web sphere using HP Diagnostics
- Created Various Vuser Scripts basing on the Critical Transactions Used by the Real Time users using VuGen of Load Runner.
- Responsible for setting runtime settings in LoadRunner.
- Correlated the dynamically created session data in the load test scripts in VuGen to synchronize with the application.
- Responsible for performance testing using Loadrunner and Jmeter.
- Developed Load/Stress scenarios for performance testing using the LoadRunner Controller.
- Configured Tomcat server, Data Base Server, Apache Server and Static Servers in site scope to monitor Memory Utilization, CPU Utilization, Throughput, Network Connections, etc in LoadRunner
- Defined and configured SLAs for hits/sec, throughput, transactions per second in LoadRunner.
- Responsible for monitoring different graphs such as Throughput, Hits/Sec, Transaction Response time and Windows Resources while executing the scripts from LoadRunner.
- Analyzed the results of the Load test by using LoadRunner Analysis tool to identify bottlenecks.
- Configured Production Server System settings on Load Test Servers and Created Load/Stress testing scenarios for performance testing using LoadRunner Controller by creating virtual users.
- Prepared detailed Performance Test Analysis Report with Graphs and the application bottlenecks from the scripts execution.
- Performed Backend testing by integrating SQL queries within scripts and validated the backend workflow under load testing.
- Developed and executed complex SQL Queries and Procedures to perform database testing.
- All the bugs were tracked and updated in defect tracking tool JIRA.
- Participated in the Go/No-go meetings.
Environment: Loadrunner, Performance Center, Jmeter, HP Quality Center, Sitescope, Unix, Windows, Wily Introscope, JAVA, Jboss, Weblogic, Oracle, XML, SQL Server, Network analysis, MS Access, MS Vision, MS Project, AJAX VB, J2ee analysis, HTML.
Confidential, Portland, OR
Sr Performance Engineer
Responsibilities:
- Understood and digested the business requirements and preparation of the Performance Test Plan and Test scenarios
- Created a number of Load scripts for Data seeding purposes.
- Involved as a Performance Testing Analyst responsible for establishing the Individual Benchmarks and Baselines for test applications.
- Created a number of LR Scripts for Applications deployed on a variety of platforms including Java J2EE, .NET.
- People Soft system is capable of receiving real-time requests for various records. PS provides a User Interface (GUI) to view all the data which comes from BCBS application
- Tested nearly 10 major BizTalk applications that integrates the data from Kelly Front Office systems (Kelly Staff Net, Bullhorn) Vendor Management Systems to Kelly BackOffice systems (PeopleSoft/Mainframe). Testing also includes a .Net application & a few Web Services that’s part of the integration architecture.
- Performed testing the data flow between KSN-PeopleSoft, KSN-BH, BH-KSN, Time and Expense Integration (TEI) applications to make sure all the values that are being sent from KSN application to PeopleSoft via Biz-Talk Application are mapped correctly.
- Responsible for the Scripting all the load testing scenarios for various sub-systems using a variety of protocols including WEB(HTML/HTTP), Web(Click and script), Web Services etc.
- Created LoadRunner scripts manually as and when needed without relying much on the recording feature
- Performed manual correlation without relying on the Correlation studio feature of LoadRunner VU.
- Involved in performing volume testing based on the production volumes and cycles.
- Responsible for creating the Load Distribution tables for various scripting modules involved.
- Performed manual testing of various business scenarios for which Automation is not an option.
- Responsible for coordinating the Batch processes alongside the Online performance testing efforts.
- Used HP Quality center for Test management and defect tracking.
- Responsible for performing analysis of the load tests ran and presenting the same to the management
- Generated load test reports and performed distribution of reports and publishing.
- Shared knowledge and helped the team with troubleshooting of scripting and other performance testing activities
- Maintained an exclusive Load Test Database instance dedicated to Load Testing activity.
- Responsible for getting the database rolled back after the load tests are completed.
Environment: HP LoadRunner 11.0/9.5, HP Performance Center 9.5, HP diagnostics Server, HP Site Scope, XML, ASP.Net, ASP 3.0, Microsoft DNA, BEA, Web Services, Jmeter, peoplesoft, HP Quality center 10.0,JavaScript
Confidential, Saint Paul, MN
QA/Performance Analyst
Responsibilities:
- Analyzed the requirements and attended several test-survey meetings with the clients to collect Application Performance specifications and SLA parameters needed as part of deliverables.
- Involved as a Performance Testing Analyst for establishing the Individual Benchmarks and Baselines for Java J2EE Applications, ERP Modules including BI/BW, FI/CO, and Finance etc
- Responsible for the Scripting all the load testing scenarios for various sub-systems using a variety of protocols including WEB(HTML/HTTP).
- Created a number of Load scripts for Data seeding purposes.
- Performed manual correlation without relying on the Correlation studio feature of LoadRunner VU.
- Involved in performing volume testing based on the production volumes and cycles.
- Responsible for creating the Load Distribution tables for various scripting modules involved.
- Responsible for coordinating the Batch processes alongside the Online performance testing efforts.
- Maintained an exclusive Load Test Database instance dedicated to Load Testing activity.
- Responsible for generating and publishing Load Test Results and publishing the results on the internal portal
- Responsible for getting the database rolled back after the load tests are completed.
- Responsible for the coordinating the new Transports to the Performance testing environments.
- Responsible for Scheduling/Kicking off the Load tests through via HP Peformance Center/Controller involving a variety of Load combination scenarios.
- Drilled down the problematic pages in Analysis to find out where the performance degradation is has been occurring.
- Analyzed all the various performance metrics involved in the test run like Web resources, Windows resources(Via SiteScope), IIS Counters, J2EE Monitors, .NET Counters, Oracle Counters etc.
- Pinpointed the bottlenecks present in different layers of the Application and Identified Memory Leak in the App and made recommendations to overcome the same.
- Performed a cross of the LoadRunner results of between iterative baseline tests.
Environment: LoadRunner 9.0/8.1, HP Quality Center 9.0 HP Performance Center 8.1, Java J2EE, SAP BI.
Confidential, Holmdel, New Jersey
QA/Performance Analyst
Responsibilities:
- Involved as a Software Quality Assurance Analyst, involved in SQA Analysis, Design, and Testing of the application.
- Created detailed Test plan, Test Scenarios and Test cases for the supply chain applications according to the business requirements.
- Scripted various user scenarios and created Test Scripts to emulate the real user, using Load Runner’s VuGen utility.
- Parameterized and correlated many variables such as SessionID, CustomerID, ServerName etc. in the recorded scripts to exactly simulate the real user.
- Created and scheduled Endurance, Load and Stress test Scenarios using Load Runner’s Controller to test the application under all kinds of loads.
- Analyzed the response times of various transactions load using LoadRunner Analysis and found the bottlenecks in the application using the monitors such as application server, database(oracle) and the network monitors.
- Developed reports and graphs to present the load/stress test results to the Working Team.
- Interacted and worked with the development team to solve the problems encountered in the test scenario run.
- Involved in Generating and modifying the LoadRunner Scripts and the corresponding configuration files involved there in.
- Involved in running the Load Scenarios Confidential various loads designed.
- Verified the volume of transactions completed over a specified period of time Confidential varying loads measured as Transactions Per Second (TPS)
- Identified the potential bottlenecks with increase in load on the system.
- Identified the system resource utilization/availability Confidential varying loads. (Memory, % Process time for different processes and System, Request queuing, %Disk activity level, %Paging level, % Cache hit ratio).
- Verified the average growth in unavailable physical memory (Private bytes)for a specific process over a specified number of transactions under a specified workload.
- Verified the server re-cycling period, before the application runs out of resources (memory).
- Responsible for Exporting the generated of the Test steps and ultimately the Test Plan using the TestDirector 7.2 plugin for Microsoft Excel 2000 and exporting the generated
- Involved in Database Testing with WinRunner 7.0,Microsoft Query utility using the Runtime Database Checkpoints.
Environment: LoadRunner 7.0/6.5, WinRunner 7.0, TestDirector 7.2, TeamTrack Microsoft JVM, Windows NT, JavaScript, Oracle 7.3, Microsoft ASP, MTS, PC Anywhere, eGain, TeamTrack 4.30, Microsoft ASP, XML
Confidential, Pittsburgh, PA
QA/Performance Analyst
Responsibilities:
- Responsible for Generating the key Virtual user scripts using the Load Runner VUGen utility 7.0 for web(HTTP/HTML),LDAP and WINSOCK Protocols.
- Responsible for Monitoring the Application’s performance under Load using the key Web Server Monitors,Web Application server monitors for WebSphere, IIS 5.0,Apache monitors and NT Performance Monitors
- Made many enhancements to the recorded scripts by correlating, parameterizing, inserting debugging messages and string manipulation and any other script enhancements as and when needed.
- Configured various WebSphere monitors for WAS applications to figure out which of the several servlets/JSPs caused the problem.
- Created quantifiable load with test-scenarios for various applications (both standalone and integration) using LoadRunner's Controller.
- Responsible for Running the LoadRunner scenarios for the Vuser using Load Runner Controller 7.0 and monitoring the server response times, throughput,Hits/sec,Trans/sec Transaction Response under load, Web Server Monitors, App server monitors, system monitors such as java processes and a host of other Performance metrics.
- Involved in the analysing the results after running the Loadrunner scenarios using the LoadRunner 7.0 Analysis with the host of graphs that can be generated
- Involved in the Regression testing of the App using WinRunner 7.0.
- Involved in the writing of custom functions and call scripts and adding the same to the Function Generator of WinRunner 7.0 and writing the Compiled Modules and firing them in the startup scripts.
- Involved in Execution of the generated WinRunner scripts remotely using the Test Lab utility in TestDirector 7.0 i
- Involved in the Mangement of the WinRunner scripts and the Bug Database using the TestDirector 7.0 i.
- Used various servers and ran SQL queries in SQL Server 7.0 on the back end to ensure the proper transaction of data during various tests.
Environment: LoadRunner VuGen 7.0, Controller 7.0, Analysis 7.0, WinRunner 7.0, QuickTest Pro 5.0TestDirector 7.0 I & 2, Java J2EE, JSP, Oracle 8i, XML, IBM WebSphere
Confidential, Orange Park FL
QA/Performance Analyst
Responsibilities:
- Studied Use Cases, Work Flow diagrams and Prototypes developed by Business Analysis team.
- Actively Involved in Creating and implementing Test Plan. Constantly reviewed project documentation and updated test deliverables.
- Was involved in acquiring the nonfunctional requirements and setting up the SLAs.
- Created and executed Test Cases. Updated test cases time to time. Prepared test data for various test scenarios.
- Created Scripts using LoadRunner for business critical applications.
- Conducted data integrity tests. Tested business logic incorporated with stored procedures. Used PL/SQL Developer for backend/database testing.
- Participated in various project meetings.
- Logged Defects. Created Defect Reports and defect Metrics from TestDirector
- Created invoices, inventory adjustments and sales Test data using POS Application. Extracted the test data from the source verified in stage and EDW data base using SQL queries.
- Tested log cleanup component to clean the log messages for Sales Forecast Extract (SFX), Sales Forecast feed (SFF), Store Replenishment Extract (SRX), Store Replenishment Feed (SRF) and log cleanup component itself.
- Created and executed scenarios that emulated typical working conditions using LoadRunner Controller.
- Created virtual users in LoadRunner for Performance testing and analyzed the reports based on the different scenarios.
Environment: VB.Net, window’s Xp,Informatica 6.1, LoadRunner 7.5, SQL Server 2000, Teradata V2R4, Web Logic 6.x, MVS, IBM 4690 POS, MQ E-Adopter, Micro Strategy 7.5.2, Harvest 5.1.1, Quick test pro 6.5, Test Director 7.6
Confidential - Dallas Texas
QA/Performance Analyst
Responsibilities:
- Involved in setting up the Test Lab for testing on distributed clients with different O/S and browsers combination for Functional Testing and Performance Testing
- Developed all the necessary Vuser scripts using Load Runner VuGen, to accurately emulate the users, for load and stress testing of the Application as a whole and the backend systems using LoadRunner VuGen 6.5 and responsible for the creation of test scenarios and scheduled in Controller 6.5 from scratch.
- Involved in running the Scernarios for the generated Vuser Scripts in LoadRunner 6.5 Controller.
- Generated the Analysis files from the Results files using LoadRunner 6.5 Analysis.
- Merged necessary graphs and analyzed to find out the bottlenecks.
- Performed the system testing consists of three different strategies: individual test cases, integrated, regression and performance evaluation stages. Used SQL and PL/SQL to generate the test data and verify the outputs.
Environment: LoadRunner 6.5, WinRunner 6.0, MS ASP, MS Site Server Commerce Edition3.0, COM+,MS Exchange Server, Windows 2000 Server, SQL Server 7.0, Visual Studio 6.0, RSW e-Load, e-Monitor, Microsoft VSS