Sr. Performance Test Analyst Resume
Pittsburgh, PA
SUMMARY:
- Over Six years of diverse experience in the field of information technology with the emphasis on Performance testing and Software QA Analyst.
- Experienced in Testing both Web - based, Client/Server applications. Experienced in successful completion of software projects by executing on software quality activities throughout the software development life cycle.Actively involved in test plan development, test case development, test case execution, test results analysis, and defect reporting.
- Experienced with testing at different levels (Smoke, functional, Regression, integration, GUI, system, and performancetesting). Extensive experienced in developing or supporting test automation development usingHP LoadRunner, Apache JMeter and HP Quick Test Pro (QTP).Exposed to Performance, stress, and volume testing using Load Runner.
- Experience in analyzing application performance requirements, designing performance and load test scenarios, test environment, test data etc. Extensive experience on different load runner protocols (WEB (http/html), AJAX, TrueClient Oracle NCA and Citrix.
- Hands-on experience in using automated tools like Performance Center and test management using HP Quality Center.Experience with diagnostics tools to identify root cause like Dynatrace, HP diagnostics. Hands-on experience with performance testing JVM based applications.
- Experienced with designing scenarios in Controller for performing load, stress, endurance, and standalone tests along with analyzing results and reporting.
- Working knowledge in Cloud test using BlazeMeter with JMeter with REST, web/http services, JSON, messaging, real-time, authentication/authorization.
- Familiar with IOS Mobile Device and Android for Mobile performance testing. Knowledge of automate build tools and continuous integration (Maven, Jenkins, Taurus, Selenium Web Driver (Maven, Jenkins) and AWS.Working experience in HP products - HP performance testing using Performance Center, Selenium, Jira, ALM, Soap UI and Dynatrace AppMon.
- Experience in Vugen Scripting using C, VB, Java, and Java Scripting and Experienced writing Complex Queries in SQL for backend testing in Oracle, and MS SQL Server.
- Knowledge Oracle, SQL, Xml API's, Web technologies, ALM. Experienced in Relational Database Management Systems and back-end database testing.
- Hands on experience in entire performance testing life cycle with agile methodologies, skilled in performing load result analysis. Hands-on experience in Application Monitoring, Database Tuning, Performance, Debugging,and identifying root cause of System Bottleneck.
- Performed End to End test, performed web testing, Enhanced automation script using Quick Test Pro and SoapUi.Hands on experience in mobile device performance testing. Experience in reporting tools/reports testing - Dynatrace AppMon to get an understanding of application behavior and performance in production and test environment. Analyzed through Hotspot, Pure path and Timeline using Dynatrace AppMon.
- A motivated self-starter with exceptional team building, Leadership, Project Management and interpersonal skills.
TECHNICAL EXPERTISE:
Software Testing Tools: LoadRunner, JMeter, BlazeMeter, Performance Center/ALM, HP Quality Center, PUTTY, Quick Test Pro (QTP), Fiddler, Developer Tool, HP Diagnostic,
Operating Systems: MS-DOS, UNIX, Windows 9x/NT/2000/XP/Vista
Databases: MySQL, MS Access, Oracle, SQL Server, DB2 and H2
Web Performance: Fiddler, Webpagetest, PageSpeed Insights and HttpWatch
APM tools/Profiling: DynaTrace, SiteScope, Ajax Dynatrace, Site 24x7, Wily Introscope, Standard Set.
Methodologies: Agile/Scrum-Sprint, Waterfall, Iterative, V-Model and RUP.
Languages: C, UNIX Shell, SQL, XML, WSDL, HTML, CSS, SOAP/REST, VBScript, JSON, Java
Server: Web Logic, Web Sphere, Tomcat, Apache, JBoss and FTP
Others: Microsoft office, MS PowerPoint, MobiTest, Mobiready, Device Anywhre,, Firebug, VMware, Perfmon, SharePoint, CRM, JSON Viewer, GIT, Jenkins &Taurus and Doors.
PROFESSIONAL EXPERIENCE:
Confidential, Pittsburgh, PA
Sr. Performance Test Analyst
Responsibilities:
- As a Senior Performance Engineer, I am working alongside of the product development team and Collaborating with the agile team. Responsible for creating and maintaining a test framework which measures and reports on the performance of the application suite.
- Preparing the Test Plan based on understanding of Business requirements and the design of System Architecture and performing Functional, Integration, End-to-End Integration, Regression, Performance and UAT testing. Creating performance test strategy, design/define performance test scenarios on Load Runner 12.01/12.50 and Performance center 12.01/12.50.
- Extensively used HP Loadrunner for Developing Vuser Scripts in PeopleSoft Enterprises, Web-HTTP/HTML, Ajax TrueClient, Web Services and silverLight protocol. Performed Vuser scripts like parameterization, correlation and rendezvous points. Customized Loadrunner scripts in C language like string manipulation, conditional loop for the Loadrunner scripts.
- Used HP LoadRunner, execute multi-user performance tests, used online monitors, real-time output messages, and used LR functions.
- Uploaded scripts and Executed Performance tests using Performance Center 12.01. Used LoadRunner for SOA testing by creating SOAP request from the validated WSDL file.
- Configured scenarios and set up the monitors to capture the performance of the Application servers, Web servers and Database servers using Performance Center.
- Identify potential bottleneck issues per defined process and escalate the risk and mitigate the issues. Communicating and collaborate closely with developers, business analysts and internal stakeholders to provide guidance towards resolving issues.
- Monitored online graphs like Transactions per Second, Throughput and Response time at Client side and analyzed after the completion of test Reviewed code to identify any code errors.
- Analyzed various graphs generated including Database Monitors, Network Monitor graphs, User Graphs, Error Graphs, Transaction graphs and Web Server Resource Graphs.
- Responsible for Reviews and documentation for reporting the status to the project manager. Provided recommendations to the application owner on steps to meet performance goals.
- Monitoring Response time in PurePath Dashboard using DynaTrace client and Monitoring test as well as production environments on Dynatrace. Engaged to implement Dynatrace to monitor real time performance. Direct performance optimization efforts using Dynatrace and Helps differentiate bad and busy days through Dynatrace.
- Identifying hotspots and isolate performance problems in Internet Explorer and Firefox using Dynatrace. Work with cross-functional team of internal and external resources. Mentor, coach, and guide team members. 2 offshore and 1 onsite.
- Automation testing using JMeter 3.0, Developing Performance Load/Stress testing script.
- Performed Baseline test, stress test and high volume of ultimate users using JMeter and monitoring the performance of the load test on the system and measured database response time, Http request, Login and proxy server. Performing API testing with JMeter and Developing and deploy test Load scripts to do end to end performance testing using JMeter.
- Testing functionality and performance testing with SoapUi and verifying the web services and Working with Http Monitor used HttpWatch. Using Fiddler to create scripts for Smart View scenarios and for Pre performance testing.
- Developing status reports and communicates appropriate level of detail on Testing, Listener Performance Results. Effectively communicated performance results to application owners and management; provide feedback and recommendations to development teams and management
Environment: Load Runner, JMeter/BlazeMeter, Application Lifecycle Management (ALM), Performance Center, Dynatrace, Citrix Protocol, Quality Center, Java, .net , J2EE, JQuery, SMAPI, SAS, Web sphere, Web logic, OPNET, SharePoint, SQL, N-Tier Architecture, Windows 7, VMware, Apache, Tomcat and browsers (IE, Firefox, Safari, Chrome)
Confidential, East Lansing, MI
Performance Tester/ Functional QA Analyst
Responsibilities:- As a QA Performance tester and Functional tester I was responsible for setting the standard performance test plan and Communicated with the Homebrew team members to discuss the test report and follow up story in JIRA/Agile backlog.
- Writing and executing load, volumeand performance test for JAVA based platform. Building Performance test for web and Cloud based applications in Linux and Creating Load Test for the Profile Web Service and Database application across multiple high-profile projects in Java Implementation. Creating thread groups requests and run them for the Connection Service. Monitoring application and multi-web server metrics and analyze PerfMon Metrics.
- Analyzed Memory Load, CPU, Thread, Response Code and Network I/O Load for performance bottleneck issue triaging. Use Commands such as Top, Perfmon, Wget, Sir and Vmstat.
- Used FTP program to upload files for offline testing, as well as for looking at server log files.
- Running SQL query performance test with JMeter using JDBC protocol under some given load and capturing the impact of performance issues, and share results. Used Badboy to created embedded scripts and integrates with JMeter.
- Participated in the development of the project plans by outlining QA tasks, deliverables, deadlines, time estimates, etc. Develop and maintain Key Performance Indicators (KPI) for the various applications including response time, failover, time to failure, and recover.
- Worked extensively with Web/Http, Mobile TrueClient, Web service/ SOAP protocols and developed scripts using HP Loadrunner. Performed Test Data Management/Automation and Data Driven Testing with JMeter.
- Build test automation for UI and WCF/REST Services and created reusable and shareable components Using JMeter in Linux platform. Monitored CPU usages of the Server and analyzed Performance metrics to determine root cause.
- Assist QA Performance Team members and developing script using JMeter and running test in the BlazeMeter Cloud platform and sharing results. Executing different Scenarios Using Sampler, Controllers & Listener of JMeter like Benchmarking, increasing Load and Stress.
- Performed Testing Application servers, providing manual Navigation Using Ajax DynaTrace Edition and Analyzed through Hotspot, Pure path and Timeline using Dynatrace Ajax Edition monitor the list of existing KPI’s and their metrics such as Time to First Impression, Time to Fully Loaded and Time Spent in JavaScript.
- Installed and Configured Appdynamics and monitoring Business transactions, response time. Attended Daily Stand-Up Scrum meetings and weekly status meetings and Sending weekly report status to the manager.
Environment: LoadRunner,Spring Boot, JMeter, J2EE, DynaTrace, IBM WebSphere, My SQL, IBM HTTP Server, XML, JVM, CA LISA, Cucumber,Oracle Linux 6, F5,Tomcat, Java 8, WebLogic, ActiveMQ, JIRA, ETL, iOS 7, Mobile safari, SoapUI, VMware, Android 4.2,and Selenium (WebDriver)
Confidential, Minneapolis, MN
Performance Test Engineer & Functional Tester
Responsibilities:
- Performed Smoke, Functional, Systems Integration, Regression, Database testing and Performance testing at various phases of the development and test cycles
- Extensively worked in Web, Mobile/True Client and Web services (SOAP) Protocol in Loadrunner, simulate virtual users and transactions and simulated user think time and Pacing time as well and configured Vugen settings.
- Configured LoadRunner Controller, Load Generator and Execute Performance Test for multiple cycle of Test scripts. Developed and Implemented load and stress test with LoadRunner, and present performance statistics to the Application Teams.
- Uploaded Scripts, Created Timeslots, Created Scenarios, Maintain scripts and Run the Load Tests in Performance Center. Analysis Test results Response time, Transaction per Seconds and Throughput per graphs.
- Monitoring Application Server through Analysis server access log debugging application used Debugging proxy tool like Fiddler/Firebug and performance issues.
- Analyzed Measuring Response time, TPS/ Throughput under load through LAN connection for mobile Application performance.
- Configure application through profiling tools such as VisualVM, Jconsole. Used Wily Introscope for Monitoring J2EE Applications and Monitored the Resources metrics to find the performance bottlenecks. Performed GUI test using QTP.
- Design and develop automation testing scripts and executed them to perform verification tests on application using Quick Test Pro. Developed complex SQL queries to performed back-end testing in MS SQL Server RDBMS.
Environment: LoadRunner, JMeter, .net, Device Anywhere, J2EE,JProbe, JSP,IIS, MS SQL, C++, Bugzilla, XML,JVM, UNIX, Web logic, Web sphere, Oracle, IBM HTTP Server, LoadRunner, Performance Center, ALM, QC, SoapUI, VMware, Wily Introscope
Confidential, VA
Software QA Analysts
Responsibilities:
- Analyze the Functional Requirements and Design Specifications documents to ensure that the system met all of the technical and business requirements of the applications. Manually generate and implement templates for Test Plan, Test Cases, and Test Scripts and performed Manual Testing on the entire application.
- Performed Integration, System, User Acceptance, Functional and Regression testing. Used Quality Center to develop test cases, test scripts, executing the scripts. Worked closely with software developers, business analysts, Sys Admin, and other project management personnel involved in Software Development Life Cycle (SDLC).
- Perform back end testing by verifying the data in the Oracle Database.
- PVCS Tracker is used to investigate software bugs and interacted with developers to resolve technical issues.
- Involve in the team meetings with representatives from Development, Database Management, Configuration Management, and Requirements Management to identify and correct defects.