Performance Test Lead Resume
MinneapoliS
PROFESSIONAL SUMMARY:
- Professional with more than 13 years IT experience in IT experience in Performance Engineering of applications using Performance Testing, Application Monitoring, and tuning.
- Gather nonfunctional requirements, understand functional specification document for application development, enhancements and prepare test plan document for performance engineering.
- Responsible for performance testing project activities including requirements gathering, planning, designing scripts and creating test data, executing load tests, and reporting observations & recommendations.
- Responsible for E2E Performance Testing lifecycle, monitoring, Performance Analysis, Performance tuning and route cause analysis of Bottle necks.
- Coordinate and Collaborate with other developers and testers to develop, test, and implement application features or requirements.
- Onsite & Offshore performance engineering team co - ordination.
- Perform Proof of Concept using LoadRunner and JMeter. Responsible in development of performance test frameworks.
- Complete Performance Engineering by using test scripting/execution skills utilizing Performance Center, LoadRunner, StromRunner & JMeter using various protocols.
- Enhancing the scripts with transactions, verification checks and implement code for error handling.
- Creating different scenarios based on the load patterns. Upload JMeter scripts in Bitbucket and execute jobs in Jenkins.
- Improve performance of Web Applications by identifying potential bottlenecks and system performance.
- Monitoring server Utilizations, Heap Memory, Garbage collections and problems using Dynatrace One Agent.
- Responsible for performing Memory and Heap dump analysis. Monitor databases like Oracle and SQL Server and provide query tuning using execution plans.
- Monitoring and analyzing Application servers like Apache Tomcat, IBM WebSphere using Sysdig, Kibana Tools.
- Monitor Web, Application and Database servers using HP Open view, Performance manager and Grafana.
- Tuning the JVMs and provide recommendations for optimal performance.
- Responsible for providing detailed Dynatrace analysis report and tuning recommendations.
- Responsible for providing performance issue logs from Kibana using Elastic search and Logstash.
- Monitor Production volumes, response times and logs using Splunk.
- Monitor and execute tests for Microsoft Azure cloud applications using AppInsights.
- Responsible in using the Azure DevOps, activities such as creating stories, tasks, templates, managing the sprints & Schedules, defining points for each story.
TECHNICAL SKILLS:
Load Testing tools: Load Runner, StromRunner, JMeter (Blazemeter), Performance Center, Soap UI, Oracle SQL Developer, Putty, Postman, Jira, WinScp, SQL Query analyzer, Azure DevOps, Grafana, ELA, Kibana, Splunk, Jenkins, Bitbucket
Protocols: Web, RTE, Web services, Oracle NCA, Java Vuser, Siebel Web, SAP GUI, RDP
Languages: C, Java, .Net,SQL
Databases: Oracle 11g, SQL Server and Cassandra
Performance monitoring tools: SiteScope 11, Perfmon, UCPS, Performance Manager, Wily Introscope, Dynatrace Appmon, Dynatrace One agent, Grafana, Splunk, Elastic Serach, Logstash, Appinsights
Operating Systems: Windows series, Unix and Linux
PROFESSIONAL EXPERIENCE:
Confidential, Minneapolis
Performance Test Lead
Responsibilities:
- Discussing and identifying with business team for Non-functional requirements.
- Review and finalize the critical business processes with business, project and support team.
- Create test strategy and review the test strategy with business and all the teams.
- Coordination with global teams (offshore teams) for project deliverables.
- Review and finalize the critical business processes with business, project and support team.
- Create test strategy and review the test strategy with business and all the teams.
- Configure the performance monitoring metrics for load and peak test execution.
- Performance test scripting using HP Vugen, Jmeter, enhance load runner scripts with technical code for proper exceptions and business validation.
- Complete Performance Engineering by using test scripting/execution skills utilizing HP Performance Center and Load Runner in various protocols. Review the LoadRunner scripts and share the comments to offshore team.
- Execute the Load and peak tests and share the test results to business and project team.
- Configure dynatrace with application and Web servers to monitor utilizations.
- Set up monitoring in dynatrace, Grafana for Confidential applications. Analyze purepaths and provide deep dive analysis for high response times.
- Responsible for performing Memory & heap dump analysis. Working with databases like Oracle, SQL Server, Tomcat. Monitoring and analyzing application severs like Apache Tomcat, IBM WebSphere using Kibana Tool.
- Monitoring pure paths, application metrics, etc. in Dynatrace.
- Review the test results shared by offshore team and provide recommendations if there are any performance issues.
- Provide tuning recommendations and retest once it is fixed to check the performance improvement.
- Arrange meetings with project team and business team if there are any performance issues.
- Prepare final test closure memo document and seek the sign-off before the code moves to production.
Environment: LoadRunner, Dynatrace OneAgent, JBoss, Performance Manager, Oracle, SQL Server, Grafana, Microsoft Azure, Splunk, Oracle DevOps, Kibana, Jmeter, Blazemeter.
Confidential, Wisconsin
Performance Test Lead
Responsibilities:
- Gathering user stories and non-functional requirements for performance testing.
- Monitored dyanTrace for high response times, CPU, GC, methods in dynaTrace.
- Monitored QPASA, App Watch, application and Integration servers.
- Created JMeter web service scripts and executed tests.
- Monitored Oracle Enterprise Manager to identify DB contention and long running queries causing high response times.
- Shared test reports and recommendations with Client.
- Raised defects and discussed with development for fixing performance bottlenecks.
Environment: JMeter, dynaTrace, Cassandra, Oracle Enterprise Manager, QPASA, Web Logic 11g, QPASA, App Watch
Confidential, USA
Performance Test Lead
Responsibilities:
- The goal is to conduct the performance evaluation in an environment that mirrors production.
- Responsible for test design and execution for Performance systems and is accountable for the following responsibilities:
- Knowledge Transition for the applications in Performance testing scope.
- Meet QA Environment Stakeholders and identify applications and owner.
- Meet with application owners and collect application data. Participating in the performance design review with the implementation team and providing review comments, as and when required.
- Scheduling, identifying and tracking progress of project milestones.
- Offshore co-ordination and delivery/execution as per agreed SLA's.
- Creating weekly project status reports and delivers to customer.
- Manage project risk and escalate issues to appropriate level of leadership for resolution.
- Manage the full execution of the quality assessment process including but not limited to test case execution, progress of test cycles, status and details of testing for Performance, Network &Firewall Testing.
- Developing performance Quality consulting plan& implementation. Analyzing the requirements & outlining best possible test options to the customers.
- Develop/maintain accurate scripts and scenarios to support Performance Test execution needs.
- Environment Setup & verify infrastructure upgrade using scripts developed.
- Providing Support and guidance in different phases of the project by measuring the output Quality Criteria artifacts across all different work streams.
- Preparation of risk assessment and capacity management and advice management regarding the application performance under various load capacities.
- Create detailed test scenarios for identifying Performance bottlenecks and implement the same using HP Load Runner/Performance Center.
- Design and execute different types of performance workload scenarios for enterprise applications.
- Collecting and compiling various usage reports from multiple sources and preparing the final performance analysis report.
- Monitoring load balancer to analyze the load sharing between servers and optimize them to handle bulk requests to avoid bottlenecks in the network.
Environment: HP Application Life Cycle Management - HP Quality Center, CA Wily Intra scope
Confidential, USA
Performance Test Lead
Responsibilities:
- Interacting with Client for gathering requirements, Volumes, Application configuration, etc..
- Created Test Strategy/Plan and approach as per Non-functional requirements.
- Created Work-Load Model using the production volumes.
- Discussing with functional, data migration team, SME’s for data set up and data creation.
- Creating SAP GUI scripts using Load Runner and validating the scripts in ALM PC.
- Reviewing the Load Runner scripts based on scripting standards.
- Monitoring the SAP system components and resource utilizations during test execution.
- Creating the monitoring set up and identifying key metrics for SAP web/application servers.
- Monitored ST03, STAD, ST04, SM12, SM66, SDF MON.
- Preparing & finalizing the test analysis report.
- Extract the Automatic Work Load Repository (AWR) report and analyze the database performance.
- Tuning the SAP application/web servers if there is any response time’s deviation.
- Involved in creating Performance project sign-off document.
Environment: SAP Oracle, SAP HANA, OTC, PTP, STP, PME, Performance Center 11.52, QC 11.52HP Open View, ST03, STAD, ST04, SM12, SM66, SDF MON
Confidential
Senior Performance Test Engineer
Responsibilities:
- Involved in gathering requirement for Episys, store line and Bizerba applications using transactional volume model and discussing with business team.
- Preparation of test strategy, test plan and approach as per Non-functional requirements.
- Discussed with business team and finalized the business critical scenarios.
- Successfully tested Oracle Fusion Middleware’s like oracle data integrator(ODI),service oriented architecture(SOA), Store line service bus(SSB), Retail integration Bus(RIB) during end to end testing.
- Created data management plan and involved in test plan creation.
- Created and reviewed integration end to end test cases Episys, Bizerba and Store line and uploaded in quality center.
- Discussed with development, integration and database teams and resolved critical defects.
- Created LR scripts using Oracle web applications 11i, Web services and RDP Protocols.
- Reviewed the Virtual Generator scripts based on Morrisons scripting standards
- Analyzed AWR report and provided Recommendations.
- Analyzed Episys, Bizerba and Store line applications and created proof of concept document.
Environment: Episys, Bizerba, RIB, Store line, Oracle Fusion Middleware, SOA, ODI, SSB, Performance Center 9.52,Quality Center 9.2, Site Scope 11, Oracle SQL Developer, HP Performance Manager, WebLogic
Confidential
Senior Performance Engineer
Responsibilities:
- Involved in gathering requirement for Oracle Retail Warehouse Management system(ORWMS) using transactional volume model and discussing with business team.
- Preparation of test strategy, test plan and approach as per Non-functional requirements.
- Discussed with business team and finalized the business critical scenarios.
- Successfully tested Oracle Fusion Middleware’s like oracle data integrator(ODI),service oriented architecture(SOA), Oracle service bus(OSB), Retail integration Bus(RIB) during end to end testing.
- Created data management plan and involved in test plan creation.
- Created and reviewed integration end to end test cases for Retail Merchandizing System (ORMS), Oracle Retail Invoice Matching (OReIM) and Oracle Retail Warehouse Management (ORWMS) and uploaded in quality center.
- Discussed with development, integration and database teams and resolved critical defects.
- Created LR scripts using Oracle web applications 11i and RTE Protocols
- Reviewed the loadrunner scripts based on morrisons scripting standards
- Monitored all the components and resource utilizations during test execution.
- Prepared & finalized the test analysis report.
- Created Performance Test completion report after project signoff.
- Created Project Hand Over document after project signoff.
- Involved in creating Performance Test Quality Gate document.
- Analyzed RMS and WMS applications and created proof of concept document.
Confidential
Performance Test Engineer
Responsibilities:
- Involved in Gathering Performance requirement and prepared the Proof of concept.
- Preparation of test strategy, test plan and approach as per Non-functional requirements.
- Assistance in validating the volumetric requirements based on transaction volume model.
- Tested ASDA application with JMeter performance testing tool and prepared proof of concept report.
- Involved in end to end Monitoring setup.
- Create Load Runner scripts for required business Scenarios using web (html/http) protocol.
- Reviewed the loadrunner scripts based on Wal-Mart scripting standards and provided review comments.
- Built Performance test scenarios to reflect expected live usage of the system, using the pre-defined sets of business transactions to uncover and fix performance and scalability issues.
- Tuning the software elements in collaboration with the development teams as appropriate.
- Monitoring the web, app and database layers with Perfmon Counters.
- Analyze graphs for web, App and DB server system parameters.
- Created Performance Test completion report after project signoff.
- Created Project Hand Over document after project signoff.
- Involved in creating Performance Test Quality Gate document.
Environment: Apache Tomcat 2.26, Oracle 10g, Jboss 4.0.5, windows XP, Vugen 9.52, Performance Center 9.52, JMeter, Bad Boy
Confidential
Senior Technical Associate
Responsibilities:
- Performance and Resilience testing for Application server, Middlewares, and Database.
- Create realistic workload scenarios for Load, Resilience, Operations Readiness, Platform regression, performance regression tests and executed using Load runner.
- Involved in the preparation of the test data and test environment for prior to starting tests.
- Monitor the application statistics using UCPS.
- Analyze the performance test results, document preliminary and summary reports, identify bottlenecks
- Analyzed Backend call responses.
- Interact with the clients and other teams, attending the Conference calls.
- Prepared the Test track report, Risk & Issue register.
- Extracted the StacksPack or AWR report and analyzed the database performance.
- Provide measurements for future releases and Assure the Application meets Non Functional Requirements on performance.
- Participated in project review meetings & discussions
Environment: WEBLOGIC 8.1, Cyclone, HUB, STAA, Zeus LB, SUN V890, IBM MQ, Load Runner and Java Scripts, Quality Center, UCPS.
Confidential
Senior Technical Associate.
Responsibilities:
- Did the pretest test activities and post test activities for all the tests.
- Monitored the Weblogic console, CPU, Memory, space utilisations and TPS, response time & different graphs, etc... in LR Online monitors.
- Analyzed the graphs & reports based on the target requirements.
- Executed different types of tests like Sanity, Load, Stress and benchmark tests.
- Identified the bottlenecks, provided the recommendations and tuned the application.
- Prepared & finalized the test analysis report.
Confidential
Technical Associate
Responsibilities:
- Overall coordination and monitoring of the performance related activities for CCM Perf Testing.
- Resolving outstanding issues and queries if any.
- Liaising with designers during issues encountered while pumping XML’s
- Review deliverables, Conduct team meetings.
- Discussed with technical team to conform the rate to pump the messages in different queues.
- Configured the harness tool with different order ids, queue names...etc
- Identified the bottlenecks, provided the recommendations and tuned the application.
- Prepared & finalized the test analysis report.
Environment: CCM Harness Tool, UCPS, Weblogic console
Confidential
Technical Associate
Responsibilities:
- Understanding of application architecture.
- Did the pretest test activities and post test activities for all the tests.
- Monitored the Weblogic console, CPU, Memory, space utilizations and TPS, response time & different graphs, etc... in LR Online monitors.
- Creation of performance test scripts through load runner.
- Involved in creation of Work Load Synthesis.
- Analyzed the graphs & reports based on the target requirements.
Environment: IBM MQ, WEBLOGIC 8.1, Cyclone, Zeus LB, SUN V890, Stubs and Java Scripts, LoadRunner8.0
Confidential
Technical Associate
Responsibilities:
- Involved in the Analysis of volumetric during Environment Preparation and Setup phase to derive and build the Test Cases/Scripts/Test Data required & the provision of Test Data, the Test Environment and Test Harnesses procured / built and configured during this stage.
- Built Performance scripts for Seibel web protocol and configured the dlls for the auto correlation.
- Built Performance scenarios (Load, Stress, Soak) to reflect expected live usage of the system, using the pre-defined sets of business transactions.
- Performed Informal Test Execution to prove the readiness of the test environment and the performance test tool scripts.
- Measure the solution/application under test performance and report against pre-determined Performance Test requirements.
- Determine whether or not the solution/application under test can reliably achieve the predicted transaction throughput levels whilst maintaining an acceptable level of performance.
- Involved in the preparation of Test plan and Specification based on the volumetric information.
- Undertaken Performance tuning exercise by running baseline tests to identify areas that are bottlenecking or under performing, Analyzed the Loading profile, Server statistics and identifying the bottle necks.
- Monitoring and Verification of test results - including collection and analysis of test results, response times and business transaction throughputs, server statistics.
- Identified the bottle necks in the Seibel application servers end.
- Reporting of test results - including reporting against objectives and requirements.
Environment: WEBLOGIC 8.1, Cyclone, Zeus LB,CSM, SUN V890, Siebel
Confidential
Technical Associate
Responsibilities:
- Understanding of application architecture.
- Generated and enhanced the scripts through Load runner.
- Created Load Test scenarios based on Scheduling pattern(Ramp Up, Duration, Ramp down)
- Created the Click Stream Workflows for performance scenarios.
- Monitored all the components and resource utilizations during test execution.
- Prepared & finalized the test analysis report.
Environment: Siebel 7.7, Oracle 10i, Web Logic