Performance Test Lead Resume
SUMMARY:
- Senior Software Testing Professional with 14+ years of experience in overseeing complex projects from requirements to project delivery phase. Highly resourceful, customer focused leader who promotes strategic planning, modern engineering principals and consistently providing high quality of work. Coaches and leads teams on Quality and standard SDLC methodologies like Agile and waterfall.
- Managing & lead Non - Functional Testing practice of Confidential - MAHIX and BG International accounts.
- Knowledge of Performance Testing Life Cycle and proficient in analyzing the test results.
- Trouble shoot bottlenecks with minimum support from technical teams.
- Led Performance engineering activities to mitigate release wise changes or P1/P2 production incidents.
- Expertise in Load, Stress, Endurance, capacity & Resiliency testing
- Proficient in testing tools like Load Runner, Performance center and Application Lifecycle Management, Quality Centre, Site scope
- Strong at Analyzing Server logs using Splunk. Configure and collect various performance metrics including CPU, Memory dump analysis and Disk utilization statistics using Wily Introscope, nmon, perfmon, Sitescope and Application Monitoring tools like AppDynamics, Dynatrace.
- Implemented & Rolled out of AppDynamics and Splunk monitoring capability for production and non-production environments. dis enabled environmental issues and incidents to be detected and analyzed within minutes instead of hours.
- Collaborate with Senior Performance and Resilience Engineers to work together to resolve complex problems which spans multiple technology, multiple platforms & various networks.
- Coached the team into new way of working with Agile and Jira, resulting in totally engaged team dat worked smarter with greater efficiency and doubled velocity.
- Profound noledge of data warehousing including Netezza, Hadoop, Oracle,BI reports, like OBIEE, MicroStrategy, Tableau
- Expertise in writing SQL or Hive queries to test data completeness, data quality, redundancy and correctness of data loaded in DWH or Datamart using ETL Mappings and transformations or derivation logics, Lookups
TECHNICAL SKILLS PROFILE:
Testing Tools: Dynatrace User Experience Management (UEM), AppDynamics, Oracle OEM, Tableau, Hue, Datameer, Tectia, LoadRunner (LR),Jmeter, Application Lifecycle Management (ALM), Performance center (PC), Site Scope, Quick Test Professional (QTP), Quality Centre (QC), Business Availability Center (BAC)
Databases: Oracle, Netezza, Hadoop
OS: Windows, UNIX, AIX
Performance Monitoring tools: Wily, SPARK, nmon, perfmon, IBM DataStage, IBM Netezza Performance Portal
Other Tools: Bugzilla, Microsoft Visual SourceSafe, Tortoise SVN, Toad, Db Visualizer, SQl DeveloperLanguages C, JAVA, SQL
PROFESSIONAL EXPERIENCE:
Confidential
Performance Test Lead
Responsibilities:
- Baseline the system performance, identify and benchmark the system’s ability to execute given volume of transactions under production-level load conditions.
- Worked on Server Sizing & workload analysis for second staging environment which will be used to performance test intermediary releases
- Requirement gathering to understand the impact of new changes for each release on performance test cycle and if new scripts needs to be generated for newly added feature
- Coordinate with team on script generation (HP Vugen & Performance center), modification and environment readiness before test execution cycle
- Leveraged Splunk advanced features to slash test report generation from 2 days to 15 minutes and test results analysis from 1 day to 2 hours.
- Use Dynatrace to monitor application behavior during test execution cycle. Analyze pure path, memory size, cpu size, disk size of all application servers and web servers involved.
- Use Oracle OEM to monitor db server if any issue faced
- Manage defects in HP ALM
- Integrate initial non-functional testing into the CI/CD pipeline
- Under the anticipated peak production load, derived from 2014 OE observation, Performance Testing team measures page-to-page response times within the Confidential applications.
- Create a report to show comparison between current response time and previous releases response time and highlight if it does not fit within SLA
- Review performance test results for each release with the Commonwealth QA Manager, MassIT Release Manager and Confidential Release Manager.
- Create comparison report using Tableau to compare Staging server application configuration parameters and Production server configuration parameters before release and post release. dis was an additional task done by performance test team to help development team
Environment: Dynatrace User Experience Management (UEM), Splunk, Oracle OEM, HP Vugen, HP Performance center, HP ALM, Tableau
Confidential, NJ, USA
Test Lead/Coordinator for Performance testing, Data Integration testing and UAT testing
Responsibilities:
- Ensure by reviewing JCI Quality Assurance (QA) Standards with team, the scope, execution documentation, requirement traceability is properly covered:
- Planning functional and Nonfunctional Requirement Analysis, Strategy Development and Schedule
- Managing & leading Test team to Write/design the test cases with documentation, Leveraging and linking appropriate SIT cases together, Case Review and Approval, run/execute the documented test cases, documenting, Defect Documentation using JIRA.
- Leading performance test team to get the scenarios under test, walkthrough on steps of scenarios, analyze workload model and get signoff on the performance test plan.
- Create dashboard for Test case execution progress and Defect summary
- Working closely with Amazon team for cloud performance monitors dat can be accessed during performance test cycle
- Work with Infrastructure team to get Application Performance Monitoring tools like AppDynamics get configured and accessed by test team.
- Create detailed date wise, feature wise, UAT user role wise, module wise, UAT execution plan
Environment: Microfocus LoadRunner (Web Http/Html) protocol, AWS, AppDynamics, Postman, JIRA
Confidential, Warren, NJ, USA
ETL/Hadoop Test Lead
Responsibilities:
- Collaborate with Business Analyst to ensure comprehensive requirements - based testing is achieved
- Keep track of new, updated requirement
- Well versed in software functional testing practices, methodologies, and standards
- Developed and maintained innovative, repeatable QA test plans as well as performance plans based on functional requirements, use cases, user interface designs, system documents and domain noledge
- Development of Test Plan, Test Scenarios and Test Summary Reports
- Assigning task to team member and tracking them
- Plan, Analyze, Design, develop, execute, and maintain Hive Scripts for reconciliation of data loaded in Hadoop
- Perform the tasks of updating test plan status to the test manager on a weekly basis
- Check if mapping documents covers the business scope
- Ensure dat data elements in BRD have been captured and implemented in EAP in desired L1, L2, L4 and L5 layers.
- Ensure dat personal information like SSN or DOB of customer is masked or encrypted in original content sources. Based on SIT/UAT environment dis information should of restricted access.
- Work on different tools like Hive, Hue, Datameer and Tectia and write Hive queries to validate data correctness in Hadoop
- Work closely with SIT team and address all their issues and questions
- Wrote complex queries to compare the counts and data from source to target.
- Wrote complex query to get the exact mismatch of column data in Source file and target table to between two tables.
- Extensively worked on Mapping Variables
- Extensively worked on Derivation rules
- Actively work with Business Analysts and Development team
- Upload Test cases, Prepare Test lab and execute test cases in HP ALM
- Raise Defects, manage Defect Report using Quality Center and verify fixes with proper meetings with Development team lead.
- Follow Test process and continuously improve the quality of test process
- Ensure Good communication and constructive work relationship with Team as well as with Client.
- Mentor Test resources
Environment: Hadoop, ETL Testing, Hive queries, Hue, Datameer, UNIX, HP ALM, Sharepoint
Confidential, MA, USA
ETL Performance Test Lead
Responsibilities:
- Design and conduct various performance testing (load test and stress test) to ensure CDC, ETL Job processes adhere to NFRs defined for R3 & R4 Release
- Design and execute performance testing interfaces impacting RMS EDW
- Analyze the source data file structure and its schema so dat data can be
- Execute Data stage sequence/Jobs using IBM DataStage
- Monitor Job execution status like Job execution time through Datastage Director
- Monitor the Netezza database hardware status, active queries, active sessions, and other related information using IBM Netezza Performance Portal
- Validate the data is properly loaded in the Netezza tables using Toad
- Gather nmon statistics for data stage and database server
- Perform analysis on Job execution and relate the nmon statistics using an internal monitoring tool called SPARK
- Suggest recommendations. Generate test analysis results and share it with development team
- Raise the issues encountered using ALM
Environment: IBM Infosphere Information Server (IIS) v8.7, IBM Infosphere CDC process,IBM DataStage V8.7, Netezza, IBM Cognos
Confidential, Warren, NJ, USA
ETL Test Lead /Manual Functional Tester
Responsibilities:
- Collaborate with Business Analyst to ensure comprehensive requirements-based testing is achieved
- Keep track of new, updated requirement
- Well versed in software functional testing practices, methodologies, and standards
- Developed and maintained innovative, repeatable QA test plans as well as performance plans based on functional requirements, use cases, user interface designs, system documents and domain noledge
- Development of Test Plan, Test Scenarios and Test Summary Reports
- Assigning task to team member and tracking them
- Plan, Analyze, Design, develop, execute and maintain SQL Scripts for reconciliation of data loaded.
- Perform the tasks of updating test plan status to the test manager on a weekly basis
- Prepare traceability matrix
- Involving in writing SQL queries to verify data quality and calculations, reviews
- Wrote complex queries to compare the counts and data from source to target.
- Wrote complex query to get the exact mismatch of column data in Source file and target table to between two tables.
- Extensively worked on Mapping Variables
- Extensively worked on Derivation rules
- Closely work with Business Analysts and Development team
- Verify the data from database is properly flown to UI
- Execute functional test cases
- Upload Test cases, Prepare Test lab and execute test cases in HP ALM
- Raise Defects, manage Defect Report using Quality Center and verify fixes with proper meetings with Development team lead.
- Follow Test process and continuously improve the quality of test process
- Ensure Good communication and constructive work relationship with Team as well as with Client.
- Prepare presentation on productivity matrix of the team
- • Mentor Test resources
Environment: ETL Testing, Manual Testing, SQL, SQL Developer, Oracle Database, UNIX, Quality Center
CONFIDENTIAL
Sr. Performance engineer
Responsibilities:
- Meetings and discussions with develop team/client to understand the system architecture, requirements, business processes to be considered for test
- Perform POC on the system and analyze the protocol for testing, type of tests to perform and accordingly prepare estimations
- Project Planning, Effort estimation. Work distribution, Tracking, Monitoring & Controlling and Project status reporting to Project Manager and Business stakeholders.
- Prepare/Review Performance Test Plans and Test Strategy
- Guide team on Load Runner and Performance testing concepts
- Co-ordination with the functional team for the functional help
- Plan Test executions
- Parameterized and correlated the scripts and enhanced them according to the test case.
- Executed the baseline performance tests for each release to verify the performance changes for significant business transactions.
- Ensure test environment reflects requirements for test execution
- Customize Parameterization in DATA file using via LoadRunner to test the application with different sets of data.
- Inserted rendezvous points to create intense load on the server and thereby to measure server performance.
- Used various techniques like Ramp up, Ramp down, Transaction Point in Load Runner.
- Conducted load, stress and soak testing using Load Runner by creating rendezvous points to simulate heavy user load, and transaction points to test application response time.
- Use HP ALM for checking load generators availability, upload Vugen scripts in zip format, design the scenario, execute scenario, collate the results from load generators
- Worked on Analysis tool to capture the Response Times, Passed/Failed Transactions, Errors
- Analyzed the Load Runner results to measure the Average CPU usage, Response time, Transactions per second.
- Analyzed various performance Monitors to find System Bottlenecks, Network Bottlenecks, CPU & Memory Utilization.
- Analyzed graphs such as Response time, Running Vusers, Throughput, Hits per Second and Web Page Breakdown Graphs
- Interacted with Developers and other Teams for Data requirements and Resolving performance issues.
- Responsible for capturing performance metrics and monitoring the Application Servers, Database server’s Windows box’s.
- Monitored Servers using Wily
- Provide recommendations related to performance
- Provide daily status to client and development team
- Prepared analysis document for the test and identified performance issues of software and hardware
Environment: Load Runner - Web (Http/Html) protocol, Application lifecycle model (ALM)Quality Center
Sr. Performance Engineer
Confidential
Responsibilities:
- Prepare Test step documents for the identified scenarios
- Co-ordination with the functional team for the functional help
- Prepare Volumetric for Business functions based on Critical success factors
- Generate scripts using Vugen (Multi-Protocol Siebel-WEB (HTTP/HTML).
- Parameterized and correlated the scripts and enhanced them according to the test case.
- Executed the baseline performance tests for each release to verify the performance changes for significant business transactions.
- Ensure test environment reflects requirements for test execution
- Customize Parameterization in DATA file using via LoadRunner to test the application with different sets of data.
- Inserted rendezvous points to create intense load on the server and thereby to measure server performance.
- Used various techniques like Ramp up, Ramp down, Transaction Point in Load Runner.
- Soak test, Stress test and peak test the system and analyze whether the response time of transactions meets CSF
- Worked on Analysis tool to capture the Response Times, Passed/Failed Transactions, Errors
- Analyzed the Load Runner results to measure the Average CPU usage, Response time, Transactions per second.
- Analyzed various performance Monitors to find System Bottlenecks, Network Bottlenecks, CPU & Memory Utilization.
- Analyzed graphs such as Response time, Running Vusers, Throughput, Hits per Second
- Interacted with Developers and other Teams for Data requirements and Resolving performance issues.
- Monitor the Unix counters of Siebel object Managers
- Generate customized reports for the Performance test results
- Send Daily status update mails to the client
- Mentor Test resources
Environment: Load Runner - Siebel protocol, Quality center
ETL Test lead
Confidential
Responsibilities:
- Collaborate with Business Analyst to ensure comprehensive requirements - based testing is achieved
- Developed and maintained innovative, repeatable QA test plans as well as performance plans based on functional requirements, use cases, user interface designs, system documents and domain noledge
- Development of Test Plan, Test Scenarios and Test Summary Reports
- Plan, Analyze, Design, develop, execute, and maintain SQL Scripts for reconciliation of data loaded.
- Analyze data using SQL, Excel, and UNIX
- Check dat the ETL jobs execute without any errors
- Analyze the response time taken for ETL Jobs to load the data in database
- Involving in writing SQL queries to verify data quality and calculations, reviews
- Wrote complex queries to compare the counts and data from source to target.
- Extensively worked on Mapping Variables
- Wrote SQL queries to validate the FACT, DIMENSIONS and AGGREGATE tables
- Raise Defects, manage Defect Report using Bugzilla and verify fixes.
- Also Handle Administrative work of Bugzilla.
- Follow Test process and continuously improve the quality of test process
- Ensure Good communication and constructive work relationship with Team as well as with Client.
- Mentor Test resources
Environment: Manual Testing, SQL, DB Visualizer, Netezza Database, UNIX, Bugzilla, SAS DI Studio - for ETL job
Performance Engineer
Confidential
Responsibilities:
- Prepare Test step documents for the identified scenarios
- HP had provided a set of scripts which are QC 10 specific for the core functionality of QC. The scripts were modified as per the company s database and different projects.
- Some scripts were newly generated for the functionality which was not supported by scripts Delivered by HP as HP provided the scripts for SQL database and the organization was using Oracle database
- Load test the system with different load and compare the response time with the earlier QC V9.2 (The performance for QC9.2 was performed last year)
- Customize Parameterization in DATA file using via LoadRunner to test the application with different sets of data.
- Used various techniques like Ramp up, Ramp down, Transaction Point in Load Runner.
- Worked on Analysis tool to capture the Response Times, Passed/Failed Transactions, Errors
- Analyzed the Load Runner results to measure the Average CPU usage, Response time, Transactions per second.
- Analyzed graphs such as Response time, Running Vusers, Throughput, Hits per Second
- Generate a comparative report for the response time with different load and earlier QC v9.2 Response time. Prepare a closure report and share it with client
- Send daily status update report to the client
Environment: Load Runner - Web (Http/Html) protocol, Quality Center
QA Analyst
Confidential
Responsibilities:
- Understand the requirements and prepare Test cases for the identified scenarios and upload them to Quality center
- Test different modules like Timesheet, General Ledger Reports, Data Migration
- Co-ordination with the Business team for the functional help
- Manually perform the Functional testing and integration testing
- Ensuring dat Test Cases cover all requirements, not missing any, and dat Test Cases and Test Report are meeting the agreed quality standards.
- Developed test plans, test cases, test scripts and procedures, traceability matrix, and test result reports.
- Performed Negative testing to find how the functions and variables perform when it encounters invalid and unexpected values.
- Involved in end-to-end defect cycle
- Provide support to the end users during UAT
- Co-ordinate with development team for the issues faced
- Daily status report with BA and Development lead
- Send Daily and Weekly status update mails to the client
Environment: Manual Testing, Navision.
QA Analyst
Confidential
Responsibilities:
- Understand the requirements and prepare Test cases for the identified scenarios and upload them to Quality center
- Test different Service Management modules like Incident, Interaction, Change and Change Management
- Co-ordination with the Business team for the functional help. Manually perform the Functional testing and integration testing
- Involved in end-to-end defect cycle
- Generate Performance testing scripts for the identified scenarios like Bulk update, Ticket queues using Vugen 9.5
- Parameterized and correlated the scripts and enhanced them according to the test case.
- Execute the load test using Performance center 9.5
- Customize Parameterization in DATA file using via LoadRunner to test the application with different sets of data.
- Inserted rendezvous points to create intense load on the server and thereby to measure server performance.
- Used various techniques like Ramp up, Ramp down, Transaction Point in Load Runner.
- Worked on Analysis tool to capture the Response Times, Passed/Failed Transactions, Errors
- Analyzed the Load Runner results to measure the Average CPU usage, Response time, Transactions per second.
- Analyzed graphs such as Response time, Running Vusers, Throughput, Hits per Second
- Interacted with Developers and other Teams for Data requirements and Resolving performance issues.
- Provide support to the end users during UAT
- Co-ordinate with development team for the issues faced
- Daily status report with BA and Development lead. Send Daily and Weekly status update mails to the client
Environment: Manual Testing, Load Runner - Web (Http/Html) protocol, Quality Center
Confidential - USA
Sr. Performance engineer
Responsibilities:
- Development of Test scripts for the identified critical business processes using LoadRunner
- Consolidation and Parameterization of Test Scripts, thus reducing the number of Automated Scripts of Number of Test Cases.
- User friendly documentation of Test Deliverables including test cases, test scripts, Analysis Reports
- Parameterized and correlated the scripts and enhanced them according to the test case.
- Ensure test environment reflects requirements for test execution
- Conducted load, stress and soak testing using Load Runner by creating rendezvous points to simulate heavy user load, and transaction points to test application response time.
- Customize Parameterization in DATA file using via LoadRunner to test the application with different sets of data.
- Inserted rendezvous points to create intense load on the server and thereby to measure server performance.
- Implement network virtualization by Integrating Shunra with HP Loadrunner to recreate multiple network scenarios
- Used various techniques like Ramp up, Ramp down, Transaction Point in Load Runner.
- Worked on Analysis tool to capture the Response Times, Passed/Failed Transactions, Errors
- Analyzed the Load Runner results to measure the Average CPU usage, Response time, Transactions per second.
- Analyzed various performance Monitors to find System Bottlenecks, Network Bottlenecks, CPU & Memory Utilization.
- Analyzed graphs such as Response time, Running Vusers, Throughput, Hits per Second and Web Page Breakdown Graphs
- Interacted with Developers and other Teams for Data requirements and Resolving performance issues.
- Responsible for capturing performance metrics and monitoring the Application Servers, Database servers Windows box’s.
- Provide recommendations related to performance
- Provide daily status to client and development team
- Prepared analysis document for the test and identified performance issues of software and hardware
- Configure BPM for Monitoring AUT
- Explore and perform POC on other testing tools like jMeter,IBM Rational functional tool, Bugzilla
Environment: HP Quick Test Professional, HP Load Runner Web (Http/Html) protocol, HP Business Availability center.
QA Analyst
Confidential
Responsibilities:
- Understanding requirements, preparing test plan, writing Test Cases and testing of modules such as File Explorer, Security, Display, Time management, USB, File Transfer, Memory card
- Prepare test scripts to test the APIS of modules like File explorer and Security.
- Testing with Rational Purify to detect memory leaks and generating report.
- Testing both on Simulator and on Target (Mobile Handset)
- Ensuring dat Test Cases cover all requirements, not missing any, and dat Test Cases and Test Report are meeting the agreed quality standards.
- Developed test plans, test cases, test scripts and procedures, traceability matrix, and test result reports.
- Performed Negative testing to find how the functions and variables perform when it encounters invalid and unexpected values.
- Testing Games specific features, and behavior of the game on various compatible devices.
- Tracking closure of defects.
- Random testing of all the modules of the handset.
- Achieved Best Performance Award and Client appreciation
Environment: Manual Testing.