We provide IT Staff Augmentation Services!

Senior Automation Test Engineer Resume

0/5 (Submit Your Rating)

Dulles, VA

SUMMARY

  • Senior Automation Tester with 71/2 years of strong performance testing experience in various Industry domains and Quality Assurance.
  • Experienced in performing load and Stress testing of web Applications and web services using LoadRunner.
  • Experienced in developing LoadRunner Scenarios by creating Vuser scripts, Hosts, Vusers and analyzed the test results with online monitor graphs and reports.
  • Experienced in Spike, Endurance and Stress Testing using Performance Center.
  • Experienced in developing Scripts for load testing with users to find bottlenecks in the server and deadlocks in the database using SQL Diagnostic Manager.
  • Experienced in testing Server hardening and server health check with Open source Tools like Jmeter.
  • Good knowledge of writing SQL queries in Oracle, SQL - Server and DB2 database systems and extensively experienced in conducting Backend database testing.
  • Experienced in generating Data Driven scripts that access the backend database using QTP.
  • Strongly experienced in determining baseline performance metrics and bottlenecks for specific business workflow to be measured against future changes through different applications and URL's using Load Runner.
  • Expert in Load Runner Analysis with custom Template.
  • Experienced in using Jmeter for Database Backend Testing with JDBC & ODBC Connection.
  • Experienced in testing LDAP, FTP,and SOAP using Jmeter.
  • Expert in Windows Typeperf & Perfmon Utility to create custom config file and collect windows resources statistics remotely and generate report with PAL.
  • Experienced in using vmstat, SAR, Topas Utility & System Monitor in UNIX System to measure UNIX system Performance under load.
  • Experienced in generating detailed reports that include graphs and tables for various performance object counters and application transaction times using LoadRunner.
  • Experienced in Site Scope & Wily Introscope to monitor the whole infrastructure.
  • Extensively experienced to create and execute Batch File &UNIX Shell Script.
  • Expert scheduling job in both Windows & Unix System through Task Scheduler & Cron Tab.
  • Experienced in Virtualization Technology like Microsoft Hyper V, Oracle VMware and VMware Workstation.
  • Experienced in using utilities RDP, Putty, Process Explorer, PoolMon,and Fiddler.
  • Experienced in identifying Memory Leak issue, Java Heap, Garbage collection issues.
  • Experienced using Monitoring Tools like Task Manager, Process Explorer, Performance Monitor, Resource Monitor and Data Ware House Monitor in Windows system and Jconsole to Monitor Java based application, System Monitor and Top in UNIX system.
  • Experience with multiple network services and protocols such as: DNS, DHCP, WINS, NAT, TCP/IP, FTP, SMTP, POP3 and IMAP.
  • Experienced in monitoring database performance using SQL Diagnostic manager.
  • Experienced in collecting and analyzing database performance using SQL Profiler, Dynamic Management
  • View (DMV) in MS SQL Server Experienced with load testing and performance testing in a network environment.

TECHNICAL SKILLS

  • Jmeter
  • Load Runner
  • Performance CenterQuality Center
  • QTP
  • Atlassian JIRA
  • Rational Clear Quest. Windows 7
  • XPWindows 2000 and Unix. Oracle
  • DB2SQL Server. Cognos 7 Series
  • Cognos 8Cognos Connection
  • Report NetReport Studio
  • Query Studio
  • SSRS
  • Informatica Power Center8.1 FiddlerBadboy
  • WireShark
  • MS PowerPointWord
  • Excel
  • Outlook.

PROFESSIONAL EXPERIENCE

Senior Automation Test Engineer

Confidential, Dulles, VA

Responsibilities:

  • Conducted meetings and walkthroughs with users, developers and Business Analysts to gather information about business process.
  • Design, develop and implement test plans, scripts, and tools using the detailed business requirements
  • Document provided by the business analysts.
  • Develop testing processes and approaches according to standards.
  • Extensively used Web (html/http), Web Services protocol in Load Runner
  • Involved in generating scripts and handling Correlation, parameterization, transaction points, debugging and various other functions using Load Runner Vugen
  • Used and Performed manual correlation to compare and identify dynamic data in Vuser scripts.
  • Responsible in developing and managing test data and test environment
  • Used Performance Center to create scenarios and set up monitors to track load generator for performance testing.
  • Involved in scalability and bottleneck testing of application.
  • Designed the Manual and Goal oriented scenarios using Load Runner Controller module to test
  • Responsible in executing scenarios and in analyzing results.
  • Ramped up Virtual users in a load test to achieve a maximal transaction volume of 1200 concurrent users.
  • Extensively used Controller to perform and execute Baseline, Ramp-Up, Endurance, and Stress test cycles.
  • Used Wily Introscope for identifying response time and memory leaks during load tests.
  • Performed in-depth analysis to isolate points of failure in the application
  • Analyzed Throughput Graph, Hits/Second graph, Transactions per second graph and Rendezvous graphsusing LR Analysis tool.
  • Involved in analyzing Runtime, System Resources and Transactions Graphs.
  • Reported average response times, 90% response times and to the development team.
  • Report and track issues/defects using Quality Center, provide management with various test metrics and generate Quality Center reports and graphs.
  • Actively participated in Defect Review meetings on a daily basis.

Environment: LoadRunner, Quality Center, J2EE, Apache-Tomcat, Apache HTTP Server, UDB/ DB2, SQLDeveloper, Linux, Windows 7/ XP, Tibco BPM, JDE, Site Scope

Senior QA Analyst/Performance Tester

Confidential, Chicago, IL

Responsibilities:

  • Developed test scripts for Performance Testing, Stress Testing using LoadRunner.
  • Executed the transactions manually and verify the back end functionalities and data exploitation.
  • Wrote PL/SQL statement to extract, update data from the tables.
  • Analyzed key scenarios to realize crucial functional areas of the application, Creation and Execution of LoadRunner Test Scripts against the Key functional areas of Stress Testing and Performance testing of the Application
  • Created VUser Scripts in Virtual User Generator (VuGen) as per the business requirements.
  • Created Scenarios using LoadRunner Controller by using different techniques such as Schedule by Scenario, Schedule by Group, Ramp Up, and Ramp Down.
  • Analyzed No. of Hits per second, Average Throughput and Response Time of Individual Transactions for specified duration using LoadRunner.
  • Inserted transaction points, rendezvous points and comments into the LoadRunner Vuser scripts to understand load conditions better.
  • Created automated job to run the LoadRunner script using windows task scheduler.
  • Examined system behavior and performance to expose the application bottlenecks using Load Runner while generating actual load.
  • Collected Performance data from app server and web server using typeperf, perfmon utility and analyzed those data as a log file
  • Measured Unix system performance under load using vmstat, sar, topas, Nmon utility and system monitor
  • Performed Server hardening and server health check with Open source Tools like Jmeter & BadBoy.
  • Performed FTP server test using Jmeter
  • Created and executed batch file in Windows and shell script in UNIX environment.
  • Performed backend database testing with ODBC and JDBC connection using Jmeter.
  • Created baseline performance metrics in UAT environment using LoadRunner and Jmeter.
  • Involved in virtualization technology like Microsoft HyperV, Oracle VMware and VMware
  • Generated, analyzed, and published LoadRunner test results and document the testing process.

Environment: HP Quality Center, Team Foundation Server, LoadRunner, Jmeter, Badboy, Windows, MS SQLServer, SQL Management Studio,LINUX, MS Office.

QA Analyst

Confidential, Cincinnati, OH

Responsibilities:

  • Generated Vuser in LoadRunner for performance testing, and load testing of the application in various Loads.
  • Created automated job to run the LoadRunner script using windows task scheduler.
  • Collected and analyzed database performance using SQL Profiler, Activity Monitor, Dynamic ManagementView(DMV) in MS SQL Server and using Statspack and TKProf utility in Oracle
  • Created and executed batch file in Windows and shell script in UNIX environment.
  • Performed backend database testing with ODBC and JDBC connection using Jmeter.
  • Performed server Monitoring using monitoring Tools like Task Manager, Process Explorer, PerformanceMonitor, Resource Monitor and Data Ware House Monitor in Windows system and Jconsole to Monitor Javabased application, System Monitor and Topas in UNIX system.
  • Created baseline performance metrics in UAT environment using LoadRunner and Jmeter.
  • Generated, analyzed, and published LoadRunner test results and document the testing process.
  • Parameterized dynamic data with base line test data for each load and developed Recovery scenarios for a smooth run when scheduled over 100 scripts in Test Lab.
  • Participated in various meetings and discussed Enhancements and Modification Request issues.

Environment: HP Quality Center, QTP, LoadRunner, Jmeter, Badboy, Oracle, Unix-Aix, Windows, MS SQL Server, SQL Management Studio, MS Office.

QA Analyst

Confidential, Schaumberg, IL

Responsibilities:

  • Responsible for gathering and analyzing performance test requirements.
  • Designing Performance project timelines based on given Production go-live dates of the applications.
  • Preparing Performance Test Plans, Test cases and Click stream diagrams.
  • Designing and develop automated scripts using HP LoadRunner based on business use cases of the application. Extensively used Web (HTTP/HTML) and other protocols.
  • Developing Test harness in VuGen; customize the test scripts for correlation, parameterization and set up the run time settings.
  • Designing Performance test Scenarios using LoadRunner to evaluate the performance of the applications developed in Multi-tier architecture
  • Executing different kinds of Performance tests such as Smoke test(s), Capacity test(s), and Soak or endurance test(s).
  • Monitoring Throughput, Hits per Second, and other key Online Monitor graphs during a Performance test.
  • Identifying Performance bottlenecks in the system from the Performance test(s) executed on an application.
  • Detecting Memory leaks in the application from a Soak or Endurance tests.
  • Analyzing Performance test results and share quick findings as a Preliminary test results to the stakeholders.
  • Using open source tools like Fiddler and Guidewire Profiler logs for UI analysis.
  • Generating Oracle AWR with the help of DBA team and share the reports to Developers for SQL tuning opportunities.
  • Helping Application teams to tune the Applications Server configuration for JVM Heap size.
  • Logging Performance defects in Quality Center and assign them to the app. teams.
  • Organizing daily/weekly status calls with App. teams and prepare meeting minutes and action items.
  • Preparing a report as Summary of Performance test findings, conduct a walkthrough on the report with stakeholders, and send the final version to them for review and sign-off.
  • Attending a Scrum call on daily basis with the Performance Engineering on-site and Off-shore teams and report the status to the management.
  • Attending weekly Performance test engineering meetings for better performance testing process.

Environment: Load Runner (VuGen, Controller, Analyzer), IBM WebSphere, Optier, SoapUIPro, Fiddler, Windows, HTML, XML, J2EE, Oracle, and SQL.

QA Tester

Confidential, Miami, FL

Responsibilities:

  • Provided the performance testing estimates at very beginning of the project initiation stage.
  • Analyzed the sizing and system architecture.
  • Prepared performance test strategy for component performance testing and E2E integration performance testing. Identified the key performance parameters to be measured during testing.
  • Prepared workload model for down the line next five years of system capacity and got it approved by key stake holders of the system like architects and business.
  • Installed Load runner components, established the connection between controller and load gens and ensured required components are up and running.
  • Created Shell scripts for collecting the CPU usage, available memory, Redo and I/O stats from web and DB server.
  • Closely worked with DB team in fixing the long running SQL queries.
  • Created custom C scripts for capturing output values and used extensive manual correlation.
  • Created scripts for generating test data and increasing the DB volume
  • Responsible for result reporting to customer and getting signoff for key deliverable like test strategy, workload model and closure report.
  • Review test scripts and various reports prepared by Test engineers.
  • Executed Load testing, stress testing and endurance testing for 16 hrs.
  • Developing the test plan for tasks and obtain stakeholder support for this plan.
  • Document, implement, monitor, and enforce all processes for testing as per standards defined by the test plan.
  • Onsite liaison to offshore QA and development team for clarification of issues and identification of gaps.
  • Organize the status meetings and send the Status Report (Daily, Weekly etc.) to the Client
  • Implement best practices in Project planning, execution, management, re-usable test scripts.
  • Perform root cause analysis of issues and provide recommendations for issue prevention.
  • Carry out ProblemManagement activities and assistance to Performance management
  • Responsible for monthly effort confirmation and invoice generations for project.

Environment: Load runner, RAC, IBM DB2, Siebel 8.2, JDE 8.X integration OBI-Analytics.

Systems Engineer

Confidential

Responsibilities:

  • Worked closely with Production Managers, Technical Managers and Business Managers in planning, scheduling, developing, and executing performance tests.
  • Develop performance test plans for new application releases and coordinate the Performance engineeringteam through completion of performance testing projects.
  • Act as Load Runner expert, meet with SDC engineers to determine performance requirements and goals,determine test strategies based on requirements and architecture.
  • Performed Performance testing using LoadRunner and developed test scripts and scenarios.
  • Created test script for the application using the web protocol in the VuGen component of LoadRunner.
  • Enhancing the scripts using VU Generator and performed Parameterization and Correlation to meet therequirements.
  • Created various checkpoints in the script using LoadRunner.
  • Inserted start and end transaction points in the scripts.
  • Created test scenarios for running the testing using LoadRunner.
  • Inserted rendezvous points to emulate the behavior of the browser under heavy load conditions.
  • Profiled slow performing areas of the application, system resources and identify bottlenecks and opportunitiesfor performance improvements.
  • Execute performance test scenarios and analyze resultsIdentified functionality and performance issues, including: deadlock conditions, database connectivityproblems, and system crashes under load.
  • Supported in developing testing efforts, project transactions, setting up testing network, functional, integration, regression test and test automation.
  • Responsible for writing Test Plans for Internal and Integration Test environments
  • Performed High Level Design document reviews. Participated in Feature Design review meetings andpresented test case review, strategy and feature functionality.
  • Responsible for load tests using LoadRunner by creating scenarios for performance testing of the applicationby simulating real-time user load.
  • Performed Load and Performance testing using LoadRunner to validate system response time for designatedtransactions or business functions.
  • Configured Web/Application/Database server monitoring setup using LoadRunner Controller.
  • Generating, Analyzing and interpreting the reports post the performance test execution.
  • Analyzed results of Transactions Response time, Transaction under load, Transaction Summary by users Hit per Second and Throughput
  • Activating / configuring monitors and adding desired performance counters into the Graphs
  • Used the Transactions and Web Resource monitors to pinpoint bottlenecks.

Environment: Unix, LoadRunner, Shell Scripting, Oracle, HTML, Quality Center, J2EE, C#, .Net, Java Servlets, JSP, JavaScript, Web Sphere, XML, Perl.

We'd love your feedback!