We provide IT Staff Augmentation Services!

Sr Qa Analyst Resume

5.00/5 (Submit Your Rating)

VA

Professional Accomplishments:
ISTQB Certified Tester
Microsoft Certified Professional
Advanced Diploma in Computer Applications

Academic Qualifications:
Masters in Technology in IT (M.Tech,IT)
Bachelor of Engineering in Computer Science (BE, CSE)
     
Objective:
To obtain an active and dynamic position in an innovative development environment as Software QA &Test Lead Engineer where I can use my expertise for the potential growth of the organization and self.

Professional Summary:

Over 10+ years of QA experience in Financial, Wireless and Pharmaceutical industries, developing test strategy, comprising Test Plan, Test Scenarios, Test Scripts, Testing Methodology and Test Reports, UAT and Automation  testing of various Client/Server and Web applications. Having Experience in Change management and Configuration process set up. Specialized in test automation using QTP, Win Runner, Load Runner, Test Director.

  • Proactive involvement in test strategies, planning and execution following the SDLC to ensure that software, systems, and services meet minimum company standards against defined functional and non-functional requirements.
  • Involved in Offshore and Onsite coordination with Development and testing teams. And setting up the testing process tailored according the client and business requirements.
  • Experience in managing test teams, test effort and size estimation, creating test strategy, methodologies and test plans, defining the entry and exit criteria for each of the test phases in the testing life cycle 
  • Good at defining/ implementing test metrics for measuring quality of testing, productivity and test coverage and Test Summary Reports. 
  • Responsible for deliverable’s and meeting dead lines. And making sure the team meets the requirements in specified time.
  • Experience in working with multiple OS like Unix and Linux environment with shell scripting.
  • Experience with testing at different levels (unit, functional, integration, system, load and performance testing).
  • Experience in using Automation Tools such as Quick Test Professional, Win Runner, Test Director and Load Runner and designing and implementing Automation framework..
  • Conducted training classes through out the company for Automation (QTP 8.2) and Manual testing skills
  • FAR (Functional Area Representative) of CMMi level 5 appraisal process.
  • Strong in Analyzing Business specifications and Developing Test Plans, Test Scripts and Test Cases and executing them.
  • Experience in Data Migration and pre-conversion baseline and the post conversion testing.
  • Extensive experience in Ad-hoc Testing, Functional, Regression, User Acceptance, System, GUI, Load, Stress and Performance Testing, Integration Testing, Alpha and Beta Testing,having knowledge on section 508 testinh.
  • Worked on Financial, Banking, Insurance, Wireless, Billing Applications. Experience in Wireless Technology---RFID and Java Web based Web services.
  • Good understanding of Software Development Life Cycle (SDLC), Quality Assurance Life Cycle (QALC), and QA Methodologies
  • Extensively used Rational Tools (ClearQuest, ClearCase, DOORS) and Mercury Tools (WinRunner, TestDirector), PVCS Tracker, and JIRA.
  • Managing and updating the progress towards the team objectives and assisting the team in root cause analysis with problem solving.
  • Mentoring, supporting and committing to the team for their positive work environment.
  • Reviewing the tasks and documents and conducting walkthrough’s according to the coverage analysis to manage issues to closer..
  • Hand on experience Database Testing  and executing the SQL queries.
  • Extensive experience in Black Box testing techniques.
  • Proficiency in testing GUI applications. Significant experience in Window level Integration Testing, System Testing and Compatibility Testing.
  • Expertise in tracking the bugs some of widely used Bug Tracking Tools. And have extensive experience in implementing bug life cycle
  • Having good knowledge of Software Testing Process and Implementation. Experience in preparing Test Strategy, Test Methodology, Test plan, Test cases, Test Summary
  • Excellent Communication skills, Documentation Skills, Team problem solving ability, analytical and programming skills in high speed, quality conscious, Multitasked environment

 

Technical Profile:
 

  • Languages:  C++, Java, J2EE, .Net COBOL HTML, DHTML, Java Script,SQL,PL/SQL, Visual Basic, XML, ASP
  • Operating Systems:  Unix, Windows XP/2000/NT/98/95, DOS, OS/390
  • Methodologies: Object Oriented Methodologies
  • Database: DB2, Oracle 10i, SQL Server, MS Access: Clients: TOAD, RapidSQL 7.4
  • Internet Technologies:  HTML 4.0, Java
  • Web/Application Server: IBM Web Sphere, MS-Internet Information Server 4.0, Apache
  • Software Automation Tools: Win Runner, Load Runner, QTP 10.1, Selenium
  • Rational Tools: Rational Clear Quest, Rational Requisite Pro, Rational Test Manager, Rational Clear Case
  • Test Management Tool: Test Director and Quality Center 10.0
  • Requirements Management Tool: Telelogic DOORS 8.1
  • Protocols: HTML/HTTP,TCPIP

Confidential,
VA                                                                                                                                                                    March 2007 – Till Date
Sr. QA Analyst/Risk Specialist

Servicer and Investor Reporting (Cash & MBS Release, Consolidation, Servicer Release, and MSR) is a re-engineering effort to transition several major legacy systems supporting the servicing and investor reporting business to a new architecture and platform that will better meet the current and future needs of the business. Servicer and Investor Reporting of this project is the processing of cash loans in portfolio, portfolio generated mortgage backed securities (PFP MBS) & SWAP MBS (Lender Formed Pools). Portions of this architecture may be replaced in subsequent releases as needed to support the requirements (both business and technical)

Responsibilities:

•           Evaluated business requirement and technical specification documents in order to craft test strategies and LOE             assessments which would account for, durations, constraints,     assumptions, etc. that are needed in order to             accomplish test objectives.
•           Reviewed and validated design specifications so as to ensure requirements traceability   and to determine             application/component functional readiness requirements.
•           Created and maintained test plans that define test objectives, methods and tools to be   employed for the assigned             project.
•           Created and contributed to the definition of detailed project schedules, conveying clearly            defined milestones for        test preparation, execution, and compliance reporting activities   and deliverables. Executed assigned tasks in             synchronize with the test schedule.
•           Involved in the creation, distribution, and walkthrough of software test cases, scripts and             other documents             surrounding testing activities and ensure that all testing activities and      deliverables are conducted/produced in             compliance with company standards.
            To include:
            Organization, coordination and participation in test plan/case reviews 
             Design and preparation of test data (HP Quality Center 10.0)
            Progress reporting against assigned tasks 
            Raising, recording and retesting defects (Rational ClearQuest 2003)
•           Influenced the use of automation tools to manage problems, develop test cases,            execute regression and load             testing, and facilitate the creation of supporting documents
•           Communicated with the developing team and created periodical status reports. Involved in Extensive testing of             various situations along with the implementation team, resolved the issues and problems.
•           Participated in status, review meetings and Served as an integral member of the QA test             team, including testing other features/products whenever necessary.
•           Designed the Test scenarios so that the achievement of the functional requirement is       tested with positive, Negative           conditions
.           Conducted Duration test, Stress test, Base Line test, Verify that new or upgraded applications meet specified             performance requirements
.           Independently developed LoadRunner test scripts according to test specifications/requirements.
•           Verified and validated the data from the backend using complex SQLs joins
•           Executed batches and verified the data using UNIX Shell script commands.
•           Involved in writing complex SQL queries using TOAD to extract the data from Oracle       database to conduct             Backend Testing.
•           With backend testing, verified the integration between the application and the database is            working as expected             and as well verified that the changes made in the database is getting      reflected in the application
•           Used Path Analysis to determine the required paths to test all the possible combination   with respect to the             requirement.
•           Utilized VB Scripting Language extensively for Parameterization and Exception   handling.
•           Verified Actual results against Expected results and investigated discrepancies.
•           Carried out extensive testing with the QuickTest Pro 9.1, testing tool with different test     cases, which reflects the             various real time business situations. Test results analyzed and   suitable corrective actions were suggested.

            Environment: Windows XP, ORACLE 10.2, TOAD 9.5, Unix,Load Runner, QuickTest Pro 10.0, Quality Center 10.0, Rational            ClearQuest 2003, Rational ClearCase 2003, Microsoft Office Visio 2007, Telelogic DOORS 8.1

 

       Confidential,
MD                                                                March 2005 – Feb 2007
       Sr Quality System Analyst/Architect

RAVE (Rating and Validation Engine) is a backend application, which validates the shipment information and rates it accordingly. This is a backend engine for products like World Ship, campus Ship, Web implementations. This application (RAVE) validates different options provided by UPS based (tech spec and Functional document) on customer’s choice. Once these options are validated, it rates the shipment according to the business rules.

Other application:
NRF (New Rating Framework) is an interface between RAVE and World Ship. 
DLC (Data Lookup Component) Component which checks the HAZMAT class regulations based on Origin, destination and Services. .Net environment with XML input and output.
Database Update Process.

Responsibilities included:

·         Lead a team as a member of the product team responsible for determining the direction of future                                  products.
·         Performed Integration and System Testing as part of the testing effort using different Windows operating                       systems (Solaris, Linux and Windows).
·         Scheduling and allocating work amongst the team members, ensuring deliverables in time.
·         Analysed test results and created reports for management review.
·         Involved in writing the Test Plans for Functional testing and user acceptance testing.
·         Involved in interacting with Business Analyst’s, Developers and Client to resolve issues.
·         Developed the strategy for Manual Testing of the application and tested the strategy to be client                                    compliant.
·         Performed White Box Testing using the code provided by the developers.
·         Extensively performed Manual Testing process to ensure the quality of the software.
·         Responsible for creating varied test data to extensively test functionality.
·         Maintains QA test environment as part of SDLC; enforcing SDLC approved procedures, rules, and                                regulations.
·         Created SQL queries in SQL Query Analyzer for analysing data in SQL Server Database for 3 and 4-                 point data.
·         Conducted Integration testing between the customer site and Administration site to verify the dataflow.
·         Wrote and modified scripts for Regression Testing.
·         Responsible for all phases of testing of the components, which includes scheduling test groups, authoring           test specifications, analyzing errors.
·         Updated the test scripts and test plans based on the application enhancement.
·         As part of the QA teams responsibility, administered the SQL Server and different databases used by the                       QA Team.
·         Developed Test strategy documents that were helpful for new members of the team.
·         Developed Test Matrix and Scenarios to run scripts for users on multiple operating systems.
·         Generated the detailed reports of the Bugs, Pass-Fail report, Comparison Chart.
·         Performed Class Mapping of the objects and executed Error Handling and Exception cases.
·         Other duties included, helping/training QA members, assisting documentation team to understand                                  functionality of the system, coordinating efforts with the development team and reporting status to QA                             Project managers.
·         Setup QA SQL Server and install the SQL Express.
·         Maintain the SERVER as admin and giving the access to the users.
·         Creating port on server, which can be available for off the network users.
·         Loading the databases and giving the privileges to databases.
·         QA SQL Server maintenance.
·         Specialized in HAZMAT and NMFC codes.
·         Special database testing on HAZMAT materials.

Environment: MS – SQL Server 2000/SQL EXPRESS, MS Visual SourceSafe 6.0, Rational, Windows XP, Windows 2000 Server, Windows 2003 Server, Unix - Solaris, Linux, C++, C#, Merant Tracker, Team Track, XML

               
Confidential,

New York                                                                                                                                             April’04 – Feb 2005

            QA Lead/Analyst/Architect
           
            Verizon Sales and Customer support department has comprehensive Intranet Technical Administration Center (TAC)             Helpdesk for RAC (Resource Allocation Center) to troubleshoot the Telephone, Terminal, and Software maintenance.             Here they used Oracle as Reference Database to store the Data of Telephone switches, Terminals and software pins.             They installed this Helpdesk in entire Verizon footprints (MD, DC, PA, WV, VA, NE, MA, RI, ME and VT).

  • Performed Manual as well as Automated Testing using Testing Tool WINRUNNER.
  • Analyzed and developed test plans for the application under automated and manual test cases.
  • Performed both manual-testing procedures for the up gradation of Applications modules from 8.x to 9.x.
  • Analyzed and reviewed source application objects.
  • Involved in various Testing stages of Software Development Life Cycle (SDLC)
  • Involved in QC life cycle internally with walkthroughs, inspections and testing the relevant documents.
  • Executed test scripts for client and web testing and tested the entire web applications on both Internet Explorer and Netscape Navigator.
  • Involved in QC peer reviews and testing processes for validating the deliverables.
  • Performed manual testing of all the transactions involving Ticket creation, Ticket assignment, Actions taken, Tracking history and Closing the ticket.
  • Involved in various testing like System Acceptance testing, GUI testing, Integration testing, Navigation testing and Regression testing and UAT on the web application.
  • Involved in logging, recording the Defects and test management using Test Director.
  • Summarized test results in formal test analysis reports, analyzed and reported significant bugs and utilized defect tracking system to track all defects.
  • Generated automated UAT test scripts using TSL of Win Runner based on system requirements for their application.
  • Created user-defined functions in TSL and accessed them from function libraries as well as from the function generator.
  • Tested general categories like sanity, regression, functionality, performance and back-end using SQL queries and SQL Navigator.
  • Conducted UAT back-end testing by querying databases to synchronize testing databases and checked for data Integrity and proper routing based on workflow rules at each step.
  • Performed back-end testing including replication of data sets into UNIX tables and check for successful transactions.
  • Checked database to determine successful transaction of test data from the application by establishing connectivity using SQL commands.
  • Writing Shell Scripts with ORACLE SQL, PL/SQL to generate test data using SQL Navigator.
  • Performed regression testing for the functionality of Applications.
  • Created test scripts using Load Runner to perform load and stress tests.
  • Performed load, stress and performance testing of the module using Load Runner by creating VUsers for different scenarios and further enhanced the recorded scripts.
  • Worked as a team to set goals and achieve them.

            Environment: Win Runner 7.0, Load Runner 7.0, Test Director 7.0, Oracle 7.1/8.i, SQL Navigator and PVCS.

 

Confidential,
CA                                                                                                                                                                                 Feb 2003-Mar’04
      Senior Software Test Engineer 

            Project Title                               :           Intel Pump Tracking Pilot (an RFID Product)
            Platform                                               :           Windows 2000 Proffessional.
            Environment                              :           Java, J2EE
            Database                                              :           MySql
            Testing                                                 :       Manual
            Bug Reporting                           :           Company’s Standard Bug tracking Tool(RAID).

            This tracking server is an web-based application is where the vacuum pumps are being tracked through RFID readers             to know the status and history of the pumps. This application tracks the status of the item for different service             providers which act on behalf of the User. There is an Admin tool for registrations and for assigning user permissions.     Another is Ale module where the RFID readers are initiated for the transmission of Data. And the last one is Tracking            server where the data is processed under predefined rules

            Key Functions:

  • Analyzed Business Requirements Documents ,user requirements and developed test plan based on the Test Strategy of the Organization..
  • Developed and Executed test cases manually and compared the actual with         
  • expected results.
  • Performed Black Box, Functional, Regression, Integration, System and 
  • Acceptance Testing
  • Conducted Load Testing, Configuration Testing and Alpha / Beta Testing.
  • Integrated with development team and discussed the technical problems, reported bugs and supported the team.
  • Used Company’s standard bug format to report the bugs.
  • Involved in the implementation and customer support of the product.

 

                        Environment: Windows 2000 Professional, Java, J2EE, MySql , RAID.

 

Confidential,
LA                                                                                                                                                                      Jan’00 – Dec 02
       Senior QA Analyst 
           
       In Business of Kids is an web-based banking application where this product is integrated with the Credit   Unions or Banks in the United States of America. Parents are the administrators for      their registered kids. So the transactions are between kids and the parents with predefined           permissions which are set by the parents. This particular application is integrated with Banks (credit         unions) in the United States Of America. Basically it has three modules Bank Wizard(BW),Kids   Module(KM) and Parents Module(PM).And has four servers(Dev,Test,Prod,QA) to be tested .
           
       Key Functions:-

  • Analyzed Business Requirement documents (BRD), Software Requirement specification (SRS) and developed Test Plan.
  • Involved in setting up the Test Environment and Test Data.
  • Involved in Offshore and Onsite coordination with Development team and testing teams.
  • Developed and executed test cases manually and compared the actual with expected results.
  • Developed automation scripts using Mercury Interactive Quick Test Pro 8.2. Jan’00 – Jan 03
  • Performed Black box Testing, Ad-hoc Testing, Regression Testing, and Functionality Testing and Backend Testing.
  • Logged and tracked defects using Test Director Defect’s module.
  • Wrote SQL queries to check the data integrity of MySQL database.
  • Executed batch files for regression testing on Linux environment.
  • Integrated with development team and discussed the technical problems, reported bugs and supported the team.
  • Involved in changing the test cases for regression testing according to the new functionality of the application and executing them.
  • 10. Participating in daily conference meetings to track the progress of testing effort with   
  • end user clients
  • 11. Involved in the implementation and customer support of the product.

 

            Environment:  Linux, Java, J2EE, MySql, JIRA, PVCS Tracker, QTP 8.2

We'd love your feedback!