We provide IT Staff Augmentation Services!

Quality Assurance & Testing Resume

3.00/5 (Submit Your Rating)

State Of New, JerseY

Summary
15+ years of experience as Software Professional – 7 years as QA Manager, 2 years as a Business Analyst and 3 years’ experience as QA Lead, managing a team of talented individuals. Substantial testing experience on web-based, desktop or distributed and server-side(non-UI) applications on .NET and JAVA EE platforms, and end to end testing best practices, and conversant with test and QA principles, methods, disciplines, methodologies and technologies.

Technical Skills

Manual Testing, Quality Center, Loadrunner, Winrunner, QTP, IBM Rational Tools, C, Card Processor TSYS, Mainframe ACAPS, CRM application Goldmine, Ms Project, Version One, Agile Project Management, Rational Unified Process, Quality Process Improvement, Quality Audits, Resource Planning, Management and Development

Education

MBA Management of Information Systems

Key Responsibilities/Accomplishments/Achievements

The main areas of my experience as Sr. QA Manager in various organizations I worked are as given below. Responsibilities include my experience in the area of Quality Assurance, Quality Control and Testing practice. Needless to mention that a specific experience or responsibility is based on that particular organization it applies to.

Supervisor and Technical Guidance

  • Provided necessary definition, development and deployment of the Organization’s product quality assurance strategy, addressing all phases of product development.
  • Directly supervised Test Leads, Quality Assurance/Quality Control coordinators and Testers including the preparation and delivery of staff performance evaluations and career development activities.
  • Managed department and overall expectations pertaining to setting accurate schedules, costs and resources.
  • Ensured delivery against QA department goals and objectives, i.e. meeting commitments and coordinating overall quality assurance schedule.

Process Management

  • Developed Quality Management Plan describing Quality Policy and Practices within an organization.
  • Maintain product consistency throughout product cycle, to include the design, define and build phases through quality checkpoints/Quality Gates and testing.
  • Developed and managed quality assurance metrics for performance improvement of all processes.
  • Developed various Checklists for Internal Review of all Deliverables.
  • Implemented ongoing quality improvement processes working with interdepartmental teams.

Project Management

  • Managed the planning and execution of product testing efforts, including all associated resources to meet committed delivery dates.
  • Provided effective communication regarding issues, objectives, initiatives and performance to plan.
  • Work regularly with Project Managers and Delivery Managers to develop project schedules and resource allocation models for QA related and other activities such as software deployment, environment, test data and Internal Audit results including handling Non-Conformances.

Organizational Liaison

  • Manage all Software Quality Assurance and Testing issues with related groups such as Functional Team, Development, Database, Implementation and Training(wherever applicable)

Quality Management and Testing

  • Assure the viability, functionality and effectiveness of essential tools and matrices for quality evaluation.
  • Anticipate program release problems and take corrective action, escalating as needed, to resolve and achieve effective results
  • Approve Test Strategy, Test Plans, Test Cases and all other Test related documentation
  • Coordinate with Test Lead for availability of Test Data, Test Environment, all Entrance and Exit Criteria.
  • Conduct Test Readiness Reviews.
  • Coordinate and support User Acceptance Testing effort.

Deployment and Delivery

  • Coordinate the delivery of software to Testing and UAT and Sandbox environments(wherever applicable)
  • Responsible for creating checklists for software deployment.

Production Support

  • Troubleshooting and high tier support.

Documentation

  • Established and maintained policy for documentation of all project deliverables.
  • Review all documentation before is sent out to Stakeholders (wherever applicable).

Professional Experience: Confidential, July 2010 - Present Sr. Test Manager CAI has been responsible for managing the entire QA and Testing practice for IBC (Independence Blue Cross) in Philadelphia. It works in accordance with the e-PMO office for all Front, Middle and Back Office projects. Multiple projects – are handled simultaneously under supervision of Test Leads and Testers – Onshore and Offshore for the major two organizations – Independence Blue Cross and Amerihealth. Multiple Interfaces involves Provider Portals, Member Portals, Future scripts and other Mail order organizations, and within various sub-groups of benefits.

I am responsible for all Front Office projects within the QA organization, heading a team of approximately 8 Leads and more than 35 testers on and offshore.

Key Functions and Responsibilities

  • Overseeing and Managing Testing Delivery Services for all front office projects
  • Managing every aspect of QA tasks within the SDLC process
  • Resource Planning and allocation for Test Leads and Testers to all projects
  • Work as escalation point on any Quality issues
  • Approve Test Models, Test Execution Certification
  • Work with Business for Acceptance Test Sign-Off procedure
  • Determine Quality Metrics
  • Performance Reviews and hiring for testing resources
  • Establish process improvements based on CAI best practices and standards
  • Mentor QA associates for accomplishing respective tasks/assignments
  • Imparting knowledge Training in the CAI VeriCenter on newly hired Testing professionals on QA practices and procedures
  • Guide the Internal Audit procedure and work closely with the Audit group from IBC
  • Work closely with Infrastructure group to determine scope and feasibility of Automation of Regression efforts
  • Approve Performance Testing results and work with Business to meet the SLAs.
  • All other Administrative duties and responsibility within the team
Confidential, Sept 2007 – July 2010 Manager - Quality Assurance & Testing State of New Jersey - Motor Vehicles Commission I worked in the State of NJ MVC’s prestigious MATRX (Motor Vehicles Automatic Transaction System) project. It is a total revamping or upgrading the Motor Vehicle transaction system from the Old Legacy to a web-based system based on JEE architecture.

I worked as the Quality Assurance and Testing Manager heading a group of 22 people across 2 locations in US and responsible for overall Quality System implementation and maintenance in the project. I was in-charge of all Quality Assurance, Quality Control and Testing activities that were carried out. We started this project with 6 Managers and grew to about 105 in strength, including Full time employees and consultants. My accomplishment in this big project was to build the total quality system within the organization from the scratch. I had to define, benchmark, standardize and laid measurement criteria for processes across the whole SDLC, including hiring appropriate workforce comprising of Functional Test Lead, QA/QC Coordinators, Manual and Automated Testers and Performance analyst.

Key Functions and Responsibilities

  • Develop and implement comprehensive short term and long term QA standards, strategies, and methodologies
  • Develop and execute at a team level comprehensive system and component test packages that emulate customer workflows and usage by analyzing customer patterns, product requirements, and product specifications from both technical and business perspectives
  • Develop an Overall Quality Management Plan defining the Quality System Processes, Procedures
  • Defined Matrices for all the processes and deliverables
  • Participate in the SDLC processes and procedures used to develop and support the product
  • Defines and implements quality strategy and processes.
  • Act as a liaison between Development and all customers (both internal and external) on quality related topics
  • Co-ordinate frequently with all levels of Management – EDS, Oversight Vendors and State of NJ
  • Manage and report on the quality of products both released and under development
  • Mentor and educate employees on QA processes, procedures, and documentation
  • Provide project and personnel management to the QA team
  • Manage Quality Assurance Audit procedure – both Internal and External
  • Co-ordinate Non-Conformances across the SDLC phases
  • Monitor Corrective Action against the Non-Conformances and organize Internal Audits
  • Developed Checklists for All Audit Types – Process Audits, Product/Deliverable Audits and Stage-Gate Quality Audits
  • Defines, collects, and reports on quality metrics
  • Drives improvement activities based on data from quality metrics
  • Leads development and execution of plans to achieve quality strategy/processes.
  • Leads development of test plans, cases, and scripts based upon business requirements and processes, in line with defined workflows, business requirements, and functional requirements.
  • Co-ordinate functional, regression, and end-to-end system test cases.
  • Assist with the maintenance and evolution of system requirements documentation.
  • Manage documentation and maintain all system defects.
  • Provide written reports detailing the results of testing for analysis and correction.
  • Provide Quality Management Reports based on Quality Activities – Audits, Reviews, Testing
  • Manage Resources both on Manual, and Automated Testing
  • Project Management Methodology used was Rational Unified Process(RUP)
  • Managed all Rational Tools like Requisite Pro, Rational Test Manager and Rational Clearquest
As a QA and Testing Manager, I was responsible for all types of White Box and Black Box Testing, Performance Testing, Functional Testing. Also support UAT testing. Coordinate with all Stakeholders and oversight vendors for acceptance of all deliverables and process checkpoints. Major achievement was development of a detailed Quality Management Plan. Also testing of Foundation Layer and Business Solutions were a major success and implementation was smooth. Confidential, July 2005 – Sept 2007

QA Manager - Projects

The Company Corporation (TCC) – a division of CSC is responsible for incorporating Small Businesses and providing other related services. The IT department in TCC maintains and develops 4 basic websites – Incorporate.com, LLC.com, CorpAmerica.com and Instacorp.com. There is a Main Admin Site which maintains all Admin functions. The basic framework is client server connected to Oracle database, and mainframe application. Also have experience in CRM application – Goldmine. Goldmine was recently launched after an 8 months effort of introducing it to our current infrastructure which was a huge effort this year.

Responsibilities and Functions

  • Reporting to IT Director - TCC
  • Follow the Agile Project Management Process – Scrum
  • Attend daily Scrum meetings, Backlog Estimation meetings, Sprint Planning meeting, and Sprint Retrospective meetings.
  • Analyze Business Requirements(Stories) and co-coordinating with the Stakeholders and Business Development Team
  • Coordinate with Development Team to translate these stories into logical language – through UMLs.
  • Create FIT test cases for Unit/Integration tests for the developers for creating fixtures.
  • Enhance the stories during estimation to create tasks and tests
  • Risk assessment and mitigation strategy with the Project team
  • Coordinate with the Tester to understand the Test cases
  • Scheduling testing timelines, allocate resource and monitor environments
  • Monitor progress of testing on a daily basis
  • Co-ordinate performance testing with the infrastructure team
  • Manage Multiple Projects and Resources
  • Conduct and participate in Team building exercise
  • Have regular team meetings for projects and team building exercise
  • Manage onshore and offshore resource and their work allocations
  • Automate repetitive tests using Loadrunner 9.0
  • Co-ordinate with Offshore Development Team for all TCC Admin website functions
  • Represent TCC in the Global CSC QA management team for process improvement
  • Use of Quality Center for reporting through Requirement, and Defect modules
  • Graphical defect trend analysis for regression suite using HP Quality Center.
  • Defect aging analysis and Root Cause analysis to improve overall quality management systems within the organization.
  • Maintain, Review and approve process documentation – wherever applicable

As a QA Project Manager, apart from onshore responsibilities, I was responsible for Offshore Development – with 3 Developers and 5 Testers. One of my major achievements was to organize and setup a Total Quality Management system which would be in line with the rest of the organization location in terms of implementing and maintaining QA Best practices. In 1 and half year all the offshore facilities were following uniform quality system across the board. One of my regular tasks were to plan and obtain Admin stories/requirements from Stakeholders and create tasks for offshore team. Offshore team is in Hyderabad, India, requiring travel there; oversee the progress of important development.

Onshore duties include working on new stories with Business Development group for upcoming sprints, analyze and allocate priorities – based on Stakeholders allocation, ease of development in the current environment or framework. Then take these stories and estimate with Development team – creating tasks, translating these stories into more technical format – wherever necessary. Then work with testers to understand the test cases/scenarios they have prepared to ensure adequate testing coverage. On every sprint – I takeover some important testing tasks – which might involve database upgrades, Quartz jobs for database feeds to various applications, etc.

A major achievement was a process improvement regarding automated test tool. This reduced the defect raised from existing functions by almost 85% over 3 months.

Also I worked on a new reporting tool, which creates reports for Call Center Sales and Service grouping for Web and Phone Orders and Conversion rates for the CSRs.

Confidential, Jun 2001 – June 2005


QA Manager /Sr. QA Lead

I worked on the Projects involving major system changes within the organizational architecture as well as Card Services Web Portal enhancements. Also I worked on major implementation projects like First USA Rehost to Bank One Architecture, Collateral Asset Management System, Bank One Redesign, Retail and Credit Card Integration, and I3 (TSYS) implementation, Online Rewards and Statements.

Responsibilities

  • Handle and manage all Merger Integration projects
  • Create, manage, oversee all Project Deliverables – Test Plan, Test Cases, Test Reports
  • Implement strict Quality Control practices through regular Quality Audits of QA Methodologies, practices and project deliverables to maintain uniformity across the Lines of Business
  • Extensive use of Quality Center as Central Repository of QA Deliverables
  • Expert understanding in Quality Center Modules – Requirement, Test Plan, Test Lab and Defect Modules
  • Reporting Quality Metrics, Requirement Coverage, Execution Reports and Defect Status reports from Quality Center
  • Ensure critical paths and processes are managed efficiently and mitigation strategies are documented
  • Ensure adherence to the laid quality systems methodologies and practices
  • Follow JP Morgan Chase Best Practices throughout my team.
  • Reporting and Monitoring Quality Center Graphs for Trend analysis and continuous process improvement.
  • Manage Onshore and Offshore resources
  • Maintain Enterprise product integrity throughout the Lifecycle
  • Scheduling and maintaining project QA timelines, resources and environments

Confidential,DE Sep 1999 – Jun 2001


SQM Consultant


To analyze and Test a Web Based Application called Just-in-Time, which acts as a Concierge Service to all the Card Members of First USA. Also was involved in testing another Web application called At Your Request, which is also an additional web based service to the FUSA Customers. Another web-based application called Reminder Service, an extension of the existing Just-in-Time being integrated with the Card member Services was also planned and tested before launch.

Responsibilities

  • Analyze and Study Business and Functional Requirements
  • Develop a Test outline
  • Meet with the Developers, DBA’s, Webmasters to discuss various Test related issues
  • Co-ordinate with the various Teams to set up Test Environment
  • Develop and Document Test Plan for the whole Test project
  • Develop Test Cases – functional, performance, exceptions, UAT based on various functionalities
  • Used Test Director as the Central Repository to store Test Data
  • Execute all the Test cases and recorded defects on Test Director
  • Generated Test Reports based on Document Generator
  • Performed Y2K Testing for the applications based on different critical dates
  • Performed Load Test based on Web Virtual Users, using the Tool Loadrunner
  • Automated existing Manual Test scripts based on Winrunner, to perform Regression Testing
  • Report outstanding issues in regular status reports and meetings and actively participated to solve those issues

Majority of the above projects were carried out in client server environment - Windows 98 with IE 4.1. But the applications were also tested against multiple browsers like Windows 95/NT, with IE 5.0/Netscape 4.x, AOL, etc. The Database used was Oracle and we also used Access to view the data.

Confidential,NC Jun 1999 - Aug 1999

QA Analyst


Analyze and Test a Web based application called eBusPlans that displays data from a remote database.
The key functional areas were:

  • Project was based on the requirements of the Testing of Web based application
  • Analyzed requirement and designed system document to understand what functionality need to be tested
  • Developed System Test Outline
  • Developed Test Plan
  • Developed and executing Test cases, and determine whether deviation from Test cases are defects. This is being done through CMVC – IBM Bug tracking system
  • Kept continuous track of Status of Tests including % completion and status of defects
  • Manage the Whole Test independently to complete the project on schedule while meeting all exit criterion
  • Specifically, the System Test was being developed and executed
  • Stress on Load Testing by creating the Test bed was important
  • New Tests were conceived and developed within the scope like Server Simulation Tests
  • Documented Final Results

This project was carried out in client server environment, in Windows 95 with Netscape 4.04 Browser and AIX Server having the Database

We'd love your feedback!