Qa Analyst / Tester Resume
Seattle, WA
SUMMARY
- Over 10 years of diverse experience in Information Technology with emphasis on SDLC, QA/QC process, Test methodology and QA Lifecycle
- Experience in QS validation of Data warehouse/ Data Analysics, Web, Client - Server and mainframe systems.
- Expertise in handling QA Projects from the stage of gathering the requirements from business users/development teams, creating test plan, test strategy, handling the team, reporting to higher management, working with business users in UAT phase, handle deployments and perform post-deployment testing.
- Performed both Manual and automated tests using Mercury tools.
- Performed Integration testing, System testing, Black Box functionality testing, Regression testing, Back end testing, User Acceptance Testing and improvement of QA Process.
- Experience in Testing Database Applications (ORACLE 8i/9i/10g, MS SQL Server, DB2 and MS Access) using querying tools.
- Experience working in Agile / Scrum Methodology, Waterfall, CMMI and RUP.
- Experience with Test Management and worked with ALM/Mercury Quality Center, Mercury Test Director.
- Strong analytical, written, communication and inter-personnel skills, willingness and enthusiasm to learn, and ability to respond to problem situations engineer self to participate actively with professional peers and team members.
PROFESSIONAL EXPERIENCE
Confidential, Richmond, VA
QA Team lead
Responsibilities:
- Deploy, Configure and validate GPOs, Print Queue, Drive Mapping, and Anti-Virus on Windows 7 workstations.
- Validated the different task sequences of the windows 7 Images for compliance with Infosec/Architecture requirements on most prevalent HW models
- Involved in applications/packages compatibility validation for Windows 7
- Co-ordinate and conduct SIT/UAT sessions with the different LOB teams
- Provisioned the virtual desktops to the SIT/UAT users via VMWare View Administrator tool
- Involved in the daily scrum, sprint planning, sprint review and retrospective meetings
- Created the user stories based on the product backlog items discussed in the sprint planning meeting
- Followed the product/Release/Sprint burn down chart in the daily scrums to know the work status
- Co-ordinate with Onsite and Offshore testers about the status of the work on the daily basis
- Weekly QA metrics reporting on test execution, test planning, defect status, requirements coverage and execution to the program audience.
- Implement and Oversee Quality Assurance Methodologies, Process and Procedures during all project phases.
- Managed Requirements across the entire program, Test Cases, Defects and Requirements Traceability Metrics to ensure that the requirements are covered fully by the test cases using Application Lifecycle Management.
- Served as primary point of contact for Quality Services across all stakeholders and facilitated delivery of Clarity Implementation from requirements to end to end testing
- Analyzed Business and System requirements and extensively collaborated with Business Analysts, Project manager, UAT resources to develop a high level Master test Plan
- Involved in gathering requirements for different modules in Clarity.
- Responsible for providing the Test Execution reports to the Leadership team for each of the Releases
- Tested Complex Portlets, Actuate and Business Objects Reports.
- Executed the Regression suites for all the Upgrades of Clarity
- Involved in providing design solutions for the Enhancements in Clarity Processes.
- Involved in maintenance of the Clarity Configurations - OBS Structure, lookup values, Portlet defaults and Security Groups
- Involved in testing the LDAP interface for Clarity to assign appropriate Security groups to the Users
- Provided Application Support and respond to the queries raised by the Customers
- Supervise test case estimations, creation and execution for all new functionalities and Interface releases.
- Creating Project Specifictestcases for Source table extraction process, INIT files, NRCH files, clean files, load ready files, Target table record count and validations, word count, field level, file level, signal/dat files and Key field validations.
- Conducted Verification and Validation using SQL Queries and Abintio commands based on Source-Target filed mappings and DifferentETLStages as well as Archiving and Purging Process.
- CreatedTestCases for Abinitio Plans, Graphs and Psets (db to clean and create delta) for clean files, delta files, individual jobs, Plans, verifying loading process and Target tables.
- Abinitio jobs are built using Plan>IT. Involved in executing individual jobs as a part of Systemtestingand executing plans as a part of IntegrationTesting
- Worked onTestingOTS (One Time Shot) Job execution totestthe previous day files, converting serial to multi files.
- Designed and Developed the QA Automation Architecture with QTP as execution engine
- Development of test driver scripts
- Development of library files and reusable components
- Automated regression test suite execution and defect reporting to customer
- Development of automation scripts in data driven framework
- Regression test using QTP in keyword driven framework
- Re-Engineered automation framework for data driving
- Extensively used generic function libraries to drive the test case scenarios
- Served as a mentor from functional perspective for all the New Resources joining in the Clarity Support Team
- Designed and built a regression suite for product testing to be executed offshore per schedule
- Used Application Lifecycle Management for managing the defect flow.
- Actively participated in defect status and review meetings to resolve the defects(SIT/UAT) in efficient and timely manner.
Environment: Clarity V12.1.0, Windows 2008 R2 / SP2, Windows 2003 Standard R3 Server, VM Ware View Administrator 5.1.1, SCCM 2007 SP2 & SCCM R2, SCOM 2007, IIS 6.0/7.0, UNIX, Abintio GDE 1.15.7.2, Co>operating 2.15.4.2, Plan>IT, SQL Server 2000/2005, SQL Server Management Studio, Oracle 9/10g, Mercury Quality Center 10.0,ALM 11.5,QTP 10.0, ILM 2007, Exchange 2007,VersionOne.
Confidential, Durham, NC
QA Team lead
Responsibilities:
- Tested the backend database using SQL queries.
- Performed data validation testing after the ETL process using DataStage.
- Reviewed the ETL specifications documents and resolved the issues in the design specifications with the developers and Architect
- Tested and validated end to end Data Conversion and Data Migration.
- Tested the Mappings between Source and Target tables.
- Worked with data files (Flat Files, XML, SSIS, SAS data files).
- Moved the data files (Flat Files, SSIS, SAS) across environments.
- Tested the ETL process (Data Extraction, Data Loading and Data Transformations).
- Tested the Source, Target databases and Data Marts.
- M dump the Serial/Multifiles with corresponding DML’s and verify the files correctness and rejected records.
- Worked on Control-M Scheduling of jobs; worked closely with Tech lead, Developers and project SMEs in preparing the Job schedule for new process by comparing existing flows and executing the jobs in QA environment.
- Worked with multiple sources/targets (Oracle, SQL Server).
- Used DataStage Designer to test jobs for extracting, cleaning, transforming and loading data into data warehouse
- Used DataStage Director to run and monitor the DataStage jobs.
- Validated the Conditions, Rules by using SQL and PL/SQL
- Worked with SQL, PL/SQL and functions to test the database integrity.
- Wrote SQL statements to extract data from the Tables.
- Extracted and transferred source data from SQL Server and Oracle databases.
- Worked extensively with Database Procedures, Functions, Cursors, Joins and Triggers using PL/SQL to apply business rules.
- Test the Data extraction process, accessing data quality, data profiling and validated data marts.
- Tested the Data warehouse Reports generated in Cognos.
- Tested the List Reports, Crosstab reports, Charts, Maps in Cognos.
- Worked with Sorting/Grouping, Filters/Prompts in Cognos.
- Performed database testing using SQL queries by using Left, Right Joins and conditional Queries for the test\database used to identify the anonymous users of the application using SQL
- Involved in Database testing along with developers and coordinated with developers to solve the problems encountered in the application.
- Tested and execute test cases that are listed in MS Excel.
Environment: Oracle 10G,, SSIS, SAS, Datastage 8, Cognos, Teradata, DB2, Rapid SQL, SQL Assistant, SQL, PL/SQL, SQL Server, Version One, MS Office, MS Visio, MS Project, Outlook, Window XP pro, SubVersion, SecureShell, XML Notepad, XML Marker.
Confidential, Richmond, VA
Senior QA Analyst
Responsibilities:
- Deploy, Configure and validate desktop threads, GPOs, Print Queue, Drive Mapping, and Anti-Virus on workstations.
- Active Directory objects structure implementation, migration, and validation.
- Creation and verification of Active Directory Schema, Objects, OUs, GPOs, Containers, Groups, Sites and Subnets and Site Links.
- Instrumental in validating and overseeing the rollout of Branch server infrastructure to host the CA Unicenter / Microsoft SCCM software in compliance with the proposed Windows Server 2008 AD structure for Confidential Finance.
- Set up and configure the CA Unicenter /Microsoft SCCM Test Environment for the four solution areas, Software Delivery, Remote Control, Asset Management and Mobile Computing.
- Conducted and documented a Proof of Concept to evaluate two Asset Inventory solutions namely CA Asset Inventory and HP Enterprise Discovery to validate that they are operational in coexistence.
- Provided Analysis, Status Reports of the project and test execution progress to the Team
- Designed and Documented the Test Plan, Test Strategy and Identified the Test Data for the project.
- Implement and Oversee Quality Assurance Methodologies, Process and Procedures during all project phases.
- Setup and configure the System Center Operations Manager (SCOM) solution for the enterprise.
- Managed Requirements across the entire program, Test Cases, Defects and Requirements Traceability Metrics to ensure that the requirements are covered fully by the test cases using Rational Requisite Pro and Mercury Quality Center.
- Actively participated in the Requirements Gap Analysis and Requirements Management sessions.
- Deployment of remediated enterprise application package on pre and post migrated workstations via SCCM.
- Weekly projection of test forecasting numbers for QA metrics reporting and aligning the test planning and execution numbers for the test progress.
- Weekly QA metrics reporting on test execution, test planning, defect status, requirements coverage and execution to the program audience.
- Co-ordinate and conduct UAT sessions with the Bank IT teams and Application owners for remediated applications.
- In depth defect analysis and categorization based on the resolution and lessons learnt summarization between the project phases.
- Involved Preparingtestdata, taking form production and mocking up of data and preparing master recycle files and for testingETLprocess of End-to-End process.
- Worked ontestingFile retention policy for Temp directory (signal and dat files) Main directory (Serial and Multi Files), Archive directory, Serial logs, Admin logs, Rejects and Error files.
- Running the jobs and plans from both GDE and command line and verifying the job status, successful execution and verifying the Admin/Serial log/error/rejects and Serial/Multifile files fortestanalysis and trouble shooting.
- Involved in creatingTestData Management (TDM) Requests for System (Scrub data without NPI) and UATtestdata preparation using HP Service Manager, creating, assigning and tracking service request tickets.
- Involved in Data mockuptesting, creating data mock up scenarios using Abinitiotestgraphs and executing in QA environment
- Creating project sandbox in home directory and override paths and executing the jobs using GDE and analyzing the data flows andtestresults
- Worked closely with Developer in Preparing the TOSS-G and in preparing the Production Implementation Steps (OTS, Stage Table, master and recycle files) and executing in QA before implementing in Production.
- Using GDE creatingtest, validations graphs and mockup the data files as pertestscenario (positive and negative) using Abinitio functions (Reformat, Merge, Gather, Leading Records, Filter by Expression, Generate Records) and executing in QA/UAT regions.
- Execution thetestcases in QC and logging defects, assigning the tracking and conducted Smoke, Sanity and Regressiontestingfor each build.
Environment: Windows 2008 R2 / SP2, Windows 2003 Standard R3 Server, VM Ware, SCCM 2007 SP2 & SCCM R2, SCOM 2007, SQL Server 2005 SP3, IIS 6.0/7.0, CA Unicenter r 11.2, HP - Enterprise Discovery, Novell Zenworks 6.5, Novell E-Directory, UNIX, Abintio GDE 1.15.7.2, Co>operating 2.15.4.2, Plan>IT, SQL Server 2000/2005, SQL Server Management Studio, Oracle 9/10g, Quality Center 9.0/10.0, Rational Suite (Clear Quest, Clear Case), Tera Data, SQL Assistant, Mercury Quality Center 9.0/10.0, Rational Requisite Pro, Quest NDS Migrator, ILM 2007, Exchange 2007.
Confidential, St. Louis, MO
Senior QA Analyst / Tester
Responsibilities:
- Participated in the Agile / Scrum Planning and assisted in reviewing the requirements for gaps in coverage.
- Actively participated in the SCRUM meetings for daily update and Iteration Planning meetings
- Performed Static testing of the Stories (requirements) for gaps and coverage.
- Created Test plans and test cases in Quality Center.
- Created automation scripts using the Automation framework (QTP)
- Created automation scripts (QTP Scripts) for testing the HTTP services.
- Used the Hybrid Automation framework (QTP) that was already built. Performed Beta testing on the Framework to make it bug free before publishing to the QA team.
- Manually tested various functionalities (Add, delete, search, edit, bulk update, HTTP services) of the application.
- Performed Unit test for ER Model changes using the TIBCO designer.
- Developed XML files and Parsed the XML messages.
- Performed end to end test for the application by pumping HL7 messages through various EPIC interfaces.
- Performed integration tests using ADT, ORU and ORM messages and made sure the correct data is populated.
- Created reports from Quality center to provide the status of test execution and Defect statuses.
- Used Version One to create QA tasks, track defects and provide test estimates as well as track time worked against each task.
- Tested the REST services from the UI as well as the service itself by executing VUGen Scripts.
- Extensively used TOAD to verify source data and target data after the successful workflow runs using SQL.
- Extensively Used SQL to validate backend database changes, deletes and update.
- Developed SQL Queries to perform database testing to see if there any data inconsistencies.
Environment: Java, SQLDeveloper, Oracle, TIBCO Designer 5.5, Quality Center, Quick Test Pro (QTP), Version One, MS Office, MS Visio, MS Project, Outlook, Window XP pro, SubVersion, SecureShell, XML Notepad, XML Marker.
Confidential, Seattle, WA
QA Analyst / Tester
Responsibilities:
- Developed Test Plans, Test Cases and Test Scripts by following Agile Process.
- Requirements management using Mercury Quality Center.
- Participated in Business Analysis, Requirement analysis and Use-Case analysis.
- Worked with Objects/trees in Active Directory.
- Tested the migration of user accounts across servers and validated the logon.
- Tested the Role-based security testing and single sign on.
- Installation of files and programs on servers.
- Trouble shooting the network connections and monitoring them.
- Test Cases and Test Scripts management using Mercury Quality Center.
- Tested the configuration, workflows and reports.
- Tested the X12 /EDI 837 and 835 for claims.
- Involved in Database testing along with developers and coordinated with developers to solve the problems encountered in the application.
- Tested and execute test cases that are listed in MS Excel.
- Participated in bugs and enhancement review meetings, assigning the bugs and enhancement request to the developers and follow up.
- Conducted System, Functional, GUI, Regression and UAT Testing.
- Used Mercury Quality Center for managing the defect flow.
- Worked with SQL queries and UNIX shell scripts.
- Participated in GO/NO-GO Meetings.
Environment: Mercury Quality Center 8.0/9.0, QTP 8.2/9.0, LDAP, Active Directory, TCP/IP, Mainframes, ASP.NET, VB.NET,TIBCO, HTML, XML, SOAP, SQL Server 2000, Unix, Oracle, MS-Office, Windows 2003 SERVER.
Confidential, Indianapolis, IN
Quality Assurance Tester
Responsibilities:
- Testing effort of Pharmaceutical clinical applications system.
- Planned, developed, and executed tests using the QA methodology.
- Involved in the system’s Requirement Analysis, Design Document, and unit testing.
- Developed Test Plans of the application as per technical specifications.
- Validated the claims data and conducted volume testing.
- Participated in system and functional testing manually and automatically with Mercury Tools.
- Involved in testing of web-application screens.
- Participated in generating Test Scripts using Win runner and Test Director.
- Wrote SQL to test the application for data integrity.
- Worked with UNIX shell scripts.
- Used MS Access to build simple applications
- Tested User Interface inconsistency and application functionality.
- Managed the defect information in Test Director.
Environment: Java 2.0, J2EE, HTML, XML, UML, Oracle 8i, JavaScript, Mercury Test Director 8.0, Win runner 7.6, Mainframes (Cobol, DB2, JCL, CICS, MVS), Crystal Reports, Lotus Notes 5.0, IBM Web Sphere, MQ Series, Unix, NT
Confidential
Quality Assurance Tester
Responsibilities:
- Participated in testing effort of E-commerce applications and Web applications.
- Involved in the development of System Test Plan and test scripts using business and system requirement documents.
- Responsible for component QA testing - using predefined test scenarios.
- Designed and developed the test strategy and approach for testing and tested the applications.
- Performed manual testing for the execution of large number of test cases, performed Integration Testing to ensure that every module works together as a whole.
- Conducted application testing manually
- Tracked defects using remote bug tracking system
- Worked with QMF to write queries to test the backend database
Environment: VB, JAVA, Oracle, Java Script, J2EE, Windows NT, MS-Office