Sap Integration Test Manager Resume
5.00/5 (Submit Your Rating)
SUMMARY:
- A results - oriented, hands on QA/Test Manager who has worked in multiple industries.
- Extensive 25+ year work history and proven track record in the area of Software Quality Assurance with a primary focus on Software Testing and strong knowledge of disciplines that intersect with Testing that include: Quality Management, Project Management, Business Process Re-engineering, Release Management, Configuration Management, Change Management, Business Analysis and Development.
- Worked in a variety of roles that require both hands on and management skills throughout the career.
- Extensive work experience has afforded the ability to work on numerous applications, developed in house or purchased commercial software (COTS), from a variety of industries each of varying configurations to include client server, web and mainframe in addition to SDLC diversity including Waterfall, Iterative, Agile.
- Extremely motivated, enthusiastic self-starter, who is not afraid to challenge the status quo.
- Requires minimal direction and possesses a unique ability to be both a visionary and strategic thinker while at the same having the capability to be extremely detail oriented and analytical.
- Very resourceful, able to effectively multi-task while still maintaining a strong emphasis on quality.
- Work experience lends itself to being able to perform in a very faced paced, hectic, dynamically changing environment, adapting as needed.
- Highly focused and goal /results oriented.
- Not afraid of hard work or challenges and can handle aggressive deadlines, pressure and heavy workloads.
- Excellent people and communication skills.
- Effectively interfaces with all levels of personnel.
- Very team and service oriented, always ensuring the clients’ expectations are set and managed, and business needs met.
- Looking to obtain a position that will enable the ability to provide leadership and guidance while exercising hands on experience, in an SQA environment, that is either just starting up or requires attention.
- Goal is to improve and optimize the Software Quality Management process within an organization by building strong, motivational, self directed teams, demonstrating existing Test / Project management abilities, and implementing quality standards and automation where feasible.
SKILL:
- Managing the Software Process, Measuring the Software Process
- Total Quality Management (TMQ) / ISO9000 Training / on the job initiatives
- Industries:
- Pharmaceutical
- Leasing
- Warehouse/Distribution
- Hospitality
- Health Care
- Financial Services
- Retirement Services
- Publishing
- Transportation / Intermodal
- Retail
- Applications Tested:
- Accounts Payable
- Procurement
- Accounts Receivable
- Workflow
- Fixed Assets
- General Ledger
- Billing
- Sales
- Food & Beverage
- Inventory
- Warehouse Management
- Contracts
- Interfaces
- ETL / ITL
- Version Control Utility
- Advertising
- Purge Utility
- Content Management
- Languages/
- Visual Basic T/E
- Java E
- COBOL T/E
- Turbo Pascal T/E
- Specialized Software
- Winrunner T/E
- Jira
- Quality Center / Test Director
- ClearQuest E
- Bugzilla
- PVCS E
- (Training/Exposure):
- Loadrunner T
- Quick Test Pro T/E
- Platforms:
- SUN Solaris
- Web
- IBM 3090/MVS
- Client Server
- Windows 2000/98/95/NT/XP
- UNIX / AIX / Linux
- Novell
- DEC/VMS
- HP3000/9000
- AS400
- Data Bases:
- Oracle
- MS Access
- DB2
- DBASE IV
- MSSQL
- Other Software:
- MS Project
- MS Excel
- MS Outlook
- CC Mail
- MS Word
- Lotus Notes
- All Clear Flow Charting
- Visio
- MS Powerpoint
- Internet Explorer
- SharePoint / Groove
- MySQL (QMF)
- RUP
- CMM
- Agile
- Waterfall
PROFESSIONAL EXPERIENCE:
Confidential
SAP Integration Test Manager
Responsibilities:
- Responsible for the Test Planning and Execution of approximately 80 interfaces impacting approximately 20 functional areas and many 3rd party systems.
- Managed the Testing of 16 Functional Workstreams and 10 Stream Leads
- Achieved 100% readiness and successful test execution for the Partner Dry Run initiative with the ability to track at a very granular level, the testing of each interface throughout the various hops in the end to end workflow.
- Generated a new Status report that provides at a glance, the progress throughout the test execution which includes test execution metrics, planned vs. actual, schedule deviations, defect tracking and progress over prior days’ activity. This is now used by senior leadership team.
- In minimal time, achieved the creation of a test schedule that was very overdue, high in demand which was then utilized to bridge a huge gap and line up the appropriate support resources globally
- Trained team and conducted kick off meeting on all aspects of the testing process prior to the test execution phase, that included mandates by the Quality Risk and Compliance group
- Achieved 100% readiness for the Market Integration testing that included the creation of a formalized Test Plan, Test Schedule, Defect Tracking, Resource utilization, Test Data creation, Test Case creation, escalation of risks and issues, mentoring team members as needed.
- Daily management of test execution phase including conducting daily status meetings, managing and adjusting schedule as needed, test management review and sign off of competed test cases using official SOP’s, defect management using ALM and official SOP, issue escalation, generation of daily /weekly status reports, liaison with management, technical teams and test leads.
- Responsible for driving all test closure activities including Test Closure documentation and Lessons Learned.
- Based on exemplary work performance on the Integration Testing phase, was added to the End to End Test Team to assist with planning, scheduling and driving the daily E2E Test execution meetings
- As MINT testing phase was winding down, filled UAT/Project Manager role for the Warehouse Management and Distribution (JDA Red Prairie) testing due unexpected vacancy. Brought the area from behind schedule / incomplete to on-time delivery escalating and resolving numerous issues that were not previously on leaderships radar which if not addressed, would have impacted Go Live.
- Trained junior role to become an additional Test Manager during the process then subsequently managed the entire initiative myself.
- Responsible for taking ongoing training required to comply with all GxP and ITIM SOPs.
- Traveled to Pittsburgh Office during initial stages of warranty period to assist with Ticket resolution during Early Life Support. Enhanced the tracking of ICE Tickets (Integration and Connecting ERP) by implementing a collaborative SharePoint solution to track and manage ticket updates using a real time solution.
- Traveled to RTP North Carolina with senior management to conduct knowledge transfer and lessons learned workshops for US Pharma group prior to the commencement of their USPG SAP Rollout
Test Manager
Confidential
Responsibilities:
- Deep Dive into both the Test Automation & Test Data Management activities & processes performed by the team (workbench)
- Performing an assessment of effectiveness, processes, status reporting, and team dynamics (assumed Keyword automation approach was to be retained)
- Documenting & escalating gaps, issues and risks encountered, in addition to noting what is working effectively
- Implementing short term controls and solutions to fix gaps in current approach
- Facilitating cross team meetings to define scope of work to be performed (backlog and future)
- Partnering with Offshore vendor to further refine the scope of their work effort, review Offshore estimates and approve SOW
- Building estimation models to plan out entire team’s work effort that is then entered into MS Project and used to manage and track work efforts
- Analyze and enhance existing status report(s) to provide more comprehensive reporting to management
- Monitor and manage daily progress through status reports and checkpoint meetings
- Central point of contact for automation issue escalation and facilitation of resolution
- Mentor, coach and motivate team members
- Work closely with offshore vendor to improve communications and productivity as needed
- Work with Onshore Team to increase productivity by providing structure where none previously existed
- Evaluate Quality Center to determine the additional fields needed to more effectively categorize tests that have special automation handling needs and to control automation process.
- Work with Manual QA Team to improve the quality of inputs to the automation process
- Working / interacting with all levels of management to communicate status, address issues and obtain direction when needed
- Generated forecasts for end of
- Of the 8,470 tests to be automated, only 16% of the automated tests were scripted with 0% validated / unable to be executed. No plan in place to complete.
- 100% tests scripted, 80% validated / executed.
- Absence of a process to upgrade the automation scripts to current code base with each release. Minimal number of automated scripts were up to current code base. No plan in place to upgrade them.
- 100% of the scripted tests were updated to current code base.
- Absence of Test Data to support onshore validation/execution
- Repeatable process implemented using Gold Copy approach. Data entered 1x, backed up and copied to two target data sets, one used by Offshore Team, the other by the Onshore Team.
- Absence of a Review process to validate Offshore deliverables and scripts
- Repeatable process defined to review every test script designed and unit tested by the Offshore Team. Output automated and automatically updated in Quality Center.
- Status report did not reflect true progress, rather only that of the Offshore Team, sending misleading message to key Stakeholders and project team members who were expecting a completed automated regression suite, with a quick time to market for each major regression test, Release testing. ROI impacted as more Manual testing had to be planned.
- Designed and built new status report to track at a more granular level, the health and status of each capability, while understanding the planned vs. actual, issue recap and plan to “Go Green”
- Limited / ambiguous guidance and direction provided to Offshore Team by Onshore Team.
- Improved communication to Offshore Team. When rolling out new Test Data process, formalized walkthrough was provided to Offshore, feedback solicited and incorporated accordingly.
- Absence of focus by Onshore Team when devising solution to Business / process needs.
- Create a project profile document for Onshore Team to follow, that required them to map out all options, with pros / cons and work effort involved.
- Test Data Management
- TDM scripts kept failing. Unable to run on more than 3 - 4 VDI machines simultaneously.
- TDM scripts now run across 10 VDI machines simultaneously. They were evaluated by architect and corrected where needed. Environmental issues were assessed and additional message queues added to prevent backlog causing scripts to fail.
- Absence of an estimation model to calculate effort on Test Data Creation
- Estimation now in place to calculate both manual and automated test data creation
Confidential
QA /UAT Test Manager
Responsibilities:
- Performed a GAP analysis for the CIO on the entire project to identify key activities that were required to move the project from an almost stalled state to where it needed to be, in order to meet project milestones.
- Devised overall Test Strategy/Approach and Test Plan for the initiative
- Created Test schedule using MS Project from Requirements Review through Production Deployment, which included the creation of a Test estimation work effort model, a resourcing plan and planning assumptions
- Successfully devised and implemented a Defect Management process & system which included the set up & configuration of the tool using Bugzilla as well as presentations to communicate the new process out to the project team.
- Analyzed Business workflows and requirements - Performed an assessment and identified / escalated the absence and / or incompleteness of key requirements.
- Worked within budget constraints to design a Test Management framework that facilitated the ability to track requirements review, capture test scenarios to be approved by the business from which actual test cases would be derived, evaluated several test management tools to store test cases, map to requirements (Traceability matrix), and ultimately log the execution results in such a way they could be easily reported on.
- Built excellent relationship with both the vendor and the Business where communication was very strained, thus contributing to more efficiencies in the overall process and obtaining needful information & training.
- Formalized the documentation of Test Scripts and built a Test Suite for ongoing regression testing against future releases in a manner to easily extrapolate the test execution data for in corporation into status report
- Devised test strategies and created test scenarios / cases for critical aspects of the testing initiative which includes Data Synchronization between Sales Force (Translations and ETL), the new Billing System and Legacy systems, inbound and outbound interfaces with the General Ledger system and Data Conversion.
- Generated a status report that effectively communicates the planned vs. actual test execution progress using metrics and a graph by key process workflows along with highlighting defect metrics and key issues.
- Built several key collaborative tracking mechanisms using MS Groove, now part of SharePoint 2010 to manage project tasks, Requirements Analysis, and the Release Management process (from vendor packaging to customization /installation.
- Performed research on various Test Management Tools for possible future implementation
- Trained and motivated a team of Business users on how to create tests, who were assisting with the testing effort prior to UAT, while hiring a QA staff.
- Identified roles & skills set needed to achieve test objectives.
- Perform hands on testing where needed
Confidential
Test Manager
Responsibilities:
- Implemented a repeatable Global Test Management process that resulted in the ability to quantify track actual Acceptance Test execution progress and results.
- Successfully devised and implemented a testing strategy for integrated vertical testing on a Global level that supported a “Follow the Sun” approach maximizing a 24 hr test window.
- Successfully designed the approach and coordinated the testing effort of a geographically distributed test/engineering teams involving 7 different countries where a 24*7 “Follow the Sun” approach was taken.
- Redesigned test scripts for short term use making them more efficient to navigate for historical data, repeatability purposes and easier to extract results for reporting purposes.
- Successfully rolled out the centralization of an issue management process and tool that encompassed the migration and training of 123+ users.
- Successfully promoted “out of the box” strategy to drive team members from local / regional mindsets (silos) to that of a Global one.
Confidential
QA Test/Project Manager / SDLC Process Manager
Responsibilities:
- Ramped up quickly in an effort to build up a QA Department for a huge initiative already in the Elaboration phase.
- Designed & Implemented QA processes for Requirements Review, Test Planning, Test Creation, Test Execution, User Acceptance Testing, Defect Management, in accordance with RUP Methodology. Used Allclear flowcharting package to capture workflows and supplemented via Word documents using Playscipt format and check lists.
- Introduced concepts of traceability, walkthroughs with BA team on requirements, impact analysis of project on existing application, back end testing / data validation, template tests for documentation of tests in Quality Center, parameterization of data values in test steps, automation, etc., that had not been in existence in prior testing initiatives
- Developing metrics to track QA progress and product / process quality in all phases of SDLC.
- Overall management of QA effort includes Manual, Automated, and Performance testing.
- Identified automation of performance needs based on new technologies / functionality and impact on existing applications
- Capacity planning - Identified QA resource needs based on project scope.
- Staffed QA resources for Manual, Automated and Performance Testing for project initiatives
- Created new estimation model that facilitated the ability to provide timely estimates for 14+ projects requiring AFE (Approval for Expenditure)
- Generated and maintained detailed QA Project Plans for each of the 14+ projects, that encompassed WBS, resource assignment, capacity planning & scheduling for the Elaboration, Construction & Transition phases.
- Managing QA project budget/schedule from Inception to Transition - Created tool to track weekly QA budget, reconciling forecasted projections to actuals, making appropriate changes to strategy, etc., where needed. Tracking could be done at manual, automation and performance testing levels as well as overall QA effort.
- Effectively managing and providing solutions for a variety of resource, offshore, budget, scheduling, cost, etc. issues
- Developed collaborative working relationships with all cross team disciplines involved in the SDLC as needed (Business Analysts, Development, Tech Services, Project Management Office, Business, etc.)
- Work with QA Lead to devise QA strategy for each project depending on the release it is to be deployed with (includes resource utilization, test creation and execution)
- Work with QA Technical Manager to maintain integrity of the QA Test environment as it relates to code pushes, release merges, data refreshes, patching. Established guidelines, project plans, standardized sets of tasks and tests needed to carry out the initiatives and ensure integrity of the environment after effort.
- Run daily QA Status Meetings during Transition Phase and provide daily / on demand updates to Senior Leadership Team as needed (Directors, VP and CIO).
- Coordinated UAT Effort. Hold Kick off meeting, work closely with UAT Business Project Managers and UAT team to establish Defect Management process, track UAT progress, identify Exit criteria, etc.
- Generate & present weekly QA status to Program Management Team
- Coaching and mentoring QA Team
- Create presentations on an as needed basis
- Responsible for transitioning QA Team mindset from Waterfall to RUP methodology.
- Motivating, mentoring and challenging entire QA team members to achieve both project and individual professional goals.
- Periodic travel to other Confidential location.
Confidential
QA Manager / Process Improvement
Responsibilities:
- Manage testing efforts for 4 tracks of parallel development and production line that runs concurrently with new development. 72 modules in total that need to be rolled out to complete application rollout. This includes:
- Defining Testing Strategy / Approach for both Functional and Conversion Testing
- Review of user requirements and issues resulting from use case review
- Monitor progress / scheduling impacts
- Coach / motivated team members - resulted in higher levels of morale and productivity
- Review test execution results
- Escalate / facilitate resolution of issues with development team / business analysts
- Defect management - assessed and dispatched issues from all QA & UAT Test cycles that resulted in defects, enhancements, training issues, requirement issues, etc.
- Defined, generated, analyzed, tracked and analyzed testing statistics on all test cycles, making necessary improvements to process, resource, or schedule related issues as needed.
- Hands on where needed in the actual creation of Test Cases, reviewing use cases, activity diagrams, data model, use case models, prototypes etc.
- Implemented cross training /knowledge transfer amongst team members
- Conduct /facilitate post mortem reviews for each release ensuring remediations get acted upon accordingly.
- Generation & publication of release notes targeted for UAT and production users
- Generate status reports
- Coordinate / Facilitate User Acceptance Testing
- Engaging users to prepare for UAT test effort - Assist them in what they need to do, test, etc.
- Scheduling and facilitating UAT and Daily Checkpoint meetings
- Escalation and resolution of user defects and issues / concerns to development and business analyst teams
- Escalation of new user requirements or modifications to existing user requirements raised during UAT
- Creation of UAT Kickoff Meeting presentation
- Work with users and auditors regarding SOX controls for auditing purposes
- Defined, documented and formalized UAT process
- Project Management:
- Scope out work effort and generate QA project plan for, new development deployments, production releases and re-engineering effort. Created spreadsheets to assist in determining estimated effort required to perform QA tasks.
- Assist in the creation and ongoing review of master rollout project plan for entire application. Created generic release project plan for each testing effort that is fine-tuned with each release.
- Implementing Functional Regression Test Automation:
- Defining Automation Framework
- Process, procedures and standards
- Scope of automation - Smoke, Functional Regression, GUI Testing
- Starting up initiative to do Performance / Load testing
- SDLC Process Management
- Assessed all SDLC processes and impact on QA, making recommendations where application (Requirements Management, Design, Development / Unit Testing, Code Management, Release Planning & Scheduling).
- Active in re-engineering SDLC processes that included requirements gathering, code management, development, release planning / scheduling, etc.
- Part of joint management team that holds weekly workshops to address key process issues.
- Re-engineered existing / created new QA processes:
- Set up Test Director to manage all testing assets (Requirements, Test Cases and Test Execution results)
- Identify strategy for organizing Test Asset information and determining customizations needed to support daily operations and reporting.
- Defined and documented overall QA Test Strategy
- Defined strategy for creating a Regression test bed where existing test bed:
- Contained tests that were obsolete, inaccurate or non-existent
- Was unable to distinguish core tests in the event of scope reduction
- Was unable to distinguish automation candidates and track life cycle of automation
- Defined Test Maintenance strategy Life Cycle of a Test (adding new / maintaining existing tests)
- Defined Test Execution Strategy based on revised Release Strategy / Release Schedule.
- Defined and documented overall QA process, procedures, workflows, and standards (including Test Plan, Test Case). Used All Clear flow Charting package to document all departmental workflows.
- Determining reports / metrics needed to support the QA process.
- Defined and documented new Defect Management / Enhancement process using Test Director
- In progress of defining User Acceptance Test Process
- Defining metrics to determine sufficient test coverage by functional area
- Developed strategy for managing data conversions from legacy to new system.
- Defined strategy for the creation and maintenance of static test data, which did not previously exist.
- Release Planning & Scheduling: Worked on defining Release Planning process, determine release contents, resolve build / environment related issues
- Defined departmental roles / responsibilities for QA and other areas
- Interviewing and hiring a QA staff - Replace existing and adds to staff (consulting and permanent positions)
Confidential
SQA Architect / Project Lead
Responsibilities:
- Analysis of the entire QA Process flow, manual test management strategies for 10 product lines, testing assets, and automations strategies.
- Improved, redefined and documented entire QA infrastructure - Full QA Cycle Process flow: Research & analysis, identify Solutions, document pros & cons, solicit team & management input, implement solutions - update tools, flowchart processes, generate process/procedures WORD documentation.
- Drastically improved, redefined & documented Test Management Strategy, Methodology & Process: Test Planning, Test Case Design, Test Execution, Test Maintenance, Management of Test Data.
- Redefined the Defect Tracking Process: Identification of software bugs, documentation, escalation, & resolution of them.
- Drastically improved and redefined automation process & strategies.
- Generated departmental statistics from analysis, and established ongoing metrics by which to gauge progress.
- Generated numerous PowerPoint presentations to convey statistics and overall process improvement recommendations to management.
- Created Roles & Responsibilities document that never previously existed.
- Interviewed candidates to fill a variety of open departmental positions.
- Analyzed existing Test Management model & process in order to emulate more efficient model in Test Director.
- Defined business requirements for each of the Test Director modules in an effort to support the newly implemented QA Process, Including Requirements, Test Plan & Test Lab.
- Developed an integrated process flow for use of Test Director by all Designed Test Director model for use by all Teams.
- Defined & documented process & procedures for using each Test Director module
- Designed enterprise wide Test Director Model to support use by all teams and product lines including Quality Assurance, Development, Product Support & Customer Support..
- Updated Test Director with all required custom user defined fields.
- Defined and documented requirements for importing 3rd party automation results into Test Director.
- Defined and documented process for importing tests and requirements into Test Director.
- Created several project plans using MS Project to fully implement Test Director (up to 70 pages).
- Created centralized issues / knowledge DB using MS Access in order to leverage implementation issue resolution and lessons learned information, critical to all rollouts.
- Created a centralized report inventory database to keep track of enterprise wide reporting requirements.
- Project Administration - Adding users, creating custom user groups, new projects
- Defined security requirements for multiple user groups sharing common repository.
- Prepared Training Material and trained staff of 25+
- Defined, documented and implemented disaster recovery process
- Analyzed and assessed state of current automation methodologies to accurately assess current state of automation.
- Performed Functional automation tool analysis, negotiated pricing and acquired tools
- Generated an ROI to determine feasibility of automation and comparison of 2 different tools.
- Defined and documented automation strategy, methodologies, and process.
- Set up an automation pilot using one automation developer & tester for proof of concept.
- Generated an enterprise wide automation rollout project plan.
- Responsible for establishing offshore QA process for products whose testing was performed overseas.
- Implemented verification process to ensure validity of offshore testing
- Managed Offshore Testing Team, escalating & resolving issues as encountered.
Confidential
Project Management Office Coordinator
Responsibilities:
- Established and implemented departmental project standards where none existed before: project definition template, project management tool, project request/approval process, status reports and status presentations.
- Generated consolidated weekly and monthly project status reports and monthly presentations.
- Analyzed departmental project resource allocation, scheduling conflicts, project deliverables, milestones., etc.
- Tracked and monitored departmental project progress
- Trained project leaders on newly implemented project standards.
- Conducted weekly project status meetings, assisting project leaders as needed, escalating and resolving project risks and issues.
- Identified, documented and flowcharted ‘existing’ departmental processes and procedures.
- Analyzed, modified and documented ‘existing’ departmental processes to achieve level II/III CMM.
- Established process metrics by which to gauge and improve departmental efficiency and effectiveness.
- Reviewed departmental organization and processes for cost effectiveness; proposed recommendations.
- Evaluated/automated internal tools to increase efficiency/reduce costs; negotiate software purchases.
- Implemented testing standards where none previously existed:
- Created a standardized/centralized global test repository using Test Director in order to effectively and efficiently maintain, manage and report on test planning activities.
- Established standardized procedures for creating, maintaining and updated test case/scripts.
- Generated a standardized testing project plan
- Established and documented standardized testing procedures for unit, system, integration, user acceptance, regression and stress testing.
- Assisted in the creation and review of all test cases/scripts as needed.
- Determined new data requirements/conversions.
- Assisted test team in troubleshooting defects and executing test cases as required.
- Managed testing effort, monitored software defects, conducted Test Team status meetings.
- Generated all testing deliverables including: Statement of Work, Test Approach/Planning Document, Project Plan, Testing Release Notes, Test Execution statistics.
- Generated a standardized project plan (90 pg.) that encompassed all aspects of the SQA department that were required to successfully and efficiently deploy the software (none previously existed).
- Determined upgrade feasibility, identified risks, technical requirements, required resources, etc.
- Planned, coordinated and managed entire upgrade effort from in house receipt to deployment.
- Monitored and tracked project status, deliverables and issues. Conducted cross team status meetings.
- Managed vendor relations and client expectations.
- Established a standardized process for documenting and requesting software changes requests.
- Ensured all change requests were properly documented/ incorporated in the software.
- Ensured all user interface customizations were properly version controlled and applied across four user interfaces - Java, Visual Basic, Neuron Data and Character Cell.
- Established and centralized and standardized global design documentation repository to maintain all detail customization information.
- Identified, documented and maintained approximately 7700 customizations (not previously accounted for)
- Established/documented standardized procedures for applying software customization reaching level II/III CMM.
- Established and configured new testing environments: Certification, Unit, System, Model
- Managed and maintained testing environments, ensuring proper migration path, resolving synchronization discrepancies
- Established/documented standardized version control and migration procedures achieving levels II/III CMM
- Gathered requirements/manage change requests for version control/migration utility
- Reviewed, consolidated and merge departmental release notes
- Packaged and deployed software and relevant documentation to regional markets
- Performed risk assessment, established contingency plans, created Y2K hotline, created global web site
- Ensured compliance, managed internal remedial and outsource effort, reviewed all generated documentation
Confidential
QA Tester/Analyst
Responsibilities:
- Generated test plan and test approach documents.
- Created detailed test cases and test scripts.
- Performed Unit, System, Integration, Regression and User Acceptance Testing.
- Identified application defects, determined root cause of problem.
- Updated, maintained and reported system defects using an Access based system.
- Gathered, reviewed and translated business requirements into functional specifications.
- Performed QA analysis/review user interface, functional, host and third party interface specification.
- Generated end user documentation of system functionality (Operators Guide).
- Conducted end user training and provided post deployment support.
Confidential
Senior QA Tester/ Analyst
Responsibilities:
- Created and executed test cases for both batch and on line programs, analyzed all related output.
- Identified, researched and documented all application defects. Resolved pending system issues
- Developed a defect tracking system to maintain and track all application defects and enhancements.
- Generated testing statistics - Developed PC programs to provide testing statistics.
- Planned, managed and coordinated testing activities in order to meet aggressive deadlines.
- Participated in TQM and ISO9000 meetings to prepare for ISO9000 certification.
- Established, documented and flowcharted departmental business processes/procedures.
- Generated user procedures for new business requirements and system constraints.
- Automated existing manual user procedures. Performed QA analysis on system documentation.
- Scheduled and managed the departmental and company wide training effort for new equipment mgt. system.
- Generated training material and conducted training classes for approximately 50 professionals.
- Designed and developed a database to track and monitor training results.
- Generated and documented testing procedures and guidelines.
- Created and executed system/regression test cases/scripts; reviewed all related output.
- Identified, researched, documented and escalated all system defects.
- Designed and implemented a global database to track all issues, problems, special procedures/rules.
- Generated conversion project plans. Defined, documented and managed conversion requirements.