Qa Analyst / Ba Resume
Washington, Dc
SUMMARY:
- Strong expertise in Quality Assurance and Software testing of N - tier application systems
- Expert in analyzing & reviewing complex functional, Business & technical Requirements, translating them to Test cases, Scripts and prepare detailed Test plans
- Have demonstrated extensive knowledge and understanding of Structured Analysis and Design as it relates to the test case design. Well versed with writing test cases from the requirement’s document
- Good knowledge of software development life cycle (SDLC)& pre-production validation, user acceptance test’s
- Extensive exposure to quality standards and practices across the life cycle stages with experience in managing and improving CMMILevel 3 implementing continuous improvement initiatives with IT process integration capabilities
- Good understanding & experience of different QA methodologies, Use cases, Sequence diagrams, Siebel CRM Concepts, RDBMS Concepts. Testing skills include White Box Testing, Black Box Testing, User Acceptance Testing, Regression Testing, Integration Testing, Performance Testing Smoke Testing, and Data Driven.
- Sound knowledge of system performance patterns and system concurrency (i.e., synchronization, Threading, Serialization, logical/physical contentions)
- Sound knowledge of Performance Test project management and test delivery in large scale project integration.
- Experience in both Front-End and Back-End testing.
- Experience in Sales Force testing.
- Extensively involved in testing the application manually.
- Experience in RDBMS Technology and Web development using HTML, CSS, DHTML, JavaScript, VB Script
- Flexible and versatile to adapt to any new environment and work on any project.
- Expertise in Version control and bug tracking tools like PVCS tracker and Rational Clear Quest Excellent in oral and written communication skills
- Strong analytical skills, ability to create and refine technical documents and data, analyze performance results, and create formal technical reports
- Very effective on Project/Task coordination, with excellent ability to track and follow up on the tasks
- Excellent oral and written communications
SOFTWARE SKILLS:Testing Tools: Team Foundation Server, Quality Center (ALM), Quick Test Professional (UFT), Load Runner,Jmeter
Defect Tracking Tools: Quality Center, JIRA, TFS
Version Control Tools: PVCS, Clear Case, VSS
Databases: MS-Access, Oracle 10g/9i/8i/8.x., Sybase ASE 12.5/12
Programming Languages: TSL, HTML, JavaScript, VBScript, C, C++, PL/SQL.
Internet Technologies: JSP, HTML, ASP, VB Script, DHTML, XML
Application/Case Tools: TOAD, Erwin5.x/3.5, SQL Navigator, Visio, Golden
Operating systems: Windows -all, MS-DOS, UNIX (AIX, HP - UX)
Web Servers: ;BEA Web logic Server, Apache Tomcat, IIS 4.0, PWS and MTS.
Auxiliary skills: UML, MS-Office Suite, Visio, MS-Front PagePROFESSIONAL EXPERIENCE
Confidential,Washington DCQA Analyst / BA
Responsibilities:
- Gather and write requirements for CMMI Level 3using FISMA Compliance 2002 in agile/ waterfall methodology.
- Performing QA audit for auditing tool SARS application depending on updates from the skate holders.
- Once the requirements or new enhancements gathered Conduct artifact audits by gathering, examining, and documenting appropriate and sufficient evidence of project artifacts to ensure compliance with established project management and system development life cycle policies and procedures.
- Design and create test cases and scripts to address business and technical use cases.
- Participate in troubleshooting and triaging of issues with different teams to drive towards root cause identification and resolution.
- Created Requirements, logged and maintained test cases, defect report, traceability using TFS.
- Documented test strategy and testing guidelines for implementation of services.
- Support production deployment of applications and perform “validation testing”.
- Write SQL queries to test the data accurate.
- Assist with application installation and testing (UAT Testing).
- Provides assistance and advice to business users in the effective use of applications and information technology.
- Provide creative and practical solutions to meet Enlightened goals.
- Identify, analyze and document IT hardware/software defects, questionable functions, errors, and inconsistencies and deviations from standards and suggest options to conform to standards, individually and in a team environment.
Environment: IE, Firefox, Chrome, Firebug, Snipping tool, .NET, Visual Studio, TFS, Excel, Word.
Confidential,Bethesda MDSecurity Test Engineer
Responsibilities:
- Good understanding of techniques, standards and state-of-the art capabilities for authentication DUO.
- Performing QA audit in an IT environment for waterfall and Agile projects.
- Conduct artifact audits by gathering, examining, and documenting appropriate and sufficient evidence of project artifacts to ensure compliance with established project management and system development life cycle policies and procedures
- Test VMware connections in different VPN connections in Remote desktop connection testing with different user accounts.
- Test the accessibility of application after remote connection, able to run the jobs as expected.
- Write and execute test cases for ‘Service Now’ application for in house maintenance and do regression testing.
- Regression testing on company website and maintain the test cases and defect tracker on excel.
- Integrated tools with Service Now applications utilizing Mid server and email technologies
- Testing the scalability and landscape of AGI application on IPad and IPhone.
- Working in Agile/Water fall environment
- Monitor Batch schedule jobs.
Environment: Windows XP, Firefox, Firebug, Snugget, SharePoint, Word, Excel
Confidential,Washington DCQA Engineer
Responsibilities:
- Establishing, sustaining, driving, and monitoring quality assurance processes, practices, and controls in support of application and system requirement, development and test activities throughout the software development and sustainment lifecycles.
- After received new enhancements or requirement from higher management Conduct artifact audits by gathering, examining, and documenting appropriate and sufficient evidence of project artifacts to ensure compliance with established project management and system development life cycle policies and procedures
- Writing stress and performance test plans, creating and executing stress and performance test scripts, and leading the testing activities. Ability to adapt quickly to an existing, complex environment.
- Develop, maintain and execute automated and manual test scripts for requirement validation, integration, regression, performance and usability testing.
- Documenting defects, using a bug tracking system, and report defects to engineers, product managers.
- Produce status reports related to the testing process
- Perform analysis related to enhancements to the application
- Review and maintain all project documentation in accordance with department procedures
- Experience in VB scripting and using UFT in integrated environment (step generator, synchronization, actions recovery scenarios and methods).
- Draft, edit, and implement Help Text files for existing and new modules
- Lead Lessons Learned or Project Review sessions, and identify potential work process improvements
- Review requirement docs and provide feedback.
- Create test data and verify the data sets.
- UAT tesing for the entire application.
- Balancing the mortgage calculation manually by using the formulas.
- Working in Agile/Water fall environment
- Execute batch files using UNIX commands.
Environment: HP ALM, Windows, IE, Firefox, .NET, Jira, Firebug 2.0.11, Cucumber, Selenium, .NET, JMeter, SharePoint, SQL Server Management Studio 2008 Express, Nunit 2.6.4, Jing from TechSmith, FirePath. MS Access, Linux environment
Confidential,Washington DCPerformance Engineer
Responsibilities:
- Conduct system performance testing to ensure system reliability, capacity and scalability.
- Work with testing team to develop performance test plans and cases.
- Analyze test results and coordinate with development teams for bug fixes.
- Generate test summary reports for management review.
- Analyze root causes of performance issues and provide corrective actions.
- Suggest new tools and techniques to improve testing efficiency.
- Assist in project planning, scheduling, budgeting and tracking activities.
- Provide support in project design, development and deployment activities.
- Develop automated test scenarios and environments for performance testing.
- Review and recommend improvements to existing test procedures.
Environment:: JMeter version 2.11, HP ALM, Windows, IE, Windchill, Safari, Firefox, Java, infomatica, cipher cloud, integration hub, Sales force cloud
Confidential, Ashburn, VAQA Engineer/Tester
Responsibilities:
- Worked closely with my manager to create the Test Strategy to establish a context for testing based on high-level test objectives, risk for the components under test, test environment needs and the organizational standards and controls to be applied to the testing activities.
- Identify/resolve potential application performance issues at all level (i.e., Application, Database, Networks, Infrastructure)
- Performed Load, Stress and Performance testing of front end using LoadRunner
- Create load testing reports to summarize test findings and identify any known risks and make recommendation for improvement.
- Provided assistance for Unit Testing of Modules written in C/C++, Java
- Lead the Web Services testing activities using SOAPUI Pro
- Validated XML responses according to the XSD format and make sure that there are no SOAP faults.
- Did the Data Driven Testing with SOAPUI Pro and tested the large data
- Validate the WSDL & XML files in SOAPUI and verified SOAP request and response for various services
- Wrote Java and VB scripts for minor custom testing when required.
- Validated data conversion requirements and data mappings.
- Created the Test Plan covering what will be tested and how it will be tested, the resources and timelines needed for testing, and the assumptions, issues, dependencies and risks associated with the plan.
- Involved in the development of system test cases based on the business/system requirement documents and business rules.
- Created expected results for data conversion by extracting data from DB2 database applying data scrubbing and conversion logic and storing results on expected results database (oracle).
- Created an Access Application that automated the creation of expected results for Backend and Report Testing. This application used BD2 and Oracle source databases and created expected results in form of Access Reports and Tables.
- Created and executed SQL Queries those compared expected results to Actual results and updated test results in Quality Center.
- Closely interacted with the business analysts, designers and software developers to analyze application functionality and navigational flow for Automated and Manual Testing.
Environment: Rational Test Manager, Rational Clear Quest, QTP, Quality Center, LoadRunner 11.5, Java, J2EE, XML, Flex, Oracle 10G, TOAD, PL/SQL, DB2,Unix Shell scripting MS-Project, WSDL, XML, SOAPUI,SQL Navigator, Windows XP, Ascential, Informatica
Confidential, Boston, MAQA Engineer
Responsibilities:
- Created, maintained, tracked the details and project schedule for System Performance team
- Conduct pre and post migration performance testing of application and compared test results to ensure no performance degradation.
- Used LoadRunner 11 for stress Test on customer facing interface.
- Used UNIX shell scripts, File management, Running Large volume Data loads & batch jobs and Installation of builds on Test machines and verifying the log files.
- By using LoadRunner Analysis, generated various reports for higher management.
- Created specific scenarios for load balance testing with LoadRunner like fail over server during peak traffic period.
- Effective coordination between development and testing team.
- Involved in manual testing of test cases.
- Developed Test Cases using WSDL, Schema files which define WebServices Request, Response, methods/operations, End Point of web service to be tested.
- Responsible for Setting up WebServices project using WSDL in SOAPUI and provided setup help to other team members.
- Tested WebServices using web services client generated using SOAP UI to track request and response SOAP Messages
- Used QTP to automate the few scenarios.
- Investigated and documented software faults and interfaced with developers to resolve technical issues.
- Developed and executed the Test Cases using the QC 11.
- Participated time to time in End to end testing for each and every release using both manual and automated scripts using Win runner and Server response monitoring using PERL scripts
- Used SQL queries for back end testing.
Environment:Oracle, DB2, SQL, Windows, UNIX, Linux Server, Test Director, SOAPUI, SOX, HTTP, SSL, IE, Firefox, Safari
Confidential, McLean, VAQA Engineer/Tester
Responsibilities:
- Worked as Quality Assurance Tester / Analyst
- As a QA Tester actively involved in Analysis and Design meetings to ensure that assigned functionality meets the user requirements
- Worked closely with developers to better understand the requirement specs with a view to improve quality of the application
- Worked on writing test plan, test cases, test scripts and test matrix.
- Extensively involved in system testing, regression testing, integration testing
- Used Test Director for Test Planning, Test design, Test execution, defect tracking and defect reporting
- Analyzed Business Requirements
- Modified existing WINRUNNER test scripts as part of regression testing
- Written UNIX SHELL scripts to support development team
- Validated layout of feed files from different systems
- Verified SHELL SCRIPTS to make sure that it is invoking SQL LOADER properly
- Used SQL scripts/queries to verify the data loaded properly
- Ran the reports to verify reports are pulling correct data from database tables
- Done Negative Testing by passing wrong or duplicate values to application
- Done WEB TESTING to check all hyperlinks are valid and HTML code usage is correct.
- Tested high priority defects in the production environment
- Coordinating with the Users and Business Analysts to resolve the issues
- Executed Test Cases using Test Lab in Test Director and logged defects in TD
- Involved in implementing new software into the production
- Used PVCS Tracker to track the defects during UAT
- Involved in setting up test environment
Environment: Oracle 8, Sun Solaris, Actuate Reports, HTML, JavaScript, Pro C, C++, Oracle Internet Application Server, PVCS, TOAD, ClearQuest, WinRunner
ConfidentialQA Engineer/Tester
Responsibilities:
- Working on project deliverable schedule meetings with project management teams
- Working on documentation, Collection of metrics and collection of reports
- Responsible for Implementing the QA process in the project level
- Tested the Capacity, Load, Stress and overall Performance of the servers
- Writing and Execution of batch jobs on UNIX servers
- Performed Load and stress testing using LoadRunner and Creation & execution of Load test Scripts as per the test plan. Site scope monitoring and Smoke testing
- Responsible for Creating the Load Test scenarios from scratch including assigning users for each script, Adding monitors for the system and Application counters, Assigning load
- Test generators, Configuring the runtime settings and scheduler aspect of LoadRunner Controller.
- Used DOORS as requirements and Traceability management tool
- Participate in business calls and Review Business Requirements and System requirements and usability feedback.
- Developed detailed Test plan & Testing Strategy for the entire application and developed various test cases using Test Director
- Extensively involved in the Database and Manual Testing
- Captured the SQL statements from the application execution and manually checked the results and extensive database testing
- Used TOAD and SQL for executing the SQL queries
- Extensive database testing and comparing the data with pre and post
- Tested the Web services using SOAP tool
- Participated time to time in regression testing & End to end testing for each and every release using both manual and automated scripts using Win runner and Server response monitoring using PERL scripts
- Prepared test data and load the same data into testing Data bases.
- Report Problems using Test Director/Quality center defect tracking system and follow up development teams, and make sure they are fixed in appropriate timeframe
- Attending technical walk- through sessions and Daily stand up calls
- Participated in daily/weekly stand up meetings with business analysts and developers.
- Used BugZilla as defect tracking tool
Environment:Windows XP, Java, Java Script, J2EE, Oracle8i, UNIX, HTML, XML,XSLD, CSS, ATG Dynamo 6.4, Documentum Content Server 5.2.5, Win runner, LoadRunner, Quality Center, DOORS, BugZilla and Toad