Software Test Documentation
Project Name
Version
Date
i. Introduction
This test plan for Mobile App xxx supports the following objectives:
1. To define the tools to be used throughout the testing process.
2. To communicate to the responsible parties the items to be tested, set expectations
around schedule, and define environmental needs.
3. To define how the tests will be conducted.
ii. Test Items:
List the test items (software / products) and their versions.
These are things you intend to test within the scope of this test plan. Essentially a list of
what is to be tested. This can be developed from the software application test objectives
inventories as well as other sources of documentation and information such as:
● Requirements Specifications
● Design Specifications
You can use the table below to list the items:
ITEMS TO BE
VERSION NUMBER
TESTED
Item1 V1.3
Item2
Item3
Item4
Item5
iii. Features to Be Tested:
This is a listing of what is to be tested from the USERS viewpoint of what the system
does. This is not a technical description of the software but a USERS view of the
functions. It is recommended to identify the test design specification associated with each
feature or set of features.
Features to be tested include the following:
FEATURE TO BE
DESCRIPTION PRIORITY
TESTED
Feature 1 L /M/ H
Feature 2 L /M/ H
Feature 3 L /M/ H
Feature 4 L /M/ H
Feature 5 L /M/ H
Feature 6 L /M/ H
Feature 7 L /M/ H
iv. Features Not To Be Tested:
This is a listing of what is not to be tested from both the Users viewpoint of what the
system does and a configuration management/version control view. This is not a
technical description of the software but a user view of the functions.
FEATURE TO BE
REASON
TESTED
Feature 1
Feature 2
Feature 3
Feature 4
Feature 5
Feature 6
Feature 7
v. Approach:
This is your overall test strategy for this test plan; it should be appropriate to the level of the plan
(master, acceptance, etc.) and should be in agreement with all higher and lower levels of plans.
Overall rules and processes should be identified.
Mention the overall approach to testing.
Specify the following:
o Testing levels
o Testing types
o Testing methods
vi. Pass/Fail Criteria:
What are the Completion criteria for this plan? This is a critical aspect of any test plan and
should be appropriate to the level of the plan. The goal is to identify whether or not a test item
has passed the test process.
CRITERIA DESCRIPTION
Item 1 xxx
Item 2
Item 3
Item 4
Item 5
Item 6
Item 7
Item 8
vii. Suspension Criteria:
Know when to pause in a series of tests or possibly terminate a set of tests. Once testing is suspended how
is it resumed and what are the potential impacts, (i.e. regression tests).
If the number or type of defects reaches a point where the follow on testing has no value,
it makes no sense to continue the test; you are just wasting resources.
· Specify what constitutes stoppage for a test or series of tests and what is the
acceptable level of defects that will allow the testing to proceed past the defects.
· Testing after a truly fatal error will generate conditions that may be identified as
defects but are in fact ghost errors caused by the earlier defects that were ignored.
viii. Test Deliverables:
Once all bugs/defect reported after complete testing is fixed and no other bugs are found,
report will be deployed to client’s test site by PM.
• Once round of testing will be done by QA on client’s test site if required Report will be
delivered along with sample output by email to respective lead and Report group.
• QA will be submitting the filled hard copy of delivery slip to respective developer.
• Once lead gets the hard copy of delivery slip filled by QA and developer, he will send the
report delivery email to client.
ix. Testing Tasks
There should be tasks identified for each test deliverable. Include all inter-task
dependencies, skill levels, etc. These tasks should also have corresponding tasks
and milestones in the overall project tracking process (tool).
If this is a multi-phase process or if the application is to be released in increments there may be
parts of the application that this plan does not address. These areas need to be identified to avoid
any confusion should defects be reported back on those future functions.
This will also allow the users and testers to avoid incomplete functions and prevent
waste of resources chasing Non-defects.
TASK NAME NOTES
Test Planning
Review Requirements documents
Create test basis
Functional specifications written and delivered
to the testing team
Iteration 2 deploy to QA test environment
Functional testing
System testing
Regression testing
Resolution of final defects and final build
testing
Deploy to Staging environment
Performance testing
Release to Production
Prepare test summary report.
x. Environmental Needs:
Are there any special requirements for this test plan, such as:
· Special hardware such as simulators, static generators etc.
· How will test data be provided. Are there special collection requirements or
specific ranges of data that must be provided?
· How much testing will be done on each component of a multi-part feature?
· Special power requirements.
· Specific versions of other supporting software.
· Restricted use of the system during testing.
· Tools (both purchased and created).
· Communications
· Web
· Client/Server
· Network
· Topology
· External
· Internal
· Bridges/Routers
1- Testing Tools
Process Tool
Test case creation
Test case tracking
Test case execution
Test case
management
Defect management
Test reporting
Check list creating
Project structure
2- Test Environment
E.g.
o Support level 1 (browsers):
Windows 8: Edge, Chrome (latest), Firefox (latest)
Mac OS X: Chrome (latest), Firefox (latest)
Linux Ubuntu: Chrome (latest), Firefox (latest)
o Support level 1 (devices):
Phone 5 / 6, iPad 3, Nokia Lumia 910, Google Nexus 7, LG G3.
o Support level 2:
Windows 7: IE 9+, Chrome (latest), Firefox (latest)
Windows XP: IE 8, Chrome (latest), Firefox (latest)
o Support level 3: x anything else
xi. Responsibilities
There should be a responsible person for each aspect of the testing and the test process.
Each test task identified should also have a responsible person assigned.
This includes all areas of the plan, here are some examples.
· Setting risks.
· Selecting features to be tested and not tested.
· Setting overall strategy for this level of plan.
· Ensuring all required elements are in place for testing.
· Providing for resolution of scheduling conflicts, especially if testing is
done on the production system.
· Who provides the required training?
· Who makes the critical go/no go decisions for items not covered in the test plans?
· Who delivers each item in the test items section?
List the responsibilities of each team / role / individual:
STAFF
ROLE RESPONSIBILITIES
MEMBER
1. Acts as a primary contact for development and QA
Project team.
Manager
2. Responsible for Project schedule and the overall
success of the project.
1. Participation in the project plan creation/update
process.
2. Planning and organization of test process for the
QA Lead release.
3. Coordinate with QA analysts/engineers on any
issues/problems encountered during testing.
4. Report progress on work assignments to the PM
1. Understand requirements
2. Writing and executing Test cases
3. Preparing RTM
4. Reviewing Test cases, RTM
QA 5. Defect reporting and tracking
6. Retesting and regression testing
7. Bug Review meeting
8. Preparation of Test Data
9. Coordinate with QA Lead for any issues
xii. Schedule
Should be based on realistic and validated estimates. If the estimates for the development
of the application are inaccurate the entire project plan will slip and the testing is part of
the overall project plan.
Testing will take place 4 weeks prior to the launch date. The first round of testing should be
completed in 1 week.
Task Name Start Finish Notes
Test Planning
Review Requirements
documents
Create test basis
Functional specifications
written and delivered to the
testing team
Iteration 2 deploy to QA test
environment
Functional testing
System testing
Regression testing
Resolution of final defects
and final build testing
Deploy to Staging
environment
Performance testing
Release to Production
Prepare test summary report.
xiii. Risks And Contingencies:
What are the overall risks to the project with an emphasis on the testing process?
· Lack of personnel resources when testing is to begin.
· Lack of availability of required hardware, software, data or tools.
· Late delivery of the software, hardware or tools.
· Delays in training on the application and/or tools.
· Changes to the original requirements or designs.
RISK
RISK TYPE DETAILS CONTINGENCY
RATING
A slip in the schedule in one of the
SCHEDULE other phases could result in a L/M/H
subsequent slip in the test phases
As this is new system, if there is a
TECHNICAL L/M/H
failure the old system can be used
Management can reduce the risk of
MANAGEMENT L/M/H
delays by supporting the test team
It is important to have experienced
PERSONNEL L/M/H
testers on this project
The test plan and its schedule are
REQUIREMENTS based on the requirements L/M/H
document
xiv. Approvals
The Names and Titles of all persons who must approve this plan.
# PROJECT MANAGER QA LEAD
NAME
SIGNATURE