A Test Plan is a critical document that outlines the objectives, scope, strategy, schedule, resources, and deliverables of testing. It serves as a blueprint for the testing process and ensures that all stakeholders have a clear understanding of how testing will be conducted.
Let’s break down the key components of a test plan one by one:
Example:
| In-Scope | Out-of-Scope |
|---|---|
| - Login functionality | - Third-party integration testing |
| - User registration | - Backend database migration |
| - Payment processing | - Mobile app compatibility testing |
The test approach outlines how testing will be performed, including techniques, tools, and strategies.
Key Elements:
Techniques to be Used:
Tools and Automation Strategy:
Levels of Testing:
Example of Test Approach for a Web Application:
Example:
| Resource Type | Details |
|---|---|
| Human Resources | 1 Test Manager, 3 Test Analysts |
| Tools | Selenium, JIRA, Postman |
| Hardware | 2 Windows servers, 2 Linux VMs |
Example of a Test Schedule:
| Activity | Start Date | End Date |
|---|---|---|
| Test Planning | Jan 1 | Jan 5 |
| Test Case Design | Jan 6 | Jan 15 |
| Test Execution | Jan 16 | Jan 31 |
| Defect Fixing and Retesting | Feb 1 | Feb 7 |
| Test Closure and Reporting | Feb 8 | Feb 10 |
Milestones:
Steps in Risk Management:
Example of Risk Management:
| Risk | Likelihood | Impact | Mitigation Plan |
|---|---|---|---|
| Delay in test environment setup | High | High | Arrange alternative testing environments. |
| Test data is unavailable | Medium | High | Generate synthetic test data or request backup. |
| Team member unavailability | Medium | Medium | Cross-train team members for critical tasks. |
Examples of Exit Criteria:
| Section | Description |
|---|---|
| Test Objectives | Ensure login functionality works perfectly. |
| Scope | Test login, search, and payment modules only. |
| Test Approach | Black-box testing for functionalities. |
| Resource Requirements | 2 testers, Selenium, 2 Windows machines. |
| Schedule and Milestones | Execution: Jan 16–31; Closure: Feb 10. |
| Risks | Delay in environment setup → Use backup server. |
| Exit Criteria | 95% test coverage, 0 critical defects remaining. |
| Component | Purpose | Example |
|---|---|---|
| Test Objectives | Define measurable goals for testing. | Verify login functionality works correctly. |
| Scope of Testing | Specify what will and will not be tested. | Include: Login; Exclude: Database testing. |
| Test Approach/Strategy | Outline techniques, tools, and testing methods. | Use Selenium for automated regression tests. |
| Resource Requirements | Specify team, tools, and environments. | 3 testers, JIRA for defect tracking. |
| Schedule and Milestones | Define timelines and key milestones. | Test execution: Jan 16–Jan 31. |
| Risk Management | Identify and mitigate risks. | Mitigation: Use backup test environments. |
| Exit Criteria | Define conditions to stop testing. | 90% coverage, 0 critical defects. |
Test metrics are measurable values used to assess testing progress and performance. Below are common metrics:
| Metric | Description | Example |
|---|---|---|
| Number of Test Cases Executed | How many test cases have been run. | “200 out of 300 test cases executed.” |
| Number of Defects Found | Total defects identified during testing. | “15 defects identified so far.” |
| Defects Fixed vs. Open | How many defects have been fixed vs. pending. | “10 defects fixed, 5 still open.” |
| Test Coverage Percentage | Percentage of requirements or code tested. | “80% of requirements are tested.” |
| Defect Detection Rate | Rate at which defects are being detected. | “Finding 5 defects per day on average.” |
Let’s assume a test team is working on a project with 100 test cases.
| Day | Planned Execution | Actual Execution | Defects Found |
|---|---|---|---|
| Day 1 | 20 | 15 | 5 |
| Day 2 | 40 | 35 | 8 |
| Day 3 | 60 | 55 | 12 |
Observation:
Test control involves making decisions and taking corrective actions based on the insights gathered through test monitoring.
Reallocating Resources:
Updating Test Plans:
Prioritizing Test Cases:
Rescheduling Test Activities:
Scenario: Testing is behind schedule because of delayed test environment setup.
Control Actions:
Configuration management involves managing changes to software, test artifacts, and related documents to ensure consistency, traceability, and version control.
Version Control
Change Control
Baseline Management
Scenario:
Configuration Management Activities:
| Activity | Purpose | Examples |
|---|---|---|
| Test Monitoring | Measure test progress using metrics. | Track executed test cases, defects, coverage. |
| Test Control | Adjust plans and resources to stay on track. | Reallocate resources, update test schedules. |
| Configuration Management | Manage changes and versions systematically. | Use Git for version control and baseline tracking. |
A risk is a potential problem or uncertain event that may impact the success of the project or product.
Types of Risks:
Examples of Risks:
Risk Priority = Likelihood × Impact
| Risk | Likelihood | Impact | Priority |
|---|---|---|---|
| Test environment delay | High | High | High |
| Incorrect tax calculations in software | Medium | High | Medium |
| Tester unavailability | Medium | Medium | Medium |
| Step | Details |
|---|---|
| Identification | Test environment may not be ready for execution. |
| Assessment | Likelihood: High, Impact: High → Priority: High |
| Mitigation | Plan to use a cloud environment (AWS or Azure) if delays occur. |
| Monitoring | Weekly checks on environment readiness; escalate delays immediately. |
A defect is an issue where the actual behavior of the software deviates from the expected behavior.
The defect lifecycle represents the stages a defect passes through from identification to resolution.
| Status | Description |
|---|---|
| New | The defect is reported and logged for the first time. |
| Assigned | The defect is assigned to a developer for fixing. |
| In Progress | The developer is working on resolving the defect. |
| Fixed | The defect has been fixed by the developer. |
| Retested | Testers verify that the defect fix works as expected. |
| Closed | The defect has been verified and is now resolved. |
| Deferred | The defect will be fixed in a later release due to low priority. |
| Rejected | The defect is invalid, not reproducible, or works as designed. |
A defect report provides a detailed description of the issue to help developers understand and resolve it quickly.
| Field | Description | Example |
|---|---|---|
| Defect ID | A unique identifier for the defect. | DEF_001 |
| Summary | A brief description of the defect. | “Login button does not respond.” |
| Steps to Reproduce | Clear, step-by-step instructions to reproduce the defect. | 1. Open login page.2. Click ‘Login’. |
| Actual Result | What the system does incorrectly. | Login button does nothing. |
| Expected Result | What the system should do. | Redirect to homepage. |
| Severity | Impact of the defect (Critical, Major, Minor). | Critical |
| Priority | Urgency to fix the defect (P1 = High, P2 = Medium, P3 = Low). | P1 |
| Environment | The environment where the defect was found. | Windows 10, Chrome 95. |
| Field | Details |
|---|---|
| Defect ID | DEF_002 |
| Summary | “Password reset link throws 404 error.” |
| Steps to Reproduce | 1. Open login page. 2. Click ‘Forgot Password’. 3. Click reset link in email. |
| Actual Result | 404 error page is displayed. |
| Expected Result | Password reset page should open. |
| Severity | Critical |
| Priority | P1 |
| Environment | Windows 10, Chrome 95 |
Test reporting provides stakeholders with updates about testing progress, defects, and results.
Example:
| Metric | Value |
|---|---|
| Test Cases Executed | 120/150 (80%) |
| Defects Found | 20 |
| Defects Fixed | 15 |
| Test Coverage | 85% |
A summary report is created at the end of testing to communicate overall testing results and outcomes.
Example of a Defect Summary:
| Severity | Count | Fixed | Open |
|---|---|---|---|
| Critical | 5 | 5 | 0 |
| Major | 10 | 8 | 2 |
| Minor | 5 | 5 | 0 |
While the test plan is a management artifact, it is also a work product subject to static testing techniques.
The test plan should be reviewed (e.g., walkthrough, inspection) before execution begins to ensure:
A test plan is not just written and forgotten—it should be reviewed as part of static testing, just like requirements or design documents.
This table helps clarify the difference between “management-level” and “execution-level” testing activities—a frequent source of confusion in ISTQB questions.
| Activity Type | Test Management Activity | Test Execution Activity |
|---|---|---|
| Planning | Define objectives, scope, schedule in the test plan | Design test cases based on requirements |
| Control | Adjust scope, schedule, or resources based on test progress | Execute test cases; retest fixed defects |
| Monitoring | Collect and analyze metrics (e.g., coverage, defect trends) | Log defects, record test results |
| Reporting | Create progress reports and summaries for stakeholders | Document status of each test case |
| Closure | Ensure exit criteria are met, assess lessons learned | Finalize defect retests and close test cycles |
ISTQB Exam Tip:
Be prepared to identify which activities belong to test management vs. execution, especially in scenario-based questions.
What is the purpose of a test plan?
A test plan defines the scope, objectives, approach, resources, and schedule for testing activities.
The test plan provides guidance for the testing process and ensures that testing activities align with project goals. It typically includes information about test levels, test types, responsibilities, environments, entry and exit criteria, and risk considerations. By documenting these elements, the test plan helps coordinate the work of testers, developers, and stakeholders.
Demand Score: 68
Exam Relevance Score: 87
What is test monitoring in ISTQB?
Test monitoring is the activity of tracking testing progress and comparing actual results against the planned objectives.
Metrics such as test case execution progress, defect discovery rates, and coverage levels are used to evaluate whether testing is proceeding as expected. Monitoring allows managers to identify deviations from the plan and determine whether corrective actions are required.
Demand Score: 66
Exam Relevance Score: 83
What is configuration management in testing?
Configuration management ensures that all test artifacts and system components are properly identified, versioned, and controlled.
Testing involves multiple artifacts such as test cases, scripts, data sets, environments, and software versions. Configuration management tracks these elements so that tests are executed against the correct versions of the system and associated artifacts.
Demand Score: 64
Exam Relevance Score: 82
What is defect management?
Defect management is the process of identifying, recording, tracking, and resolving defects throughout the testing lifecycle.
When testers discover failures, they create defect reports describing the issue, reproduction steps, and severity. The defect management process tracks these reports through states such as open, assigned, fixed, retested, and closed. Proper defect tracking ensures visibility of issues, supports communication between testers and developers, and helps prioritize fixes based on severity and business impact.
Demand Score: 69
Exam Relevance Score: 86