5. Test Management
Table of Contents

5.1 Test Organization

  • Independent testing
  • Tasks of Test Leader and Tester

5.1.1 Recognize the importance of independent testing

5.1.2 Explain the benefits and drawbacks of independent testing within an organization


  • More objective and consistent
  • Specialist testing expertise
  • Dedicated team


  • Could be isolated form development team
  • Communication problems, feeling of alienation
  • Blame games and political backstabbing
  • Developers make lose sense of responsibility for quality

5.1.3 Recognize the different team members to be considered for the creation of a test team

  • Test Leader
  • Tester

5.1.4 Recall the tasks of a typical test leader and tester

Test leader

  • Test strategy
  • Coordinate
  • Plan
  • Metrics
  • Contribute
  • Initiate and monitor
  • Control
  • Decide tools
  • Test summary report
  • Design test environment
  • Configuration management
  • Automation


  • Analyse for testability
  • Implement test environment
  • Create test specification
  • Prepare test data
  • Implement test
  • Execute and log the tests
  • Evaluate the results
  • Measure performance
  • Report deviations
  • Automate test
  • Use tools
  • Review

5.2 Test Planning and Estimation

Testing Planning:

  • Test policy and Test strategy
  • Estimating techniques
  • Test Plan

Test Approaches:

  • Analytical
  • Model-based
  • Methodical
  • Process-compliant or Standard-compliant
  • Regression-averse
  • Dynamic and heuristic
  • Consultative

Entry and Exit criteria

5.2.1 Recognize the different levels and objectives of test planning

5.2.2 Summarize the purpose and content of the test plan, test design specification and test procedure documents according to the ‘Standard for Software Test Documentation’ (IEEE Std 829-1998)

See Documentation

5.2.3 Differentiate between conceptually different test approaches, such as analytical, model-based, methodical, process/standard compliant, dynamic/heuristic, consultative and regression-averse

  • Analytical: risk and/or requirements-based
  • Model based: stochastic testing
  • Methodical: checklist and failure based
  • Process or standard compliant: e.g. Agile methodologies such as extreme programming
  • Dynamic and heuristic: error guessing, exploratory testing
  • Consultative: ask user/developer
  • Regression averse: highly automated

5.2.4 Differentiate between the subject of test planning for a system and scheduling test execution

5.2.5 Write a test execution schedule for a given set of test cases, considering prioritization, and technical and logical dependencies

5.2.6 List test preparation and execution activities that should be considered during test planning

  • Test objectives
  • Scope and risks
  • Testing approach
  • Integrating with teh SDLC
  • What, who and how to test
  • Scheduling
  • Resources
  • Test documentation
  • Selecting metrics
  • Automation

5.2.7 Recall typical factors that influence the effort related to testing

Product requirements

  • Test basis quality
  • Size of product
  • Complexity of product
  • Non-functional requirements
  • Security requirements
  • Required documentation

Development process characteristics:

  • Organisation stability
  • Tools used
  • Test process
  • Time pressure
  • Skills

Outcome of testing:

  • number of defects
  • amount of review required

5.2.8 Differentiate between two conceptually different estimation approaches: the metrics-based approach and the expert-based approach

  • Metrics-based may use
    • size * complexity
    • per test case effort
    • tester to developer ratio
  • Expert-based uses experience

5.2.9 Recognize/justify adequate entry and exit criteria for specific test levels and groups of test cases (e.g., for integration testing, acceptance testing or test cases for usability testing)

Entry criteria

  • Test environment availability
  • Test tool readiness
  • Testable code
  • Test data available
  • Test design compete
  • Software documentation availabl

Exit criteria

  • Certain level of coverage
  • No high-prioirty or sever defects outstanding
  • Cost
  • Schedule achieved
  • All high-risk areas fully-testing, with only minor residual risks outstanding
  • All test planned have been run
  • Status of the important quality characteristics for the system

5.3 Test Progress Monitoring and Control

  • Test monitoring
  • Test reporting and control
  • Test summary report

5.3.1 Recall common metrics used for monitoring test preparation and execution

Percentage of work done in test case preparation (or percentage of planned test cases prepared).
Percentage of work done in test environment preparation.
Test case execution (e.g. number of test cases run/not run, and test cases passed/failed).
Defect information (e.g. defect density, defects found and fixed, failure rate, and retest results).
Test coverage of requirements, risks or code.
Subjective confidence of testers in the product.
Dates of test milestones.
Testing costs, including the cost compared to the benefit of finding the next defect or to run the next test.

5.3.2 Explain and compare test metrics for test reporting and test control (e.g., defects found and fixed, and tests passed and failed) related to purpose and use

5.3.3 Summarize the purpose and content of the test summary report document according to the 'Standard for Software Test Documentation' (IEEE Std 829-1998)

  • test summary report : A document summarizing testing activities and results. It also contains an evaluation of the corresponding test items against exit criteria.

5.4 Configuration Management

Establish and maintain the integrity of the products and ensure all items of test-ware are identified, version controlled, tracked for changes, related to each other.

5.4.1 Summarize how configuration management supports testing


  • Determining what is needed
  • Configuration items management
  • Build process
  • Version control

5.5 Risk and Testing

  • Risk probability, likelihood and impact
  • Project and product risks
  • Risk-based testing approach

5.5.1 Describe a risk as a possible problem that would threaten the achievement of one or more stakeholders’ project objectives

5.5.2 Remember that the level of risk is determined by likelihood (of happening) and impact (harm resulting if it does happen)

Risk = probability x impact

5.5.3 Distinguish between the project and product risks

Product risks

  • system or software

Project risks

  • way the work is carried out. e.g. late delivery of software for testing

5.5.4 Recognize typical product and project risks

5.5.5 Describe, using examples, how risk analysis and risk management may be used for test planning

Risks responses:

  • mitigated
  • ameliorated by contingency
  • transferred
  • ignored

5.6 Incident Management

  • Incident Management
  • Incident Logging
  • Test Incident Report

5.6.1 Recognize the content of an incident report according to the 'Standard for Software Test Documentation' (IEEE Std 829-1998)

  • incident report: A document reporting on any event that occurred, e.g. during the testing, which requires investigation.

5.6.2 Write an incident report covering the observation of a failure during testing.

Unless otherwise stated, the content of this page is licensed under Creative Commons Attribution-ShareAlike 3.0 License