Browse Prior Art Database

Executing a software testplan Disclosure Number: IPCOM000014819D
Original Publication Date: 2001-Nov-16
Included in the Prior Art Database: 2003-Jun-20
Document File: 6 page(s) / 84K

Publishing Venue




This text was extracted from a PDF file.
At least one non-text object (such as an image or picture) has been suppressed.
This is the abbreviated version, containing approximately 29% of the total text.

Page 1 of 6

Executing a software testplan


A system is described that will execute a list of test descriptions, which we call a testplan.

    The test descriptions each individually consist of an environment specification and a task specification. (1) The environment specification is subdivided into items not specific to the product under test (eg. operating system type and version, thread library to use), and other items specific to the product under test but non-specific to the particular test (eg. the runtime configuration details). (2) The task specification describes the test apart from its environment (eg. test that the product conforms to its advertised behaviour on page 100 of the user manual).

    Each machine available for running test programs is recorded in a test machines database. A messaging system is implemented on a control machine. For each machine in the test machines database there is a queue in the messaging system.

    A testplan reader program reads the test descriptions in the testplan, and matches the environment specification to a machine by looking in the test machine's database. A message containing that test description is written on that machine's queue. This is repeated until the whole testplan has been read. The messages on the queues in the messaging system are transferred to the relevant test machines.

    A test executor program runs on the test machine. It receives the message from the control machine and decides how to run the test. To do this, it loads such modules as are deemed necessary (to decide how to run the test) based on the task specification.

    The result of the test (and other attendant data) is sent back to the control machine. The result and data are stored in a status database on the control machine together with the environment specification and task specification that were used to define the test. Cross-referencing between the testplan and status database (for the purposes of status reports etc) is then trivial.


    The system requires that a list of tests be provided. The task specifications are named such that the execution system can decide how to run them. A hierarchical namespace is useful, though not necessary, for this.

    The testplan might be generated according to ad-hoc requirements. In this case the generation of the testplan and the running of tests are notionally part of a single activity, which we call a run.

Notable existing patents

US5021997: Test automation system

New ideas

Notable inventions in this system may include:

1. The test description consists of an environment specification and a task


Page 2 of 6

2. Test machines are utilised by looking them up in a database.
3. Tests are run asynchronously on many machines using a messaging system.
4. The test executor program inspects the test description and loads such modules as may be necessary to decide how to run the test.
5. The results of the test are stored in close proximity to the testplan to assist cr...