Browse Prior Art Database

A tool to highlight test output differences to normal behaviour Disclosure Number: IPCOM000201697D
Publication Date: 2010-Nov-18
Document File: 2 page(s) / 52K

Publishing Venue

The Prior Art Database


This article describes a method of detecting and presenting to the user of an automated test system alerts as to when an important change different to normal behaviour has occurred. Testers often have to trawl through vast amounts of output data from regression tests, spending much time simply looking for differences to expected results. This article describes a method in which this process can be automated further reducing the testers need to maintain such a course of action and instead having alerts when important changes are noticed by a system.

This text was extracted from a PDF file.
This is the abbreviated version, containing approximately 51% of the total text.

Page 01 of 2

A tool to highlight test output differences to normal behaviour

Application testers and developers often run automated test suites to help test an application. These test cases usually have a large volume of debug and test output stating what has passed, failed, or what the test is currently doing. Once a test suite has completed, a tester will often look through all of the test output to ensure the test adhered to its normal behaviour. The tester is looking for unexpected behaviour, which is different to normal, which may highlight a problem such as a regression when a regression testing suite has been run against the application.

    This process for a tester is often very slow, and is a manual solution. It is a very time consuming and fairly uninteresting process. Often testers are highly worked and their time is under heavy demand, meaning they do not always get the time to closely inspect test results and test output - meaning possible problems could be missed. A smarter solution would be one which speeds up this process and flags up possible problems to the tester automatically whilst maintaining a reliable and consistent processing of test results

    The core idea of this invention is that a tool will run which assess the normal condition of test suites and their results. It will build an understanding of what a normal result is for a specific test - and then from this it will be able to flag up to a tester when test output looks different to normal.

    A tool will run which is pointed at the root folder of the test suites debug output and test results. Over time it will be able to determine what is normal based upon a statistical comparison for the test content. The tester would be able to set a minimum level of difference at which the tool should notify them that a possible unique change has occurred in the test results.

    The advantages using...