Dismiss
InnovationQ will be updated on Sunday, Oct. 22, from 10am ET - noon. You may experience brief service interruptions during that time.
Browse Prior Art Database

Automated Status Tracking System for Software Testing

IP.com Disclosure Number: IPCOM000110607D
Original Publication Date: 1992-Dec-01
Included in the Prior Art Database: 2005-Mar-25
Document File: 2 page(s) / 79K

Publishing Venue

IBM

Related People

Carpenter, ER: AUTHOR [+5]

Abstract

This process was developed to address the problem of gathering and presenting accurate and timely status information in a large testing environment and storing that data for future reference. The environment used in the initial implementation was made up of approximately 81 networked systems from different vendors. Testing was operated 24 hours a day, 7 days a week. The test environments were automated, and data logs were generated continuously, creating large amounts of testcase results requiring analysis. To keep up with this volume of test information, and to be able to process it in time to use the results of testing, an automated process for evaluating and presenting the information was needed.

This text was extracted from an ASCII text file.
This is the abbreviated version, containing approximately 52% of the total text.

Automated Status Tracking System for Software Testing

       This process was developed to address the problem of
gathering and presenting accurate and timely status information in a
large testing environment and storing that data for future reference.
The environment used in the initial implementation was made up of
approximately 81 networked systems from different vendors.  Testing
was operated 24 hours a day, 7 days a week.  The test environments
were automated, and data logs were generated continuously, creating
large amounts of testcase results requiring analysis.  To keep up
with this volume of test information, and to be able to process it in
time to use the results of testing, an automated process for
evaluating and presenting the information was needed.

      The testing environments were run so that results from all
tests were output in the same format.  This allowed the resulting log
files to be automatically processed.  Testing was scheduled to run
for a certain number of hours, and then when that time was up, status
processing occurred automatically.  Graphs of the results were
generated automatically for each set of tests, immediately following
the ending of those tests.  The steps of this process are: the log
files of the test results are transferred to the server machine; the
log files are processed; and percentage of passing tests is
determined at varying levels, from the individual testcases to each
machine to the overall percentage of all the machines combined; and
the information is stored in a database for future reference along
with a date and time when the testing occurred.  The whole process
takes less than 15 minutes from the time the testing finishes to the
time the graphs are being printed.  To accomplish these objec...