Dismiss
InnovationQ will be updated on Sunday, Oct. 22, from 10am ET - noon. You may experience brief service interruptions during that time.
Browse Prior Art Database

Automated Computer Performance Model Calibration Method Using Rule Based Inferencing And Generate-And-Test Paradigm

IP.com Disclosure Number: IPCOM000100543D
Original Publication Date: 1990-May-01
Included in the Prior Art Database: 2005-Mar-15
Document File: 3 page(s) / 122K

Publishing Venue

IBM

Related People

Potok, TE: AUTHOR

Abstract

Model calibration is the process of identifying and adjusting modeled performance indicators that exceed their measured counterpart by an unjustifiably large percentage. Model calibration is necessary because the performance data used to create a model is incomplete, and, at times, inaccurate. Estimated values are used to supply the missing or inaccurate information. Often additional information is required about the system to create a model that reflects the measured system. Below is an automated process based on the generate-and-test paradigm for calibrating a computer performance model.

This text was extracted from an ASCII text file.
This is the abbreviated version, containing approximately 49% of the total text.

Automated Computer Performance Model Calibration Method Using Rule Based Inferencing And Generate-And-Test Paradigm

       Model calibration is the process of identifying and
adjusting modeled performance indicators that exceed their measured
counterpart by an unjustifiably large percentage. Model calibration
is necessary because the performance data used to create a model is
incomplete, and, at times, inaccurate.  Estimated values are used to
supply the missing or inaccurate information.  Often additional
information is required about the system to create a model that
reflects the measured system.  Below is an automated process based on
the generate-and-test paradigm for calibrating a computer performance
model.

      Initialization:  A small pool of likely calibration parameters
is selected, a small arbitrary change is made to each parameter in
the pool, and the performance indicators resulting from the model run
are recorded.  This provides cause-and-effect information on several
model parameters. This information need only be collected at the
first attempt to calibrate a model.

      Problem Determination:  The percentage differences in the
measured and modeled performance indicators are calculated and
compared against acceptable tolerances.  If the differences are
within the specified tolerances, then the model is calibrated,
otherwise the model is uncalibrated.  The next step in the problem
determination phase is to filter the uncalibrated indicators by
rule-base inferencing to determine the root causes of the calibration
problem.  The indicators that are used to determine calibration are
often related.  Several indicators may be uncalibrated; however, one
root problem may be skewing several of the indicators.  By converting
knowledge about modeling into rules, the root calibration problem can
be found, thus reducing the domain of the problem space.

      Solution Determination:  If calibration problems are found,
then all reasonable model parameters that may solve the calibration
problem are found through rule-based inferencing.  This solutions
space is then bounded to eliminate unlikely solutions.  Three types
of bounding information are used:  problem information, feasibility
information, and historical information.  Problem information limits
the solution space to those parameters that are likely to solve a
specific calibration problem. Feasibility information eliminates
those solutions that propose infeasible changes.  Historical
information is used to reject parameters that produced little or no
change to uncalibrated indicators when tried during a previous
calibration session.

      Planning:  The following formula defines the predictive model
that estimates the impact that a change in a parameter will have on
the performance indicators of the model.
      B  = a vector of the measured performance indicators.
      M  = a vector of the current modeled performance
     in...