Browse Prior Art Database

Cache Miss Director - A Means of Prefetching Cache Missed Lines

IP.com Disclosure Number: IPCOM000050035D
Original Publication Date: 1982-Aug-01
Included in the Prior Art Database: 2005-Feb-09
Document File: 1 page(s) / 12K

Publishing Venue

IBM

Related People

Driscoll, GC: AUTHOR [+6]

Abstract

In a computing system with a cache, when the CPU encounters a storage reference that cannot be satisfied by the cache (a demand miss), processing usually must be delayed for an access to main storage. Because of the increasing disparity between CPU and main storage speeds, the penalty for cache misses is both substantial and increasing in cycles per instruction. Therefore, one needs to forestall "demand misses" by guessing what lines will be needed in the cache.

This text was extracted from a PDF file.
This is the abbreviated version, containing approximately 70% of the total text.

Page 1 of 1

Cache Miss Director - A Means of Prefetching Cache Missed Lines

In a computing system with a cache, when the CPU encounters a storage reference that cannot be satisfied by the cache (a demand miss), processing usually must be delayed for an access to main storage. Because of the increasing disparity between CPU and main storage speeds, the penalty for cache misses is both substantial and increasing in cycles per instruction. Therefore, one needs to forestall "demand misses" by guessing what lines will be needed in the cache.

Assume a system in which the staging of lines between main storage and the cache does not interfere substantially with the CPU's references to the cache.

A "Cache Miss Directory" (CMD) is set forth, that is, a hardware table of addresses. The CMD contains "trigger" addresses and for each trigger address a set of "target" addresses. When a demand miss occurs, the CMD is searched for a trigger address equal to the address that caused the demand miss. If one is found, a series of "anticipatory" cache misses are generated, one for each target address that is associated with the trigger address and that would cause a demand miss if referenced. The expectation is that these lines will be needed, and the goal is to get them in the cache (or at least to start getting them) before a demand miss occurs for them.

To prevent immediate needs from being overshadowed by probable future needs, anticipatory misses are managed with the lowest priority. Further...