Dismiss
InnovationQ will be updated on Sunday, Oct. 22, from 10am ET - noon. You may experience brief service interruptions during that time.
Browse Prior Art Database

Cache Prefetching Scheme With Increased Timeliness And Conditional Prefetches for a Two-Level Cache Structure

IP.com Disclosure Number: IPCOM000121108D
Original Publication Date: 1991-Jul-01
Included in the Prior Art Database: 2005-Apr-03
Document File: 2 page(s) / 79K

Publishing Venue

IBM

Related People

Ignatowski, M: AUTHOR [+2]

Abstract

Disclosed is an algorithm for initiating cache prefetches for data in a very timely manner using conditional prefetches to reduce contention. Other features included in the scheme help exploit a two-level cache structure.

This text was extracted from an ASCII text file.
This is the abbreviated version, containing approximately 52% of the total text.

Cache Prefetching Scheme With Increased Timeliness And Conditional
Prefetches for a Two-Level Cache Structure

      Disclosed is an algorithm for initiating cache prefetches
for data in a very timely manner using conditional prefetches to
reduce contention.  Other features included in the scheme help
exploit a two-level cache structure.

      Prefetching cache lines is a way to reduce the memory access
penalty in computer systems employing caches. However, timing and
contention details are very important to the real performance
benefits from prefetching.  Studies have shown that a key to making a
cache prefetching scheme work well involves prefetching in the most
timely manner possible, while avoiding interference with other cache
misses initiated by the processor.

      The disclosed algorithm assumes a prefetching mechanism located
in the system control or memory control element.  A large level two
cache may be located in this element as well.  The algorithm also
introduces a new memory request type, called a pseudo-miss.  It will
be sent to the prefetching mechanism the first time a prefetched line
is actually used by a processor.  A pseudo-miss will not actually
return any data (since that line is already in the cache).  However,
it will be used by the prefetching mechanism just like any other miss
to potentially initiate the next prefetch.

      The prefetching mechanism is signalled whenever a cache miss or
pseudo miss occurs.  It then takes the following steps:
 1.   Determine what data is to be prefetched, if any. A simple
scheme may just choose the next sequential line.
 2.   Mak...