Browse Prior Art Database

Methods of Specifying Data Prefetching without using a Separate Instruction

IP.com Disclosure Number: IPCOM000115743D
Original Publication Date: 1995-Jun-01
Included in the Prior Art Database: 2005-Mar-30
Document File: 2 page(s) / 119K

Publishing Venue

IBM

Related People

Song, SP: AUTHOR

Abstract

Three methods of data prefetch mechanisms are disclosed. Data prefetch is a way of reducing data cache miss frequency and the associated penalty by pre-loading the cache line that will soon be accessed by a load or a store instruction.

This text was extracted from an ASCII text file.
This is the abbreviated version, containing approximately 46% of the total text.

Methods of Specifying Data Prefetching without using a Separate Instruction

      Three methods of data prefetch mechanisms are disclosed.  Data
prefetch is a way of reducing data cache miss frequency and the
associated penalty by pre-loading the cache line that will soon be
accessed by a load or a store instruction.

      A data cache miss can cause a significant performance
degradation because, when it occurs, the processor is prevented from
executing the instructions that follow the load or the store
instruction (that has caused the data cache miss condition) until it
is corrected.  The degree of degradation caused by a data cache miss
is proportional to the difference in speeds of the effective
instruction execution time and the effective memory access time; the
greater the difference between the two, the greater the degradation.
With the growing trend of computers employing processors that can
execute multiple instructions per cycle that have increasingly short
instruction cycle time compared to the memory access time, data cache
miss penalty will only grow worse.

      Data prefetching can reduce occurrences of data cache miss.
The basic mechanism of data prefetching is to give processor a hint
that a load or a store access will be made to a memory address in a
near future.  The hint enables a processor to check for the presence
of the line that contains a copy of the specified memory location in
the data cache.  If the line is present in the data cache, no action
is taken.  If not, the processor may fetch the missing line into the
data cache so that a future load or store access to this line will
not incur a miss.  Data prefetching is most beneficial if the hint is
given early enough for the processor to finish fetching the line
before the actual load or store access to the line is made.

      There are several ways for a program to provide a data prefetch
hint to a processor.  In the existing PowerPC implementation, the
dcbt (data cache block touch) and dcbts (data cache block touch for
store) instructions are provided for that purpose.  The intention of
this scheme is that a program would use a dcbt(s) instruction for
each load (or store) instruction that may incur a data cache miss.
The problem with this method of using an instruction to provide a
hint is that each hint takes valuable resources, such as a location
in the program memory, in the instruction cache and in the
instruction decoder, away from another instruction.  If the number of
hints used in a given program is small, this method is quite
acceptable.  However, it is difficult to keep the number of hints
small because most existing compilers do not have the necessary
information to accurately predict whether or not a load or store
access will incur a miss.  Whenever a compiler is not sure about an
access, it will either use a hint, in which case it could use as many
as there are accesses, or not use a hint, in which case the
usefulness of da...