Browse Prior Art Database

History Overflow Mechanism

IP.com Disclosure Number: IPCOM000043981D
Original Publication Date: 1984-Oct-01
Included in the Prior Art Database: 2005-Feb-05
Document File: 2 page(s) / 25K

Publishing Venue

IBM

Related People

Liu, L: AUTHOR [+7]

Abstract

In a cache line prediction mechanism, lost history information is preserved in a history overflow buffer (H2) and reloaded into the regular history buffer (H1) when a history pattern in H2 repeats. The disclosed cache line prediction mechanism is based on the next line reference information collected in the past. A history buffer H1 is used at L1 (first level cache). H1 has the same configuration as the L1 directory and may be implemented by adding a field to each entry of L1. For each line in L1, the corresponding entry in H1 contains the next line information (presumably a line ID). A valid bit, attached to each of the entry of H1, indicates the validity of the corresponding next line information. When a line is referenced in L1 with valid next line information, the recorded next line will be prefetched.

This text was extracted from a PDF file.
At least one non-text object (such as an image or picture) has been suppressed.
This is the abbreviated version, containing approximately 52% of the total text.

Page 1 of 2

History Overflow Mechanism

In a cache line prediction mechanism, lost history information is preserved in a history overflow buffer (H2) and reloaded into the regular history buffer (H1) when a history pattern in H2 repeats. The disclosed cache line prediction mechanism is based on the next line reference information collected in the past. A history buffer H1 is used at L1 (first level cache). H1 has the same configuration as the L1 directory and may be implemented by adding a field to each entry of L1. For each line in L1, the corresponding entry in H1 contains the next line information (presumably a line ID). A valid bit, attached to each of the entry of H1, indicates the validity of the corresponding next line information. When a line is referenced in L1 with valid next line information, the recorded next line will be prefetched. A history overflow buffer H2 is maintained under L1 level. Each entry of H2 consists of the line ID and next line ID. H2 is structured as usual cache directories with congruence classes, sets, LRU (Least Recently Used) replacement algorithms and certain status bits (e.g., the valid bits). H2 is filled with information overflowed from the L1 level. When a line is replaced from L1 with a valid entry in H1, the corresponding pairs of line ID's will be added to H2. In which case H2 is searched using the L1 line ID as the key and old information of this line in H1 will be replaced. When a line is fetched into L1, the directory is searched and then its associated valid next line information is also restored in H1. At L1, the information in H1 is updated dynamically. The criteria for updating H1 depend on the individual prediction mechanism. The figu...