Browse Prior Art Database

Prefetching in a Multilevel Memory Hierarchy

IP.com Disclosure Number: IPCOM000049471D
Original Publication Date: 1982-Jun-01
Included in the Prior Art Database: 2005-Feb-09
Document File: 2 page(s) / 37K

Publishing Venue

IBM

Related People

Bennett, BT: AUTHOR [+4]

Abstract

The key aspect set forth in the three-level memory hierarchy, comprised of level 1 (L1), level 2 (L2), and level 3 (L3), is the incorporation of the prefetching mechanism within the directory of the second level cache. An example of the value of this idea is sequential line fetching on Instruction Misses from the first level cache. If the prior sequential aspect of the line referencing has been maintained by the directory of the L2, then it need not be maintained separately by a Cache Miss History Table, resulting in a clear saving in hardware.

This text was extracted from a PDF file.
At least one non-text object (such as an image or picture) has been suppressed.
This is the abbreviated version, containing approximately 99% of the total text.

Page 1 of 2

Prefetching in a Multilevel Memory Hierarchy

The key aspect set forth in the three-level memory hierarchy, comprised of level 1 (L1), level 2 (L2), and level 3 (L3), is the incorporation of the prefetching mechanism within the directory of the second level cache. An example of the value of this idea is sequential line fetching on Instruction Misses from the first level cache. If the prior sequential aspect of the line referencing has been maintained by the directory of the L2, then it need not be maintained separately by a Cache Miss History Table, resulting in a clear saving in hardware.

The introduction of a third level in the memory hierarchy (see figure) to eliminate the impact of longer main memory access times has the additional benefit of allowing the highest level (L1) to be smaller and closer (in cycles) to the processing unit. As such, the performance can be further improved due to the potential for cycle time reduction.

The smallness of the L1 level increases the opportunity for prefetching cache lines from the second level (L2). The smaller cache regularizes the miss sequence and the directory of the second level can be used as a means for controlling the prefetching by monitoring the miss activity from L1 and setting bits within the directory to invoke miss sequences it has previously monitored. The optimum prefetching strategy will depend on the size of the L1 cache and the bits (space).

1

Page 2 of 2

2

[This page contains 2 pictures or other non-te...