Dismiss
InnovationQ will be updated on Sunday, Oct. 22, from 10am ET - noon. You may experience brief service interruptions during that time.
Browse Prior Art Database

Cache LRU overriding in order to prevent thrashing

IP.com Disclosure Number: IPCOM000013911D
Original Publication Date: 2000-Mar-01
Included in the Prior Art Database: 2003-Jun-19
Document File: 1 page(s) / 30K

Publishing Venue

IBM

Abstract

Introduction Disclosed is a method and apparatus in a computer system for controlling cache line Least Recently Used (LRU) status explicitly with the instruction. This invention operates in a computer system having one or more levels of cache. Problem solved by this invention

This text was extracted from a PDF file.
This is the abbreviated version, containing approximately 52% of the total text.

Page 1 of 1

Cache LRU overriding in order to prevent thrashing

< Introduction >

   Disclosed is a method and apparatus in a computer system for controlling cache line
Least Recently Used (LRU) status explicitly with the instruction.

This invention operates in a computer system having one or more levels of cache.

< Problem solved by this invention >

   In a computer system, a cache is a small, fast memory located close to
the CPU that holds the code or data that is likely to be accessed again.
When the CPU does not find a data item it needs in the cache, a cache miss occurs,
and the data is retrieved from main memory and put into the cache. At this moment,
data item in sacrificial cache line is purged.

   To reduce the chance of throwing out information that will be needed soon,
accesses to cache lines are recorded. That is cache line LRU status.

The cache line replaced is the one that has been unused for the longest time.
This replacement is implemented transparent from the software;
sacrificial cache line upon a miss is selected automatically by the hardware
basing upon LRU algorithm described above.

   When the CPU is dealing with data structure that is too large for the cache to hold
at a time, the cache starts to thrash. Under thrash condition, data item in the cache
is trashed before being reused. This creates extra data transfer on the system bus.

   For example, when cache is implemented in 8-way set associativity, there is eight
potential candidate of sacrificial cache line. And when data structure is large,
that will occupy all eight entry, in spite that the data structure is unlikely to be
referenced soon. That is because newly loaded line always win LRU competition.

< Previous art >

   In this occasion, with previous art, the cache might simply be locked to prevent data
f...