also used for primary memory, which could eliminate the need for cache memory all
together, there are some very good practical and economic reasons. SRAM costs as much
as six times more than DRAM and to store the same amount of data as DRAM would
require a lot more space on the motherboard.
Caching in Operation
The CPU operates internally faster than RAM is able to supply data and instructions to it.
In turn, RAM operates faster than the hard disk. Caching solves the speed issues between
these devices by serving as a buffer between faster devices (the processor or RAM) and
slower devices (RAM or the hard disk).
As discussed in Chapters 3 and 7, the CPU interacts with RAM through a series of
wait states. During a wait state, the CPU pauses to allow a certain number of clock cycles
for the data it has requested to be located and transferred from RAM to its registers. If the
data is not in RAM already and must be fetched from the hard disk, additional wait states
are invoked and the CPU waits even longer for its data. One of the primary purposes of
the cache memory is to eliminate the cycles burned in CPU wait states. Eliminating any
CPU idleness should make the entire system more productive and efficient.
Locality of Reference
The principle oflocality of referenceis a design philosophy in computing that is based on
the assumption that the next data or instructions to be requested is very likely to be lo-
cated immediately following the last data or instructions requested by the CPU. Using
this principle, caching copies data or instructions just beyond the data requested into the
cache memory in anticipation of the CPU asking for it. How successful the caching sys-
tem is at making its assumptions determines the effectiveness of the caching operation.
As iffy as this may sound, PC caching systems surprisingly get acache hitabout 90
to 95 percent of the time. The cache memory’shit ratiodetermines its effectiveness. Each
time the caching system is correct in anticipating which data or instructions the CPU will
want and has it in cache, it is tallied as a hit. The number of hits divided by the total re-
quests for data by the CPU is how the hit ratio is calculated. Of course, if the CPU asks for
data that is not in cache, the data must be requested from RAM and acachemiss, a definite
caching no-no, is tallied.
Saving Trips
If your PC did not have cache memory, all requests for data and instructions by the CPU
would be served from RAM. Only the data requested would be supplied, and there would
be no anticipation of what the CPU would be asking for next. This would be something like
if every time you wanted a cold one, you had to run to the store for just one can, bottle, or
cu pof your favorite drink. If the CPU is very busy, it could get bogged down in memory
requests, just like if you were very thirsty, you would spend all of your time running to and
from the store.
Chapter 8: Cache Memory^159