Generic selectors
Exact matches only
Search in title
Search in content
Post Type Selectors

Write short note on write through and write back policy of cache memory.

Write Through:

  • Simplest Technique: Write-through is a straightforward approach where every write operation updates both the cache and the main memory simultaneously.
  • Memory Traffic: While ensuring that the main memory is always consistent, this technique generates significant memory traffic because every write operation involves updating both the cache and the main memory.
  • Disadvantage: The major drawback is the potential for creating a bottleneck due to the increased memory traffic.

Write Back:

  • Minimize Memory Writes: Write-back is a technique designed to reduce memory writes. In this method, updates are made only in the cache initially, without immediately updating the main memory.
  • Update Bit: When a write operation occurs in the cache, an “UPDATE” bit associated with the cache slot is set to indicate that the corresponding block in the cache has been modified.
  • Write to Main Memory: When a cache block needs to be replaced, it is written back to the main memory only if the UPDATE bit is set. This means that only modified cache blocks are written back to main memory, reducing memory traffic compared to write-through.
  • Problem with Write Back: However, a drawback of write-back is that portions of main memory may become outdated or invalid because updates are first made only in the cache. This can pose problems when I/O modules access main memory, as they may not see the most current data if it’s only present in the cache.

Temporal and Spatial Locality:

  • Temporal Locality: This refers to the tendency for a processor to access memory locations that have been recently used. It suggests that recently accessed instructions or data are likely to be accessed again soon.
  • Spatial Locality: Spatial locality is the tendency of a program to access memory locations that are clustered together. It implies that instructions or data stored near recently accessed ones are also likely to be accessed soon.

Implications for Cache Management:

  • Temporal Aspect: Cache systems exploit temporal locality by storing recently accessed data in the cache, anticipating that it will be needed again soon.
  • Spatial Aspect: Similarly, cache systems leverage spatial locality by fetching not just the requested data but also adjacent data into the cache, assuming that neighboring data will likely be accessed soon as well.

Leave a Comment