Cache & Memory Hierarchy | Computer Architecture & Organisation (CAO) - Computer Science Engineering (CSE) PDF Download


Cache Memory

Cache memory is a very high speed memory that is placed between the CPU and main memory, to operate at the speed of the CPU.

It  is used to reduce the average time to access data from the main memory. The cache is a smaller and  faster memory which stores copies of the data from frequently used main memory locations. Most CPUs have different independent caches, including instruction and data.

  • Cache memory is much faster than main memory and its access time is very less as compared to main memory.

Cache & Memory Hierarchy | Computer Architecture & Organisation (CAO) - Computer Science Engineering (CSE)

Question for Cache & Memory Hierarchy
Try yourself:
What is the purpose of cache memory?
View Solution

Cache Performance

When the processor needs to read or write a location in main memory, it first checks for a corresponding entry in the cache.

The cache checks for the contents of the requested memory location in any cache lines that might contain that address.

  • If the processor finds that the memory location is in the cache, a cache hit has occurred and data is read from chache
  • If the processor does not find the memory location in the cache, a cache miss has occurred. For a cache miss, the cache allocates a new entry and copies in data from main memory, then the request is fulfilled from the contents of the cache.

The performance of cache memory is frequently measured in terms of a quantity called Hit ratio.

Hit ratio = hit / (hit + miss) =  no. of hits/total accesses

Cache Mapping

The three different types of mapping used for the purpose of cache memory are as follow,

  • Direct mapping
  • Associative mapping,
  • Set-Associative mapping.

Direct mapping: In direct mapping assigned each memory block to a specific line in the cache. If a line is previously taken up by a memory block when a new block needs to be loaded, the old block is trashed. An address space is split into two parts index field and tag field. The cache is used to store the tag field whereas the rest is stored in the main memory. Direct mapping`s performance is directly proportional to the Hit ratio.

Cache & Memory Hierarchy | Computer Architecture & Organisation (CAO) - Computer Science Engineering (CSE)

Associative mappingIn this type of mapping the associative memory is used to store content and addresses both of the memory word. Any block can go into any line of the cache. This means that the word id bits are used to identify which word in the block is needed, but the tag becomes all of the remaining bits. This enables the placement of the any word at any place in the cache memory. It is considered to be the fastest and the most flexible mapping form.

Cache & Memory Hierarchy | Computer Architecture & Organisation (CAO) - Computer Science Engineering (CSE)

Question for Cache & Memory Hierarchy
Try yourself:What is the purpose of cache mapping in cache memory?
View Solution

Set-associative mapping: This form of mapping is a enhanced form of the direct mapping where the drawbacks of direct mapping is removed. Set associative addresses the problem of possible thrashing in the direct mapping method. It does this by saying that instead of having exactly one line that a block can map to in the cache, we will group a few lines together creating a set. Then a block in memory can map to any one of the lines of a specific set..Set-associative mapping allows that each word that is present in the cache can have two or more words in the main memory for the same index address. Set associative cache mapping combines the best of direct and associative cache mapping techniques

Cache & Memory Hierarchy | Computer Architecture & Organisation (CAO) - Computer Science Engineering (CSE)

The cache hit ratio for this initialization loop is
(A) 0%
(B) 25%
(C) 50%
(D) 75%
Answer: (C)

Explanation:

Cache hit ratio=No. of hits/total accesses

=1024/(1024+1024)

=1/2=0.5=50%

So (C) is correct option

Cache Organization | (Introduction)

Cache is close to CPU and faster than main memory. But at the same time is smaller than main memory. The cache organization is about mapping data in memory to a location in cache.

A Simple Solution: One way to go about this mapping is to consider last few bits of long memory address to find small cache address, and place them at the found address.

Problems With Simple Solution: The problem with this approach is, we loose the information about high order bits and have no way to find out the lower order bits belong to which higher order bits.

Cache & Memory Hierarchy | Computer Architecture & Organisation (CAO) - Computer Science Engineering (CSE)

Solution is Tag: To handle above problem, more information is stored in cache to tell which block of memory is stored in cache. We store additional information as Tag

Cache & Memory Hierarchy | Computer Architecture & Organisation (CAO) - Computer Science Engineering (CSE)

What is a Cache Block?

Since programs have Spatial Locality (Once a location is retrieved, it is highly probable that the nearby locations would be retrieved in near future). So a cache is organized in the form of blocks. Typical cache block sizes are 32 bytes or 64 bytes.

Cache & Memory Hierarchy | Computer Architecture & Organisation (CAO) - Computer Science Engineering (CSE)

The above arrangement is Direct Mapped Cache and it has following problem

We have discussed above that last few bits of memory addresses are being used to address in cache and remaining bits are stored as tag. Now imagine that cache is very small and addresses of 2 bits. Suppose we use the last two bits of main memory address to decide the cache (as shown in below diagram). So if a program accesses 2, 6, 2, 6, 2, …, every access would cause a hit as 2 and 6 have to be stored in same location in cache.

Cache & Memory Hierarchy | Computer Architecture & Organisation (CAO) - Computer Science Engineering (CSE)

Solution to above problem – Associativity

What if we could store data at any place in cache, the above problem won’t be there? That would slow down cache, so we do something in between.

Cache & Memory Hierarchy | Computer Architecture & Organisation (CAO) - Computer Science Engineering (CSE)

The document Cache & Memory Hierarchy | Computer Architecture & Organisation (CAO) - Computer Science Engineering (CSE) is a part of the Computer Science Engineering (CSE) Course Computer Architecture & Organisation (CAO).
All you need of Computer Science Engineering (CSE) at this link: Computer Science Engineering (CSE)
20 videos|86 docs|48 tests

Top Courses for Computer Science Engineering (CSE)

FAQs on Cache & Memory Hierarchy - Computer Architecture & Organisation (CAO) - Computer Science Engineering (CSE)

1. What is the purpose of the cache in a computer's memory hierarchy?
Ans. The cache in a computer's memory hierarchy is used to store frequently accessed data and instructions. It acts as a temporary storage that is closer to the processor, allowing for faster access compared to the main memory. By keeping frequently used data in the cache, the computer can reduce the time it takes to fetch information from the main memory, improving overall system performance.
2. How does the memory hierarchy in a computer system work?
Ans. The memory hierarchy in a computer system consists of multiple levels, including registers, cache, main memory, and secondary storage. Each level has different capacities, access times, and costs. When the processor needs data, it first checks the registers, which are the fastest but have the smallest capacity. If the data is not found in the registers, it moves to the cache, which has a larger capacity but slightly longer access times. If the data is still not found in the cache, it then accesses the main memory, which has a larger capacity but longer access times compared to the cache. Finally, if the data is not found in the main memory, it retrieves it from the secondary storage, such as a hard disk, which has the largest capacity but the longest access times.
3. How does the cache improve the performance of a computer system?
Ans. The cache improves the performance of a computer system by reducing the average time it takes to access data and instructions. When the processor needs data, it first checks the cache, which has faster access times compared to the main memory. If the data is found in the cache (cache hit), the processor can fetch it quickly, avoiding the longer access times of the main memory (cache miss). This reduces the overall latency and improves the speed of the system.
4. What is the concept of cache coherence in a multi-core processor system?
Ans. Cache coherence refers to the consistency of shared data in a multi-core processor system. In such systems, each core has its own cache, and when multiple cores access the same memory location simultaneously, there is a possibility of data inconsistency. Cache coherence protocols are used to ensure that all cores observe a consistent view of shared data. These protocols manage the invalidation and updating of cache lines to maintain data coherence and prevent data corruption or incorrect results due to concurrent access.
5. What are the different levels of cache in a computer system's memory hierarchy?
Ans. A computer system's memory hierarchy typically consists of multiple levels of cache, including L1 cache, L2 cache, and sometimes L3 cache. The L1 cache is the closest to the processor and has the smallest capacity but the fastest access times. It is divided into separate instruction and data caches. The L2 cache is larger than the L1 cache and has slightly longer access times. It acts as a middle layer between the L1 cache and the main memory. In some systems, there may be an additional L3 cache, which is larger than the L2 cache but has longer access times. The L3 cache serves as a shared cache for multiple cores or processors in a system.
20 videos|86 docs|48 tests
Download as PDF
Explore Courses for Computer Science Engineering (CSE) exam

Top Courses for Computer Science Engineering (CSE)

Signup for Free!
Signup to see your scores go up within 7 days! Learn & Practice with 1000+ FREE Notes, Videos & Tests.
10M+ students study on EduRev
Related Searches

Viva Questions

,

past year papers

,

Cache & Memory Hierarchy | Computer Architecture & Organisation (CAO) - Computer Science Engineering (CSE)

,

Extra Questions

,

Semester Notes

,

Sample Paper

,

study material

,

video lectures

,

practice quizzes

,

Objective type Questions

,

Important questions

,

pdf

,

shortcuts and tricks

,

Exam

,

MCQs

,

Free

,

Cache & Memory Hierarchy | Computer Architecture & Organisation (CAO) - Computer Science Engineering (CSE)

,

Previous Year Questions with Solutions

,

Cache & Memory Hierarchy | Computer Architecture & Organisation (CAO) - Computer Science Engineering (CSE)

,

mock tests for examination

,

ppt

,

Summary

;