You are on page 1of 12

Cache size vs block size 2.2.

3
CACHE PERFORMANCE

When the processor needs to read or write a location in main memory, it first checks for a
corresponding entry in the cache.

 If the processor finds that the memory location is in the cache, a cache hit has
occurred and data is read from cache
 If the processor does not find the memory location in the cache, a cache miss has
occurred. For a cache miss, the cache allocates a new entry and copies in data from
main memory, then the request is fulfilled from the contents of the cache.

The performance of cache memory is frequently measured in terms of a quantity called Hit


ratio.

Hit ratio = hit / (hit + miss) = no. of hits/total accesses

We can improve Cache performance using higher cache block size, higher associativity,
reduce miss rate, reduce miss penalty, and reduce the time to hit in the cache.

CACHE LINES

Cache memory is divided into equal size partitions called as cache lines.

  While designing a computer’s cache system, the size of cache lines is an important
parameter.
 The size of cache line affects a lot of parameters in the caching system.

The following results discuss the effect of changing the cache block (or line) size in a caching
system.

Result-01: Effect of Changing Block Size on Spatial Locality-

The larger the block size, better will be the spatial locality.

Explanation-

Keeping the cache size constant, we have-

Case-01: Decreasing the Block Size-


 A smaller block size will contain a smaller number of nearby addresses in it.
 Thus, only smaller number of nearby addresses will be brought into the cache.
 This increases the chances of cache miss which reduces the exploitation of spatial
locality.
 Thus, smaller is the block size, inferior is the spatial locality.
 

Case-02: Increasing the Block Size-


 A larger block size will contain a larger number of nearby addresses in it.
 Thus, larger number of nearby addresses will be brought into the cache.
 This increases the chances of cache hit which increases the exploitation of spatial
locality.
 Thus, larger is the block size, better is the spatial locality.

Result-02: Effect of Changing Block Size on Cache Tag in Direct Mapped Cache-

In direct mapped cache, block size does not affect the cache tag anyhow.

Explanation-

Keeping the cache size constant, we have-

Case-01: Decreasing the Block Size-


 Decreasing the block size increases the number of lines in cache.
 With the decrease in block size, the number of bits in block offset decreases.
 However, with the increase in the number of cache lines, number of bits in line
number increases.
 So, number of bits in line number + number of bits in block offset = remains constant.
 Thus, there is no effect on the cache tag.

Example-

 
 

Case-02: Increasing the Block Size-


 Increasing the block size decreases the number of lines in cache.
 With the increase in block size, the number of bits in block offset increases.
 However, with the decrease in the number of cache lines, number of bits in line
number decreases.
 Thus, number of bits in line number + number of bits in block offset = remains
constant.
 Thus, there is no effect on the cache tag.

Example-
 

Result-03: Effect of Changing Block Size on Cache Tag in Fully Associative Cache-

In fully associative cache, on decreasing block size, cache tag is reduced and vice versa.

Explanation-

Keeping the cache size constant, we have-

Case-01: Decreasing the Block Size-

 Decreasing the block size decreases the number of bits in block offset.
 With the decrease in number of bits in block offset, number of bits in tag increases.

Case-02: Increasing the Block Size-

 Increasing the block size increases the number of bits in block offset.
 With the increase in number of bits in block offset, number of bits in tag decreases.

Result-04: Effect of Changing Block Size on Cache Tag in Set Associative Cache-

In set associative cache, block size does not affect cache tag anyhow.

Explanation-

Keeping the cache size constant, we have-

Case-01: Decreasing the Block Size-


 Decreasing the block size increases the number of lines in cache.
 With the decrease in block size, number of bits in block offset decreases.
 With the increase in the number of cache lines, number of sets in cache increases.
 With the increase in number of sets in cache, number of bits in set number increases.
 So, number of bits in set number + number of bits in block offset = remains constant.
 Thus, there is no effect on the cache tag.

Example-

 
 

Case-02: Increasing the Block Size-

 Increasing the block size decreases the number of lines in cache.


 With the increase in block size, number of bits in block offset increases.
 With the decrease in the number of cache lines, number of sets in cache decreases.
 With the decrease in number of sets in cache, number of bits in set number decreases.
 So, number of bits in set number + number of bits in block offset = remains constant.
 Thus, there is no effect on the cache tag.

Example-

 
 

Result-05: Effect of Changing Block Size on Cache Miss Penalty-

A smaller cache block incurs a lower cache miss penalty.

Explanation-
 When a cache miss occurs, block containing the required word has to be brought from
the main memory.
 If the block size is small, then time taken to bring the block in the cache will be less.
 Hence, less miss penalty will incur.
 But if the block size is large, then time taken to bring the block in the cache will be
more.
 Hence, more miss penalty will incur.

Result-06: Effect of Cache Tag on Cache Hit Time-

A smaller cache tag ensures a lower cache hit time.

Explanation-
 Cache hit time is the time required to find out whether the required block is in cache
or not.
 It involves comparing the tag of generated address with the tag of cache lines.
 Smaller is the cache tag, lesser will be the time taken to perform the comparisons.
 Hence, smaller cache tag ensures lower cache hit time.
 On the other hand, larger is the cache tag, more will be time taken to perform the
comparisons.
 Thus, larger cache tag results in higher cache hit time.

PRACTICE PROBLEM BASED ON CACHE LINE-


Problem-

In designing a computer’s cache system, the cache block or cache line size is an important
parameter. Which of the following statements is correct in this context?

1. A smaller block size implies better spatial locality


2. A smaller block size implies a smaller cache tag and hence lower cache tag overhead
3. A smaller block size implies a larger cache tag and hence lower cache hit time
4. A smaller bock size incurs a lower cache miss penalty
Solution-

Option (D) is correct. (Result-05)

Reasons-

Option (A) is incorrect because-

 Smaller block does not imply better spatial locality.


 Always, Larger the block size, better is the spatial locality.

Option (B) is incorrect because-

 In direct mapped cache and set associative cache, there is no effect of changing block
size on cache tag.
 In fully associative mapped cache, on decreasing block size, cache tag becomes
larger.
 Thus, smaller block size does not imply smaller cache tag in any cache organization.

Option (C) is incorrect because-

 “A smaller block size implies a larger cache tag” is true only for fully associative
mapped cache.
 Larger cache tag does not imply lower cache hit time rather cache hit time is
increased.

What is the cache block size?


(4bytes)
Since each cache block is of size 4 bytes, the total number of sets in the cache is 256/4, which
equals 64 sets or cache lines. The incoming address to the cache is divided into bits for offset
and tag.

Where is cache block size in Word?


In a nutshell the block offset bits determine your block size (how many bytes are in a cache
row, how many columns if you will). The index bits determine how many rows are in each
set. The capacity of the cache is therefore 2^(blockoffsetbits + indexbits) * #sets. In this case
that is 2^(4+4) * 4 = 256*4 = 1 kilobyte.

What is the cache block or line size in words)?

Each cache line is 1 word (4 bytes).

What is word size in cache?


A cache memory has a line size of eight 64-bit words and a capacity of 4K words. computer-
architecture cpu-cache. A cache memory has a line size of eight 64-bit words and a capacity
of 4K words.

How do I check my cache size?

Right-click on the Start button and click on Task Manager. 2. On the Task Manager screen,
click on the Performance tab > click on CPU in the left pane. In the right-pane, you will see
L1, L2 and L3 Cache sizes listed under “Virtualization” section.

What are block sizes?

Concrete Block (CMU) Sizes

CMU Size Nominal Dimensions D x H x Actual Dimensions D x H x


L L

4″ CMU Full 4″ x 8″ x 16″ 3 5/8″ x 7 5/8″ x 15 5/8″


Block

4″ CMU Half- 4″ x 8″ x 8″ 3 5/8″ x 7 5/8″ x 7 5/8″


Block

6″ CMU Full 6″ x 8″ x 16″ 5 5/8″ x 7 5/8″ x 15 5/8″


Block
6″ CMU Half- 6″ x 8″ x 8″ 5 5/8″ x 7 5/8″ x 7 5/8″
Block

How do I know my cache memory size?


How do I know my cache line size?

The size of these chunks is called the cache line size. Common cache line sizes are 32, 64 and
128 bytes. A cache can only hold a limited number of lines, determined by the cache size. For
example, a 64 kilobyte cache with 64-byte lines has 1024 cache lines.

What is a cache block?

cache block – The basic unit for cache storage. May contain multiple bytes/words of data.
Because different regions of memory may be mapped into a block, the tag is used to
differentiate between them. valid bit – A bit of information that indicates whether the data in
a block is valid (1) or not (0).

What is a computer’s word size?


The word size of a computer generally indicates the largest integer it can process in a single
instruction, and the size of a memory address, which is usually, but not necessarily the same
as the integer size. The main indication of the word size is how much memory the processor
can address.

How big is the block size of the cache?

In the example the cache block size is 32 bytes, i.e., byte-addressing is being used; with four-
byte words, this is 8 words. As you can see there are four hits out of 12 accesses, so the hit
rate should be 33%.

What’s the line size of a cache memory?

A cache memory has a line size of eight 64-bit words and a capacity of 4K words. A cache
memory has a line size of eight 64-bit words and a capacity of 4K words.

How big is a 64-bit word cache?


A 64-bit word means 8 bytes. Line size: 8 words in a line, means 8 x 8 bytes = 64 bytes in a
line = 2 6 bytes. Cache size: 4k words, meaning 4096 x 8 bytes = 32k total bytes.

How does cache size affect the cache tag?


Thus, there is no effect on the cache tag. A smaller cache block incurs a lower cache miss
penalty. When a cache miss occurs, block containing the required word has to be brought
from the main memory. If the block size is small, then time taken to bring the block in the
cache will be less. Hence, less miss penalty will incur.

Cache size, Block size, Mapping function, Replacement algorithm, and Write policy. These
are explained as following below.

1. Cache Size:

It seems that moderately tiny caches will have a big impact on performance.

2. Block Size:

Block size is the unit of information changed between cache and main memory. As
the block size will increase from terribly tiny to larger sizes, the hit magnitude
relation can initially increase as a result of the principle of locality. the high chance
that knowledge within the neck of the woods of a documented word square measure
possible to be documented within the close to future. As the block size increases, a
lot of helpful knowledge square measure brought into the cache.
a. The hit magnitude relation can begin to decrease, however, because the
block becomes even larger and also the chance of victimization the new
fetched knowledge becomes but the chance of reusing the information that
ought to be abstracted of the cache to form area for the new block.
3. Mapping Function:

When a replacement block of data is scan into the cache, the mapping performs
determines that cache location the block will occupy. Two constraints have an effect
on the planning of the mapping perform. First, once one block is scan in, another
could be replaced.
a. We would wish to do that in such the simplest way to minimize the chance
that we are going to replace a block which will be required within the close
to future. A lot of versatile the mapping performs, a lot of scopes we’ve to
style a replacement algorithmic rule to maximize the hit magnitude relation.
Second, a lot of versatile the mapping performs, a lot of advanced is that the
electronic equipment needed to look the cache to see if a given block is
within the cache.
4. Replacement Algorithm:

The replacement algorithmic rule chooses, at intervals, the constraints of the


mapping perform, which block to interchange once a replacement block is to be
loaded into the cache and also the cache already has all slots full of alternative
blocks. We would wish to replace the block that’s least possible to be required once
more within the close to future. Although it’s impossible to spot such a block, a
fairly effective strategy is to interchange the block that has been within the cache
longest with no relevance.
a. This policy is spoken because of the least-recently-used (LRU) algorithmic
rule. Hardware mechanisms square measure required to spot the least-
recently-used block
5. Write Policy:

If the contents of a block within the cache square measure altered, then it’s
necessary to write down it back to main memory before exchange it. The written
policy dictates once the memory write operation takes place. At one extreme, the
writing will occur whenever the block is updated.
a. At the opposite extreme, the writing happens only if the block is replaced.
The latter policy minimizes memory write operations however leaves the
main memory in associate obsolete state. This can interfere with the
multiple-processor operation and with direct operation by I/O hardware
modules.

References
Reference Books:

 J.P. Hayes, “Computer Architecture and Organization”, Third Edition.


 Mano, M., “Computer System Architecture”, Third Edition, Prentice Hall.

 Stallings, W., “Computer Organization and Architecture”, Eighth Edition, Pearson Education.
Text Books:

 Carpinelli J.D,” Computer systems organization &Architecture”, Fourth Edition, Addison


Wesley.

 Patterson and Hennessy, “Computer Architecture”, Fifth Edition Morgaon Kauffman.


Other References

 (1) New Message! (knowledgeburrow.com)


 Cache Memory Design - GeeksforGeeks

 https://www.gatevidyalay.com/cache-line-cache-line-size-cache-memory/
 https://www.geeksforgeeks.org/cache-memory-in-computer-organization/
 https://stackoverflow.com/questions/8107965/concept-of-block-size-in-a-cache

You might also like