You are on page 1of 17
Computer System Architecture Memory Part IT caching vinta A pont , «op addr» cop adér>, | op: i-fetch, read, write Optimize the memory system organization To minimize the average memory access time fer typical workloads eapdedidegrond ~ reteeence thom Su hiIy, © Direct Mapped Cache | + Mapping: address is modulo the number | of blocks in the cache — és ddr whuoumyse> t = paanerniney oedaees Wor tugs tor cdarteansrtesgearhuon saries (emuernsy eo (9191, 41193,8 Chibonety) oon AAT LE powers inde index Memory WeinbetvroiwuiT Ro Toyoouids Tungiaty Durireneyohunddes snUragntoudthi coche lle wa" uche sine 3! tow scouetonm indet oe eysteed #° | Gnonit Rnkelz aur rioven est Ve Sethe casey oooh ae ee Direct ‘Mapped Cache (cont.) Direct Mapped Cache (cont.) + For MIPS: | pseu onl ett Sb we + Taking advantage of spatial locality: huge) S yj oan i th eceege weldvets os Toomer | sandy iis Were Laas wages oe haar aabsive Gow hisentry gy emthe wacache a) wav nad Ck aseidy ayevvma hell lZ Beko cnr oh \iieintu oddreer omturictesen sede) Hiddetnlitie (ondsbrabieeadutuunls hau)» — | | Extreme Example: single big line -tislananldy pig peg Extreme Example: single big \ betoaas WC che depen line (cont.) Voki Bit Cache Tag cache bata + If an item is accessed, likely that it will eC be accessed again soon | dg { f ~ But it is unlikely that it will be accessed aoumiichslint again immediately!!! + Cache Size = 4 bytes Cocke woien, Wd - The next access will likely to be a miss Block Size = 4 bytes cant lly loading data into the cache but \ - * Continually loading data into the cache but | - Only ONE entry in the cache discard (foree out) them before they are used ‘again . + Worst nightmare of a cache designer: Ping Pong =a UetisvertToe wefan gina fect | Sian] roplie cede storemen (Nenetiiconi ‘catte o+nin‘is tiie vans Ble iv id Performance + Increasing the block size tends to decrease miss rate: Block Size Tradeoff fe + In generai, larger block size take dvantage of spatiai locality BUT: ~ Larger block size means larger mi penalty: + Takes longer time to fill up the block Ef block size is too big relative to cache size, miss rate will go up [roqrow eet] + Too few cache blacks » In general, Average Access Time: - = Hit Time x (1 - Miss Rate) + Miss Penalty | x Miss Rate emopiniae ous -Ficginine iis phe boner ianboblg ving Performance (cont.) f + Use split caches because there is more | spatial locality in code: ‘Block size la | Instruction | Datarmiss | Etlective combined $ 12% | 4 02% = stetudeds sad apediat lecality men mrdwscetidnnelgts = (aravaalins vithtown bleh size Yuen mies raeleinn BpMlpg > — rete goto Stil mecewtowersh vttawentn, Sook oe foun eis rate “Leflatenn tec Woah APIS Rete enone TeSlacueets nas Sets / show_tei_ greatly Block Size Tradeoff (cont.) wis Peraity rere by were Tate MI | ese ro Block Sie

You might also like