Wednesday, April 6, 2011

Cache memory

Cache memory

              

A CPU cache is a cache used by the central processing unit of a computer to reduce the average time to access memory. The cache is a smaller, faster memory which keep copies of the data from the most frequently used main memory locations. As long as most memory accesses are cached memory locations, the average latency of memory accesses will be closer to the cache latency than to the latency of main memory.
When the processor needs to read from or write to a location in main memory, it first checks whether a copy of that data is in the cache. If so, the processor immediately reads from or writes to the cache, which is much faster than reading from or writing to main memory.
Most modern desktop and server CPUs have at least three independent caches: an instruction cache to speed up executable instruction fetch, a data cache to speed up data fetch and store, and atranslation lookaside buffer (TLB) used to speed up virtual-to-physical address translation for both executable instructions and data. Data cache is usually organized as a hierarchy of more cache levels (L1, L2, etc.; see Multi-level caches).

Cache memory organization?
                

                              
Cache memory is random access memory (RAM) that a computer or microprocessor can access more quickly than it can access regular RAM. As the microprocessor processes data, it looks first in the cache memory and if it finds the data there (from a previous reading of
data), it does not have to do the more time-consuming reading of data
from larger memory.

Two types of caching are commonly used in personal computers: MEMORY caching and DISK caching.

A memory cache: Sometimes called a cache store or RAM cache, is a portion of memory made of high-speed static RAM (SRAM) instead of the slower and cheaper dynamic RAM (DRAM) used for main memory. Cache memory dramatically raises the performance of a computer system at relatively little cost

A disk cache :is a portion of a system memory used to cache reads and writes to the hard disk. It may be referred to as the most important type of cache on the PC, because of the greatest differential speed between the layers, that is the system RAM and the hard disk. Disk caching works under the same principle as memory caching, but instead of using high-speed SRAM, a disk cache uses conventional main memory.
also know about:-
1.Cache memory organization.

2.cache memory and its mapping.( transformation of data from main memory to cache memory is referred as a mapping process.)

3.cache memory in computer.( Cache that is built into the CPU is faster than separate cache, running at the speed of the microprocessor itself)

4.cache memory architecture(Cache only memory architecture (COMA) is a computer memory organization for use in multiprocessors .)

5.cache memory enhances. .( Cache memory enhances:
(a) memory capacity (b) memory access time
(c) secondary storage capacity (d) secondary storage access time)

For more description about cache memory follow this link.
Link -  http://www.bunkclass.com/upload_download/download_doc_details.php?docid=637&f=265799239.doc#

No comments:

Post a Comment