RAM & Cache
Cache Memory
Cache memory is a high-speed, small-sized type of volatile computer memory that plays a crucial role in improving CPU (Central Processing Unit) performance. It serves as a bridge between the CPU and the slower, larger main memory (RAM). The CPU registers form part of the cache memory.
The primary purpose of cache memory is to reduce the time it takes for the CPU to access frequently used data and instructions, thereby enhancing overall system performance.
Cache memory is static ram (SRAM) that stores its data in flop flop circuits.
Cache is usually split into levels: L1, L2, and L3, each with different sizes, speeds, and proximity to the CPU core.
L1 Cache (Level 1)
Closest to the CPU core (often built into it).
-
Smallest (typically 16KB–128KB per core).
-
Fastest (latency: ~1-2 CPU cycles).
-
Often split into:
-
L1d (Data) cache
-
L1i (Instruction) cache
-
-
Used for: Storing the most frequently used data and instructions.
-
Because of its speed, it's accessed first by the CPU.
L2 Cache (Level 2)
Larger than L1 (usually 128KB–1MB per core).
-
Slower than L1 but still very fast (latency: ~3–15 cycles).
-
May be dedicated per core or shared between a few cores.
-
Used for: Storing data that isn’t in L1 but is still likely to be reused soon.
L3 Cache (Level 3)
Largest (usually 2MB–64MB, shared among all CPU cores).
-
Slowest of the three (latency: ~20–50+ cycles), but still faster than RAM.
-
Shared across cores on the same processor (helps with inter-core communication).
-
Used for: Reducing access times to DRAM and improving performance in multi-core tasks.
What happens when requested data is not found in cache memory?
Cache Hits & Misses
A cache hit occurs when the CPU requests a piece of data or instruction, and that data is found in the cache memory. In other words, the data needed by the CPU is already present in the cache.
A cache miss occurs when the CPU requests data or instructions that are not present in the cache memory. In this case, the cache does not contain the required data, and the CPU must fetch it from a slower memory hierarchy level, typically from the main memory (RAM).
Cache hits are desirable because they result in very fast memory access times. The CPU can retrieve the required data directly from the cache, avoiding the longer latency associated with accessing data from main memory (RAM) or even slower storage devices like hard drives or SSDs.
What does a cache hit mean in computer memory?
Pros & Cons of Cache Memory
Advantages
- Cache memory is faster than RAM memory because it is located directly on the CPU using flip-flop circuits.
Disadvantages
- Cache memory is more expensive than RAM because the circuitry is more complex than RAM.
- It is also more energy intensive, due to the flip flop circuits being constantly active.
How does cache memory improve computer performance?
RAM
RAM (Random Access Memory) is a type of volatile computer memory that serves as the primary workspace for actively used data and instructions within a computer system. It utilises dynamic ram DRAM, a form of memory that stores data in capacitors. This form of memory is much cheaper than static ram, allowing much larger amounts of data to be stored.
What does RAM stand for in computer memory?
Why do we need RAM?
RAM is much faster than other storage devices like hard drives and SSDs. It provides the CPU with rapid access to data and instructions, enabling quick data retrieval and manipulation. This speed is crucial for the efficient operation of applications and the overall responsiveness of the computer.
Without RAM the computer would have to load data in directly from the HDD, which has significantly slower speeds. This would lead to bottlenecks in processing that would slow down the operation of the computer to the point where it would be unusable.
What is the main function of RAM in a computer?
When is RAM used?
During Bootup
When you power on your computer or restart it, the computer's BIOS or UEFI firmware initializes hardware components and loads the operating system (e.g., Windows, macOS, Linux) from non-volatile storage (e.g., a hard drive or SSD) into RAM. The operating system's kernel and essential system files are loaded into RAM during this process.
When software is opened
When you launch a software application (e.g., a web browser, word processor, or video game), the operating system loads the necessary program files and libraries from storage into RAM. This includes executable code, user interface elements, and data structures needed to run the application.
When Files are accessed
When you open a file (e.g., a document, image, or video), the relevant portions of that file are read from storage and loaded into RAM.
What is the primary role of RAM during the computer bootup process?
Virtual Memory
In cases where physical RAM is limited, modern operating systems use a technique called virtual memory. They allocate a portion of the storage device (e.g., a hard drive or SSD) as virtual RAM.
When physical RAM becomes scarce, less frequently used data is moved to this virtual memory, allowing more critical data to remain in physical RAM.
This process, known as paging or swapping, enables efficient memory management.
Advantages of RAM
Speed
RAM is significantly faster than secondary storage devices like hard drives and SSDs. It provides quick access to data and instructions, which is essential for the rapid execution of applications and tasks.
Data Access
RAM offers random access, which means that any part of the memory can be accessed quickly with nearly the same speed. This is important for efficient data retrieval and manipulation.
Cost
The DRAM used in RAM is cheaper than the SRAM used in cache memory and so can be installed in much greater quantities on a computer system.
Disadvantages of RAM
Volatility
RAM is volatile memory, meaning that it loses its data when the computer is powered off or restarted. This makes it unsuitable for long-term data storage.
Limited Capacity
RAM has limited capacity compared to secondary storage devices. While you can have multiple gigabytes of RAM, it's still relatively smaller than the storage capacity of hard drives and SSDs, limiting the amount of data that can be held in RAM.
Cost
RAM can be expensive, especially as you aim for larger capacities and higher speeds. This can be a significant cost factor in building or upgrading a computer system.
Review: Fill in the Blanks
Cache memory is divided into different levels, specifically , , and , each varying in size, speed, and proximity to the CPU core. The L1 cache is the closest and fastest, typically ranging from per core, and is used for storing the most frequently accessed data and instructions. In contrast, L2 cache is larger than L1, usually between , and while slower than L1, it still serves to store data likely to be reused soon.
A cache hit occurs when the CPU finds the requested data in the cache, leading to very fast memory access times. Conversely, a cache miss happens when the data is absent from the cache, requiring the CPU to fetch it from the slower main memory. Cache memory has advantages, such as being faster than RAM due to its use of flip-flop circuits, but it is also more expensive and energy-intensive.
RAM, or , serves as the primary workspace for actively used data and instructions within a computer. It is a type of volatile memory that utilizes (DRAM) and offers rapid access to data, which is crucial for the efficient operation of applications. However, RAM is limited in capacity compared to secondary storage devices and loses all stored data when the computer is powered off.
Complete! Ready to test your knowledge?
Cache Memory
- Cache Memory
- L1 Cache (Level 1)
- L2 Cache (Level 2)
- L3 Cache (Level 3)
- Cache Hits & Misses
- Pros & Cons of Cache Memory
RAM
- RAM
- Why do we need RAM?
- When is RAM used?
- Virtual Memory
- Advantages of RAM
- Disadvantages of RAM