The term cache is derived from French, meaning 'to hide' or 'hidden', which is quite fitting considering its role in the digital world.

Back to glossary

The term cache is derived from French, meaning 'to hide' or 'hidden', which is quite fitting considering its role in the digital world. In the simplest of terms, a cache is a hardware or software component that stores data so that future requests for that data can be served faster. The data stored in a cache might be the result of an earlier computation or a copy of data stored elsewhere.

Types of cache

There are various types of caches that exist in computing, each serving a unique purpose. These include memory cache, disk cache, and web cache, among others. Each type of cache has its own characteristics and functionalities, and they all contribute to improving the overall performance of a computer system.

Memory cache, for instance, is a small amount of high-speed random access memory (RAM) dedicated for temporary storage of frequently accessed data. Disk cache, on the other hand, is a portion of RAM used to speed up access to data on a disk. Web cache is a mechanism used by internet browsers to store web page resources on a user's computer, thereby speeding up subsequent access to the same web pages.

Memory cache

Memory cache, also known as CPU cache, is a high-speed static random access memory (SRAM) that a computer microprocessor can access more quickly than it can access regular random access memory (RAM). This cache is used to store frequently accessed data and instructions. The primary goal of a memory cache is to increase the speed at which the processor can access this data.

Memory cache is typically integrated directly into the CPU or located in a separate chip next to the CPU and connected via a high-speed bus. The size of the memory cache can greatly affect the performance of the computer. A larger cache allows for more data to be stored closer to the CPU, reducing the time it takes for the processor to access this data.

Disk cache

Disk cache, also known as disk buffering, serves a similar purpose to memory cache but is used for data read from or written to a disk. The disk cache holds data that has recently been read and, in some cases, adjacent data areas that are likely to be accessed next. Write caching is also used to gather small writes and commit them as a single larger write to a disk, reducing the overhead of writing data to disk.

Disk cache can be a dedicated portion of RAM, but it can also be a portion of the general memory that the operating system allocates as needed. The size of a disk cache can also impact system performance. A larger disk cache allows for more data to be stored for quick access, reducing the number of direct reads and writes to the disk, which are slower operations.

Web cache

Web cache, also known as HTTP cache, is used by web browsers and web servers to store web documents, such as HTML pages and images, to reduce server lag. A web cache stores copies of documents passing through it; subsequent requests for the same document are satisfied from the cache if certain conditions are met. This can significantly improve the efficiency and performance of web browsing.

Web caching reduces the amount of data that needs to be transmitted across the network, as the same request can be served from the cache. This not only reduces bandwidth usage but also reduces the load on the server, as it doesn't need to serve all clients itself. Web caching is a key aspect of HTTP, and all modern web browsers and web servers support it.

Cache and sybersecurity

While cache serves an important role in improving system performance, it also presents certain cybersecurity concerns. Cached data can be a target for cybercriminals as it often contains sensitive information. If a hacker can access a cache, they may be able to retrieve this information.

Furthermore, cache poisoning is a type of cyber attack where the attacker corrupts the cache data of a web server or a web browser by replacing it with malicious data. This can lead to users being redirected to malicious websites or being served malicious content when they request a web page.

Cache attacks

Cache attacks are a type of side-channel attack where the attacker observes the cache's behavior and uses this information to infer the data stored in the cache. These attacks can be particularly effective as they can be performed without needing direct access to the cache or the data it contains.

There are several types of cache attacks, including cache hit/miss attacks, prime and probe attacks, and flush and reload attacks. Each of these attacks involves observing different aspects of the cache's behavior to infer the data it contains. These attacks can be used to retrieve sensitive information, such as encryption keys, from a system.

Cache poisoning

Cache poisoning is a type of attack where the attacker corrupts the cache data of a DNS server by replacing it with a malicious IP address. When users request the IP address of a website from the DNS server, they are directed to the malicious website instead. This can be used to perform phishing attacks or to spread malware.

Cache poisoning can be difficult to detect as the malicious data is stored in the cache, and the DNS server continues to serve this data until it is refreshed. This makes cache poisoning a potent threat in the realm of cybersecurity.

Preventing cache-based attacks

There are several strategies for preventing cache-based attacks. These include regularly clearing the cache, using secure protocols, and implementing security measures at the hardware level. Regularly clearing the cache can prevent an attacker from accessing sensitive information stored in the cache.

Using secure protocols, such as HTTPS, can prevent cache poisoning by ensuring that the data transferred between the server and the client is encrypted. Hardware-level security measures, such as isolated cache architecture, can prevent cache attacks by isolating different processes' caches, preventing one process from accessing another's cache data.


In conclusion, cache is a fundamental concept in computing and cybersecurity. While it plays a crucial role in improving system performance, it also presents certain cybersecurity risks. Understanding these risks and how to mitigate them is key to maintaining a secure system.

As technology continues to evolve, the role of cache in computing and cybersecurity is likely to continue to grow. Therefore, a deep understanding of cache and its implications for cybersecurity is essential for anyone involved in the field.

This post has been updated on 17-11-2023 by Sofie Meyer.

Author Sofie Meyer

About the author

Sofie Meyer is a copywriter and phishing aficionado here at Moxso. She has a master´s degree in Danish and a great interest in cybercrime, which resulted in a master thesis project on phishing.

Similar definitions

Telemetry Vanity domain Wireless fidelity Truncate Data Manipulation Language Semantics Single sign-on (SSO) Kali Linux Nonce Network block device (NBD) TL;DR Brute force attack Not safe for work (NSFW) Internet protocol address (IP) Non-volatile memory (NVM)