A cache is a mechanism for temporarily storing (caching) data locally in order to reduce access time to data stored far away.

- Stackoverflow.com Wiki
14 articles, 2 books. Go to books ↓

It’s a mistake to feel complacent about the state of the art of computing, no matter when you live. There’s always another bottleneck.


Getting caching right yields huge performance benefits, saves bandwidth, and reduces server costs, but many sites half-arse their caching, creating race conditions resulting in interdependent resources getting out of sync.


Was reading on optimal ways of caching, mainly to offload the server by reducing the number of hits. 304s are beautiful, aren’t they? Moving your scaling to the frontend has a lot of benefits which includes serve content faster, fewer origin hits, .. Let’s look at the various levels where caches could be implemented.


Caching done badly has bad implications. Try your hardest not to cache data; but if you really do have to, make sure you do it right.


How Reddit monitors, tunes, and scales their memcached infrastructure.


You want to keep an object around only as long as you have memory available, do ya? Then you need the WeakReference class.


If you have ever bought milk at the supermarket, then you can understand server-side and browser-side caching.


One of the easiest and most popular ways to increase system performance is to use caching. When we introduce caching, we automatically duplicate our data. It's very important to keep your cache and data source in sync (more or less, depends on the requirements of your system) whenever changes occur in the system. In this article, we will go through the most common cache synchronization strategies, their advantages, and disadvantages, and also popular use cases.