A cache is a mechanism for temporarily storing (caching) data locally in order to reduce access time to data stored far away.

- Stackoverflow.com Wiki
12 articles, 2 books. Go to books ↓

It’s a mistake to feel complacent about the state of the art of computing, no matter when you live. There’s always another bottleneck.


Getting caching right yields huge performance benefits, saves bandwidth, and reduces server costs, but many sites half-arse their caching, creating race conditions resulting in interdependent resources getting out of sync.


Was reading on optimal ways of caching, mainly to offload the server by reducing the number of hits. 304s are beautiful, aren’t they? Moving your scaling to the frontend has a lot of benefits which includes serve content faster, fewer origin hits, .. Let’s look at the various levels where caches could be implemented.


Caching done badly has bad implications. Try your hardest not to cache data; but if you really do have to, make sure you do it right.


How Reddit monitors, tunes, and scales their memcached infrastructure.


You want to keep an object around only as long as you have memory available, do ya? Then you need the WeakReference class.