-
Notifications
You must be signed in to change notification settings - Fork 31
Home
BitFaster.Caching is a high performance in-memory caching library for .NET.
The most popular caching library for .NET is arguably System.Extensions.MemoryCache, but it's a heavyweight option with limitations (see below). In particular MemoryCache is not a good fit when the number of possible cached values is huge. Everything will be cached and this is either a waste of memory (expensive), or worse, out of memory (failure). I have seen out of memory failures in production. BitFaster.Caching provides bounded size caches with a focus on performance. By explicitly choosing how many items to cache, the developer is in control of the cache budget and runaway memory usage is prevented.
Since BitFaster.Caching is generic, non-string keys can be used without being forced to allocate a new key string for each lookup. A cache provides a speedup when a lookup is faster than computing/querying/fetching the value. With faster lookups that don't allocate, caching can achieve speedups in lower level code (e.g. fast computations like parsing/json formatting small objects), in addition to RPC calls or disk reads. This enables caching to be plugged into several layers of the program without exploding memory consumption.
MemoryCache is perfectly serviceable, but it has some limitations:
- Makes heap allocations when the native object key is not type string.
- Is not 'scan' resistant, fetching all keys will load everything into memory.
- Does not scale well with concurrent writes.
- Contains perf counters that can't be disabled.
- Uses an heuristic to estimate memory used, and evicts items using a timer. The 'trim' process may remove useful items, and if the timer does not fire fast enough the resulting memory pressure can be problematic (e.g. induced GC).