Skip to content

ConcurrentTLru

Alex Peck edited this page May 19, 2022 · 17 revisions

ConcurrentTLru is a thread-safe bounded size pseudo TLRU. It's exactly like ConcurrentLru, but items have TTL. This page describes how to use the ConcurrentTLru.

Usage

ConcurrentTLru is intended to be a drop in replacement for ConcurrentDictionary, but with the benefit of bounded size based on a TLRU eviction policy.

The code samples below illustrate how to create an LRU then get/remove/update items:

Constructor

int capacity = 666;
TimeSpan ttl = TimeSpan.FromMinutes(5);
var lru = new ConcurrentTLru<int, SomeItem>(capacity, ttl);

Get

bool success1 = lru.TryGet(1, out var value);
var value1 = lru.GetOrAdd(1, (k) => new SomeItem(k));
var value2 = await lru.GetOrAddAsync(0, (k) => Task.FromResult(new SomeItem(k)));

Remove

bool success2 = lru.TryRemove(1);
lru.Clear();

Update

var item = new SomeItem(1);
bool success3 = lru.TryUpdate(1, item);
lru.AddOrUpdate(1, item);

Diagnostics

Console.WriteLine(lru.HitRatio);

// enumerate keys
foreach (var k in lru.Keys)
{
   Console.WriteLine(k);
}

// enumerate key value pairs
foreach (var kvp in lru)
{
   Console.WriteLine($"{kvp.Key} {kvp.Value}");
}

// register event on item removed
lru.ItemRemoved += (source, eventArgs) => Console.WriteLine($"{eventArgs.Reason} {eventArgs.Key} {eventArgs.Value}");

How it works

The ConcurrentTLru implementation is the same as ConcurrentLru, described here. The only difference is the TLru item eviction policy evaluates the age of an item on each lookup (i.e. TryGet/GetOrAdd/GetOrAddAsync), and if the item has expired the item in the cache will be discarded at lookup time. Expired items can also be discarded if the cache is at capacity and new items are added. Internally items transition from hot to warm to cold, and expired items can be evicted at these transition points.

Note that expired items are not eagerly evicted when the cache is below capacity, since there is no background thread performing cleanup. Thus, TLru provides a mechanism to bound the staleness of read items, with no forceful eviction of stale items until capacity is reached.