Skip to content

ConcurrentTLru

Alex Peck edited this page May 20, 2022 · 17 revisions

ConcurrentTLru is a thread-safe bounded size pseudo TLRU. It's exactly like ConcurrentLru, but items have TTL. This page describes how to use the ConcurrentTLru.

Usage

ConcurrentTLru is intended to be a drop in replacement for ConcurrentDictionary, but with the benefit of bounded size based on a TLRU eviction policy.

The code samples below illustrate how to create an LRU then get/remove/update items:

Constructor

int capacity = 666;
TimeSpan ttl = TimeSpan.FromMinutes(5);
var lru = new ConcurrentTLru<int, SomeItem>(capacity, ttl);

Get

bool success1 = lru.TryGet(1, out var value);
var value1 = lru.GetOrAdd(1, (k) => new SomeItem(k));
var value2 = await lru.GetOrAddAsync(0, (k) => Task.FromResult(new SomeItem(k)));

Remove

bool success2 = lru.TryRemove(1);
lru.Clear();

Update

var item = new SomeItem(1);
bool success3 = lru.TryUpdate(1, item);
lru.AddOrUpdate(1, item);

Diagnostics

Console.WriteLine(lru.HitRatio);

// enumerate keys
foreach (var k in lru.Keys)
{
   Console.WriteLine(k);
}

// enumerate key value pairs
foreach (var kvp in lru)
{
   Console.WriteLine($"{kvp.Key} {kvp.Value}");
}

// register event on item removed
lru.ItemRemoved += (source, args) => Console.WriteLine($"{args.Reason} {args.Key} {args.Value}");

How it works

ConcurrentTLru uses the same core algorithm as ConcurrentLru, described here. In addition, the TLru item eviction policy evaluates the age of an item on each lookup (i.e. TryGet/GetOrAdd/GetOrAddAsync), and if the item has expired it will be discarded at lookup time. Expired items can also be discarded if the cache is at capacity and new items are added. When a new item is added, existing items may transition from hot to warm to cold, and expired items can be evicted at these transition points.

Note that expired items are not eagerly evicted when the cache is below capacity, since there is no background thread performing cleanup. Thus, TLru provides a mechanism to bound the staleness of read items, and there is no forceful eviction of stale items until capacity is reached.

Why is ConcurrentTLru slower than ConcurrentLru?

On every lookup, item age is calculated and compared to the TTL. Internally, this results in a call to Stopwatch.GetTimestamp(), which is relatively expensive compared to the dictionary lookup that fetches the item.

It is possible to get a considerable speedup using Environment.TickCount. However, this is based on a 32bit int that can only be used reliably for 49.8 days.

Method Runtime Mean Ratio
DateTimeUtcNow .NET 6.0 24.545 ns 1.00
EnvironmentTickCount .NET 6.0 1.624 ns 0.06
StopWatchGetElapsed .NET 6.0 16.349 ns 0.67
DateTimeUtcNow .NET Framework 4.8 57.072 ns 1.00
EnvironmentTickCount .NET Framework 4.8 1.577 ns 0.03
StopWatchGetElapsed .NET Framework 4.8 25.015 ns 0.44

Stopwatch is used by default because it is the fastest method that gives reliable expiry in all cases.

If your process is guaranteed to run for less than 49.8 days, you can create a TLru with TickCountLruItem and TlruTicksPolicy like this:

public sealed class CustomTLru<K, V> : TemplateConcurrentLru<K, V, TickCountLruItem<K, V>, TLruTicksPolicy<K, V>, TelemetryPolicy<K, V>>
{
    public CustomTLru(int capacity, TimeSpan timeToLive)
        : base(concurrencyLevel, capacity, EqualityComparer<K>.Default, new TLruTicksPolicy<K, V>(timeToLive), default)
    { 
    }
...
}