-
TL;DR: For AsyncLoaderCache Is there some mechanism by which concurrent invocations of the Good day There is a large existing system with a custom caches, which is being replaced with Caffeine. This cache is called concurrently by many callers, each needing singular values, and given the established nature of the system this is hard to change. The data fetching operation to the DB is expensive, and it would be convenient to say that only ever eg. two loading operations can happen at the same time, with singular requests that arrive while they are in flight (that need fetching) instead being fetched using the I can see how one could achieve this by building around Caffeine, but is there some way to achieve it inside it? Thanks for you help. |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 3 replies
-
You can look at the contributed coalescing bulkloader. I believe this returns the singular futures, queues the requests, batches loads them, and completes the pending futures. This type of functionality is nice but a little out of scope for this library, similar to retry and fallbacks being served by combining with Failsafe. Unfortunately I don’t know of a generic coalescing library, but that would compose nicely with our cache loader. |
Beta Was this translation helpful? Give feedback.
You can look at the contributed coalescing bulkloader. I believe this returns the singular futures, queues the requests, batches loads them, and completes the pending futures. This type of functionality is nice but a little out of scope for this library, similar to retry and fallbacks being served by combining with Failsafe. Unfortunately I don’t know of a generic coalescing library, but that would compose nicely with our cache loader.