Skip to content

Commit

Permalink
Add clarifying JavaDoc (see #537)
Browse files Browse the repository at this point in the history
In #537, a deadlock occurs because the reload operation blocks waiting
for other reloads to complete. This is to batch the individual reloads
through a blocking queue, which drains tasks as work is performed. Due
to in-flight reloads being tracked by a secondary map, the triggering
of a reload is performed with a map computation. This ensures that a
stale reload does not insert outdated data by the entry being removed
when the reload completes or the original entry was removed / modified.

Due to a reload blocking on a map lock, when the batch completes the
future's callback to remove the entry collides on the same map lock
due to hashing. This causes the task queue to never be reduced to
allow more work, so the reload holding the lock is stuck forever. Due
to hash collisions, other usages of the map (like eviction) may
collide and fully disrupt the cache from performing any more writes.
  • Loading branch information
ben-manes committed May 3, 2021
1 parent 7b28541 commit 754b19d
Show file tree
Hide file tree
Showing 3 changed files with 18 additions and 0 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -48,6 +48,8 @@ public interface AsyncCacheLoader<K extends Object, V extends Object> {

/**
* Asynchronously computes or retrieves the value corresponding to {@code key}.
* <p>
* <b>Warning:</b> loading <b>must not</b> attempt to update any mappings of this cache directly.
*
* @param key the non-null key whose value should be loaded
* @param executor the executor with which the entry is asynchronously loaded
Expand All @@ -71,6 +73,8 @@ public interface AsyncCacheLoader<K extends Object, V extends Object> {
* This method should be overridden when bulk retrieval is significantly more efficient than many
* individual lookups. Note that {@link AsyncLoadingCache#getAll} will defer to individual calls
* to {@link AsyncLoadingCache#get} if this method is not overridden.
* <p>
* <b>Warning:</b> loading <b>must not</b> attempt to update any mappings of this cache directly.
*
* @param keys the unique, non-null keys whose values should be loaded
* @param executor the executor with which the entries are asynchronously loaded
Expand All @@ -92,6 +96,9 @@ public interface AsyncCacheLoader<K extends Object, V extends Object> {
* {@code null} is computed. This method is called when an existing cache entry is refreshed by
* {@link Caffeine#refreshAfterWrite}, or through a call to {@link LoadingCache#refresh}.
* <p>
* <b>Warning:</b> loading <b>must not</b> attempt to update any mappings of this cache directly
* or block waiting for other cache operations to complete.
* <p>
* <b>Note:</b> <i>all exceptions thrown by this method will be logged and then swallowed</i>.
*
* @param key the non-null key whose value should be loaded
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -223,6 +223,9 @@ Map<K, V> getAll(Iterable<? extends K> keys,
* Returns access to inspect and perform low-level operations on this cache based on its runtime
* characteristics. These operations are optional and dependent on how the cache was constructed
* and what abilities the implementation exposes.
* <p>
* <b>Warning:</b> policy operations <b>must not</b> be performed from within an atomic scope of
* another cache operation.
*
* @return access to inspect and perform advanced operations based on the cache's characteristics
*/
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -123,6 +123,8 @@ default CompletableFuture<? extends V> asyncLoad(K key, Executor executor) {
* This method should be overridden when bulk retrieval is significantly more efficient than many
* individual lookups. Note that {@link AsyncLoadingCache#getAll} will defer to individual calls
* to {@link AsyncLoadingCache#get} if this method is not overridden.
* <p>
* <b>Warning:</b> loading <b>must not</b> attempt to update any mappings of this cache directly.
*
* @param keys the unique, non-null keys whose values should be loaded
* @param executor the executor that with asynchronously loads the entries
Expand Down Expand Up @@ -151,6 +153,9 @@ default CompletableFuture<? extends V> asyncLoad(K key, Executor executor) {
* returned. This method is called when an existing cache entry is refreshed by
* {@link Caffeine#refreshAfterWrite}, or through a call to {@link LoadingCache#refresh}.
* <p>
* <b>Warning:</b> loading <b>must not</b> attempt to update any mappings of this cache directly
* or block waiting for other cache operations to complete.
* <p>
* <b>Note:</b> <i>all exceptions thrown by this method will be logged and then swallowed</i>.
*
* @param key the non-null key whose value should be loaded
Expand All @@ -172,6 +177,9 @@ default CompletableFuture<? extends V> asyncLoad(K key, Executor executor) {
* {@code null} is computed. This method is called when an existing cache entry is refreshed by
* {@link Caffeine#refreshAfterWrite}, or through a call to {@link LoadingCache#refresh}.
* <p>
* <b>Warning:</b> loading <b>must not</b> attempt to update any mappings of this cache directly
* or block waiting for other cache operations to complete.
* <p>
* <b>Note:</b> <i>all exceptions thrown by this method will be logged and then swallowed</i>.
*
* @param key the non-null key whose value should be loaded
Expand Down

0 comments on commit 754b19d

Please sign in to comment.