Sandbox cache backend errors in cache_memoize().

Review Request #13446 — Created Dec. 4, 2023 and submitted

Information

Djblets
release-3.x

Reviewers

If a cache backend was acting up or was down, calls to cache_memoize()
or cache_memoize_iter() would easily fail, propagating errors and
breaking the application. This could interfere with state generation or
just halt operations on a server.

This is now fully-insulated from any errors coming from the backend. The
_CacheContext.load_value(), store_value(), and store_many()
internal helpers now catch exceptions and log them, providing some
useful information to help immediately show any issues in logs.

The functions calling these then catch the exceptions, optionally adding
more context for logs, and then gracefully handle and discard the
exceptions.

For a standard cache_memoize(), this means that generated data will be
returned but any failure to get/set data will be succeed. This is
actually the same behavior we already had, with one exception (no pun
intended): Upon exception, a variable would be undefined, breaking. This
is now set correctly.

The bulk of the new sandboxing comes from working with large or iterable
data. We already had protection from loading data from cache (mostly to
take into account missing chunked cache items), in that we'd fall back
on generating new data. But we now protect when setting cached data as
well. If we fail to set a cache key, we give up on setting any prior
keys for that data, and just yield the newly-generated data as before.

This should all protect from bad cache server behavior,
misconfigurations, and outages in code that uses cache_memoize() and
friends.

All Djblets and Review Board unit tests pass.

Summary ID
Sandbox cache backend errors in cache_memoize().
If a cache backend was acting up or was down, calls to `cache_memoize()` or `cache_memoize_iter()` would easily fail, propagating errors and breaking the application. This could interfere with state generation or just halt operations on a server. This is now fully-insulated from any errors coming from the backend. The `_CacheContext.load_value()`, `store_value()`, and `store_many()` internal helpers now catch exceptions and log them, providing some useful information to help immediately show any issues in logs. The functions calling these then catch the exceptions, optionally adding more context for logs, and then gracefully handle and discard the exceptions. For a standard `cache_memoize()`, this means that generated data will be returned but any failure to get/set data will be succeed. This is actually the same behavior we already had, with one exception (no pun intended): Upon exception, a variable would be undefined, breaking. This is now set correctly. The bulk of the new sandboxing comes from working with large or iterable data. We already had protection from loading data from cache (mostly to take into account missing chunked cache items), in that we'd fall back on generating new data. But we now protect when setting cached data as well. If we fail to set a cache key, we give up on setting any prior keys for that data, and just yield the newly-generated data as before. This should all protect from bad cache server behavior, misconfigurations, and outages in code that uses `cache_memoize()` and friends.
4b0db2cd509d5c2572c36de8574d15c87b2121f4
david
  1. Ship It!
  2. 
      
maubin
  1. Ship It!
  2. 
      
chipx86
Review request changed

Status: Closed (submitted)

Change Summary:

Pushed to release-3.x (20278e2)
Loading...