Fix other re-entrancy nits for the lru_cache.

Keep references for oldkey and oldvalue so they can't
trigger a __del__ method to reenter our thread.

Move the cache[key]=link step to the end, after the link
data is in a consistent state.

Under exotic circumstances, the cache[key]=link step could
trigger reentrancy (i.e. the key would have to have a hash
exactly equal to that for another key in the cache and the
key would need a __eq__ method that makes a reentrant call
our cached function).
This commit is contained in:
Raymond Hettinger 2013-03-04 03:34:09 -05:00
parent 0392342673
commit f2c17a9276

View file

@ -267,19 +267,23 @@ def lru_cache(maxsize=128, typed=False):
# computed result and update the count of misses. # computed result and update the count of misses.
pass pass
elif full: elif full:
# use root to store the new key and result # use the old root to store the new key and result
root[KEY] = key oldroot = root
root[RESULT] = result oldroot[KEY] = key
cache[key] = root oldroot[RESULT] = result
# empty the oldest link and make it the new root # empty the oldest link and make it the new root
root = root[NEXT] root = oldroot[NEXT]
del cache[root[KEY]] oldkey = root[KEY]
oldvalue = root[RESULT]
root[KEY] = root[RESULT] = None root[KEY] = root[RESULT] = None
# now update the cache dictionary for the new links
del cache[oldkey]
cache[key] = oldroot
else: else:
# put result in a new link at the front of the queue # put result in a new link at the front of the queue
last = root[PREV] last = root[PREV]
link = [last, root, key, result] link = [last, root, key, result]
cache[key] = last[NEXT] = root[PREV] = link last[NEXT] = root[PREV] = cache[key] = link
currsize += 1 currsize += 1
full = (currsize == maxsize) full = (currsize == maxsize)
misses += 1 misses += 1