A cache object that deletes the least-recently-used items.If you put more stuff in it, then items will fall out.
mru lru cacheThis is a collection of different data structures and utilities, implemented in JavaScript. Its written and tested using Node.js which is also the target platform.CircularBuffer - A data structure that uses a single, fixed-size buffer as if it were connected end-to-end. When the buffer is filled, new data is written starting at the beginning of the buffer and overwriting the old.
algorithms datastructures ds sort-map data-structures avl avltree trie ternarysearchtrie multiwaytrie skiplist btree binarysearch binarysearchtree sortedmap lru cache eviction maximize least-recently-used loading guava async delay-queue delayqueue priorityqueue heap binaryheap bitset bitarray bitvector circularbuffer ringbuffer graph directed undirected shortest cycle path topological bloomfilter bloom filterAll of these features are optional and are off by default so you can pick and choose those you wish to enable. For anything else, please see the documentation.
caching transactions lru distributed-systems expirationA finite key-value map using the Least Recently Used (LRU) algorithm, where the most recently-used items are "kept alive" while older, less-recently used items are evicted to make room for newer items. Useful when you want to limit use of memory to only hold commonly-used things.
cache lru buffer mapmoize is a consistently blazing fast memoization library for JavaScript. It handles multiple parameters (including default values) without any additional configuration, and offers a large number of options to satisfy any number of potential use-cases. All parameter types are supported, including circular objects, functions, etc. There are also a number of shortcut methods to memoize for unique use-cases.
memoization performance cache expire lru memoize optimize promise ttlStorageLRU is a LRU implementation that can be used with local storage, or other storage mechanisms that support a similar interface.Note: This library is written in CommonJS style. To use it in browser, please use tools like Browserify and Webpack.
cache localstorage lruUseful when you need to cache something and limit memory usage.Inspired by the hashlru algorithm, but instead uses Map to support keys of any type, not just strings, and values can be undefined.
lru quick cache caching least recently used fast map hash bufferCache your async lookups and don't fetch the same thing more than necessary.Let's say you have to look up stat info from paths. But you are ok with only looking up the stat info once every 10 minutes (since it doesn't change that often), and you want to limit your cache size to 1000 objects, and never have two stat calls for the same file happening at the same time (since that's silly and unnecessary).
async cache lruMemoize the given function fn, using async-lru, a simple async LRU cache supporting O(1) set, get and eviction of old keys.The function must be a Node.js style function, where the last argument is a callback.
memoize nodejs browser browserify lru lru-cache callback memo node.js async-lruSince values are fetched asynchronously, the get method takes a callback, rather than returning the value synchronously.While there is a set(key, value) method to manually seed the cache, typically you'll just call get and let the load function fetch the key for you.
async lru lru-cache nodejs browser least-recently-used cache async-cache async-lruThis caches the results of store.get() calls using lru-cache. See the lru-cache docs for the full list of configuration options.MIT. Copyright (c) Feross Aboukhadijeh.
abstract-chunk-store chunk lru least-recently-used cache storeIt's a very simple and extremely fast lru cache for node.js.
cache lru simple fastLeast Recently Used cache for Client or Server. Lodash provides a memoize function with a cache that can be swapped out as long as it implements the right interface. See the lodash docs for more on memoize.
lru cache tiny client server least recently usedLookupBy is a thread-safe lookup table cache for ActiveRecord that reduces normalization pains. Please create Issues to submit bug reports and feature requests. However, I ask that you'd kindly review these bug reporting guidelines first.
lru lookup caching activerecord railsDistributed cache using Redis as cache store and memory lru for fast access to loaded data. createRedisClient must be a function that creates and returns an instance of node-redis compatible with v0.8. This way your app can still have control over instantiating and handling errors on the redis client.
distributed cache redis lru memorycluster aware lru-cache - A cache object that deletes the least-recently-used items. If you put more stuff in it, then items will fall out.
lru-cache lru cache clusterResilient and performant in-memory cache for node.js. Unless you're able to cache resources forever, use maxAge together with staleWhileRevalidate to get fault-tolerant, zero-latency cache refreshes.
lru cache stale-while-revalidate max-age cache-controlThe implementation is inspired by node-lru-cache by Isaac Schlueter. The motivation of this project is to provide Object.create fallback in order to work on IE8. Or download tarball.
cache lruA caching library for Python.
caching memoization lru mru lfu fifo lifo rr python3
We have large collection of open source products. Follow the tags from
Tag Cloud >>
Open source products are scattered around the web. Please provide information
about the open source projects you own / you use.
Add Projects.