[LeetCode]LRU Cache
2014-06-30 01:24
239 查看
Design and implement a data structure for Least Recently Used (LRU) cache. It should support the following operations:
key exists in the cache, otherwise return -1.
present. When the cache reached its capacity, it should invalidate the least recently used item before inserting a new item.
Analysis:
首先,对于cache,如果希望有O(1)的查找复杂度,肯定要用hashmap来保存key和对象的映射。对于LRU而言,问题在于如何用O(1)解决cache entry的替换问题。
简单的说,cache的存储是一个链表的话,那么只要保证从头到尾的顺序就是cache从新到旧的顺序就好了,对于任何一个节点,如果被访问了,那么就将该节点移至头部。如果cache已满,那么就把尾部的删掉,从头部插入新节点。
所以,需要用到两个数据结构
1. hashmap, 保存key和对象位置的映射
2. list,保存对象新旧程度的序列。不一定是list,也可以用vector,不过list的好处是已经实现了头尾操作的api,vector的话,还要自己写,麻烦。
用单向链表复杂度为O(lgn)
双向链表为O(1)
Java 双向链表
c++ 单向链表
getand
set.
get(key)- Get the value (will always be positive) of the key if the
key exists in the cache, otherwise return -1.
set(key, value)- Set or insert the value if the key is not already
present. When the cache reached its capacity, it should invalidate the least recently used item before inserting a new item.
Analysis:
首先,对于cache,如果希望有O(1)的查找复杂度,肯定要用hashmap来保存key和对象的映射。对于LRU而言,问题在于如何用O(1)解决cache entry的替换问题。
简单的说,cache的存储是一个链表的话,那么只要保证从头到尾的顺序就是cache从新到旧的顺序就好了,对于任何一个节点,如果被访问了,那么就将该节点移至头部。如果cache已满,那么就把尾部的删掉,从头部插入新节点。
所以,需要用到两个数据结构
1. hashmap, 保存key和对象位置的映射
2. list,保存对象新旧程度的序列。不一定是list,也可以用vector,不过list的好处是已经实现了头尾操作的api,vector的话,还要自己写,麻烦。
用单向链表复杂度为O(lgn)
双向链表为O(1)
Java 双向链表
public class LRUCache { class CacheEntry{ int key; int value; CacheEntry pre; CacheEntry next; public CacheEntry(int k, int v) { this.key = k; this.value = v; } } int l_capacity; HashMap<Integer, CacheEntry> l_HashMapmap; CacheEntry head, tail; // use two way linklist public LRUCache(int capacity) { l_capacity = capacity; l_HashMapmap = new HashMap<>(capacity); head = new CacheEntry(-1, -1); tail = new CacheEntry(1, 1); head.next = tail; tail.pre = head; } public int get(int key) {//if contains key, just get value and update if(l_HashMapmap.containsKey(key)){ CacheEntry cEntry = l_HashMapmap.get(key); MoveToHead(cEntry); return cEntry.value; } else return -1; } public void set(int key, int value) { if(l_HashMapmap.containsKey(key)){//if map contains key, just update value CacheEntry cEntry = l_HashMapmap.get(key); cEntry.value = value; MoveToHead(cEntry); }else if(l_HashMapmap.size()<l_capacity){//not contain & smaller the size CacheEntry cEntry = new CacheEntry(key, value); MoveToHead(cEntry); l_HashMapmap.put(key, cEntry); }else {//not contain key and over the size CacheEntry cEntry = new CacheEntry(key, value); MoveToHead(cEntry); l_HashMapmap.put(key, cEntry); int endIndex = removeEnd(); l_HashMapmap.remove(endIndex); } } private int removeEnd() { // TODO Auto-generated method stub CacheEntry cEntry = tail.pre; tail.pre.pre.next = tail; tail.pre = cEntry.pre; cEntry.pre = null; cEntry.next = null; return cEntry.key; } private void MoveToHead(CacheEntry cEntry) { // TODO Auto-generated method stub if(cEntry.next!=null && cEntry.pre!=null){ cEntry.pre.next = cEntry.next; cEntry.next.pre = cEntry.pre; } cEntry.pre = head; cEntry.next = head.next; head.next.pre = cEntry; head.next = cEntry; } }
c++ 单向链表
class LRUCache{ public: struct CacheEntry{ public: int key; int value; CacheEntry(int k, int v): key(k), value(v){} }; LRUCache(int capacity){ m_capacity = capacity; } int get(int key){ if(m_map.find(key)==m_map.end()) return -1; MoveToHead(key); return m_map[key]->value; } void set(int key, int value){ if(m_map.find(key)==m_map.end()){ CacheEntry newItem(key, value); if(m_LRU_cache.size()>=m_capacity){//remove from tail m_map.erase(m_LRU_cache.back().key); m_LRU_cache.pop_back(); } //insert in head m_LRU_cache.push_front(newItem); m_map[key] = m_LRU_cache.begin(); return; } m_map[key]->value = value; MoveToHead(key); } private: int m_capacity; list<CacheEntry> m_LRU_cache; unordered_map<int, list<CacheEntry>::iterator> m_map; void MoveToHead(int key){ auto updateEntry = *m_map[key]; m_LRU_cache.erase(m_map[key]); m_LRU_cache.push_front(updateEntry); m_map[key] = m_LRU_cache.begin(); } };
相关文章推荐
- [Leetcode] LRU Cache
- leetcode - LRU Cache
- Leetcode: LRU Cache 理解分析
- leetcode--LRU Cache
- Java for LeetCode 146 LRU Cache 【HARD】
- 【LeetCode】LRU Cache
- leetcode[146]LRU Cache
- [LeetCode] LRU Cache
- Leetcode: LRU Cache
- solution of LeetCode:146 LRU Cache
- [LeetCode]LRU Cache
- LRU Cache--LeetCode
- [LeetCode]LRU Cache
- LeetCode 146: LRU Cache
- 【LeetCode】LRU Cache
- LeetCode: LRU Cache
- [LeetCode]LRU Cache
- LeetCode:LRU Cache
- [Leetcode][JAVA] LRU Cache
- LeetCode:146_LRU cache | LRU缓存设计 | Hard