Implement An LRU Cache - The LRU Cache Eviction Policy ("LRU Cache" on LeetCode)
Free 5-Day Mini-Course: https://backtobackswe.com
Try Our Full Platform: https://backtobackswe.com/pricing
📹 Intuitive Video Explanations
🏃 Run Code As You Learn
💾 Save Progress
❓New Unseen Questions
🔎 Get All Solutions
Subscribe To Live Tech Offers: https://offerfeed.io
Join Our Coaching Service: https://backtobackswe.com/coaching
Question: Implement an LRU Cache.
It is a cache replacement policy that says to evict the least recently used item in the cache.
Every time an item is used it goes to the "front" of the cache since it has been used and has priority.
Items that are not used will go to the "end" of the cache eventually and get evicted since they are the least used items, they never got a bump up by being used.
Additional Cache Eviction Policies: https://en.wikipedia.org/wiki/Cache_replacement_policies
What is a Cache?
A cache is a hardware or software component that stores data so that future requests for that data can be served faster; the data stored in a cache might be the result of an earlier computation or a copy of data stored elsewhere.
What Is An LRU Cache?
So a LRU Cache is a storage of items so that future access to those items can be serviced quickly and an LRU policy is used for cache eviction.
The Constraints/Operations
Lookup of cache items must be O(1)
Addition to the cache must be O(1)
The cache should evict items using the LRU policy
The Approach
There are many ways to do this but the most common approach is to use 2 critical structures: a doubly linked list and a hashtable.
Our Structures
Doubly Linked List: This will hold the items that our cache has. We will have n items in the cache.
This structure satisfies the constraint for fast addition since any doubly linked list item can be added or removed in O(1) time with proper references.
Hashtable: The hashtable will give us fast access to any item in the doubly linked list items to avoid O(n) search for items and the LRU entry (which will always be at the tail of the doubly linked list).
This is a very common pattern, we use a structure to satisfy special insertion or remove properties (use a BST, linked list, etc.) and back it up with with a hashtable so we do not re-traverse the structures every time to find elements.
Complexities
Time
Both get and put methods are O( 1 ) in time complexity.
Space
We use O( n ) space since we will store n items where n ist the capacity of the cache.
++++++++++++++++++++++++++++++++++++++++++++++++++
HackerRank: https://www.youtube.com/channel/UCOf7UPMHBjAavgD0Qw5q5ww
Tuschar Roy: https://www.youtube.com/user/tusharroy2525
GeeksForGeeks: https://www.youtube.com/channel/UC0RhatS1pyxInC00YKjjBqQ
Jarvis Johnson: https://www.youtube.com/user/VSympathyV
Success In Tech: https://www.youtube.com/channel/UC-vYrOAmtrx9sBzJAf3x_xw
++++++++++++++++++++++++++++++++++++++++++++++++++
This question is number 13.3 in "Elements of Programming Interviews" by Adnan Aziz, Tsung-Hsien Lee, and Amit Prakash.
Видео Implement An LRU Cache - The LRU Cache Eviction Policy ("LRU Cache" on LeetCode) канала Back To Back SWE
Try Our Full Platform: https://backtobackswe.com/pricing
📹 Intuitive Video Explanations
🏃 Run Code As You Learn
💾 Save Progress
❓New Unseen Questions
🔎 Get All Solutions
Subscribe To Live Tech Offers: https://offerfeed.io
Join Our Coaching Service: https://backtobackswe.com/coaching
Question: Implement an LRU Cache.
It is a cache replacement policy that says to evict the least recently used item in the cache.
Every time an item is used it goes to the "front" of the cache since it has been used and has priority.
Items that are not used will go to the "end" of the cache eventually and get evicted since they are the least used items, they never got a bump up by being used.
Additional Cache Eviction Policies: https://en.wikipedia.org/wiki/Cache_replacement_policies
What is a Cache?
A cache is a hardware or software component that stores data so that future requests for that data can be served faster; the data stored in a cache might be the result of an earlier computation or a copy of data stored elsewhere.
What Is An LRU Cache?
So a LRU Cache is a storage of items so that future access to those items can be serviced quickly and an LRU policy is used for cache eviction.
The Constraints/Operations
Lookup of cache items must be O(1)
Addition to the cache must be O(1)
The cache should evict items using the LRU policy
The Approach
There are many ways to do this but the most common approach is to use 2 critical structures: a doubly linked list and a hashtable.
Our Structures
Doubly Linked List: This will hold the items that our cache has. We will have n items in the cache.
This structure satisfies the constraint for fast addition since any doubly linked list item can be added or removed in O(1) time with proper references.
Hashtable: The hashtable will give us fast access to any item in the doubly linked list items to avoid O(n) search for items and the LRU entry (which will always be at the tail of the doubly linked list).
This is a very common pattern, we use a structure to satisfy special insertion or remove properties (use a BST, linked list, etc.) and back it up with with a hashtable so we do not re-traverse the structures every time to find elements.
Complexities
Time
Both get and put methods are O( 1 ) in time complexity.
Space
We use O( n ) space since we will store n items where n ist the capacity of the cache.
++++++++++++++++++++++++++++++++++++++++++++++++++
HackerRank: https://www.youtube.com/channel/UCOf7UPMHBjAavgD0Qw5q5ww
Tuschar Roy: https://www.youtube.com/user/tusharroy2525
GeeksForGeeks: https://www.youtube.com/channel/UC0RhatS1pyxInC00YKjjBqQ
Jarvis Johnson: https://www.youtube.com/user/VSympathyV
Success In Tech: https://www.youtube.com/channel/UC-vYrOAmtrx9sBzJAf3x_xw
++++++++++++++++++++++++++++++++++++++++++++++++++
This question is number 13.3 in "Elements of Programming Interviews" by Adnan Aziz, Tsung-Hsien Lee, and Amit Prakash.
Видео Implement An LRU Cache - The LRU Cache Eviction Policy ("LRU Cache" on LeetCode) канала Back To Back SWE
Показать
Комментарии отсутствуют
Информация о видео
Другие видео канала
LeetCode 146. LRU Cache (Algorithm Explained)How To Get A Job At Google | The Ultimate Guide To Algorithmic/Coding InterviewsAll Nodes Distance K In A Binary Tree - Performing Bidirectional Search On A Tree Using A HashtableSystem Design : Design a service like TinyUrlThe Change Making Problem - Fewest Coins To Make Change Dynamic ProgrammingMerge K Sorted Arrays - Min Heap Algorithm ("Merge K Sorted Lists" on LeetCode)Knuth–Morris–Pratt (KMP) Pattern Matching Substring Search - First Occurrence Of SubstringLongest Common Subsequence (2 Strings) - Dynamic Programming & Competing SubproblemsURL shortener system design | tinyurl system design | bitly system designInvestigating Heap Sort - Why Is Heap Sort Θ(n * log(n))? An Even Longer Really Long Answer.What is Distributed Caching? Explained with Redis!The Magic of LRU Cache (100 Days of Google Dev)Least Recently Used (LRU) ExplanationClone A Linked List (With Random Pointers) - Linear Space Solution & Tricky Constant Space SolutionGenerate All Strings With n Matched Parentheses - Backtracking ("Generate Parentheses" on LeetCode)Pro vs. Hard Coding Interview Problem - LRU Cache (LeetCode Day 24)Designing Instagram: System Design of News FeedLFU Cache: Leetcode (Design Problem)The 0/1 Knapsack Problem (Demystifying Dynamic Programming)