How to Implement Cache LRU With Swift
Learn how to implement a cache LRU in Swift mobile development in this tutorial, as well as a comparison of linked lists versus arrays.
Join the DZone community and get the full member experience.
Join For FreeIntroduction
A Cache LRU (Least Recently Used) is similar to a dictionary. It stores data associated with a key. The difference between a dictionary and a cache is that the latter has a limited capacity. Every time we reach the capacity, the cache deletes the least recently used element.
In this article, we are going to see how to implement a Cache LRU with Swift.
Happy reading!
Getting Started
First of all, we must understand what data structures we should use to implement our Cache LRU. There are different ways to implement it. In this version, we will use:
- Doubly Linked List: This is the core of our implementation. We need this list to store the elements of our cache. We don’t use an Array because it would be slower. The Cache LRU policy is to move the elements recently used in the head very often. If we move an element in the head—at the index
0
—in an array, we should perform a shift to right for all the other elements. Dictionary<Key, ListNode>
: The problem of using a doubly linked list is that its lookup complexity is O(n). We can solve this bottleneck using a dictionary—which has a lookup complexity of O(1). We’ll use this dictionary to store the nodes of our list.
In the next section, we are going to see how to implement the doubly linked list. If you already know it, feel free to jump to the section Cache LRU
.
Doubly Linked List
For this article, we don’t need a complete doubly linked list implementation. For this reason, we’ll implement only the methods used in the cache.
The first step is to create a new class DoublyLinkedList
which accepts a generic value T
to store inside the nodes:
final class DoublyLinkedList<T> {
}
Then, we must create a class for the nodes:
final class DoublyLinkedList<T> {
final class Node<T> {
var payload: T
var previous: Node<T>?
var next: Node<T>?
init(payload: T) {
self.payload = payload
}
}
}
In this implementation, we use a nested Node
class. If you use a Swift version older than 3.1, you must create this class outsideDoublyLinkedList
. Nested classes with generic values are supported from Swift 3.1.
Then, we must provide the information of how many elements are stored in the list:
private(set) var count: Int = 0
The operations on a linked list may be sometimes complex to implement. For this reason, we can store the first and last elements to keep our life easier:
private var head: Node<T>?
private var tail: Node<T>?
Now, we can start implementing the list methods:
addHead
We need a method to add a new element in the list. We add it in the head to be compliant with the Cache LRU policy—a new element is the recently used:
func addHead(_ payload: T) -> Node<T> {
let node = Node(payload: payload)
defer {
head = node
count += 1
}
guard let head = head else {
tail = node
return node
}
head.previous = node
node.previous = nil
node.next = head
return node
}
moveToHead
The concept of a Cache LRU is keeping the recently used element at the beginning of our list. For this reason, we need a method to move a node at the head:
func moveToHead(_ node: Node<T>) {
guard node !== head else { return }
let previous = node.previous
let next = node.next
previous?.next = next
next?.previous = previous
node.next = head
node.previous = nil
if node === tail {
tail = previous
}
self.head = node
}
removeLast
When our cache is full, we need a method to remove the last element—which is the least recently used:
func removeLast() -> Node<T>? {
guard let tail = self.tail else { return nil }
let previous = tail.previous
previous?.next = nil
self.tail = previous
if count == 1 {
head = nil
}
count -= 1
// 1
return tail
}
- The value of this
tail
is not the same ofself.tail
. It’s the value of the oldtail
which comes from the optional binding in theguard
at the beginning of this method.
Finally, we can add a typealias for our Node
type to use in the cache implementation:
typealias DoublyLinkedListNode<T> = DoublyLinkedList<T>.Node<T>
The final version of our list implementation should be like this:
typealias DoublyLinkedListNode<T> = DoublyLinkedList<T>.Node<T>
final class DoublyLinkedList<T> {
final class Node<T> {
var payload: T
var previous: Node<T>?
var next: Node<T>?
init(payload: T) {
self.payload = payload
}
}
private(set) var count: Int = 0
private var head: Node<T>?
private var tail: Node<T>?
func addHead(_ payload: T) -> Node<T> {
let node = Node(payload: payload)
defer {
head = node
count += 1
}
guard let head = head else {
tail = node
return node
}
head.previous = node
node.previous = nil
node.next = head
return node
}
func moveToHead(_ node: Node<T>) {
guard node !== head else { return }
let previous = node.previous
let next = node.next
previous?.next = next
next?.previous = previous
node.next = head
node.previous = nil
if node === tail {
tail = previous
}
self.head = node
}
func removeLast() -> Node<T>? {
guard let tail = self.tail else { return nil }
let previous = tail.previous
previous?.next = nil
self.tail = previous
if count == 1 {
head = nil
}
count -= 1
return tail
}
}
Cache LRU
It’s time to implement our cache. We can start creating a new CacheLRU
class:
final class CacheLRU<Key: Hashable, Value> {
The generic value Key
must be Hashable
since it’s the key of the value stored in the doubly linked list.
A cache stores data associated with keys, like a dictionary. Unfortunately, our doubly linked list accepts only a value payload
and not also a key. To solve this problem, we can create a struct which wraps the value and its key. In this way, our list nodes will store the object CachePayload
which contains both value and key:
final class CacheLRU<Key: Hashable, Value> {
private struct CachePayload {
let key: Key
let value: Value
}
}
Then, we should add the two data structures—a doubly linked list and a dictionary:
private let list = DoublyLinkedList<CachePayload>()
private var nodesDict = [Key: DoublyLinkedListNode<CachePayload>]()
As we saw in Introduction
, Cache LRU has a limited capacity. We can inject this capacity in the init
method and store it in a private property:
private let capacity: Int
init(capacity: Int) {
self.capacity = max(0, capacity)
}
We use the method max
to avoid invalid capacity
values less than zero.
Now, we can implement the two cache methods to get
and set
the elements:
setValue
With the method set
, we can add/update an element for a specific key. The value is always moved at the beginning of the list as recently used element:
func setValue(_ value: Value, for key: Key) {
// 1
let payload = CachePayload(key: key, value: value)
// 2
if let node = nodesDict[key] {
node.payload = payload
list.moveToHead(node)
} else {
let node = list.addHead(payload)
nodesDict[key] = node
}
// 3
if list.count > capacity {
let nodeRemoved = list.removeLast()
if let key = nodeRemoved?.payload.key {
nodesDict[key] = nil
}
}
}
- Create a payload object to wrap
key
andvalue
to be stored in the list. - If the list already stores an element for that specific key, we update the value and move it at the beginning of the list. Otherwise, we create a new node and add it as head of the list.
- If we exceeded the capacity of the cache adding the new element, we remove the last element of the list.
getValue
With the method get
, we can retrieve an element for a specific key. Every time we retrieve an element, it is moved at the beginning of the list as recently used element:
func getValue(for key: Key) -> Value? {
guard let node = nodesDict[key] else { return nil }
list.moveToHead(node)
return node.payload.value
}
The final version of our cache implementation should be this:
final class CacheLRU<Key: Hashable, Value> {
private struct CachePayload {
let key: Key
let value: Value
}
private let capacity: Int
private let list = DoublyLinkedList<CachePayload>()
private var nodesDict = [Key: DoublyLinkedListNode<CachePayload>]()
init(capacity: Int) {
self.capacity = max(0, capacity)
}
func setValue(_ value: Value, for key: Key) {
let payload = CachePayload(key: key, value: value)
if let node = nodesDict[key] {
node.payload = payload
list.moveToHead(node)
} else {
let node = list.addHead(payload)
nodesDict[key] = node
}
if list.count > capacity {
let nodeRemoved = list.removeLast()
if let key = nodeRemoved?.payload.key {
nodesDict[key] = nil
}
}
}
func getValue(for key: Key) -> Value? {
guard let node = nodesDict[key] else { return nil }
list.moveToHead(node)
return node.payload.value
}
}
We can use this cache like this:
let cache = CacheLRU<Int, String>(capacity: 2)
cache.getValue(for: 5) // nil
cache.setValue("One", for: 1)
cache.setValue("Eleven", for: 11)
cache.setValue("Twenty", for: 20)
cache.getValue(for: 1) // nil. We exceeded the capacity with the previous `setValue` and `1` was the last element.
cache.getValue(for: 11) // Eleven
You can find a Gist with the Cache LRU implementation here.
Conclusion
That’s all for our Cache LRU.
Nowadays, we have a lot of memory available for our Apps. Despite this, we may need a cache with a limited capacity to save memory space. For example, when we have to cache objects which are space-consuming like the images.
Update:
I’ve figured out that Array is faster than linked lists. Since the pure version of Cache LRU uses a doubly linked list, I leave the current implementation. But, keep in mind that with a Swift Array we would have a faster implementation.
You can watch an interesting video about Array and linked list here:
Reference Links
Published at DZone with permission of Marco Santarossa, DZone MVB. See the original article here.
Opinions expressed by DZone contributors are their own.
Comments