Skip to content →

Tag: ARC

A Generic Swift Queue and ARC

In my current project, I set off to implement a generic queue in Swift, and settled on the use of a linked list for doing so, where my queue is a collection of singly linked nodes. As it turned out, this ended up being fertile ground for illustrating Swift’s use of Automatic Reference Counting (ARC) for memory management, something I take on in this post.

For the visual learners, here is an illustration to assist the discussion, courtesy of Paper by 53 and my iPad Pro (with Apple Pencil, of course):

IMG_0040

The top row represents three people waiting in line at Walmart customer service (who hasn’t been there?). The second row shows the first person has been helped (dequeued), leaving two. And the third row shows two new customers have arrived (enqueued), eager to make their returns, before anyone else has been helped. The orange numbers correspond to reference counts (more on that in a moment).

Queue Node

So to begin, let’s take a look at the building block of the queue, the node (represented by the circles in the diagram):

/**
 The component of a queue that provides the queue's content and sequence.
 */
class QueueNode<T> {

  /// The data element stored in the queue.
  var element: T

  /// The node that is "next in line", or `nil` if there is no subsequent node.
  var next: QueueNode?

  /**
   Wraps the given `element` into a node, leaving `next` set to `nil`.
   */
  init(element: T) {
    self.element = element
  }
}

Being part of a linked list, it is necessary for the node to be a class, as linked lists rely on reference pointers, and only Swift classes—as reference types—can be “pointed” to. (In contrast, any assignment of a structure or enumeration stores a copy of the object in the receiving variable, property, or parameter.)

Queue

Next, let’s begin taking a look at the class that takes the QueueNode building block and with it forms a fully functional Queue data structure:

/**
 A generic collection that provides standard enqueueing and dequeuing functionality.
 As elements are enqueued, they are conceptually added to "the back of the line;" 
 as they are dequeued, they are removed from "the front of the line." 
 */
class Queue<T> {

  /// The first enqueued node or that following the last dequeued node.
  private var front: QueueNode<T>?

  /// The last enqueued node.
  private var back: QueueNode<T>?

  /// The number of elements in the queue.
  private(set) var count = 0
  
  /// True if there are no elements in the queue, false otherwise.
  var isEmpty: Bool {
    return count == 0
  }
 

I want to interrupt here to point out that the class properties front and back defined on lines 9 and 12 are effectively pointers, as is the function variable node on line 28 below (reference the diagram).

  /**
   Adds the given `element` to the back of the queue.
    - complexity: O(1)
   */
  func enqueue(element: T) {
    
    let node = QueueNode<T>(element: element)
    
    // First node
    if front == nil {
      front = node
      back = node
    }
      
    // Subsequent nodes
    else {
      back!.next = node
      back = node
    }

    count += 1
  }
 

It is on line 28 that this class allocates memory for, creates, initializes and assigns a reference to a new QueueNode object (all done behind the scenes by Swift). In accordance with ARC, as long as this object has a property, variable or parameter pointing to it, it will remain allocated. This node is short-lived, though, so the only potential direct references held by Queue are front and back. It is the nodes themselves that maintain the next references keeping the chain of nodes alive (again, the diagram).

  /**
   Removes and returns the element at the front of the queue.
   - complexity: O(1)
   */
  func dequeue() -> T? {
     
    // Queue is empty
    if front == nil { return nil }
     
    // Remove first node/element
    let element = front!.element
    front = front!.next
     
    // Deallocate dequeued back node
    if front == nil {
      back = nil
    }
 
    count -= 1
     
    return element
  }
 

Consider line 56 in dequeue(), where front is reassigned to the object pointed to by its own next property. This leaves the node formerly known as first with nothing referring to it, triggering implicit and immediate deallocation; there’s no need to explicitly deallocate it, such as with a nil assignment.

In contrast, if we have dequeued the last node, we need to explicitly release it by reassigning back to nil (line 60), otherwise back will retain that node until the queue itself is deallocated (at which time it is automatically set to nil).

Automatic Queue Deallocation

In the context of queue deallocation, such as when the reference count for a Queue object drops to zero, front is assigned to nil. When that happens the associated node’s reference count also drops to zero, triggering its deallocation. This causes the setting of the node’s next property to nil, which then decrements its adjacent node’s reference count to zero, thereby triggering that node’s deallocation, and so on. This chain reaction continues until the last node in the linked list is reached.

As discussed in the context of dequeue(), this last node is also pointed to by the queue’s back property, thus preventing its reference count from dropping to zero, and thereby keeping the object alive. It is not until back is set to nil (line 60 above) that the reference count drops to zero and the last node is deallocated.

The process of deallocating the nodes between front and back happens automatically courtesy of ARC memory management built into Swift and Objective-C. You can see it in action by using this script and addition to QueueNode:

class QueueNode<T> { // continued
  deinit {
    print("\(#function)")
  }
}

func testQueue {
  let n = 10
  var queue = Queue<Int>()
  for i in 1...n {
    queue.enqueue(i)
  }
}
testQueue()
print("done")

Once testQueue() exits, you’ll see its queue variable is automatically deallocated as described, which in this case would result in 10 debugger console output lines that read “deinit” before “done” appears.

Large Queues

Now you might find that for large queues, trouble ensues in the form of a EXC_BAD_ACCESS runtime error. I found taking the reigns on deallocating the individual QueueNodes ensured the deallocation of any sized queue works flawlessly.

Returning to our Queue definition, this is a simple matter of adding a deinit method that calls upon a removeAll() method familiar to users of Swift’s collection types:

  /**
   Explicitly deinitializes all enqueued nodes.
   - complexity: O(n)
   */
  deinit {
    removeAll()
  }

  /**
   Removes all elements from the queue.
   - complexity: O(n)
   */
  func removeAll() {
    while front != nil {
      front = front!.next
    }
    back = nil
    count = 0
  }
}

Note we haven’t completely given up on ARC. As it did in line 56 previously, the same implicit deallocation occurs here on line 82: as soon as the front = front!.next assignment is made, the node that was the front node is deallocated. Where we short-circuit the automatic chain-reaction deallocation discussed above is in that front reassignment. Were we to replace lines 81–83 with a simple front = nil, we would be back to the default behavior we had without a deinit method.

And yes, for you lines-of-code hounds out there, I could have written removeAll() as a simple one-liner with the loop while dequeue() != nil { }. I contend, though, the given approach is superior for performance reasons, however negligible. There’s a bit more overhead in calling dequeue() 100 million times, let’s say, than in dereferencing that many nodes directly (in a single test, this amounted to 45 versus 65 seconds…a near 150% increase).

More to Come

That does it for now. In my next post I conform the Queue to SequenceType so it can be iterated with the likes of:

for element in queue {
  // do something with element
}
Leave a Comment