Skip to content →

Month: June 2016

extension Queue: SequenceType

Previously, I presented a linked-list-based Queue class in the context of a discussion around Automatic Reference Counting (ARC). Here I extend that class to conform to the SequenceType protocol, and then add some bonus initializers that mirror those given for Swift’s collections.

In simplest terms, it is the SequenceType protocol that allows us to iterate over the elements in the Queue:

for element in queue {
  // do something with element
}

But that’s just the beginning. Taking a look at the Appe documentation and this excellent post by Nate Cook on the Swift Collection Protocols, we find that because of this iterability, we gain significant functionality for free, a subset of which includes the likes of enumerate, map–filter–reduce, reverse, sort, and so on. Good stuff!

Now, a SequenceType makes no guarantees as to whether the sequence is consumed, so if we were interested only in iteration and didn’t care about that (or perhaps wanted it to be consumed), we could destructively iterate like so:

while let element = queue.dequeue() {
  // do something with element
}

However, our Queue is constructed in such a way that it need not be consumed on iteration.

Generator

So let’s give Queue an iterator (generator in Swift vernacular) and declare its conformance to the SequenceType protocol. There’s not much to it, at least as far as Queue is concerned:

extension Queue: SequenceType {

  /**
   Returns a `SequenceType` generator for iterating over the queue.
   */
  func generate() -> QueueGenerator<T> {
    return QueueGenerator(queue: self)
  }
}

Beyond that, we need only define the QueueGenerator returned by generate(). It takes our Queue object and, with its front property and each node’s next property, non-destructively iterates over the elements in the queue:

/**
 The generator by which `Queue` conforms to `SequenceType`.
 */
struct QueueGenerator<T>: GeneratorType {
  
  /// The queue being iterated over.
  var queue: Queue<T>
  
  /// The current node.
  var node: QueueNode<T>?
  
  /**
   Initializes iteration at the front of the given `queue`.
   */
  init(queue: Queue<T>) {
    self.queue = queue
    node = queue.front
  }
  
  /**
   Returns the next element in the queue, or `nil` if the queue is empty or all nodes have been iterated.
   */
  mutating func next() -> T? {
    let element = node?.element
    node = node?.next
    return element
  }
}

Notice that like the previously defined dequeue() method, the next() method is not returning a queue node, but rather the element of type T that is wrapped by the node.

And that’s all there is to for SequenceType conformance!

Initializers

But before we move on, let’s go back to our original Queue definition and drop in some initializers that allow Queue construction to work like Swift’s own Array and Set types:

class Queue<T>: SequenceType, ArrayLiteralConvertible { // continued

  /**
   Construct an empty queue.
   */
  init() { }
  
  /**
   Construct an instance containing `elements`.
   */
  required init(arrayLiteral elements: T...) {
    for element in elements {
      enqueue(element)
    }
  }
  
  /**
   Construct an instance from an arbitrary sequence with elements of type `Element`.
   */
  init<S: SequenceType where S.Generator.Element == T>(_ sequence: S) {
    for element in sequence {
      enqueue(element)
    }
  }
}

The first constructor was implied by the previous omission of any initializers, but now that we’ve added the other two, we must explicitly define it. However, because Queue consists only of optional properties, they are automatically initialized to nil. Since this is the “default” state, there is nothing for it to initialize.

The second initializer uses the Swift ellipses syntax to indicate a comma-separated list of literal values, which it converts into an Array. This is all that’s needed for ArrayLiteralConvertible protocol conformance, so I add that to the class. This in turn necessitates the use of the required modifier on the initializer, as the initializer is required by the protocol.

Finally, the third initializer allows us to build a queue from any SequenceType, such as an Array, Set or Dictionary. The syntax used here is that of the generic parameter clause, which adds a constraint to the Queue‘s generic argument T. That constraint says S conforms to the SequenceType protocol and the associated type S.Generator.Element is of type T. In other words, we can pass in any sequence of type T elements.

Here’s what those initializers look like in application.

let array = [3, 7, 2, 9]
let set = Set(arrayLiteral: 3, 7, 2, 9)
let dict = ["A": 3, "B": 7, "C": 2, "D": 9]
let queue1 = Queue<Int>()
let queue2 = Queue(arrayLiteral: 3, 7, 2, 9)
let queue3 = Queue(array)
let queue4 = Queue(set)
let queue5 = Queue(dict.values)
let queue6 = Queue(queue2)

Some points of note:

  • We’re able to drop the <Int> type specification on any but the first initializer, thanks to Swift’s type inference.
  • We can now initialize one Queue with another.
  • Only the Queue and Array types maintain their order as initialized, so the arrangement of the elements for queue4 and queue5 are not necessarily 3, 7, 2 and 9.
Leave a Comment

A Generic Swift Queue and ARC

In my current project, I set off to implement a generic queue in Swift, and settled on the use of a linked list for doing so, where my queue is a collection of singly linked nodes. As it turned out, this ended up being fertile ground for illustrating Swift’s use of Automatic Reference Counting (ARC) for memory management, something I take on in this post.

For the visual learners, here is an illustration to assist the discussion, courtesy of Paper by 53 and my iPad Pro (with Apple Pencil, of course):

IMG_0040

The top row represents three people waiting in line at Walmart customer service (who hasn’t been there?). The second row shows the first person has been helped (dequeued), leaving two. And the third row shows two new customers have arrived (enqueued), eager to make their returns, before anyone else has been helped. The orange numbers correspond to reference counts (more on that in a moment).

Queue Node

So to begin, let’s take a look at the building block of the queue, the node (represented by the circles in the diagram):

/**
 The component of a queue that provides the queue's content and sequence.
 */
class QueueNode<T> {

  /// The data element stored in the queue.
  var element: T

  /// The node that is "next in line", or `nil` if there is no subsequent node.
  var next: QueueNode?

  /**
   Wraps the given `element` into a node, leaving `next` set to `nil`.
   */
  init(element: T) {
    self.element = element
  }
}

Being part of a linked list, it is necessary for the node to be a class, as linked lists rely on reference pointers, and only Swift classes—as reference types—can be “pointed” to. (In contrast, any assignment of a structure or enumeration stores a copy of the object in the receiving variable, property, or parameter.)

Queue

Next, let’s begin taking a look at the class that takes the QueueNode building block and with it forms a fully functional Queue data structure:

/**
 A generic collection that provides standard enqueueing and dequeuing functionality.
 As elements are enqueued, they are conceptually added to "the back of the line;" 
 as they are dequeued, they are removed from "the front of the line." 
 */
class Queue<T> {

  /// The first enqueued node or that following the last dequeued node.
  private var front: QueueNode<T>?

  /// The last enqueued node.
  private var back: QueueNode<T>?

  /// The number of elements in the queue.
  private(set) var count = 0
  
  /// True if there are no elements in the queue, false otherwise.
  var isEmpty: Bool {
    return count == 0
  }
 

I want to interrupt here to point out that the class properties front and back defined on lines 9 and 12 are effectively pointers, as is the function variable node on line 28 below (reference the diagram).

  /**
   Adds the given `element` to the back of the queue.
    - complexity: O(1)
   */
  func enqueue(element: T) {
    
    let node = QueueNode<T>(element: element)
    
    // First node
    if front == nil {
      front = node
      back = node
    }
      
    // Subsequent nodes
    else {
      back!.next = node
      back = node
    }

    count += 1
  }
 

It is on line 28 that this class allocates memory for, creates, initializes and assigns a reference to a new QueueNode object (all done behind the scenes by Swift). In accordance with ARC, as long as this object has a property, variable or parameter pointing to it, it will remain allocated. This node is short-lived, though, so the only potential direct references held by Queue are front and back. It is the nodes themselves that maintain the next references keeping the chain of nodes alive (again, the diagram).

  /**
   Removes and returns the element at the front of the queue.
   - complexity: O(1)
   */
  func dequeue() -> T? {
     
    // Queue is empty
    if front == nil { return nil }
     
    // Remove first node/element
    let element = front!.element
    front = front!.next
     
    // Deallocate dequeued back node
    if front == nil {
      back = nil
    }
 
    count -= 1
     
    return element
  }
 

Consider line 56 in dequeue(), where front is reassigned to the object pointed to by its own next property. This leaves the node formerly known as first with nothing referring to it, triggering implicit and immediate deallocation; there’s no need to explicitly deallocate it, such as with a nil assignment.

In contrast, if we have dequeued the last node, we need to explicitly release it by reassigning back to nil (line 60), otherwise back will retain that node until the queue itself is deallocated (at which time it is automatically set to nil).

Automatic Queue Deallocation

In the context of queue deallocation, such as when the reference count for a Queue object drops to zero, front is assigned to nil. When that happens the associated node’s reference count also drops to zero, triggering its deallocation. This causes the setting of the node’s next property to nil, which then decrements its adjacent node’s reference count to zero, thereby triggering that node’s deallocation, and so on. This chain reaction continues until the last node in the linked list is reached.

As discussed in the context of dequeue(), this last node is also pointed to by the queue’s back property, thus preventing its reference count from dropping to zero, and thereby keeping the object alive. It is not until back is set to nil (line 60 above) that the reference count drops to zero and the last node is deallocated.

The process of deallocating the nodes between front and back happens automatically courtesy of ARC memory management built into Swift and Objective-C. You can see it in action by using this script and addition to QueueNode:

class QueueNode<T> { // continued
  deinit {
    print("\(#function)")
  }
}

func testQueue {
  let n = 10
  var queue = Queue<Int>()
  for i in 1...n {
    queue.enqueue(i)
  }
}
testQueue()
print("done")

Once testQueue() exits, you’ll see its queue variable is automatically deallocated as described, which in this case would result in 10 debugger console output lines that read “deinit” before “done” appears.

Large Queues

Now you might find that for large queues, trouble ensues in the form of a EXC_BAD_ACCESS runtime error. I found taking the reigns on deallocating the individual QueueNodes ensured the deallocation of any sized queue works flawlessly.

Returning to our Queue definition, this is a simple matter of adding a deinit method that calls upon a removeAll() method familiar to users of Swift’s collection types:

  /**
   Explicitly deinitializes all enqueued nodes.
   - complexity: O(n)
   */
  deinit {
    removeAll()
  }

  /**
   Removes all elements from the queue.
   - complexity: O(n)
   */
  func removeAll() {
    while front != nil {
      front = front!.next
    }
    back = nil
    count = 0
  }
}

Note we haven’t completely given up on ARC. As it did in line 56 previously, the same implicit deallocation occurs here on line 82: as soon as the front = front!.next assignment is made, the node that was the front node is deallocated. Where we short-circuit the automatic chain-reaction deallocation discussed above is in that front reassignment. Were we to replace lines 81–83 with a simple front = nil, we would be back to the default behavior we had without a deinit method.

And yes, for you lines-of-code hounds out there, I could have written removeAll() as a simple one-liner with the loop while dequeue() != nil { }. I contend, though, the given approach is superior for performance reasons, however negligible. There’s a bit more overhead in calling dequeue() 100 million times, let’s say, than in dereferencing that many nodes directly (in a single test, this amounted to 45 versus 65 seconds…a near 150% increase).

More to Come

That does it for now. In my next post I conform the Queue to SequenceType so it can be iterated with the likes of:

for element in queue {
  // do something with element
}
Leave a Comment