GCD concurrency tutorial for inexperienced persons
The Grand Central Dispatch (GCD, or simply Dispatch) framework relies on the underlying thread pool design sample. Which means that there are a set variety of threads spawned by the system – primarily based on some components like CPU cores – they’re at all times out there ready for duties to be executed concurrently. 🚦
Creating threads on the run is an costly activity so GCD organizes duties into particular queues, and in a while the duties ready on these queues are going to be executed on a correct and out there thread from the pool. This strategy results in nice efficiency and low execution latency. We will say that the Dispatch framework is a really quick and environment friendly concurrency framework designed for contemporary multi-core {hardware} and wishes.
Concurrency, multi-tasking, CPU cores, parallelism and threads
A processor can run duties made by you programmatically, that is often known as coding, creating or programming. The code executed by a CPU core is a thread. So your app goes to create a course of that’s made up from threads. 🤓
Prior to now a processor had one single core, it might solely cope with one activity at a time. In a while time-slicing was launched, so CPU’s might execute threads concurrently utilizing context switching. As time handed by processors gained extra horse energy and cores in order that they had been able to actual multi-tasking utilizing parallelism. ⏱
These days a CPU is a really highly effective unit, it is able to executing billions of duties (cycles) per second. Due to this excessive availability pace Intel launched a know-how known as hyper-threading. They divided CPU clock cycles between (often two) processes working on the similar time, so the variety of out there threads basically doubled. 📈
As you possibly can see concurrent execution could be achieved with numerous strategies, however you need not care about that a lot. It is as much as the CPU structure the way it solves concurrency, and it is the working system’s activity how a lot thread goes to be spawned for the underlying thread pool. The GCD framework will disguise all of the complexity, nevertheless it’s at all times good to know the fundamental rules. 👍
Synchronous and asynchronous execution
Every work merchandise could be executed both synchronously or asynchronously.
Have you ever ever heard of blocking and non-blocking code? This is identical scenario right here. With synchronous duties you will block the execution queue, however with async duties your name will immediately return and the queue can proceed the execution of the remaining duties (or work gadgets as Apple calls them). 🚧
Synchronous execution
When a piece merchandise is executed synchronously with the sync methodology, this system waits till execution finishes earlier than the strategy name returns.
Your perform is most probably synchronous if it has a return worth, so func load() -> String
goes to most likely block the factor that runs on till the sources is totally loaded and returned again.
Asynchronous execution
When a piece merchandise is executed asynchronously with the async methodology, the strategy name returns instantly.
Completion blocks are a great sing of async strategies, for instance if you happen to have a look at this methodology func load(completion: (String) -> Void)
you possibly can see that it has no return kind, however the results of the perform is handed again to the caller in a while by means of a block.
It is a typical use case, if you must anticipate one thing inside your methodology like studying the contents of an enormous file from the disk, you do not need to block your CPU, simply due to the sluggish IO operation. There could be different duties that aren’t IO heavy in any respect (math operations, and so forth.) these could be executed whereas the system is studying your file from the bodily arduous drive. 💾
With dispatch queues you possibly can execute your code synchronously or asynchronously. With synchronous execution the queue waits for the work, with async execution the code returns instantly with out ready for the duty to finish. ⚡️
Dispatch queues
As I discussed earlier than, GCD organizes activity into queues, these are similar to the queues on the shopping center. On each dispatch queue, duties will probably be executed in the identical order as you add them to the queue – FIFO: the primary activity within the line will probably be executed first – however it’s best to observe that the order of completion will not be assured. Duties will probably be accomplished in accordance with the code complexity. So if you happen to add two duties to the queue, a sluggish one first and a quick one later, the quick one can end earlier than the slower one. ⌛️
Serial and concurrent queues
There are two varieties of dispatch queues. Serial queues can execute one activity at a time, these queues could be utilized to synchronize entry to a particular useful resource. Concurrent queues however can execute a number of duties parallel in the identical time. Serial queue is rather like one line within the mall with one cashier, concurrent queue is like one single line that splits for 2 or extra cashiers. 💰
Most important, international and customized queues
The primary queue is a serial one, each activity on the primary queue runs on the primary thread.
International queues are system offered concurrent queues shared by means of the working system. There are precisely 4 of them organized by excessive, default, low precedence plus an IO throttled background queue.
Customized queues could be created by the person. Customized concurrent queues at all times mapped into one of many international queues by specifying a High quality of Service property (QoS). In a lot of the circumstances if you wish to run duties in parallel it is strongly recommended to make use of one of many international concurrent queues, it’s best to solely create customized serial queues.
System offered queues
- Serial principal queue
- Concurrent international queues
- excessive precedence international queue
- default precedence international queue
- low precedence international queue
- international background queue (IO throttled)
Customized queues by high quality of service
- userInteractive (UI updates) -> serial principal queue
- userInitiated (async UI associated duties) -> excessive precedence international queue
- default -> default precedence international queue
- utility -> low precedence international queue
- background -> international background queue
- unspecified (lowest) -> low precedence international queue
Sufficient from the idea, let’s examine methods to use the Dispatch framework in motion! 🎬
How one can use the DispatchQueue class in Swift?
Right here is how one can get all of the queues from above utilizing the model new GCD syntax out there from Swift 3. Please observe that it’s best to at all times use a world concurrent queue as an alternative of making your personal one, besides if you’re going to use the concurrent queue for locking with boundaries to realize thread security, extra on that later. 😳
How one can get a queue?
import Dispatch
DispatchQueue.principal
DispatchQueue.international(qos: .userInitiated)
DispatchQueue.international(qos: .userInteractive)
DispatchQueue.international(qos: .background)
DispatchQueue.international(qos: .default)
DispatchQueue.international(qos: .utility)
DispatchQueue.international(qos: .unspecified)
DispatchQueue(
label: "com.theswiftdev.queues.serial"
)
DispatchQueue(
label: "com.theswiftdev.queues.concurrent",
attributes: .concurrent
)
So executing a activity on a background queue and updating the UI on the primary queue after the duty completed is a fairly straightforward one utilizing Dispatch queues.
DispatchQueue.international(qos: .background).async {
DispatchQueue.principal.async {
}
}
Sync and async calls on queues
There isn’t any huge distinction between sync and async strategies on a queue. Sync is simply an async name with a semaphore (defined later) that waits for the return worth. A sync name will block, however an async name will instantly return. 🎉
let q = DispatchQueue.international()
let textual content = q.sync {
return "this may block"
}
print(textual content)
q.async {
print("this may return immediately")
}
Mainly if you happen to want a return worth use sync, however in each different case simply go along with async. DEADLOCK WARNING: it’s best to by no means name sync on the primary queue, as a result of it’s going to trigger a impasse and a crash. You should utilize this snippet in case you are searching for a protected method to do sync calls on the primary queue / thread. 👌
Do not name sync on a serial queue from the serial queue’s thread!
Delay execution
You’ll be able to merely delay code execution utilizing the Dispatch framework.
DispatchQueue.principal.asyncAfter(deadline: .now() + .seconds(2)) {
}
Carry out concurrent loop
Dispatch queue merely means that you can carry out iterations concurrently.
DispatchQueue.concurrentPerform(iterations: 5) { (i) in
print(i)
}
Debugging
Oh, by the way in which it is only for debugging function, however you possibly can return the title of the present queue through the use of this little extension. Don’t use in manufacturing code!!!
extension DispatchQueue {
static var currentLabel: String {
.init(validatingUTF8: __dispatch_queue_get_label(nil))!
}
}
Utilizing DispatchWorkItem in Swift
DispatchWorkItem encapsulates work that may be carried out. A piece merchandise could be dispatched onto a DispatchQueue and inside a DispatchGroup. A DispatchWorkItem can be set as a DispatchSource occasion, registration, or cancel handler.
So that you similar to with operations through the use of a piece merchandise you possibly can cancel a working activity. Additionally work gadgets can notify a queue when their activity is accomplished.
var workItem: DispatchWorkItem?
workItem = DispatchWorkItem {
for i in 1..<6 {
guard let merchandise = workItem, !merchandise.isCancelled else {
print("cancelled")
break
}
sleep(1)
print(String(i))
}
}
workItem?.notify(queue: .principal) {
print("accomplished")
}
DispatchQueue.international().asyncAfter(
deadline: .now() + .seconds(2)
) {
workItem?.cancel()
}
DispatchQueue.principal.async(execute: workItem!)
Concurrent duties with DispatchGroups
So you’ll want to carry out a number of community calls so as to assemble the information required by a view controller? That is the place DispatchGroup may also help you. Your whole lengthy working background activity could be executed concurrently, when every part is prepared you will obtain a notification. Simply watch out you must use thread-safe information buildings, so at all times modify arrays for instance on the identical thread! 😅
func load(delay: UInt32, completion: () -> Void) {
sleep(delay)
completion()
}
let group = DispatchGroup()
group.enter()
load(delay: 1) {
print("1")
group.depart()
}
group.enter()
load(delay: 2) {
print("2")
group.depart()
}
group.enter()
load(delay: 3) {
print("3")
group.depart()
}
group.notify(queue: .principal) {
print("accomplished")
}
Observe that you just at all times must stability out the enter and depart calls on the group. The dispatch group additionally permits us to trace the completion of various work gadgets, even when they run on completely different queues.
let group = DispatchGroup()
let queue = DispatchQueue(
label: "com.theswiftdev.queues.serial"
)
let workItem = DispatchWorkItem {
print("begin")
sleep(1)
print("finish")
}
queue.async(group: group) {
print("group begin")
sleep(2)
print("group finish")
}
DispatchQueue.international().async(
group: group,
execute: workItem
)
group.notify(queue: .principal) {
print("accomplished")
}
Another factor that you should utilize dispatch teams for: think about that you just’re displaying a properly animated loading indicator when you do some precise work. It’d occurs that the work is finished sooner than you’d anticipate and the indicator animation couldn’t end. To unravel this example you possibly can add a small delay activity so the group will wait till each of the duties end. 😎
let queue = DispatchQueue.international()
let group = DispatchGroup()
let n = 9
for i in 0..<n {
queue.async(group: group) {
print("(i): Operating async activity...")
sleep(3)
print("(i): Async activity accomplished")
}
}
group.wait()
print("accomplished")
Semaphores
A semaphore is just a variable used to deal with useful resource sharing in a concurrent system. It is a actually highly effective object, listed below are a couple of vital examples in Swift.
How one can make an async activity to synchronous?
The reply is easy, you should utilize a semaphore (bonus level for timeouts)!
enum DispatchError: Error {
case timeout
}
func asyncMethod(completion: (String) -> Void) {
sleep(2)
completion("accomplished")
}
func syncMethod() throws -> String {
let semaphore = DispatchSemaphore(worth: 0)
let queue = DispatchQueue.international()
var response: String?
queue.async {
asyncMethod { r in
response = r
semaphore.sign()
}
}
semaphore.wait(timeout: .now() + 5)
guard let outcome = response else {
throw DispatchError.timeout
}
return outcome
}
let response = strive? syncMethod()
print(response)
Lock / single entry to a useful resource
If you wish to keep away from race situation you’re most likely going to make use of mutual exclusion. This may very well be achieved utilizing a semaphore object, but when your object wants heavy studying functionality it’s best to contemplate a dispatch barrier primarily based answer. 😜
class LockedNumbers {
let semaphore = DispatchSemaphore(worth: 1)
var components: [Int] = []
func append(_ num: Int) {
self.semaphore.wait(timeout: DispatchTime.distantFuture)
print("appended: (num)")
self.components.append(num)
self.semaphore.sign()
}
func removeLast() {
self.semaphore.wait(timeout: DispatchTime.distantFuture)
defer {
self.semaphore.sign()
}
guard !self.components.isEmpty else {
return
}
let num = self.components.removeLast()
print("eliminated: (num)")
}
}
let gadgets = LockedNumbers()
gadgets.append(1)
gadgets.append(2)
gadgets.append(5)
gadgets.append(3)
gadgets.removeLast()
gadgets.removeLast()
gadgets.append(3)
print(gadgets.components)
Watch for a number of duties to finish
Similar to with dispatch teams, you can too use a semaphore object to get notified if a number of duties are completed. You simply have to attend for it…
let semaphore = DispatchSemaphore(worth: 0)
let queue = DispatchQueue.international()
let n = 9
for i in 0..<n {
queue.async {
print("run (i)")
sleep(3)
semaphore.sign()
}
}
print("wait")
for i in 0..<n {
semaphore.wait()
print("accomplished (i)")
}
print("accomplished")
Batch execution utilizing a semaphore
You’ll be able to create a thread pool like habits to simulate restricted sources utilizing a dispatch semaphore. So for instance if you wish to obtain a number of pictures from a server you possibly can run a batch of x each time. Fairly useful. 🖐
print("begin")
let sem = DispatchSemaphore(worth: 5)
for i in 0..<10 {
DispatchQueue.international().async {
sem.wait()
sleep(2)
print(i)
sem.sign()
}
}
print("finish")
The DispatchSource object
A dispatch supply is a basic information kind that coordinates the processing of particular low-level system occasions.
Indicators, descriptors, processes, ports, timers and plenty of extra. Every little thing is dealt with by means of the dispatch supply object. I actually do not need to get into the main points, it is fairly low-level stuff. You’ll be able to monitor information, ports, indicators with dispatch sources. Please simply learn the official Apple docs. 📄
I would wish to make just one instance right here utilizing a dispatch supply timer.
let timer = DispatchSource.makeTimerSource()
timer.schedule(deadline: .now(), repeating: .seconds(1))
timer.setEventHandler {
print("good day")
}
timer.resume()
Thread-safety utilizing the dispatch framework
Thread security is an inevitable subject if it involves multi-threaded code. At first I discussed that there’s a thread pool beneath the hood of GCD. Each thread has a run loop object related to it, you possibly can even run them by hand. In the event you create a thread manually a run loop will probably be added to that thread routinely.
let t = Thread {
print(Thread.present.title ?? "")
let timer = Timer(timeInterval: 1, repeats: true) { t in
print("tick")
}
RunLoop.present.add(timer, forMode: .defaultRunLoopMode)
RunLoop.present.run()
RunLoop.present.run(mode: .commonModes, earlier than: Date.distantPast)
}
t.title = "my-thread"
t.begin()
You shouldn’t do that, demo functions solely, at all times use GCD queues!
Queue != Thread
A GCD queue will not be a thread, if you happen to run a number of async operations on a concurrent queue your code can run on any out there thread that matches the wants.
Thread security is all about avoiding tousled variable states
Think about a mutable array in Swift. It may be modified from any thread. That is not good, as a result of finally the values inside it are going to be tousled like hell if the array will not be thread protected. For instance a number of threads try to insert values to the array. What occurs? In the event that they run in parallel which factor goes to be added first? Now this is the reason you want typically to create thread protected sources.
Serial queues
You should utilize a serial queue to implement mutual exclusivity. All of the duties on the queue will run serially (in a FIFO order), just one course of runs at a time and duties have to attend for one another. One huge draw back of the answer is pace. 🐌
let q = DispatchQueue(label: "com.theswiftdev.queues.serial")
q.async() {
}
q.sync() {
}
Concurrent queues utilizing boundaries
You’ll be able to ship a barrier activity to a queue if you happen to present an additional flag to the async methodology. If a activity like this arrives to the queue it’s going to be certain that nothing else will probably be executed till the barrier activity have completed. To sum this up, barrier duties are sync (factors) duties for concurrent queues. Use async boundaries for writes, sync blocks for reads. 😎
let q = DispatchQueue(label: "com.theswiftdev.queues.concurrent", attributes: .concurrent)
q.async(flags: .barrier) {
}
q.sync() {
}
This methodology will end in extraordinarily quick reads in a thread protected surroundings. It’s also possible to use serial queues, semaphores, locks all of it is dependent upon your present scenario, nevertheless it’s good to know all of the out there choices is not it? 🤐
A number of anti-patterns
It’s a must to be very cautious with deadlocks, race situations and the readers writers downside. Normally calling the sync methodology on a serial queue will trigger you a lot of the troubles. One other problem is thread security, however we have already coated that half. 😉
let queue = DispatchQueue(label: "com.theswiftdev.queues.serial")
queue.sync {
queue.sync {
}
}
DispatchQueue.international(qos: .utility).sync {
DispatchQueue.principal.sync {
}
}
The Dispatch framework (aka. GCD) is a tremendous one, it has such a possible and it actually takes a while to grasp it. The true query is that what path goes to take Apple so as to embrace concurrent programming into an entire new degree? Guarantees or async / await, possibly one thing completely new, let’s hope that we’ll see one thing in Swift 6.