How does Swift prevent Concurrency Thread Explosion? by Lee Kah Sengo

What I discovered is quite fascinating!

How does Swift prevent Concurrency Thread Explosion

A few weeks ago, I read an article Wojciech Kulik, where he talks about some of the shortcomings in the Swift Concurrency Framework. In one section, Wojciech briefly mentioned thread explosion, and how Swift concurrency can prevent this from happening by limiting systems with more threads than CPU cores from overcommitting.

This makes me wonder… is it really so? How does it work behind the scenes? Can we trick the system to create more threads than CPU cores?

We are going to answer all these questions in this article. So without further ado, let’s get straight in.

So, what is thread explosion? Thread explosion is a situation where a system has a large amount of threads running simultaneously and eventually causes performance problems and memory overhead.

There is no clear answer to how many threads are considered too many. As a general benchmark, we can refer to the example given in this WWDC video, whereby a system running threads 16 times more than its CPU cores is considered to undergo thread explosion.

Since Grand Central Dispatch (GCD) does not have a built-in mechanism that prevents thread explosion, it is very easy to create one using a dispatch queue. Consider the following code:

Once executed, the above code will spawn a total of 150 threads, causing a thread explosion. This can be verified by stopping the execution and checking the Debug Navigator.

Xcode Debug Navigator which shows thread explosion when using GCD
Debug Navigator that shows thread explosion

Now that you have learned how to trigger a thread explosion, let’s try to execute the same code using Swift Concurrency and see what happens.

As we all know, there are 3 levels of task priority in Swift Concurrency, mainly userInitiated, utilityAnd backgroundwhere userInitiated top priority, then utility And background with the lowest priority. So let’s go ahead and update our HeavyWork class accordingly:

Every time a task is created, we will print out the creation time. Then we can use it to see what is happening behind the scene.

with update HeavyWork Class location, let’s start with the test first.

Test 1: Creating tasks with the same priority level

This test is basically the same as the dispatch queue example we saw earlier, but instead of using GCD, we’ll use Task To create a thread from Swift Concurrency.

Following are the logs captured from Xcode console.

Swift concurrency running up to 6 threads at a time
Swift concurrency running up to 6 threads at a time

As you can see (from task creation time), thread creation stopped when thread count reached 6, which perfectly corresponds to the number of CPU cores of my6-core iPhone 12. Creation of tasks will continue only after an ongoing task has completed its execution. As a result, there can be a maximum of 6 threads running simultaneously at a time.

Comment:

The iOS simulator will always limit the maximum thread count to 1 regardless of the device selected. So, make sure to run the above test using a real device for a more accurate result.

For a clearer picture of what’s really happening behind the scenes, let’s pause the execution.

Swift concurrent task with 'userInitiated' priority running on concurrent queue
Tasks with ‘userInitiated’ priority running on concurrent queue

It looks like everything we just saw is controlled by a concurrent queue named “com.apple.root.user-initiated-qos.cooperative,

Based on the above observation, it is safe to say how Swift prevents concurrency thread explosion from happening: keep a dedicated concurrent queue to limit the maximum number of threads so that it does not exceed CPU cores.

Test 2: Creating all tasks at once from high to low priority level

Now, let’s go a little deeper by adding tasks with different priorities to the test.

Note that we are creating the highest priority task (userInitiated) before, after utility And background, Based on our previous observation, I was expecting to see 3 queues with 6 threads running simultaneously in each queue, which means we would see a total of 18 threads being spawned. Surprisingly it is not so. Take a look at the following screenshot:

Swift concurrent task distribution from high to low priority level starting at once
Task distribution from high to low priority level with all simultaneous starts

As you can see, both utility And background Queues are limiting the maximum number of threads to 1 when the queue with higher priority (userInitiated) is satisfied. In other words, we can have at most 8 formulas in this test.

This is such an interesting find! Saturating the high-priority queue will somehow prevent other low-priority queues from generating more threads.

But, what if we reverse the order of priority levels? let’s find out!

Test 3: Creating all tasks at once from low to high priority level

First, let’s update the execution code:

Here comes the result:

Swift concurrent task distribution when starting from low to high priority level at once
Task distribution from low to high priority level starting all at once

The result we get is exactly the same as “Test 2”.

It seems that the system is smart enough to give way to the higher priority tasks to run first, even though we started the lowest priority tasks first. Also, the system is still preventing us from creating more than 8 concurrent threads, so we still haven’t been able to create a thread explosion for this test. Good job apple! I

Test 4: Creating tasks from low to high priority level with breaks in between

In real life situations, it is very unlikely that we start a bunch of tasks with different priority levels all at once. So let’s make the situation more realistic by adding a short break between each for loop. Note that we are still using low to high order in this test.

The result we get is quite interesting.

Swift concurrent task distribution starting from low to high priority level with breaks in between
Task distribution starting from low to high priority level with breaks in between

As you can see, after the second break, all 3 queues are running multiple threads. It seems that if we first start a low priority queue and let it run for some time, the high priority queue will not depress the performance of the low priority queue.

I have executed this test twice, the maximum number of threads may vary slightly, but it equates to 3 times the CPU cores.

Is this considered thread explosion?

I don’t think so, because 3 times more threads than CPU cores is still less than the 16 times threshold I mentioned earlier. In fact, I think Apple allows this to be done intentionally in order to have a better balance between execution performance and multi-threading overhead. kill me Twitter If you have other perspectives, I’d really like to hear your thoughts.

Leave a Comment