Solving the Issue of new Thread().Start Causing High Concurrency and 100% CPU Usage

Time: Column:Backend & Servers views:226

Many of us are familiar with the thread queue ThreadPool.QueueUserWorkItem. It queues a method for execution and specifies the data object that the method uses. This method executes when a thread pool thread becomes available. 

Solving the Issue of new Thread().Start Causing High Concurrency and 100% CPU Usage

While taking over a project, I noticed the following code was being used extensively:

new Thread(() => {
  //do something
}).Start();

The purpose of this is nothing more than to reduce page wait time and improve user experience by moving time-consuming operations to a new thread running in the background.

The Issue

However, the problem with this approach is that it leads to the creation of a large number of threads, which severely impacts the project's performance. Especially under high concurrency, this often results in 100% CPU usage.

Of course, if your project is full of such code and hasn't crashed yet, it at least indicates that not many users are hitting this method.

The Solution

Naturally, I wanted to optimize the project. My first thought was to use a queue, but I found that the project wasn’t using a queue at all. Many operations were still done in an outdated way: there was a task table, and whenever a task appeared, it would add content to the table, and then a scheduled task would execute every minute to process the tasks.

So, I thought about how to fix the issue with minimal changes, and then consider further improvements later.

The core problem was the excessive creation of threads. The simplest solution is to limit their number. This is where ThreadPool.QueueUserWorkItem comes into play.

Many of you may already be familiar with ThreadPool.QueueUserWorkItem. Below is Microsoft's explanation:

It queues a method for execution and specifies the data object that the method uses. This method executes when a thread pool thread becomes available.

Here’s how it works:

protected static Logger Logger = LogManager.GetCurrentClassLogger();
public ActionResult Index() {
  // Logger.Debug("Execution started");
  ThreadPool.QueueUserWorkItem(new WaitCallback(InsertNewsInfoExt), "param");
  // Logger.Debug("Execution ended");
  return View();
}

private void InsertNewsInfoExt(object info) {
  // Logger.Debug("InsertNewsInfoExt execution started");
  Thread.Sleep(1000 * 200);
  Logger.Debug("InsertNewsInfoExt execution ended");
  
  new Thread(t => {
    try {
      Logger.Debug("Thread execution");
    } catch (Exception ex) {
      Logger.Error(ex.Message);
    }
  }).Start();
}

According to the MSDN documentation, the default size of the thread pool is 25 threads per available processor. You can change the number of threads in the thread pool using the SetMaxThreads method:

// Maximum number of worker threads and I/O threads
ThreadPool.SetMaxThreads(1000, 1000);   
// Start a worker thread
ThreadPool.QueueUserWorkItem(new WaitCallback(InsertNewsInfoExt), "param");

Related Parameters

  • GetAvailableThreads: Retrieves the number of available worker threads remaining.

  • GetMaxThreads: Retrieves the maximum number of threads in the thread pool. Any requests beyond this number will be queued until threads become available.

  • GetMinThreads: Retrieves the minimum number of idle threads the thread pool maintains to handle new requests.

  • QueueUserWorkItem: Starts one of the threads from the thread pool. If there are no idle threads, the request will be queued.

  • SetMaxThreads: Sets the maximum number of threads in the thread pool.

  • SetMinThreads: Sets the minimum number of threads the thread pool must retain.

This approach solves the problem of creating an unlimited number of Thread objects and achieves the solution with minimal changes.