Introduction of Multithreading in C#
As we know that every application runs with at least one thread. So what is a thread? A thread is nothing more than a process. So, I will attempt to give an introductory discussion on threading, why it is used, and what is multithreading in C# .Net?
Each thread defines a unique flow of control. One common example of use of thread is implementation of concurrent programming by modern operating systems. Use of threads saves wastage of CPU cycle and increase efficiency of an application so threads are basically lightweight process.
Life Cycle of Thread in C#
The life cycle of a thread starts when an object of the System.Threading.Thread class is created and ends when the thread is terminated or completes execution.
What is Main Thread in C#
In C#, When a program starts execution, the main thread is automatically created. The threads created using the Thread class are called the child threads of the main thread. You can access a thread using CurrentThread property of the Thread class. See below example:
using System; using System.Threading; namespace MultithreadingApplication { class MainThreadProgram { static void Main(string[] args) { Thread th = Thread.CurrentThread; th.Name = "MainThread"; Console.WriteLine("This is {0}", th.Name); Console.ReadKey(); } } }
What is Multithreading in C#
A multithreaded application allows you to run several threads, each thread running in its own process. So theoretically you can run step 1 in one thread and at the same time run step 2 in another thread. At the same time you could run step 3 in its own thread, and even step 4 in its own thread. Hence step 1, step 2, step 3, and step 4 would run concurrently. Theoretically, if all four steps took about the same time, you could finish your program in a quarter of the time it takes to run a single thread (assuming you had a 4 processor machine).
So why isn't every program multithreaded? Because along with speed, you face complexity. Imagine if step 1 somehow depends on the information in step 2. The program might not run correctly if step 1 finishes calculating before step 2 or visa versa.
Thread Safety
In a program we force one thread to wait inside our code block while the other thread is finishing its business. This activity, known as thread blocking or synchronizing threads, allows us to control the timing of simultaneous threads running inside our program.
In C# we lock on a particular part of memory (usually an instance of an object) and don't allow any thread to enter code of this object's memory until another thread is done using the object.
// shared memory variable between the two threads // used to indicate which thread we are in private string _threadOutput = "";
In above program we are expecting output in sequence of Thread 1 Output --> Hello Thread 1 and Thread 2 Output --> Hello Thread 2, but for the most part, the results are completely unpredictable. In thread 1 we are getting Hello Thread 2.
The reason we see the results we do is because in a multithreaded program such as in above program, the code theoretically is executing the two methods DisplayThread1 and DisplayThread2, simultaneously.
Each method shares the variable, _threadOutput. So it is possible that although _threadOutput is assigned a value "Hello Thread1" in thread #1 and displays _threadOutput two lines later to the console, that somewhere in between the time thread #1 assigns it and displays it, thread #2 assigns _threadOutput the value "Hello Thread2". Not only are these strange results, possible, they are quite frequent as we can see above program's output. This painful threading problem is an all too common bug in thread programming known as a race condition.
Synchronizing two Threads by locking them
The best way to avoid race conditions is to write thread-safe code. If your code is thread-safe, you can prevent some nasty threading issues from cropping up. There are several defenses for writing thread-safe code. One is to share memory as little as possible.
void DisplayThread1() { while (_stopThreads == false) { // lock on the current instance of the class for thread #1 lock (this) { Console.WriteLine("Display Thread 1"); _threadOutput = "Hello Thread1"; Thread.Sleep(1000); // simulate a lot of processing // tell the user what thread we are in thread #1 Console.WriteLine("Thread 1 Output --> {0}", _threadOutput); }// lock released for thread #1 here } } void DisplayThread2() { while (_stopThreads == false) { // lock on the current instance of the class for thread #2 lock (this) { Console.WriteLine("Display Thread 2"); _threadOutput = "Hello Thread2"; Thread.Sleep(1000); // simulate a lot of processing // tell the user what thread we are in thread #1 Console.WriteLine("Thread 2 Output --> {0}", _threadOutput); } // lock released for thread #2 here } }
Now, all thread output is nicely synchronized. You always get a result saying Thread 1 Output --> Hello Thread 1 and Thread 2 Output --> Hello Thread 2.
Note, however, that thread locking does come at a price. When you lock a thread, you force the other thread to wait until the lock is released. In essence, you've slowed down the program, because while the other thread is waiting to use the shared memory, the first thread isn't doing anything in the program. Therefore you need to use locks sparingly; don't just go and lock every method you have in your code if they are not involved in shared memory.
Summary
With a mulithreaded program, two threads can enter a piece of code at the same time, and wreak havoc on the results. And multithreading program also increase performance of any program if it has been written nicely. Click on next button for next tutorials.
Cheers!