Professional Documents
Culture Documents
Threads can communicate and synchronize with each other using various techniques, like locks,
semaphores, and condition variables. Careful design and management of threads are necessary to avoid
issues like data races, deadlocks, and resource contention.
After all, threads are the smallest unit of execution within a process, allowing for parallelization of tasks
and more efficient use of system resources. They share memory space and other resources with other
threads in the same process, simplifying communication and synchronization between parallel tasks.
Careful management of threads is crucial to avoid issues like data races and deadlocks.
Improved performance: These ideas allow for the concurrent execution of numerous tasks or
threads, optimizing the utilization of available CPU and computational resources. This can
greatly enhance a system or applications overall performance and speed.
Resource sharing: Through multithreading, several threads within a single process can access
the same memory space, file indicators, and other resources. This results in effective utilization
of resources and minimized memory consumption.
Cost reduction: By making efficient use of CPU and system resources, multitasking and
multiprocessing can reduce computing costs by processing more tasks simultaneously or
executing tasks faster, reducing the need for additional hardware or computational power.
Fault tolerance: In a multiprocessing system, if one process experiences a failure, the other
processes running on separate cores can continue their execution, providing a level of fault
tolerance.
Responsiveness: In a multitasking environment, when a task or process is waiting for a response
from an external resource (e.g., I/O operation), another task can still be executed, ensuring
continuous responsiveness and smooth user experience.
Scalability: Multiprocessing leverages the power of multi-core CPUs, as parallel execution of
tasks can be distributed across different CPU cores, increasing computational capacity and
speeding up execution times.
2
Java individual assignment
A process is an instance of a program that runs independently and has its own memory space, while a
thread is a lightweight unit of execution within a process that shares the same memory space as other
threads.
A process is an independent and self-sufficient unit of execution in a system. Each process has its own
memory space, including the heap memory area and the resource files necessary for it to run. Being
completely separate from each other, processes do not share their memory and resources, which creates a
degree of isolation, preventing direct interference. The mechanism for inter-process communication is
more complicated and slower than for threads. The operating system’s task scheduler allocates CPU time
to these processes. Consequently, processes are heavyweight and demand more resources and time for
their creation and termination.
On the other hand, threads are lightweight and comprise a sequence of instructions within a process. They
are also known as sub-processes that share the same memory area, including the heap memory along with
the original process. Multiple threads belonging to the same process can work concurrently, allowing
better performance on multi-core and parallel processing systems. While this sharing of resources speeds
up communication between threads, it also raises potential synchronization and race-condition problems.
Developers must be cautious when designing multi-threaded code
The main difference between processes and threads lies in their level of isolation and resource usage.
Processes are more isolated from each other, have their own address space, and require more resources to
create. Threads, on the other hand, share resources with other threads within the same process and can be
created more quickly.
The process takes more time to terminate while, The thread takes less time to terminate.
process takes more time for creation and context switching while, thread takes more time for
creation and context switching.
The process is less efficient in terms of communication while, Thread is more efficient in terms
of communication.
The process is isolated while, Threads share memory.
The process is called the heavyweight process while, A Thread is lightweight processes
Process switching uses an interface in an operating system while, Thread switching does not
require calling an operating system and causes an interrupt to the kernel.
If one process is blocked then it will not affect the execution of other processes
while, in thread If a user-level thread is blocked, then all other user-level threads are blocked.
3
Java individual assignment
Thread-based multitasking involves establishing and controlling multiple threads within a single process.
Each thread operates independently with shared memory space. A thread is the smallest execution unit
within a process and consists of a program counter, a register set, and a stack space. When multiple
threads are created within a process, they function concurrently, enabling applications to execute various
tasks simultaneously without the need for running separate processes. This method decreases the
overhead associated with process creation and termination and communication between processes.
Multithreading, on the other hand, is a specific kind of thread-based multitasking that allows multiple
threads within a single process to run in parallel on multiple processor cores. This method not only
enhances system resource usage but also improves application performance and responsiveness. Modern
operating systems and programming languages, including Windows, Linux, Java, and C++, support
multithreading, enabling developers to construct high-performance applications capable of handling
multiple tasks concurrently.
Thread-based multitasking and multithreading offer several benefits. First, these methods enable better
resource utilization since threads within a process share memory space, file handles, and other resources,
reducing overall system overhead. Second, multithreading can significantly boost application
performance, especially on multi-processor systems with multiple cores, as tasks can run concurrently.
Third, these methods can increase the responsiveness of applications as tasks can run simultaneously
without impeding other operations such as updating the user interface.
However there are challenges associated with thread-based multitasking and multithreading. One key
concern is synchronization since multiple threads can potentially access the same memory space and
resources at the same time, leading to race conditions and inconsistent data. As a result, developers must
ensure that suitable synchronization mechanisms are implemented to prevent these problems.
Additionally, managing threads, particularly in complex applications, can be complex and necessitate
careful design and planning to avoid resource contention, deadlocks, and other potential issues.
4
Java individual assignment
Multi-threading refers to the process of running multiple threads concurrently within a single
process. This allows programs to perform multiple tasks simultaneously, thereby increasing
throughput and reducing the overall execution time. For example, a multi-threaded web server
can handle multiple client requests simultaneously, thus serving multiple users at once. In multi-
threading, the operating system schedules and manages the execution of multiple threads,
ensuring that each thread gets an opportunity to run and consume the CPU resources.
Thread synchronization is a technique used to control the order in which threads access shared
resources. In concurrent programming, it is common for multiple threads to access shared data
structures or resources simultaneously. This can lead to inconsistencies and race conditions that
produce incorrect results. Synchronization ensures that only one thread accesses a shared
resource at a time, preventing such issues. Developers use various synchronization mechanisms,
such as locks, semaphores, or condition variables, to coordinate the sequence of thread execution
and protect the integrity of shared resources.
Thread priorities are a way to define the importance of different threads relative to each other.
By assigning priorities to threads, the operating system can determine which threads should be
given preference and scheduled to execute first. Higher priority threads will typically execute
before lower-priority threads, potentially leading to more responsive applications and improved
resource utilization. However, setting priorities should be approached with caution, as it can lead
to issues such as priority inversion, where a higher priority thread is blocked by a lower priority
thread, ultimately causing the overall performance to degrade. It is also crucial to note that thread
priorities' actual impact and behavior may vary depending on the underlying operating system
and its scheduling policies.
After all, multi-threading and thread synchronization, along with priorities, play a crucial role in
concurrent programming, as they enable efficient and controlled execution of multiple tasks.
These concepts help developers build scalable, responsive, and resource-efficient programs that
can take advantage of today's increasingly parallel processing computing architectures.
5
Java individual assignment
Mutex (Mutual Exclusion Object): A synchronization primitive that provides exclusive access to
a shared resource. Threads must obtain the mutex before accessing the shared resource and
release it afterward. If a mutex is locked by another thread, the requesting thread will wait until
the mutex is unlocked.
Semaphores: Synchronization objects that help manage access to a set of shared resources. They
maintain a count of available resources (e.g., memory buffers, connections). When a resource is
requested, the count is reduced, and when a resource is released, the count is increased. If there
are no resources available, requesting threads must wait for resources to be freed.
Monitors: High-level synchronization constructs that combine mutual exclusion and condition
variables. Monitors ensure that only one thread can access shared data at a time and allow threads
to wait for specific conditions before proceeding with execution.
Condition Variables: Synchronization mechanisms that allow threads to wait until a particular
condition is met. Threads can be notified of the condition and then continue execution. Typically
used alongside mutexes or critical sections.
Critical Sections: Code sections that are protected by a lock, ensuring only one thread can
execute the code simultaneously. Critical sections are frequently used for short sections of code
where performance is crucial.
Barriers: Synchronization constructs that enable multiple threads to wait at a specific execution
point until all participating threads reach that point. When all threads have reached the barrier,
they are released and can continue execution.
Read/Write Locks: Read/Write Locks allow several threads to access shared data concurrently for
reading, while ensuring exclusive access for writing. This improves performance and reduces
contention, particularly when read operations outnumber write operations.
Atomic operations: Operations that are completed in a single, uninterruptible step. Atomic
operations can be used for synchronization by ensuring a particular action is completed without
being interrupted by other threads, guaranteeing consistency and preventing race conditions.
Wait and Notify: A synchronization mechanism that allows one or more threads to wait for
specific events or conditions and be notified when they occur. Typically used in conjunction with
locks or other synchronization constructs.
6
Java individual assignment
import java.io.*;
// Class 1
// Helper class
class SharedDataPrinter {
// Method (synchronised)
synchronized public void display(String str)
{
// Class 2
// Helper class extending the Thread class
class Thread1 extends Thread {
SharedDataPrinter p;
// Thread
public Thread1(SharedDataPrinter p)
{
7
Java individual assignment
// Print statement
p.display("Geeks");
}
}
SharedDataPrinter p;
// Print statement
p.display(" for Geeks");
}
}
// Class 3
// Main class
class GFG {
8
Java individual assignment
Semaphores: Semaphores allow a fixed number of threads to access a shared resource at the
same time. A semaphore maintains a count of available permits, which can be acquired and
released by threads.
1. import java.util.concurrent.Semaphore;
2. public class SemaphoreExample
3. {
4. //creating constructor of the Semaphore with the initial value 3
5. static Semaphore semaphore = new Semaphore(3);
6. static class DemoThread extends Thread
7. {
8. String name = "";
9. //constructor of the DemoThread class
10. DemoThread(String name)
11. {
12. this.name = name;
13. }
14. public void run()
15. {
16. try
17. {
18. System.out.println("Thread "+name + " : acquiring lock...");
19. System.out.println("Thread "+name + " : available Semaphore permits is: "+ semaphore.availableP
ermits());
20. //thread A acquire lock and the permit count decremented by 1
21. semaphore.acquire();
22. System.out.println("Thread "+name + " : got the permit!");
23. try
24. {
25. for (int i = 1; i <= 5; i++)
26. {
27. System.out.println("Thread "+name + " : is performing operation " + i+ ", available Semaphore pe
rmits : "+ semaphore.availablePermits());
28. //sleep 2 second
29. Thread.sleep(2000);
30. }
31. }
32. finally
33. {
34. System.out.println("Thread "+name + " : releasing lock...");
9
Java individual assignment
10
Java individual assignment
Monitors: Monitors, also known as synchronized blocks, allow threads to enter a critical
section of code one at a time and ensure that other threads wait until the previous thread
exits the critical section.
1. class CubbyHole {
2. private int contents;
3. private boolean available = false;
4.
5. public synchronized int get() {
6. while (available == false) {
7. try {
8. wait();
9. } catch (InterruptedException e) {
10. }
11. }
12. available = false;
13. notifyAll();
14. return contents;
15. }
16.
17. public synchronized void put(int value) {
18. while (available == true) {
19. try {
20. wait();
21. } catch (InterruptedException e) {
22. }
23. }
24. contents = value;
25. available = true;
26. notifyAll();
27. }
28. }
29. public synchronized void put(int value) {
30. // monitor has been acquired by the Producer
31. while (available == true) {
32. try {
33. wait();
34. } catch (InterruptedException e) {
35. }
36. }
37. contents = value;
38. available = true;
39. notifyAll();
40. // monitor is released by the Producer
41. }
42. public synchronized int get() {
43. // monitor has been acquired by the Consumer
44. while (available == false) {
45. try {
46. wait();
47. } catch (InterruptedException e) {
48. }
49. }
50. available = false;
51. notifyAll();
52. return contents;
11
Java individual assignment
53. }
Q8, Mutual exclusion threads and its type with sample code example
Mutual exclusion is a property of process synchronization which states that “no two processes
can exist in the critical section at any given point of time”. The term was first coined by Dijkstra.
Any process synchronization technique being used must satisfy the property of mutual exclusion,
without which it would not be possible to get rid of a race condition.
The need for mutual exclusion comes with concurrency. There are several kinds of concurrent
execution:
1. Interrupt handlers
2. Interleaved preemptively scheduled processes/threads
3. Multiprocessor clusters, with shared memory
4. Distributed systems
Mutual exclusion methods are used in concurrent programming to avoid the simultaneous use
of a common resource, such as a global variable, by pieces of computer code called critical
sections • Requirement of mutual exclusion is that, when process P1 is accessing a shared
resource R1 then on other process should be able to access resource R1 until process P1 has
finished its operation with resource R1.
Examples of such resources include files, I/O devices such as printers and shared data
structures.
Approaches to implementing mutual exclusion:
1. Software method: Leave the responsibility with the processes themselves. These methods are
usually highly error-prone and carry high overheads.
2. Hardware method: Special-purpose machine instructions are used for accessing shared
resources. This method is faster but cannot provide complete solution. Hardware solution cannot
give guarantee the absence of deadlock and starvation.
3. Programming language method: Provide support through the operating system or through
the programming language.
Requirements of mutual exclusion:
1. At any time, only one process is allowed to enter in its critical section.
2. Solution is implemented purely in software on a machine.
3. A process remains inside its critical section for a bounded time only.
4. No assumption can be made about relative speeds of asynchronous concurrent processes.
5. A process cannot prevent any other process for entering into critical section.
6. A process must not be indefinitely postponed from entering its critical section.
Code
class
IncrementingRunnabl
e implements
Runnable {
private final MutableInteger mutableInteger;
public IncrementingRunnable(MutableInteger
mutableInteger) {
this.mutableInteger = mutableInteger;
}
@Override
public void run() {
for (int i = 0; i < 10_000; i++) {
mutableInteger.increment();
}
12
Java individual assignment
}
}
public class Main {
public static void main (String[] args) {
List<Thread> threads = new ArrayList<>();
// Variable to increment from multiple threads
MutableInteger integer = new MutableInteger();
// Run 10 threads to increment the same
variable
for (int i = 0; i < 10; i++) {
Thread thread = new Thread(new
IncrementingRunnable(integer));
thread.start();
threads.add(thread);
}
// Wait until all threads are finished
for (Thread thread : threads) {
thread.join();
}
System.out.println("Result value: " +
integer.getValue());
}
}
1. Synchronized Method:
You can use the synchronized keyword in a method declaration to ensure mutual exclusion.
count++;
13
Java individual assignment
count--;
return count;
In this example, the synchronized keyword is applied to the increment, decrement, and getCount
methods to ensure mutual exclusion.
2. Static Synchronization:
Static synchronization is used to synchronize static methods in a class. To achieve mutual exclusion for
static methods, use the synchronized keyword with the method declaration.
globalCount++;
globalCount--;
14
Java individual assignment
return globalCount;
In this example, the synchronized keyword is applied to the static methods incrementGlobal,
decrementGlobal, and getGlobalCount.
3. Synchronized Block:
synchronized (lock) {
blockCount++;
synchronized (lock) {
blockCount--;
15
Java individual assignment
synchronized (lock) {
return blockCount;
1. wait(): This method forces the current thread to wait until another thread calls notify() or
notifyAll() on the same object (or until it reaches an optional timeout).
2. notify(): This method wakes up a single randomly-chosen thread that is waiting on the same
object.
3. notifyAll(): This method wakes up all threads that are waiting on the same object.
class SharedResource {
16
Java individual assignment
while (dataAvailable) {
try {
wait();
} catch (InterruptedException e) {
e.printStackTrace();
this.data = data;
this.dataAvailable = true;
notifyAll();
while (!dataAvailable) {
try {
wait();
} catch (InterruptedException e) {
e.printStackTrace();
dataAvailable = false;
notifyAll();
return data;
17
Java individual assignment
this.sharedResource = sharedResource;
@Override
sharedResource.put("Data-" + i);
try {
Thread.sleep(1000);
} catch (InterruptedException e) {
e.printStackTrace();
this.sharedResource = sharedResource;
18
Java individual assignment
@Override
try {
Thread.sleep(1000);
} catch (InterruptedException e) {
e.printStackTrace();
producerThread.start();
consumerThread.start();
19
Java individual assignment
In this example, we have created a SharedResource class that contains a data variable, a
dataAvailable flag, and two synchronized methods: put() and get(). The put() method is responsible
for adding data to the shared resource, while the get() method retrieves data from it.
We have also created a Producer class and a Consumer class, which represent two threads that
interact through the shared resource. The Producer adds data to the shared resource while the
consumer retrieves and processes it.
In the main method, we create a shared resource object, a Producer thread and a Consumer thread.
After starting both threads, they will use the wait() and notifyAll() methods to synchronize their
access to the shared resource. This ensures that the producer waits if the consumer has not
consumed the previous data, and the consumer waits if there is no new data available.
As a result, the producer and consumer threads are able to communicate with each other and
coordinate their actions to ensure smooth access to the shared resource.
1. Define a shared resource that contains the data and methods for accessing that data in a
synchronized manner.
2. Use the wait() method in the methods of the shared resource to make the thread wait if the
required conditions are not met (e.g., data availability).
3. Use the notify() or notifyAll() methods in the methods of the shared resource to wake up the
threads that are waiting on the same object when the required conditions are met.
5. Start the threads and observe the inter-thread communication through the correct execution of
their actions.
20
Java individual assignment
In this way, inter-thread communication enables different threads to work together efficiently in a
coordinated manner while accessing shared resources, enhancing the performance of the program
and ensuring correct execution.
summary
A process is an independent execution environment, containing its own memory space and
resources. A thread is a part of the process that runs concurrently with other threads,
sharing the process' resources but maintaining its own stack and program counter.
21
Java individual assignment
For thread synchronization, various techniques are employed, such as locks, semaphores,
and monitors. Code examples showcasing these techniques can be found in many
programming languages, such as Java and C++.
Mutual exclusion ensures that only one thread can access a shared resource at a time. This
can be achieved using different methods like synchronized methods, static synchronization,
and synchronized blocks with code examples in languages like Java.
22