You are on page 1of 22

Java individual assignment

Q1, What is Thread?


Thread is a single sequence of instructions or a basic unit of CPU utilization within a process. Thread is a
small set of instructions designed to be scheduled and executed by the CPU independently of the parent
process. It forms the smallest unit of execution inside a process and shares the memory space, file
descriptors, and other resources with other threads belonging to the same process. Threads enable
parallelization of tasks, allowing a program to perform multiple tasks concurrently, thereby improving
the overall efficiency and performance.

Threads can communicate and synchronize with each other using various techniques, like locks,
semaphores, and condition variables. Careful design and management of threads are necessary to avoid
issues like data races, deadlocks, and resource contention.

After all, threads are the smallest unit of execution within a process, allowing for parallelization of tasks
and more efficient use of system resources. They share memory space and other resources with other
threads in the same process, simplifying communication and synchronization between parallel tasks.
Careful management of threads is crucial to avoid issues like data races and deadlocks.

Q2, Why Multithreading, Multitasking, Multiprocessing?


There’s so many reasons why we use Multithreading, Multitasking, Multiprocessing some of them are

 Improved performance: These ideas allow for the concurrent execution of numerous tasks or
threads, optimizing the utilization of available CPU and computational resources. This can
greatly enhance a system or applications overall performance and speed.
 Resource sharing: Through multithreading, several threads within a single process can access
the same memory space, file indicators, and other resources. This results in effective utilization
of resources and minimized memory consumption.
 Cost reduction: By making efficient use of CPU and system resources, multitasking and
multiprocessing can reduce computing costs by processing more tasks simultaneously or
executing tasks faster, reducing the need for additional hardware or computational power.
 Fault tolerance: In a multiprocessing system, if one process experiences a failure, the other
processes running on separate cores can continue their execution, providing a level of fault
tolerance.
 Responsiveness: In a multitasking environment, when a task or process is waiting for a response
from an external resource (e.g., I/O operation), another task can still be executed, ensuring
continuous responsiveness and smooth user experience.
 Scalability: Multiprocessing leverages the power of multi-core CPUs, as parallel execution of
tasks can be distributed across different CPU cores, increasing computational capacity and
speeding up execution times.

2
Java individual assignment

Q3, Process vs thread

A process is an instance of a program that runs independently and has its own memory space, while a
thread is a lightweight unit of execution within a process that shares the same memory space as other
threads.

A process is an independent and self-sufficient unit of execution in a system. Each process has its own
memory space, including the heap memory area and the resource files necessary for it to run. Being
completely separate from each other, processes do not share their memory and resources, which creates a
degree of isolation, preventing direct interference. The mechanism for inter-process communication is
more complicated and slower than for threads. The operating system’s task scheduler allocates CPU time
to these processes. Consequently, processes are heavyweight and demand more resources and time for
their creation and termination.

On the other hand, threads are lightweight and comprise a sequence of instructions within a process. They
are also known as sub-processes that share the same memory area, including the heap memory along with
the original process. Multiple threads belonging to the same process can work concurrently, allowing
better performance on multi-core and parallel processing systems. While this sharing of resources speeds
up communication between threads, it also raises potential synchronization and race-condition problems.
Developers must be cautious when designing multi-threaded code

The main difference between processes and threads lies in their level of isolation and resource usage.
Processes are more isolated from each other, have their own address space, and require more resources to
create. Threads, on the other hand, share resources with other threads within the same process and can be
created more quickly.

Lets put it this way

 The process takes more time to terminate while, The thread takes less time to terminate.
 process takes more time for creation and context switching while, thread takes more time for
creation and context switching.
 The process is less efficient in terms of communication while, Thread is more efficient in terms
of communication.
 The process is isolated while, Threads share memory.
 The process is called the heavyweight process while, A Thread is lightweight processes
 Process switching uses an interface in an operating system while, Thread switching does not
require calling an operating system and causes an interrupt to the kernel.
 If one process is blocked then it will not affect the execution of other processes
while, in thread If a user-level thread is blocked, then all other user-level threads are blocked.

3
Java individual assignment

Q4, Thread-based multitasking and multithreading


Thread-based multitasking and multithreading are crucial programming strategies for enhancing the
effectiveness, responsiveness, and concurrency of software applications. These approaches allow a
program to run several tasks concurrently, utilizing the same memory space and resources, significantly
improving the application's overall performance. This essay will explore the fundamental concepts of
thread-based multitasking and multithreading and their benefits and challenges.

Thread-based multitasking involves establishing and controlling multiple threads within a single process.
Each thread operates independently with shared memory space. A thread is the smallest execution unit
within a process and consists of a program counter, a register set, and a stack space. When multiple
threads are created within a process, they function concurrently, enabling applications to execute various
tasks simultaneously without the need for running separate processes. This method decreases the
overhead associated with process creation and termination and communication between processes.

Multithreading, on the other hand, is a specific kind of thread-based multitasking that allows multiple
threads within a single process to run in parallel on multiple processor cores. This method not only
enhances system resource usage but also improves application performance and responsiveness. Modern
operating systems and programming languages, including Windows, Linux, Java, and C++, support
multithreading, enabling developers to construct high-performance applications capable of handling
multiple tasks concurrently.

Thread-based multitasking and multithreading offer several benefits. First, these methods enable better
resource utilization since threads within a process share memory space, file handles, and other resources,
reducing overall system overhead. Second, multithreading can significantly boost application
performance, especially on multi-processor systems with multiple cores, as tasks can run concurrently.
Third, these methods can increase the responsiveness of applications as tasks can run simultaneously
without impeding other operations such as updating the user interface.

However there are challenges associated with thread-based multitasking and multithreading. One key
concern is synchronization since multiple threads can potentially access the same memory space and
resources at the same time, leading to race conditions and inconsistent data. As a result, developers must
ensure that suitable synchronization mechanisms are implemented to prevent these problems.
Additionally, managing threads, particularly in complex applications, can be complex and necessitate
careful design and planning to avoid resource contention, deadlocks, and other potential issues.

4
Java individual assignment

Q5, Multi-thread and Thread synchronization, and priorities

Multi-threading refers to the process of running multiple threads concurrently within a single
process. This allows programs to perform multiple tasks simultaneously, thereby increasing
throughput and reducing the overall execution time. For example, a multi-threaded web server
can handle multiple client requests simultaneously, thus serving multiple users at once. In multi-
threading, the operating system schedules and manages the execution of multiple threads,
ensuring that each thread gets an opportunity to run and consume the CPU resources.

Thread synchronization is a technique used to control the order in which threads access shared
resources. In concurrent programming, it is common for multiple threads to access shared data
structures or resources simultaneously. This can lead to inconsistencies and race conditions that
produce incorrect results. Synchronization ensures that only one thread accesses a shared
resource at a time, preventing such issues. Developers use various synchronization mechanisms,
such as locks, semaphores, or condition variables, to coordinate the sequence of thread execution
and protect the integrity of shared resources.

Thread priorities are a way to define the importance of different threads relative to each other.
By assigning priorities to threads, the operating system can determine which threads should be
given preference and scheduled to execute first. Higher priority threads will typically execute
before lower-priority threads, potentially leading to more responsive applications and improved
resource utilization. However, setting priorities should be approached with caution, as it can lead
to issues such as priority inversion, where a higher priority thread is blocked by a lower priority
thread, ultimately causing the overall performance to degrade. It is also crucial to note that thread
priorities' actual impact and behavior may vary depending on the underlying operating system
and its scheduling policies.

After all, multi-threading and thread synchronization, along with priorities, play a crucial role in
concurrent programming, as they enable efficient and controlled execution of multiple tasks.
These concepts help developers build scalable, responsive, and resource-efficient programs that
can take advantage of today's increasingly parallel processing computing architectures.

5
Java individual assignment

Q6, Types of synchronization (process and thread)

 Mutex (Mutual Exclusion Object): A synchronization primitive that provides exclusive access to
a shared resource. Threads must obtain the mutex before accessing the shared resource and
release it afterward. If a mutex is locked by another thread, the requesting thread will wait until
the mutex is unlocked.
 Semaphores: Synchronization objects that help manage access to a set of shared resources. They
maintain a count of available resources (e.g., memory buffers, connections). When a resource is
requested, the count is reduced, and when a resource is released, the count is increased. If there
are no resources available, requesting threads must wait for resources to be freed.
 Monitors: High-level synchronization constructs that combine mutual exclusion and condition
variables. Monitors ensure that only one thread can access shared data at a time and allow threads
to wait for specific conditions before proceeding with execution.
 Condition Variables: Synchronization mechanisms that allow threads to wait until a particular
condition is met. Threads can be notified of the condition and then continue execution. Typically
used alongside mutexes or critical sections.
 Critical Sections: Code sections that are protected by a lock, ensuring only one thread can
execute the code simultaneously. Critical sections are frequently used for short sections of code
where performance is crucial.
 Barriers: Synchronization constructs that enable multiple threads to wait at a specific execution
point until all participating threads reach that point. When all threads have reached the barrier,
they are released and can continue execution.
 Read/Write Locks: Read/Write Locks allow several threads to access shared data concurrently for
reading, while ensuring exclusive access for writing. This improves performance and reduces
contention, particularly when read operations outnumber write operations.
 Atomic operations: Operations that are completed in a single, uninterruptible step. Atomic
operations can be used for synchronization by ensuring a particular action is completed without
being interrupted by other threads, guaranteeing consistency and preventing race conditions.
 Wait and Notify: A synchronization mechanism that allows one or more threads to wait for
specific events or conditions and be notified when they occur. Typically used in conjunction with
locks or other synchronization constructs.

6
Java individual assignment

Q7, Types of thread synchronization with sample code example


 Locks: Locks provide a simple way to synchronize threads. A lock can be acquired and released
by only one thread at a time, which ensures mutual exclusion and prevents race conditions.

import java.io.*;

// Class 1
// Helper class
class SharedDataPrinter {

// Monitor implementation is carried on by


// Using synchronous method

// Method (synchronised)
synchronized public void display(String str)
{

for (int i = 0; i < str.length(); i++) {


System.out.print(str.charAt(i));

// Try-catch block for exceptions as we are


// using sleep() method
try {

// Making thread to sleep for very


// nanoseconds as passed in the arguments
Thread.sleep(100);
}
catch (Exception e) {
}
}
}
}

// Class 2
// Helper class extending the Thread class
class Thread1 extends Thread {

SharedDataPrinter p;

// Thread
public Thread1(SharedDataPrinter p)
{

// This keyword refers to current instance itself


this.p = p;
}

7
Java individual assignment

// run() method for this thread invoked as


// start() method is called in the main() method
public void run()
{

// Print statement
p.display("Geeks");
}
}

// Class 2 (similar to class 1)


// Helper class extending the Thread class
class Thread2 extends Thread {

SharedDataPrinter p;

public Thread2(SharedDataPrinter p) { this.p = p; }

public void run()


{

// Print statement
p.display(" for Geeks");
}
}

// Class 3
// Main class
class GFG {

// Main driver method


public static void main(String[] args)
{

// Instance of a shared resource used to print


// strings (single character at a time)
SharedDataPrinter printer = new SharedDataPrinter();

// Thread objects sharing data printer


Thread1 t1 = new Thread1(printer);
Thread2 t2 = new Thread2(printer);

// Calling start methods for both threads


// using the start() method
t1.start();
t2.start();
}
}

8
Java individual assignment

 Semaphores: Semaphores allow a fixed number of threads to access a shared resource at the
same time. A semaphore maintains a count of available permits, which can be acquired and
released by threads.

1. import java.util.concurrent.Semaphore;
2. public class SemaphoreExample
3. {
4. //creating constructor of the Semaphore with the initial value 3
5. static Semaphore semaphore = new Semaphore(3);
6. static class DemoThread extends Thread
7. {
8. String name = "";
9. //constructor of the DemoThread class
10. DemoThread(String name)
11. {
12. this.name = name;
13. }
14. public void run()
15. {
16. try
17. {
18. System.out.println("Thread "+name + " : acquiring lock...");
19. System.out.println("Thread "+name + " : available Semaphore permits is: "+ semaphore.availableP
ermits());
20. //thread A acquire lock and the permit count decremented by 1
21. semaphore.acquire();
22. System.out.println("Thread "+name + " : got the permit!");
23. try
24. {
25. for (int i = 1; i <= 5; i++)
26. {
27. System.out.println("Thread "+name + " : is performing operation " + i+ ", available Semaphore pe
rmits : "+ semaphore.availablePermits());
28. //sleep 2 second
29. Thread.sleep(2000);
30. }
31. }
32. finally
33. {
34. System.out.println("Thread "+name + " : releasing lock...");

9
Java individual assignment

35. //invoking release() method after successful execution


36. semaphore.release();
37. //prints the total number of available permits
38. System.out.println("Thread "+name + " : available Semaphore permits is: "+ semaphore.availableP
ermits());
39. }
40. }
41. catch (InterruptedException e)
42. {
43. e.printStackTrace();
44. }
45. }
46. }
47. //main method
48. public static void main(String[] args)
49. {
50. //prints the total number of available permits
51. System.out.println("Total available Semaphore permits is: "+ semaphore.availablePermits());
52. //creating four threads namely A, B, C, and D
53. DemoThread t1 = new DemoThread("A");
54. //staring thread A
55. t1.start();
56. DemoThread t2 = new DemoThread("B");
57. //staring thread B
58. t2.start();
59. DemoThread t3 = new DemoThread("C");
60. //staring thread C
61. t3.start();
62. DemoThread t4 = new DemoThread("D");
63. //staring thread D
64. t4.start();
65. }
66. }

10
Java individual assignment

 Monitors: Monitors, also known as synchronized blocks, allow threads to enter a critical
section of code one at a time and ensure that other threads wait until the previous thread
exits the critical section.
1. class CubbyHole {
2. private int contents;
3. private boolean available = false;
4.
5. public synchronized int get() {
6. while (available == false) {
7. try {
8. wait();
9. } catch (InterruptedException e) {
10. }
11. }
12. available = false;
13. notifyAll();
14. return contents;
15. }
16.
17. public synchronized void put(int value) {
18. while (available == true) {
19. try {
20. wait();
21. } catch (InterruptedException e) {
22. }
23. }
24. contents = value;
25. available = true;
26. notifyAll();
27. }
28. }
29. public synchronized void put(int value) {
30. // monitor has been acquired by the Producer
31. while (available == true) {
32. try {
33. wait();
34. } catch (InterruptedException e) {
35. }
36. }
37. contents = value;
38. available = true;
39. notifyAll();
40. // monitor is released by the Producer
41. }
42. public synchronized int get() {
43. // monitor has been acquired by the Consumer
44. while (available == false) {
45. try {
46. wait();
47. } catch (InterruptedException e) {
48. }
49. }
50. available = false;
51. notifyAll();
52. return contents;

11
Java individual assignment

53. }
Q8, Mutual exclusion threads and its type with sample code example

Mutual exclusion is a property of process synchronization which states that “no two processes
can exist in the critical section at any given point of time”. The term was first coined by Dijkstra.
Any process synchronization technique being used must satisfy the property of mutual exclusion,
without which it would not be possible to get rid of a race condition.
The need for mutual exclusion comes with concurrency. There are several kinds of concurrent
execution:
1. Interrupt handlers
2. Interleaved preemptively scheduled processes/threads
3. Multiprocessor clusters, with shared memory
4. Distributed systems
 Mutual exclusion methods are used in concurrent programming to avoid the simultaneous use
of a common resource, such as a global variable, by pieces of computer code called critical
sections • Requirement of mutual exclusion is that, when process P1 is accessing a shared
resource R1 then on other process should be able to access resource R1 until process P1 has
finished its operation with resource R1.
 Examples of such resources include files, I/O devices such as printers and shared data
structures.
 Approaches to implementing mutual exclusion:
1. Software method: Leave the responsibility with the processes themselves. These methods are
usually highly error-prone and carry high overheads.
2. Hardware method: Special-purpose machine instructions are used for accessing shared
resources. This method is faster but cannot provide complete solution. Hardware solution cannot
give guarantee the absence of deadlock and starvation.
3. Programming language method: Provide support through the operating system or through
the programming language.
Requirements of mutual exclusion:
1. At any time, only one process is allowed to enter in its critical section.
2. Solution is implemented purely in software on a machine.
3. A process remains inside its critical section for a bounded time only.
4. No assumption can be made about relative speeds of asynchronous concurrent processes.
5. A process cannot prevent any other process for entering into critical section.
6. A process must not be indefinitely postponed from entering its critical section.

Code
class
IncrementingRunnabl
e implements
Runnable {
private final MutableInteger mutableInteger;
public IncrementingRunnable(MutableInteger
mutableInteger) {
this.mutableInteger = mutableInteger;
}
@Override
public void run() {
for (int i = 0; i < 10_000; i++) {
mutableInteger.increment();
}

12
Java individual assignment

}
}
public class Main {
public static void main (String[] args) {
List<Thread> threads = new ArrayList<>();
// Variable to increment from multiple threads
MutableInteger integer = new MutableInteger();
// Run 10 threads to increment the same
variable
for (int i = 0; i < 10; i++) {
Thread thread = new Thread(new
IncrementingRunnable(integer));
thread.start();
threads.add(thread);
}
// Wait until all threads are finished
for (Thread thread : threads) {
thread.join();
}
System.out.println("Result value: " +
integer.getValue());
}
}

Q9, How to achieve mutual exclusive thread (synchronized method, static


synchronization and synchronized block with sample code examples)
Mutual exclusion in threading is a concept used to prevent multiple threads from simultaneously
executing a critical region of code, ensuring that only one thread can access shared resources at a time.
In Java, you can achieve mutual exclusion using synchronized methods, static synchronization, and
synchronized blocks. Here are some examples of using these techniques:

1. Synchronized Method:

You can use the synchronized keyword in a method declaration to ensure mutual exclusion.

public class Counter {

private int count = 0;

public synchronized void increment() {

count++;

13
Java individual assignment

public synchronized void decrement() {

count--;

public synchronized int getCount() {

return count;

In this example, the synchronized keyword is applied to the increment, decrement, and getCount
methods to ensure mutual exclusion.

2. Static Synchronization:

Static synchronization is used to synchronize static methods in a class. To achieve mutual exclusion for
static methods, use the synchronized keyword with the method declaration.

public class GlobalCounter {

private static int globalCount = 0;

public static synchronized void incrementGlobal() {

globalCount++;

public static synchronized void decrementGlobal() {

globalCount--;

14
Java individual assignment

public static synchronized int getGlobalCount() {

return globalCount;

In this example, the synchronized keyword is applied to the static methods incrementGlobal,
decrementGlobal, and getGlobalCount.

3. Synchronized Block:

A synchronized block is used to achieve mutual exclusion by synchronizing on an object.

public class BlockCounter {

private int blockCount = 0;

private final Object lock = new Object();

public void incrementBlock() {

synchronized (lock) {

blockCount++;

public void decrementBlock() {

synchronized (lock) {

blockCount--;

15
Java individual assignment

public int getBlockCount() {

synchronized (lock) {

return blockCount;

Q9, Inter-thread communication (implemented using wait (), notify () and


notifyAll () with code examples).
Inter-thread communication is the process by which two or more threads share information and
synchronize their actions. In Java, inter-thread communication can be achieved using the wait(),
notify(), and notifyAll() methods provided by the Object class.

Here is a brief explanation of these three methods:

1. wait(): This method forces the current thread to wait until another thread calls notify() or
notifyAll() on the same object (or until it reaches an optional timeout).

2. notify(): This method wakes up a single randomly-chosen thread that is waiting on the same
object.

3. notifyAll(): This method wakes up all threads that are waiting on the same object.

Let's consider an example to understand inter-thread communication:

class SharedResource {

private String data;

private boolean dataAvailable = false;

public synchronized void put(String data) {

16
Java individual assignment

while (dataAvailable) {

try {

wait();

} catch (InterruptedException e) {

e.printStackTrace();

this.data = data;

this.dataAvailable = true;

notifyAll();

public synchronized String get() {

while (!dataAvailable) {

try {

wait();

} catch (InterruptedException e) {

e.printStackTrace();

dataAvailable = false;

notifyAll();

return data;

17
Java individual assignment

class Producer implements Runnable {

private SharedResource sharedResource;

public Producer(SharedResource sharedResource) {

this.sharedResource = sharedResource;

@Override

public void run() {

for (int i = 0; i < 5; i++) {

sharedResource.put("Data-" + i);

System.out.println("Produced: Data-" + i);

try {

Thread.sleep(1000);

} catch (InterruptedException e) {

e.printStackTrace();

class Consumer implements Runnable {

private SharedResource sharedResource;

public Consumer(SharedResource sharedResource) {

this.sharedResource = sharedResource;

18
Java individual assignment

@Override

public void run() {

for (int i = 0; i < 5; i++) {

String data = sharedResource.get();

System.out.println("Consumed: " + data);

try {

Thread.sleep(1000);

} catch (InterruptedException e) {

e.printStackTrace();

public class InterThreadCommunicationExample {

public static void main(String[] args) {

SharedResource sharedResource = new SharedResource();

Thread producerThread = new Thread(new Producer(sharedResource));

Thread consumerThread = new Thread(new Consumer(sharedResource));

producerThread.start();

consumerThread.start();

19
Java individual assignment

In this example, we have created a SharedResource class that contains a data variable, a
dataAvailable flag, and two synchronized methods: put() and get(). The put() method is responsible
for adding data to the shared resource, while the get() method retrieves data from it.

We have also created a Producer class and a Consumer class, which represent two threads that
interact through the shared resource. The Producer adds data to the shared resource while the
consumer retrieves and processes it.

In the main method, we create a shared resource object, a Producer thread and a Consumer thread.
After starting both threads, they will use the wait() and notifyAll() methods to synchronize their
access to the shared resource. This ensures that the producer waits if the consumer has not
consumed the previous data, and the consumer waits if there is no new data available.

As a result, the producer and consumer threads are able to communicate with each other and
coordinate their actions to ensure smooth access to the shared resource.

To summarize, inter-thread communication in Java is achieved through the following steps:

1. Define a shared resource that contains the data and methods for accessing that data in a
synchronized manner.

2. Use the wait() method in the methods of the shared resource to make the thread wait if the
required conditions are not met (e.g., data availability).

3. Use the notify() or notifyAll() methods in the methods of the shared resource to wake up the
threads that are waiting on the same object when the required conditions are met.

4. Create threads that interact with the shared resource.

5. Start the threads and observe the inter-thread communication through the correct execution of
their actions.

20
Java individual assignment

In this way, inter-thread communication enables different threads to work together efficiently in a
coordinated manner while accessing shared resources, enhancing the performance of the program
and ensuring correct execution.

summary

 Thread is a lightweight unit of execution within a process, allowing multiple tasks to be


executed concurrently. It makes efficient use of CPU resources and enables faster and more
responsive applications.

 Multithreading, multitasking, and multiprocessing improve application performance and


efficiency. Multithreading allows parallel execution within a single process, while
multitasking manages multiple threads or processes simultaneously. Multiprocessing
utilizes multiple CPUs or cores for executing tasks in parallel.

 A process is an independent execution environment, containing its own memory space and
resources. A thread is a part of the process that runs concurrently with other threads,
sharing the process' resources but maintaining its own stack and program counter.

 Thread-based multitasking and multithreading involve managing multiple threads within a


single process. This leads to efficient execution by sharing resources while keeping
individual threads lightweight.

 Thread synchronization ensures that multiple threads execute in an ordered and


predictable way. Thread priorities can be set to manage the order in which threads execute.

 Synchronization can be of two types: process synchronization and thread synchronization.


Process synchronization coordinates activities between different processes, while thread
synchronization focuses on coordinating activities within the same process.

21
Java individual assignment

 For thread synchronization, various techniques are employed, such as locks, semaphores,
and monitors. Code examples showcasing these techniques can be found in many
programming languages, such as Java and C++.

 Mutual exclusion ensures that only one thread can access a shared resource at a time. This
can be achieved using different methods like synchronized methods, static synchronization,
and synchronized blocks with code examples in languages like Java.

 Inter-thread communication is crucial for proper thread synchronization. It can be


implemented using methods like wait(), notify(), and notifyAll(). These methods allow
threads to coordinate with each other, ensuring resources are accessed only when they're
available, all while avoiding deadlock situations. Code examples can be found in languages
like Java, demonstrating how these methods can be effectively implemented in a multi-
threaded application.

22

You might also like