Java Concurrency(pseudo -parallelism): Part II

Kumar Shivam
7 min readApr 4, 2020

Before discussing, multi-threading way of achieving concurrency, let’s understand about the thread:-

· What is the thread?

“Thread is a light-weight process”

As we know the process is heavy-weight because OS has to do a lot of background activities to create a process(i.e. allocation of buffers, maintain metadata(i.e. process_id, environment, folder, registers, stack, heap, file descriptor, shared libraries details, instruments of interprocess communications (like pipes, semaphores, queues, shared memory, etc.) ) at a different level and has to keep on notifying schedulers, memory management because all processes have their own memory space. Processes do not share their memory spaces because of data inconsistency and memory corruption issue. Hence, creating a new process is slow, expensive and heavyweight. But, threads are created inside the process.

So we can say that “Process is a container of execution context and a thread inside that process is the execution context”.

That being said, a process can have multiple threads irrespective of the processor (i.e. single-core or multi-core ).

All the thread inside a process shares the memory, communicate state or data among themselves. It also has some metadata information (i.e. stack, registers, attributes for scheduler, policies, thread state). They communicate at user and kernel level. They work on “Many to Many” threading model by multiplexing a batch of user threads onto an equal (or lesser) amount kernel threads. It means that a blocked kernel call doesn’t halt the process and application will not crash. Thread minimizes the context switching time. So considering these awesome behaviours of thread, we can say it’s a light-weight process.

By creating a limited amount of thread will give responsiveness to the application. Also, we can manage and utilize resources efficiently. But contrary, if we’ll create a lot of threads then we would get performance hit.

Multi-threading in java:-

The application is running on multi-core and multi-threaded environment

Multithreading is a Java feature that allows concurrent execution of two or more parts of a program that can run concurrently and each part can handle a different task at the same time making optimal use of the available resources especially when our computer has multiple CPUs. Multi-threading extends an idea of multitasking within the application where we can sub-divide specific task within an application into individual threads. Each of the threads can run in parallel. OS divides processing time among different applications and their threads.

Thread Lifecycle:-

A thread can be created by using two mechanisms:-

· Extending the “Thread” class

· Implementing the “Runnable” interface

Thread creation by extending the “Thread” class

EX:
public class HelloThread extends Thread {
public void run() {
System.out.println("Hello from a thread!");
}

public static void main(String args[]) {
(new HelloThread()).start();
}
}

Thread class extends object class and implements the Runnable interface.

Commonly used constructors of the Thread class:-

· Thread()

· Thread(String name)

· Thread(Runnable r)

· Thread(Runnable r, String name)

Thread creation by implementing the “Runnable” Interface

public class SampleClass implements Runnable {@Override
public void run() {
System.out.println("Thread has ended");
}
public static void main(String[] args) {
SampleClass ex = new SampleClass();
Thread t1= new Thread(ex);
t1.start();
System.out.println("Hello");
}
}
O/p :- Hello
Thread has ended

How does the JVM keep track of each thread’s execution?

The JVM provides method-call stack to all the threads. When bytecodes are executing, the method-call stack keeps tracks of all the local variables and parameters. JVM passes those local variables and parameter to a method and the method’s return value.

What is happening internally?

When java application starts, the main method is being invoked and one background thread JVM allocates to run the application. All the code will execute on that thread. But, the moment it will run the code of thread instance creation (i.e. Thread threadobj = new Thread() or by using Runnable interface).JVM will spawn a new thread and attach that method call or code to that. Each thread will wait for its turn for the processing(i.e. CPU) and local memory area(i.e. stack and stack frames for each method call) to store the result temporarily. Once the thread completes its execution garbage collector will flush the de-referenced objects from the Heap.

Heap is the shared memory area for all threads. Stack is the private memory area for each running thread.

What will happen if we keep on creating threads?

Since threads dwell to OS, so thread creation is always a costly operation. Each thread consumes a non-trivial amount of memory on the stack. Gradually it results in starvation and degrades the performance of the application.

How can we fix the above problem?

To get rid of this challenge at a certain level, Java supports “Thread pool” concept.

How does Thread pool work?

Thread pool

Thread pool is a collection of pre-initialized threads. Using the same thread, it facilitates execution of N number of tasks. Whenever a task needs to execute, first it has to go and wait in “thread queue”. Then one thread is assigned to the task for execution, that thread will remain with the task during the span of execution. Once the execution is completed then that thread will be detached. Threads always remain in the thread pool even if there is not any task. A watcher keeps on watching the thread queue for any new task. Once any task will reach to the queue thread again execute them.

Challenges of using Thread pools:-

1. Deadlock:- There could be the possibility that executing threads are waiting for the results from the threads waiting in the queue due to unavailability of threads for their execution.

2. Thread Leakage:- This will occur if a thread is removed from the pool to execute a task but not returned to the pool although the task is completed because of any exception.

3. Resource Thrashing:- If the thread pool size is very large, then time is wasted in context switching between threads.

Why do we need Synchronization?

In a multi-core and multi-threaded environment, multiple tasks are executed by multiple parallel executing threads. So in this environment, threads must be synchronized when they are accessing shared object concurrently, to avoid state corruption or any kind of unexpected behaviour. Synchronization only required when shared object are mutable, but if shared objects are read-only or immutable then synchronization is not required. JVM ensures that java synchronized code will only be executed by one thread at a time.

EX:-public class Singleton{
private static volatile Singleton _instance;
public static Singleton getInstance(){
if(_instance == null){
synchronized(Singleton.class){
if(_instance == null)
_instance = new Singleton();
}
}
return _instance;
}
}
//Note:- If object try to lock synchronised block code then synchronized block will throw “NullPointerException”

Challenges in concurrency:-

1) Race — around condition:- It occurs within the critical section of our code when two or more threads can access shared data and they try to change it at the same time.

Solution:- This can be avoided with proper thread synchronization within critical sections techniques:-

a) Lock

b) Atomic variables

c) Message passing

2) Deadlock:- This occurs when two or more threads are blocked forever, waiting for each other. This situation arises when multiple threads need the same locks but obtain them in a different order.

Solution:-

a) Avoid Nested Locks:- Avoid giving locks to multiple threads if we already have given to one.

b) Avoid unnecessary locks:- Try to reduce the need to lock things as much as we can.

3) Starvation:- Starvation refers to a situation where a thread is unable to make any progress (or progressing very slowly) despite being in runnable state.

Solution:-

a) ReentrantLock:- First we need to determine which thread has been waiting longer so that we can assign it a lock. If a thread can’t acquire a lock, it should release lock pre-emptively and should try again later.

b) Mutex:- (Mutual Exclusion) — It is used to provide controlled access over shared resources or critical section, by allowing one thread to access at a time. Once the thread acquires a mutex, all threads attempting to acquire the same mutex are blocked until the first thread releases the mutex.

c) Semaphore:- If we wanted to have multiple threads run at once while preventing starvation.

4) Livelock:- It refers to a situation where two or more threads are so busy relinquishing CPU for each other that overall there is hardly any progress going on. A thread often acts in response to the action of another thread. If the other thread’s action is also a response to the action of this thread, then we may have a livelock.

Unlike deadlock, threads are not blocked when livelock occurs. They are simply too busy responding to each other to resume work.

Solution:-

a) ReentrantLock

Conclusion, In this section we have discussed how can we achieve concurrency using multi-threading, its challenges and how can we overcome. Since multi-threading is such an important topic that I couldn’t able to discuss all the concepts in this blog.

--

--

Kumar Shivam

Technical Consultant | Passionate about exploring new Technology | Cyber Security Enthusiast | Technical Blogger | Problem Solver