You are on page 1of 1

Imagine you've thread A and thread B.

They are both synchronised on the same object and


inside this block there's a global variable they are both updating;
static boolean commonVar = false;
Object lock = new Object;

...

void threadAMethod(){
...
while(commonVar == false){
synchornized(lock){
...
commonVar = true
}
}
}

void threadBMethod(){
...
while(commonVar == true){
synchornized(lock){
...
commonVar = false
}
}
}
So, when thread A enters in the while loop and holds the lock, it does what it has to do and
set the commonVar to true. Then thread B comes in, enters in the while loop and
since commonVar is true now, it is be able to hold the lock. It does so, executes
the synchronised block, and sets commonVar back to false. Now, thread A again gets it's new
CPU window, it was about to quit the while loop but thread B has just set it back to false, so
the cycle repeats over again. Threads do something (so they're not blocked in the traditional
sense) but for pretty much nothing.

It maybe also nice to mention that livelock does not necessarily have to appear here. I'm
assuming that the scheduler favours the other thread once the synchronised block finish
executing. Most of the time, I think it's a hard-to-hit expectation and depends on many
things happening under the hood.

You might also like