Professional Documents
Culture Documents
The threading module in Python that allows us to spin up native operating system
threads to execute multiple tasks concurrently.
by Tim Ojo � Oct. 30, 17 � Big Data Zone � Tutorial
Like (7)
Comment (0)
Save Tweet 7,140 Views
Join the DZone community and get the full member experience. JOIN FOR FREE
lock = threading.Lock()
### assume that code below runs in multiple threads ###
lock.acquire() # acquire the lock preventing other threads from doing so
try:
# access shared resource
finally:
lock.release() # release the lock so that other blocked threads can now run
queue = Queue()
## assume the code below runs in a separate thread t1 ###
def producer(queue):
item = make_an_item()
queue.put(item)
## assume the code below runs in a separate thread t2 ###
def consumer(queue):
item = queue.get() #gets item put in the queue by another thread. Blocks if
item not there yet
queue.task_done() # marks the last item retrieved as done
However, the current implementation of Python has a global interpreter lock (GIL)
to make Python easier to implement and faster to run for single threaded programs.
But as a result of the GIL, which only allows one thread to run at a time,
threading is not suitable for CPU-bound tasks (tasks in which most of the time is
spent performing a computation instead of waiting on IO). So instead, we have the
multiprocessing package. The multiprocessing package uses processes instead of
threads as the actors of parallel execution. And the multiprocessing API tries to
mimic the threading API as much as possible, to reduce the amount of dissonance
between the two and to make switching easier.
It was also the introduction of Futures into Python. In Python, a future represents
a pending result and it also allows us to manage the execution of the computation
that produces the result. Future API methods include result(), cancel(), and
add_done_callback(fn):