This content originally appeared on Level Up Coding - Medium and was authored by Rexs
Speed Up Your CPU-Bound Tasks with Efficient Parallelism
If you’ve ever worked with Python, you might have noticed that it doesn’t always make full use of your computer’s CPU cores. This limitation is due to the Global Interpreter Lock (GIL), which ensures that only one thread executes Python bytecode at a time. While this simplifies many aspects of programming in Python, it can become a bottleneck for CPU-bound tasks.
The solution? Python’s multiprocessing module. In this tutorial, we'll dive into the basics of Python multiprocessing, exploring how it can help you unleash the full potential of your machine's CPU cores.
Before jumping into the code, let’s understand when multiprocessing is helpful:
- CPU-bound tasks: These include operations like numerical computations, image processing, or data transformation, where the CPU is the bottleneck.
- Parallelism: Multiprocessing allows you to split tasks across multiple processes, enabling true parallel execution.
With multiprocessing, we can achieve significant performance gains for CPU-intensive tasks. Let’s see it in action.
Getting Started with Multiprocessing:
Let’s begin with a simple example to demonstrate how multiprocessing works. Imagine a CPU-bound task that squares numbers from a list.
Example: Squaring Numbers
import multiprocessing
import time
def square(n):
"""Function to compute the square of a number."""
time.sleep(1) # Simulating a time-consuming task
return n * n
if __name__ == "__main__":
numbers = [1, 2, 3, 4, 5]
# Sequential Execution
start_time = time.time()
results = [square(n) for n in numbers]
print(f"Sequential Results: {results}")
print(f"Sequential Execution Time: {time.time() - start_time:.2f} seconds")
# Parallel Execution
start_time = time.time()
with multiprocessing.Pool(processes=5) as pool:
results = pool.map(square, numbers)
print(f"Parallel Results: {results}")
print(f"Parallel Execution Time: {time.time() - start_time:.2f} seconds")
Sequential Results: [1, 4, 9, 16, 25]
Sequential Execution Time: 5.00 seconds
Parallel Results: [1, 4, 9, 16, 25]
Parallel Execution Time: 1.00 seconds
In the parallel execution, the Pool object manages multiple processes and distributes the tasks among them. This allows the squaring operation to run concurrently, cutting down the total execution time.
Key Concepts in Multiprocessing
1. Processes
Each process runs independently and has its own memory space. To create a process, you can use the Process class:
from multiprocessing import Process
def greet(name):
print(f"Hello, {name}!")
if __name__ == "__main__":
p = Process(target=greet, args=("Alice",))
p.start()
p.join()
2. Pools
The Pool class simplifies parallel processing by managing multiple worker processes for you.
- map: Distributes a function across multiple inputs.
- apply: Runs a function with a single set of arguments.
- apply_async: Runs a function asynchronously.
Example: Using apply and apply_async
from multiprocessing import Pool
import time
def cube(n):
return n ** 3
if __name__ == "__main__":
with Pool(processes=2) as pool:
# Using apply
result = pool.apply(cube, (3,))
print(f"Apply Result: {result}")
# Using apply_async
async_result = pool.apply_async(cube, (4,))
print(f"Apply Async Result: {async_result.get()}")
Apply Result: 27
Apply Async Result: 64
Synchronization and Communication
When working with multiple processes, you might need to share data or synchronize their actions. Python’s multiprocessing module provides tools for this:
1. Queues
Queues are used for safe communication between processes:
from multiprocessing import Process, Queue
def producer(queue):
for i in range(5):
queue.put(i)
def consumer(queue):
while not queue.empty():
print(queue.get())
if __name__ == "__main__":
queue = Queue()
p1 = Process(target=producer, args=(queue,))
p2 = Process(target=consumer, args=(queue,))
p1.start()
p1.join()
p2.start()
p2.join()
0
1
2
3
4
2. Locks
Locks prevent processes from accessing shared resources simultaneously:
from multiprocessing import Process, Lock
def task(lock, i):
with lock:
print(f"Process {i} is working")
if __name__ == "__main__":
lock = Lock()
processes = [Process(target=task, args=(lock, i)) for i in range(5)]
for p in processes:
p.start()
for p in processes:
p.join()
Process 0 is working
Process 1 is working
Process 2 is working
Process 3 is working
Process 4 is working
Real-Life Example: Image Processing
Imagine you have a folder containing hundreds of images that need to be resized. Processing them one by one would be slow, but with multiprocessing, you can process them concurrently.
from multiprocessing import Pool
from PIL import Image
import os
def resize_image(image_path):
"""Resize an image to 100x100 pixels."""
with Image.open(image_path) as img:
img = img.resize((100, 100))
img.save(f"resized_{os.path.basename(image_path)}")
if __name__ == "__main__":
image_folder = "path_to_images"
image_files = [os.path.join(image_folder, f) for f in os.listdir(image_folder) if f.endswith('.jpg')]
with Pool(processes=4) as pool:
pool.map(resize_image, image_files)
print("Image resizing complete!")
resized_image1.jpg
resized_image2.jpg
resized_image3.jpg
... (and so on for all images in the folder)
Image resizing complete!
In this example, the Pool splits the image resizing task among four processes, speeding up the operation significantly.
Best Practices
- Avoid Shared State: Each process has its own memory. Minimize dependencies on shared data to prevent race conditions.
- Use if __name__ == "__main__":: Always guard the entry point of the program to avoid recursive process creation.
- Set Process Limits: Don’t create too many processes; use the number of available CPU cores as a guide.
Conclusion
Python’s multiprocessing module is a powerful tool for achieving parallelism in CPU-bound tasks. With a clear understanding of its basics, you can speed up your code significantly by utilizing all your CPU cores.
Experiment with these examples and see the performance gains for yourself. Happy coding!
Unlocking the Power of Python Multiprocessing was originally published in Level Up Coding on Medium, where people are continuing the conversation by highlighting and responding to this story.
This content originally appeared on Level Up Coding - Medium and was authored by Rexs

Rexs | Sciencx (2025-01-15T16:49:55+00:00) Unlocking the Power of Python Multiprocessing. Retrieved from https://www.scien.cx/2025/01/15/unlocking-the-power-of-python-multiprocessing/
Please log in to upload a file.
There are no updates yet.
Click the Upload button above to add an update.