Fundamentals 10 min read

Understanding Processes, Threads, Concurrency, and Process Pools

This article explains the concepts of processes and threads, their differences, interaction methods, the relationship between them, the three execution states of a task, the distinctions among parallel, concurrent, and serial execution, and the purpose and operation of process pools in operating systems.

Laravel Tech Community
Laravel Tech Community
Laravel Tech Community
Understanding Processes, Threads, Concurrency, and Process Pools

1. Process

A process is an instance of a program that has been loaded into memory and is being executed; it is the smallest unit of resource allocation and serves as a container for threads.

When a program runs, the running instance is called a process.

Process is the allocation of a memory space for data; it is the minimal resource management unit.

Processes communicate with each other via TCP/IP ports.

Program vs. Process

A program is a static collection of code and data stored on disk, while a process is the dynamic execution of that program, having a lifecycle that ends when the program terminates.

2. Thread

A thread is the smallest unit of execution that the operating system can schedule; it exists within a process and represents a single sequential flow of control.

Multiple threads can run concurrently within the same process, sharing the same memory space for interaction.

A thread is a pipeline of a process that executes the program without requesting additional resources; it is the smallest execution unit of a program.

3. Relationship Between Process and Thread

Example: Opening a chat application creates a process; opening features like “Space”, “Scan”, or “Settings” creates threads within that process.

Thus, a process contains threads, and threads are subsets of a process.

4. Summary

Process : an executing application; the minimal resource allocation unit .

Thread : the basic unit of CPU time allocation; the minimal program execution unit .

A process allocates a large memory region, while a thread only needs a small stack.

Every program has at least one process, and every process has at least one thread.

Threads can create and destroy other threads, and multiple threads in the same process can run concurrently.

Parallel, Concurrent, and Serial Execution

Concurrency: multiple tasks appear to run simultaneously (pseudo‑parallelism) using time‑slicing on a single core.

Implemented with multitasking techniques on a single‑core CPU.

Parallelism: multiple tasks truly run at the same time, requiring multiple CPU cores.

Parallel execution is only possible on multi‑core systems; otherwise, only concurrency is achievable.

Serial: tasks run one after another, completing one before starting the next.

Task Execution States

A running process cycles through three states: Ready, Running, and Blocked.

Ready

A process that has all required resources except the CPU is in the Ready state, waiting for CPU time; ready processes are placed in a ready queue.

Running

When the scheduler assigns the CPU to a Ready process, it enters the Running state. Single‑core systems have only one Running process, while multi‑core systems can have several.

Blocked (Sleep)

A Running process may become Blocked when it must wait for I/O, a higher‑priority task, or other events, causing the OS to revoke its CPU time.

State Transitions

Processes repeatedly transition between Ready, Running, and Blocked states as resources become available or tasks complete.

Synchronous vs. Asynchronous Communication

Synchronous

The sender waits for a response before sending the next message.

Two programs are tightly coupled; one thread blocks until the other completes.

Asynchronous

The sender does not wait for a response and can continue sending messages.

Threads operate independently without waiting for each other.

Examples

Synchronous: You hear a call to eat and immediately go; if you don’t hear it, the caller repeats until you respond.

Asynchronous: You receive a call to eat, but you may go immediately or later; sending a message is like asynchronous communication—you can message one person and then another without waiting.

Process Pool

What Is a Process Pool?

A process pool consists of a fixed number of pre‑created “resource processes” managed by a “manager process”.

Why Use a Process Pool?

Creating and destroying processes for each task is costly; a pool reuses processes, reducing overhead and improving concurrency.

How It Works

Define a pool with a set number of processes.

When a task arrives, an idle process from the pool handles it.

After completion, the process returns to the pool instead of terminating.

If all pool processes are busy, incoming tasks wait until a process becomes free.

The fixed pool size limits the maximum concurrent processes, simplifying OS scheduling and achieving better concurrency.

Resource and Manager Processes

Resource processes are idle processes ready to execute tasks; the manager process creates, assigns, and recycles them, communicating via IPC mechanisms such as signals, semaphores, message queues, or pipes.

concurrencythreadOperating SystemProcessprocess pool
Laravel Tech Community
Written by

Laravel Tech Community

Specializing in Laravel development, we continuously publish fresh content and grow alongside the elegant, stable Laravel framework.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.