Concurrency & Parallelism
Concurrency
Concurrency is a simulation of running tasks at the "same time".
It happens when tasks run on the same core, or different virtual/logical cores.
Thread
A thread is like the smallest execution unit.
Each thread executes a sequence of intructions and have a context, that stays in the CPU registers while the thread is executing in the CPU.
Time Sharing
Context Switch
Context switching is the switch of contexts, in such a speed, that it looks like they are running on the same time.
The machine offers a pre-defined amount of time (time-slice
) to each context to run.
Scheduler (O.S)
The o.s scheduler is the task that will manage the context switch, to remove, contexts that do not cooperate and hang on the CPU.
True Parallelism
Is when two tasks can run each on a separate physical core, meaning they will run at the same time.
They will run in parallel unless they share the same resource. (e.g file, printer, etc)
Multiple threads handling the same resource at the same time is messy (may lead to race conditions), so this is where Mutex comes to help.
Race Condition
Is when multiple threads compete to access the same resource.
Without knowing which will access first, it can lead to non-deterministic or unexpected results.
Mutex (Locks)
Is a flag to signal exclusivity on a resouce.
The thread that access first will signal the resource to be "used", so that subsequent threads access will wait until the resource is "free".
Dead Locks
Are when threads forget or don't handle the removal of the "used" flag from the resources.
So they will forever wait the flag clearance.
Blocking I/O
Is when a thread will stay running until it finishes its operation involving an I/O resource. (e.g writing to disk, or using network connections)
Assynchronous I/O deals with these blocks over I/O. (An event is emitted on the I/O finish, so this way the thread doesn't need to synchronouly wait for it)
Last updated