Skip to main content

Concurrency, Thread



Definition:
Concurrency is when two tasks overlap in execution.
In programming, these situations are encountered:
  • When two processes are assigned to different cores on a machine by the kernel, and both cores execute the process instructions at the same time.
  • When more connections arrive before earlier connections are finished, and need to be handled immediately.
More generally, it’s when we need to handle multiple tasks at about the same time.
That’s it. That’s all concurrency is. Parallel execution is when two tasks start at the same time, making it a special case of concurrent execution.

4 levels of  Concurrency:
-- Machine Instruction level
-- HLL(High-level language) statement level.
-- Unit level
-- Program level.

 There are two types of Concurrency:
1. Physical Concurrency
2. Logical Concurrency

Physical Concurrency:   Several program units from the same program literally execute simultaneously on different processors.

Logical Concurrency: Several program units from the same program are believed by the programmer and application software to execute simultaneously on different processors. In fact, the execution of the program is taking place in an interleaved fashion on a single processor.

For the programmer and language designer point of view, both kinds of concurrency are the same.

Thread: A thread is an independent set of values for the processor registers (for a single core). A thread is a sequence of such instructions within a program that can be executed independently of other code.

Multithreading: Multithreading is a process of executing multiple threads simultaneously. So at this point we will ask our selves what a thread is . A thread is a lightweight subprocess, a smallest unit of processing. It is a separate path of execution. It shares the memory area of process. So in short, Multithreading is a technique that allows a program or a process to execute many tasks concurrently. at the same time and parallel. It allows a process to run its tasks in parallel mode on a single processor system. A multithreading is a specialized form of multitasking.

Task Synchronization: A mechanism that controls the order in which tasks execute.
• Two kinds of synchronization


Cooperation Synchronization: Task A must wait for task B to complete some specific activity before task A can continue its execution, e.g., the producer-consumer problem


Competition synchronization: Two or more tasks must use some resource that cannot be
simultaneously used, e.g., a shared counter.


--------------------------
Task : unit of a program, similar to a subprogram that can be in concurrent execution with other units of same program

Synchronization : a mechanism that controls the order in which task execute.

Competition Synchronization : Synchronization between two task when both require a single resource that can't be used together.

Cooperation Synchronization : Synchronization in which a task must wait for other task to finish before it can begin or continue its execution

Liveness : condition if some event, like program completion is supposed to occur, it will occur eventually. That is, progress is actually made.

Race Condition : When two or more tasks are racing to use the shared resource and the behavior of the program depends on which task arrives first.

Deadlock : a condition when a program is trapped inside an endless loop due to error between task synchronization.

------------------------------  
 Tasks do not require any kind of synchronization: Tasks that are not depending on the outcome or output from other tasks. 

 Design Issues for Concurrency: 
  • Competition and cooperation synchronization.
  •  Controlling task scheduling 
  • How can an application influence task scheduling 
  • How and when tasks start and end execution 
  • How and when are tasks created. 
  • Asynchronous vs Synchronous interaction.
  • Allocating shared resources.
  • Race condition
  • Deadlock 
 Task descriptor:
A data structure that stores all of the relevant information about the execution state of a task.







Comments

Popular posts from this blog

DFS Performance Measurement

Completeness DFS is not complete, to convince yourself consider that our search start expanding the left subtree of the root for so long path (maybe infinite) when different choice near the root could lead to a solution, now suppose that the left subtree of the root has no solution, and it is unbounded, then the search will continue going deep infinitely, in this case , we say that DFS is not complete. Optimality  Consider the scenario that there is more than one goal node, and our search decided to first expand the left subtree of the root where there is a solution at a very deep level of this left subtree , in the same time the right subtree of the root has a solution near the root, here comes the non-optimality of DFS that it is not guaranteed that the first goal to find is the optimal one, so we conclude that DFS is not optimal. Time Complexity Consider a state space that is identical to that of BFS, with branching factor b, and we start the search fro...

Difference between a Singly LinkedList and Doubly LinkedList

Difference between abstract class and interface in OOP

Source: Amit Sethi In Interface: > All variables must be public static final. > No constructors. An interface can not be instantiated using the new operator.   > All methods must be public abstract .