Concurrency

Concurrency

Optional readings for this topic from Operating Systems: Principles and Practice: Chapter 5 up through Section 5.1.

Independent and Cooperating Threads

Independent thread: one that can't affect or be affected by the rest of the universe.

  • Its state isn't shared in any way by any other thread.
  • Deterministic: input state alone determines results.
  • Reproducible.
  • Can stop and continue with no impact on behavior (only time varies).

For independent threads, the scheduling order doesn't matter.

Cooperating threads: those that share state.

  • Behavior is nondeterministic: depends on relative execution sequence and cannot be predicted in advance.
  • Behavior may be irreproducible.

Example: one thread writes ABC to a console window, another writes CBA concurrently.

Why permit threads to cooperate?

Basic assumption for cooperating threads is that the order of some operations is irrelevant; certain operations are independent of certain other operations. Examples:

  • Thread 1: A = 1;
    Thread 2: B = 2;
  • Thread 1: A = B+1;
    Thread 2: B = 2*B;

Atomic Operations

Before we can say anything about the behavior of cooperating threads, we must know that some operation is atomic: it happens in its entirety without interruption. Cannot be observed in the middle.

  • Single-word references and assignments are atomic in almost all systems. A=B will always read a clean value for B and set a clean value for A (but not necessarily true for arrays or records).
  • If you don't have an atomic operation, you can't make one. Fortunately, hardware designers give us atomic ops.
  • If you have any atomic operation, you can use it to generate higher-level constructs and make parallel programs work correctly. This is the approach we'll take in this class.

The "Too Much Milk" Problem

The basic problem:

          Person A                       Person B
3:00      Look in fridge: no milk
3:05      Leave for store
3:10      Arrive at store                Look in fridge: no milk
3:15      Leave store                    Leave home
3:20      Arrive home, put milk away     Arrive at store
3:25                                     Leave store
3:30                                     Arrive home: too much milk!

What is the correct behavior?

More definitions:

  • Synchronization: using atomic operations to ensure correct operation of cooperating threads.
  • Critical section: a section of code, or collection of operations, in which only one thread may be executing at a given time. E.g. shopping.
  • Mutual exclusion: mechanisms used to create critical sections. Typically, mutual exclusion achieved with a locking mechanism: prevent others from doing something. For example, before shopping, leave a note on the refrigerator: don't shop if there is a note.

First attempt at computerized milk buying (assume atomic reads and writes):

1 if (milk == 0) {
2   if (note == 0) {
3     note = 1;
4     buy_milk();
5     note = 0;
6   }
7 }

Second attempt: change meaning of note. A buys if no note, B buys if there is a note.

Thread A
1 if (note == 0) {
2   if (milk == 0) {
3     buy_milk();
4   }
5   note = 1;
6 }

Thread B
1 if (note == 1) {
2   if (milk == 0) {
3     buy_milk();
4   }
5   note = 0;
6 }

Third attempt: use separate notes for A and B.

Thread A
1 noteA = 1;
2 if (noteB == 0) {
3   if (milk == 0) {
4     buy_milk();
5   }
6 }
7 noteA = 0;

Thread B
1 noteB = 1;
2 if (noteA == 0) {
3   if (milk == 0) {
4     buy_milk();
5   }
6 }
7 noteB = 0;

Fourth attempt: just need a way to decide who will buy milk when both leave notes (somebody has to hang around to make sure that the job gets done):

Thread B
1 noteB = 1;
2 while (noteA == 1) {
3   // do nothing;
4 }
5 if (milk == 0) {
6   buy_milk();
7 }
8 noteB = 0;
  • This solution works but has two disadvantages:
    • Asymmetric (and complex) code.
    • While B is waiting it is consuming resources (busy-waiting).
  • For a symmetric solution without busy-waiting, see Peterson's Algorithm.