definition of wait-free (referring to parallel programming)

Posted by tecuhtli on Stack Overflow See other posts from Stack Overflow or by tecuhtli
Published on 2010-05-17T18:18:12Z Indexed on 2010/05/17 18:41 UTC
Read the original article Hit count: 109

Filed under:
|

In Maurice Herlihy paper "Wait-free synchronization" he defines wait-free:

"A wait-free implementation of a concurrent data object is one that guarantees that any process can complete any operation in a finite number of steps, regardless the execution speeds on the other processes." www.cs.brown.edu/~mph/Herlihy91/p124-herlihy.pdf

Let's take one operation op from the universe.

(1) Does the definition mean: "Every process completes a certain operation op in the same finite number n of steps."?

(2) Or does it mean: "Every process completes a certain operation op in any finite number of steps. So that a process can complete op in k steps another process in j steps, where k != j."?

Just by reading the definition i would understand meaning (2). However this makes no sense to me, since a process executing op in k steps and another time in k + m steps meets the definition, but m steps could be a waiting loop. If meaning (2) is right, can anybody explain to me, why this describes wait-free?

In contrast to (2), meaning (1) would guarantee that op is executed in the same number of steps k. So there can't be any additional steps m that are necessary e.g. in a waiting loop.

Which meaning is right and why?

Thanks a lot, sebastian

© Stack Overflow or respective owner

Related posts about terminology

Related posts about concurrency