[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: [tlaplus] Re: Fairness in programming languages
Thank you, Markus AND William. Sorry.
On Wednesday, 10 November 2021 at 19:01:07 UTC-3 Jones Martins wrote:
Thank you, Markus.
I supposed it was mostly kernel level. Also, it's nice to remember about ReentrantLock; I'll read the documentation about it.
May I ask in your BlockingQueue spec, why do you need to prove Deadlock freedom in the sense that why is it necessary? Is it good practice to prove specs of its size?
On 11/10/21 8:34 AM, Willy Schultz wrote:
> Typically, whether or not liveness is ensured (i.e. via fairness) for a
> multi-threaded concurrent program (e.g. in C++) may be dependent on the
> policy of the OS scheduler. It may also, for example, be dependent on
> the scheduling/queuing policy implemented in the synchronization
> primitives you use (e.g. locks, condition variables, etc.). In general,
> though, I usually consider such fairness concerns in real world
> concurrent programs as things that are "under the hood" e.g. at the OS
> kernel level.
> This might vary from instance to instance, though. For example, if you
> have a program where threads are scheduled in user space, you may have
> more fine grained control over the scheduling policies. Also, in a
> distributed system, where separate programs are operating concurrently,
> fairness assumptions for each process may also become more explicit
> and/or controllable e.g. "if a process is non-faulty for long enough and
> has a message in its queue then it will eventually process the message"
> or something like that.
Java's ReentrantLock  is an example of a synchronization primitive
where fairness is exposed to the programmer. By the way, the (fair)
lock essentially implements the fairness constraint  of a high-level
You received this message because you are subscribed to the Google Groups "tlaplus" group.
To unsubscribe from this group and stop receiving emails from it, send an email to tlaplus+unsubscribe@xxxxxxxxxxxxxxxx.
To view this discussion on the web visit https://groups.google.com/d/msgid/tlaplus/f6e6a02a-5fca-41e1-94ac-dee0ca95d0d1n%40googlegroups.com.