On Wednesday, October 16, 2024 at 9:04:24 AM UTC-4 Benjamin Parsons-Willis wrote:
Hi all!
I am working with TLA+/pluscal, although potentially this is not the appropriate tool for what I want to test.
I am creating a specification for a system to achieve consistency OR report of failure, in a constrained environment where a total ordering of events (mutations/assignments) cannot always be derived.
For instance, we may receive events in time, that look like:Event1: assignment of state to value 3; occurred between t=2 and t=3
Event2: assignment of state to value 2; occurred between t=2 and t=4
With the above, it would not be possible to derive an output value, and this would raise a failure condition.
Regardless; to model this I need to be able to model the passage of time.
Events are generated by 'polling' processes, which generate events at a fixed time-incremenent.
A synchronised clock can be assumed for all processes.
I have run into difficulty representing this clock variable without resulting in infinite state space.
Having read about VIEW, I understand that I can instruct TLC to ignore the global clock, but this clock leaks into the specification regardless, as the clock is included in events.
I have another process 'coordinator' which will take these events, establish a partial ordering, and then resolve all ordered cases (or mergable cases like unordered mutations), to either establish a consensus value, or note that consensus was unable to be reached (overlapping assignments)
Is TLA the best tool for this task?
If so, what are some approaches to consider? I have looked at Real-Time is easy, but I have not been able to match it to my use case, perhaps I am not understanding?
Thanks for any insight you could offer.