[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [tlaplus] TLC on the GPU



You can find the details of how TLC model checking works in Section 14.3.1 of Specifying Systems: https://lamport.azurewebsites.net/tla/book-02-08-08.pdf.

I think the challenge is that TLC effectively "brute-forces" the state space exploration by performing a graph search. GPU instructions are generally optimized for parallel arithmetic, so I don't think traditional graph search algorithms translate neatly to optimizations with GPU instructions.

There may be a way to rephrase algorithms though to make them amenable to vector processing. I found a couple papers / presentations on the topic:

https://research.nvidia.com/sites/default/files/pubs/2011-08_High-Performance-and/BFS%20TR.pdf
https://on-demand.gputechconf.com/gtc/2012/presentations/S0600-Scalable-GPU-Graph-Traversal.pdf

Also, if you are just interested in performance and aren't tied to using the GPU as a solution for that, there is the Apalache model checker as well: https://apalache.informal.systems/. This converts model-checking to an SMT constraint problem which can be solved very quickly by SMT solvers such as Z3. I haven't used it, but have always thought this project looks very interesting.


On Sun, Feb 6, 2022 at 1:07 PM <sglaser@xxxxxxxxxxx> wrote:
Let me know if I can help with this.

I don’t know enough about the TLA+ internals to know what makes sense here.

I’m an occasional TLA+ user and work for Nvidia. 

Steve Glaser
sglaser@xxxxxxxxxx
On Feb 6, 2022, 8:46 AM -0800, Jones Martins <jonesmvc@xxxxxxxxx>, wrote:
Hi everyone,

Would a GPU model-checker be useful for large models in TLA⁺? For example, when compared to multicore CPU systems, GPUs are much faster in certain scenarios. Would model-checking be one of them?
I'm assuming that, in order to model-check, one would need to bring the parser to the GPU as well, so that reading from RAM  doesn't become a bottleneck.

Best,

Jones

--
You received this message because you are subscribed to the Google Groups "tlaplus" group.
To unsubscribe from this group and stop receiving emails from it, send an email to tlaplus+unsubscribe@xxxxxxxxxxxxxxxx.
To view this discussion on the web visit https://groups.google.com/d/msgid/tlaplus/cc2b665d-9863-485e-bbac-04c652f0e92fn%40googlegroups.com.

--
You received this message because you are subscribed to the Google Groups "tlaplus" group.
To unsubscribe from this group and stop receiving emails from it, send an email to tlaplus+unsubscribe@xxxxxxxxxxxxxxxx.
To view this discussion on the web visit https://groups.google.com/d/msgid/tlaplus/9d6b38bd-c540-4d7e-aad0-33ebc1057de3%40Spark.

--
You received this message because you are subscribed to the Google Groups "tlaplus" group.
To unsubscribe from this group and stop receiving emails from it, send an email to tlaplus+unsubscribe@xxxxxxxxxxxxxxxx.
To view this discussion on the web visit https://groups.google.com/d/msgid/tlaplus/CAEPrOmLAZapKg3Nhs%3DhTBBAK7xvOGznfC1And7o2v%3D9dUpPtsw%40mail.gmail.com.