The conventional perspective on Markov chains considers decision problems concerning the probabilities of temporal properties being satisfied by traces of visited states. However, consider the following query made of a stochastic system modelling the weather: given the conditions today, will there be a day with less than 50% chance of rain? The conventional perspective is ill-equipped to decide such problems regarding the evolution of the initial distribution. The alternate perspective we consider views Markov Chains as distribution transformers: the focus is on the sequence of distributions on states at each step, where the evolution is driven by the underlying stochastic transition matrix. More precisely, given an initial distribution vector μ, a stochastic update transition matrix M, we ask whether the ensuing sequence of distributions (μ, Mμ, M2μ, …) satisfies a given temporal property. This is a special case of the model-checking problem for linear dynamical systems, which is not known to be decidable in full generality. The goal of this article is to delineate the classes of instances for which this problem can be solved, under the assumption that the dynamics is governed by stochastic matrices.
Submitted, 2024. 19 pages.
PDF
© 2024 Rajab Aghamov, Christel Baier, Toghrul Karimov, Joris Nieuwveld, Joël
Ouaknine, Jakob Piribauer, and Mihir Vahanwala.