Markoff chain

Explore definitions, synonyms, and language insights of Markoff chain

Definitions

Noun
a Markov process for which the parameter is discrete time values