Markov chain

Explore definitions, synonyms, and language insights of Markov chain

Definitions

Noun
a Markov process for which the parameter is discrete time values