Markoff process

Explore definitions, synonyms, and language insights of Markoff process

Definitions

Noun
a simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state

More General Terms


More Specific Terms