Library / English Dictionary |
MARKOV PROCESS
Pronunciation (US): | (GB): |
I. (noun)
Sense 1
Meaning:
A simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state
Synonyms:
Markoff process; Markov process
Classified under:
Nouns denoting natural processes
Hypernyms ("Markov process" is a kind of...):
stochastic process (a statistical process involving a number of random variables depending on a variable parameter (which is usually time))
Hyponyms (each of the following is a kind of "Markov process"):
Markoff chain; Markov chain (a Markov process for which the parameter is discrete time values)