Search:   Dictionary All Posts
Store Your Knowledge at the Brain Bank!

markov Definition

Cancel
markov
adjective(Mar-kov-ian)
of, relating to, or resembling a Markov process or Markov chain especially by having probabilities defined in terms of transition from the possible existing states to other states
markov
noun(Mar-kov-ian)
a usually discrete stochastic process (such as a random walk) in which the probabilities of occurrence of various future states depend only on the present state of the system or on the immediately preceding state and not on the path by which the present state was achieved —called also Markoff chain
markov
noun(Mar-kov-ian)
a stochastic process (such as Brownian motion) that resembles a Markov chain except that the states are continuous; also : markov chain —called also Markoff process

Words Related to markov!

markov Examples by Brain Bank Users