Search:   Dictionary All Posts
Store Your Knowledge at the Brain Bank!

mark off Definition

Cancel
mark off
adjective(Mar-kov-ian)
of, relating to, or resembling a Markov process or Markov chain especially by having probabilities defined in terms of transition from the possible existing states to other states
mark off
noun(Mar-kov-ian)
a usually discrete stochastic process (such as a random walk) in which the probabilities of occurrence of various future states depend only on the present state of the system or on the immediately preceding state and not on the path by which the present state was achieved —called also Markoff chain
mark off
noun(Mar-kov-ian)
a stochastic process (such as Brownian motion) that resembles a Markov chain except that the states are continuous; also : markov chain —called also Markoff process

Words Related to mark off!

mark off Examples by Brain Bank Users