Hyper Dictionary

English Dictionary Computer Dictionary Video Dictionary Thesaurus Dream Dictionary Medical Dictionary


Search Dictionary:  

Meaning of MARKOV CHAIN

 
Computing Dictionary
 
 Definition: 

(Named after andrei markov) A model of sequences of events where the probability of an event occurring depends upon the fact that a preceding event occurred.

A markov process is governed by a Markov chain.

In simulation, the principle of the Markov chain is applied to the selection of samples from a probability density function to be applied to the model. simscript II.5 uses this approach for some modelling functions.

 

 

COPYRIGHT © 2000-2013 HYPERDICTIONARY.COM HOME | ABOUT HYPERDICTIONARY