Online dictionaryOnline dictionary
Synonyms, antonyms, pronunciation

  Home
English Dictionary      examples: 'day', 'get rid of', 'New York Bay'




Markov chain   Listen
noun
Markov chain  n.  (Also spelled Markoff chain)  (Statistics) A random process (Markov process) in which the probabilities of discrete states in a series depend only on the properties of the immediately preceding state or the next preceeding state, independent of the path by which the preceding state was reached. It differs from the more general Markov process in that the states of a Markov chain are discrete rather than continuous. Certain physical processes, such as diffusion of a molecule in a fluid, are modelled as a Markov chain. See also random walk.






Collaborative International Dictionary of English 0.48








Advanced search
     Find words:
Starting with
Ending with
Containing
Matching a pattern  

Synonyms
Antonyms
Quotes
Words linked to  

only single words



Share |
Add this dictionary
to your browser search bar





Words linked to "Markov chain" :   Markoff process



Copyright © 2024 Dictionary One.com