Markov chain


Definition

We begin with a probability spaceMathworldPlanetmath (Ω,,). Let I be a countable set, (Xn:n) be a collectionMathworldPlanetmath of random variablesMathworldPlanetmath taking values in I, 𝐓=(tij:i,jI) be a stochastic matrix, and λ be a distributionPlanetmathPlanetmath. We call (Xn)n0 a Markov chainMathworldPlanetmath with initial distribution λ and transition matrix 𝐓 if:

  1. 1.

    X0 has distribution λ.

  2. 2.

    For n0, (Xn+1=in+1|X0=i0,,Xn=in)=tinin+1.

That is, the next value of the chain depends only on the current value, not any previous values. This is often summed up in the pithy phrase, “Markov chains have no memory.”

As a special case of (2) we have that (Xn+1=i|Xn=j)=tij whenever i,jI. The values tij are therefore called transition probabilities for the Markov chain.

Discussion

Markov chains are arguably the simplest examples of random processes. They come in discrete and continuous versions; the discrete version is presented above.

Title Markov chain
Canonical name MarkovChain
Date of creation 2013-03-22 12:37:32
Last modified on 2013-03-22 12:37:32
Owner Mathprof (13753)
Last modified by Mathprof (13753)
Numerical id 5
Author Mathprof (13753)
Entry type Definition
Classification msc 60J10
Related topic HittingTime
Related topic MarkovChainsClassStructure
Related topic MemorylessRandomVariable
Related topic LeslieModel
Defines Markov chain
Defines transition matrix
Defines transition probability