Math gifts

- Art Gallery -

In probability theory, an additive Markov chain is a Markov chain with an additive conditional probability function. Here the process is a discrete-time Markov chain of order m and the transition probability to a state at the next time is a sum of functions, each depending on the next state and one of the m previous states.


An additive Markov chain of order m is a sequence of random variables X1, X2, X3, ..., possessing the following property: the probability that a random variable Xn has a certain value xn under the condition that the values of all previous variables are fixed depends on the values of m previous variables only (Markov chain of order m), and the influence of previous variables on a generated one is additive,

\( {\displaystyle \Pr(X_{n}=x_{n}\mid X_{n-1}=x_{n-1},X_{n-2}=x_{n-2},\dots ,X_{n-m}=x_{n-m})=\sum _{r=1}^{m}f(x_{n},x_{n-r},r).} \)

Binary case

A binary additive Markov chain is where the state space of the chain consists on two values only, Xn ∈ { x1, x2 }. For example, Xn ∈ { 0, 1 }. The conditional probability function of a binary additive Markov chain can be represented as

\( {\displaystyle \Pr(X_{n}=1\mid X_{n-1}=x_{n-1},X_{n-2}=x_{n-2},\dots )={\bar {X}}+\sum _{r=1}^{m}F(r)(x_{n-r}-{\bar {X}}),} \)

\( {\displaystyle \Pr(X_{n}=0\mid X_{n-1}=x_{n-1},X_{n-2}=x_{n-2},\dots )=1-\Pr(X_{n}=1\mid X_{n-1}=x_{n-1},X_{n-2}=x_{n-2},\dots ).} \)

Here X ¯ {\displaystyle {\bar {X}}} {\bar {X}} is the probability to find Xn = 1 in the sequence and F(r) is referred to as the memory function. The value of \( {\bar {X}} \) and the function F(r) contain all the information about correlation properties of the Markov chain.
Relation between the memory function and the correlation function

In the binary case, the correlation function between the variables \( X_{n} \) and \( X_{k} \) of the chain depends on the distance n n-k only. It is defined as follows:

\( K(r)=\langle (X_{n}-{\bar {X}})(X_{{n+r}}-{\bar {X}})\rangle =\langle X_{n}X_{{n+r}}\rangle -{{\bar {X}}}^{2}, \)

where the symbol \( \langle \cdots \rangle \) denotes averaging over all n. By definition,
\( K(-r)=K(r),K(0)={\bar {X}}(1-{\bar {X}}). \)

There is a relation between the memory function and the correlation function of the binary additive Markov chain:[1]

K\( K(r)=\sum _{{s=1}}^{m}K(r-s)F(s),\,\,\,\,r=1,2,\dots \,. \)

See also

Examples of Markov chains

S.S. Melnyk, O.V. Usatenko, and V.A. Yampol’skii. (2006) "Memory functions of the additive Markov chains: applications to complex dynamic systems", Physica A, 361 (2), 405–415 doi:10.1016/j.physa.2005.06.083


A.A. Markov. (1906) "Rasprostranenie zakona bol'shih chisel na velichiny, zavisyaschie drug ot druga". Izvestiya Fiziko-matematicheskogo obschestva pri Kazanskom universitete, 2-ya seriya, tom 15, 135–156
A.A. Markov. (1971) "Extension of the limit theorems of probability theory to a sum of variables connected in a chain". reprinted in Appendix B of: R. Howard. Dynamic Probabilistic Systems, volume 1: Markov Chains. John Wiley and Sons
S. Hod; U. Keshet (2004). "Phase transition in random walks with long-range correlations". Phys. Rev. E. 70: 015104. arXiv:cond-mat/0311483. Bibcode:2004PhRvE..70a5104H. doi:10.1103/PhysRevE.70.015104.
S.L. Narasimhan; J.A. Nathan; K.P.N. Murthy (2005). "Can coarse-graining introduce long-range correlations in a symbolic sequence?". Europhys. Lett. 69 (1): 22. arXiv:cond-mat/0409042. Bibcode:2005EL.....69...22N. doi:10.1209/epl/i2004-10307-2.
Ramakrishnan, S. (1981) "Finitely Additive Markov Chains", Transactions of the American Mathematical Society, 265 (1), 247–272 JSTOR 1998493

Undergraduate Texts in Mathematics

Graduate Texts in Mathematics

Graduate Studies in Mathematics

Mathematics Encyclopedia



Hellenica World - Scientific Library

Retrieved from ""
All text is available under the terms of the GNU Free Documentation License