- A finite set of states
- Transfer probability: When at step, the probability of the system sits at the state is , and then the next step, the probability of the system sits at the state is:
Writing down the matrix form, it looks like:
- Positivity:
- Conservation of probability:
- Typically,
- When the system reaches its equilibrium, there is a particular distribution , fulfilling
- Ergodicity: after finite number of steps, the system from any state can evolve to any final state . In Markov chain, it’s also called irreducible.
- can be diagonalizable. (Here needs to be regular or positive: , ) And for each eigenvalue of , there exists at least one right eigenvector and one left eigenvector.[1](The matrix must be good enough, otherwise there are plenty of counter examples, such as in website.[2])
Theorem 1: The matrix has at least one right eigenvector with eigenvalue one.Proof: has a left eigenvector with eigenvalue one -- the vector all of whose components are one:Hence must have an eigenvalue equal to one, and hence (property 6) it must also have a right eigenvector with eigenvalue one.Theorem 2: Any right eigenvector with eigenvalue different from one must have components that sum to zero.Proof: is a right eigenvector, .Hence So, there are only two probabilities: or Theorem 3: (Perron-Frobenius theorem) Let be an matrix with all positive matrix elements. Then has a positive eigenvalue , of multiplicity one, whose corresponding right and left eigenvectors have all positive components. Furthermore, any other eigenvalue of must be smaller, .Proof:[3] Look at the sphere and intersect it with the space . This gives a closed, bounded set . The matrix defines a map on because the entries of the matrix are non-negative. Because they are positive, is contained in the interior of . This map is a contraction, there exists such thatwhere is the geodesic sphere distance. Such a map has a unique fixed point by Banach's fixed point theorem.[4][5] This is the eigenvector we were looking for. We have seen now that on , there is only one eigenvector. Every other eigenvector must have a coordinate entry which is negative. Write for the vector with coordinates . The computation:shows that because is a vector with length smaller than , where is the length of . From with non-zero we get . The first "" which appears in the displayed formula is however an inequality for some if one of the coordinate entries is negative. Having established the proof is finished. Corollary 1 of Theorem 3: With theorem 2 and theorem 3, we can see the largest eigenvalue of is 1.Corollary 2 of Theorem 3: An ergodic Markov chain has a unique time-independent probability distribution .Corollary 3 of Theorem 3: One can find a complete set of right eigenvectors for , once the Markov chain matches the condition of detailed balance.Step 1: To prove there is a method which can transform the into a symmetric matrix.Step 2: And we have already known from linear algebra, every real symmetric matrices has a complete basis of eigenvectors.[6]We record the eigen system as:Step 3: Then we can prove there is a transform which can change the eigenvectors of into eigenvectors of :So we can see are the eigenvectors of , and also form a complete set.
Distraction: detailed balance
If there is some probability distribution satisfying:for each state and .Once the system satisfies detailed balance, it also satisfies global balance (thermal equilibrium).Reverse derivation needs some extra condition (detailed balance is stronger than equilibrium condition):- the chain is irreducible
or 2) with initial distribution is and the process is reversible.We construct a reverse chain:So if the chain is reversible, we can have detailed balance. And there is also a good reference of distinguishing conditional probability and joint probability.[7]A discrete dynamical system with a finite number of states can be guaranteed to converge to an equilibrium distribution if the computer algorithm:1. is Markovian (has no memory)2. is ergodic (can reach everywhere and is acyclic)3. satisfies detailed balance.Proof: detailed balance has a complete set of eigenvectors . Since our algorithm is ergodic there is only one right eigenvector with eigenvalue one, which we can chose to be the stationary distribution ; all the other eigenvalues have . Decompose the initial condition: J. P. Sethna, Statistical mechanics, Chpt. 8参考:- ^https://www.math./~freiwald/309markov.pdf
- ^http://people.math./~knill/teaching
- ^http://people.math./~knill/teaching/math19b_2011/handouts/lecture34.pdf
- ^http://www./~md3405/FPT.pdf
- ^http://www./~siegelj/SetTheoryandTopology/BanachFPT.html
- ^https://people.eecs./~ananth/223Spr07/ee223spr07_lec8.pdf
|