To better understand Markov chains, we need to introduce some definitions. The first definition concerns the accessibility of states from each other: If it is possible to go from state ii to state j, we say that state j is accessible from state i. In particular, we can provide the following definitions.
We say that state j is accessible from state ii, written as for some n. We assume every state is accessible from itself since |
i↔j means i→j and j→i. |
Communication is an equivalence relation. That means that
−−every state communicates with itself, i↔i;
−−if i↔j, then j↔i;
−−if i↔j and j↔k, then i↔k.
Therefore, the states of a Markov chain can be partitioned into communicating classes such that only members of the same class communicate with each other. That is, two states i and j belong to the same class if and only if i↔j.
Example 11.6
Consider the Markov chain shown in Figure 11.9. It is assumed that when there is an arrow from state i to state j, then pij>0. Find the equivalence classes for this Markov chain.
Figure 11.9 - A state transition diagram.
Solution
There are four communicating classes in this Markov chain. Looking at Figure 11.10, we notice that states 11 and 22 communicate with each other, but they do not communicate with any other nodes in the graph. Similarly, nodes 33 and 44 communicate with each other, but they do not communicate with any other nodes in the graph. State 55 does not communicate with any other states, so it by itself is a class. Finally, states 66, 77, and 88 construct another class. Thus, here are the classes:
A Markov chain is said to be irreducible if it has only one communicating class. As we will see shortly, irreducibility is a desirable property in the sense that it can simplify analysis of the limiting behavior.
A Markov chain is said to be irreducible if all states communicate with each other. |
Looking at Figure 11.10, we notice that there are two kinds of classes. In particular, if at any time the Markov chain enters Class 4, it will always stay in that class. On the other hand, for other classes this is not true. For example, if X0=1, then the Markov chain might stay in Class 1 for a while, but at some point, it will leave that class and it will never return to that class again. The states in Class 4 are called recurrent states, while the other states in this chain are called transient.
In general, a state is said to be recurrent if, any time that we leave that state, we will return to that state in the future with probability one. On the other hand, if the probability of returning is less than one, the state is called transient. Here, we provide a formal definition:
For any state i, we define fii=P(Xn=i, for some n≥1|X0=i). State i is recurrent if fii=1, and it is transient if fii<1. |
It is relatively easy to show that if two states are in the same class, either both of them are recurrent, or both of them are transient. Thus, we can extend the above definitions to classes. A class is said to be recurrent if the states in that class are recurrent. If, on the other hand, the states are transient, the class is called transient. In general, a Markov chain might consist of several transient classes as well as several recurrent classes.
Consider a Markov chain and assume X0=i. If ii is a recurrent state, then the chain will return to state ii any time it leaves that state. Therefore, the chain will visit state ii an infinite number of times. On the other hand, if ii is a transient state, the chain will return to state ii with probability fii<1. Thus, in that case, the total number of visits to state ii will be a Geometric random variable with parameter 1−fii.
|
Example 11.7
Show that in a finite Markov chain, there is at least one recurrent class.
n≥max{M1,M2,⋯,Mr},
then Xn cannot be equal to any of the states 1,2,⋯,r. This is a contradiction, so we conclude that there must be at least one recurrent state, which means that there must be at least one recurrent class.Consider the Markov chain shown in Figure 11.11. There is a periodic pattern in this chain. Starting from state 0, we only return to 0 at times n=3,6,⋯. In other words, , if nn is not divisible by 3. Such a state is called a periodic state with period d(0)=3.
The period of a state ii is the largest integer d satisfying the following property: =0, whenever nn is not divisible by d. The period of ii is shown by d(i). If =0, for all n>0, then we let d(i)=∞.
−−If d(i)>1, we say that state ii is periodic.
−−If d(i)=1, we say that state ii is aperiodic.
You can show that all states in the same communicating class have the same period. A class is said to be periodic if its states are periodic. Similarly, a class is said to be aperiodic if its states are aperiodic. Finally, a Markov chain is said to be aperiodic if all of its states are aperiodic.
If i↔j, then d(i)=d(j). |
Why is periodicity important? As we will see shortly, it plays a roll when we discuss limiting distributions. It turns out that in a typical problem, we are given an irreducible Markov chain, and we need to check if it is aperiodic.
How do we check that a Markov chain is aperiodic? Here is a useful method. Remember that two numbers mm and ll are said to be co-prime if their greatest common divisor (gcd) is 1, i.e., gcd(l,m)=1. Now, suppose that we can find two co-prime numbers l and mm such that and . That is, we can go from state i to itself in ll steps, and also in mm steps. Then, we can conclude state i is aperiodic. If we have an irreducible Markov chain, this means that the chain is aperiodic. Since the number 1 is co-prime to every integer, any state with a self-transition is aperiodic.
Consider a finite irreducible Markov chain Xn:
|
Example 11.8.
Consider the Markov chain in Example 11.6.
Solution
556 videos|198 docs
|
1. What is the classification of states in mathematical sciences? |
2. What are discrete states in mathematical sciences? |
3. How are continuous states defined in mathematical sciences? |
4. What are finite states in mathematical sciences? |
5. What is the difference between ergodic and non-ergodic states in mathematical sciences? |
556 videos|198 docs
|
|
Explore Courses for Mathematics exam
|