<< Chapter < Page | Chapter >> Page > |
The transition matrix for [link] is given below.
Write the transition matrix from a) Monday to Thursday, b) Monday to Friday.
In writing a transition matrix from Monday to Thursday, we are moving from one state to another in three steps. That is, we need to compute .
b) To find the transition matrix from Monday to Friday, we are moving from one state to another in 4 steps. Therefore, we compute .
It is important that the student is able to interpret the above matrix correctly. For example, the entry , states that if Professor Symons walked to school on Monday, then there is probability that he will bicycle to school on Friday.
There are certain Markov chains that tend to stabilize in the long run, and they are the subject of [link] . It so happens that the transition matrix we have used in all the above examples is just such a Markov chain. The next example deals with the long term trend or steady-state situation for that matrix.
Suppose Professor Symons continues to walk and bicycle according to the transition matrix given in [link] . In the long run, how often will he walk to school, and how often will he bicycle?
As mentioned earlier, as we take higher and higher powers of our matrix , it should stabilize.
Therefore, in the long run, Professor Symons will walk to school of the time and bicycle of the time.
When this happens, we say that the system is in steady-state or state of equilibrium. In this situation, all row vectors are equal. If the original matrix is an by matrix, we get n vectors that are all the same. We call this vector a fixed probability vector or the equilibrium vector . In the above problem, the fixed probability vector is . Furthermore, if the equilibrium vector is multiplied by the original matrix , the result is the equilibrium vector . That is,
or,
At the end of [link] , we took the transition matrix and started taking higher and higher powers of it. The matrix started to stabilize, and finally it reached its steady-state or state of equilibrium . When that happened, all the row vectors became the same, and we called one such row vector a fixed probability vector or an equilibrium vector . Furthermore, we discovered that .
In this section, we wish to answer the following four questions.
Does every Markov chain reach the state of equilibrium?
Answer: A Markov chain reaches a state of equilibrium if it is a regular Markov chain. A Markov chain is said to be a regular Markov chain if some power of it has only positive entries.
Notification Switch
Would you like to follow the 'Applied finite mathematics' conversation and receive update notifications?