MTH6141 Random Processes, Spring 2012, Exercise Sheet 4
MTH6141 Random Processes, Spring 2012, Exercise Sheet 4
Please drop your answer to Question 3 only in the red box on the second oor
of the maths building by 13:00 on Tuesday 14th February 2012.
You are strongly encouraged to attempt at least Q1 and Q4 in addition to the
assessed question.
Please send comments and corrections to [email protected].
1. Look back to your answer to Question 1 on Sheet 3. For which of the ve
parts do we have that the expected proportion of time spent in a particular
state up to time t tends to a limit independent of the initial distribution? In
each case say what the expected proportion of time spent in state 1 is in the
long run, and whether it depends on the initial distribution.
2. Show that (the intercommunicates relation on states of a Markov chain)
is an equivalence relation on S.
3. Consider the Markov chain on state space {1, 2, 3, 4, 5, 6, 7} with transition
matrix
P =
_
_
_
_
_
_
_
_
_
_
1
3
1
2
0
1
6
0 0 0
4
5
0 0 0
1
5
0 0
0 0
1
2
0
1
2
0 0
0 0
1
4
1
4
0
1
4
1
4
0 0 0
1
3
2
3
0 0
0 0 0 0 0 0 1
0 0 0 0 0
1
4
3
4
_
_
_
_
_
_
_
_
_
_
(a) Determine the communicating classes of states of the chain.
(b) Calculate the rst return probabilities f
(t)
11
and the return probabil-
ity f
11
. Is state 1 transient or recurrent? If it is transient, calculate the
expected number of returns to 1.
(c) Calculate the rst return probabilities f
(t)
66
and the return probabil-
ity f
66
. Is state 6 transient or recurrent? If it is transient, calculate the
expected number of returns to 6.
(d) Without computing f
33
exactly, provide a simple argument that it is
strictly less than 1, and hence that state 3 is transient.
(e) Using the above information, and the solution to the 2-state chain from
the notes, make an educated guess at the (unique) equilibrium distri-
bution of the chain. Verify your guess.
1
4. Consider the Markov chain on state space {0, 1, 2, . . .} with transition prob-
abilities:
p
ij
=
_
_
1 if i = 0 and j = 1;
1
3
if i 1 and j = i + 1;
2
3
if i 1 and j = i 1;
0 otherwise.
Does this Markov chain have an equilibrium distribution? If so, what is it?
5. If a is a state of a Markov chain, we say that a is 2-periodic if p
(k)
aa
= 0 for
all odd k.
(a) Show that if i is 2-periodic and j is a element of the same communicating
class as i then j is 2-periodic. (We say that 2-periodicity is a class
property.)
[Hint: Assume that i is 2-periodic but j is not 2-periodic and deduce a
contradiction.]
(b) Can you nd a Markov chain that contains both a loop (that is a state s
with p
ss
> 0) and a 2-periodic state.
(c) Can you nd an irreducible Markov chain that contains both a loop and
a 2-periodic state?
6
. [This is a harder question. Actually, I dont know the exact answer!] Con-
sider a regular Markov chain on a nite state space S. Let m = min{t :
p
(t)
ij
> 0 for all i, j S}. We know from the denition of regular that m is
well dened. How large may m be as a function of n = |S|?
2