0% found this document useful (0 votes)
103 views

MA 2213 - Tutorial 3

This document contains 11 problems related to Markov processes. The problems involve analyzing transition probability matrices, determining whether processes are Markov chains, calculating multi-step transition probabilities, and finding steady state distributions. A variety of Markov processes are considered, including models of computer systems, weather patterns, machines with working and non-working states, and processes involving balls in an urn.

Uploaded by

Manas Ambati
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
103 views

MA 2213 - Tutorial 3

This document contains 11 problems related to Markov processes. The problems involve analyzing transition probability matrices, determining whether processes are Markov chains, calculating multi-step transition probabilities, and finding steady state distributions. A variety of Markov processes are considered, including models of computer systems, weather patterns, machines with working and non-working states, and processes involving balls in an urn.

Uploaded by

Manas Ambati
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

MA 2213: Problem Sheet 3: Markov Process

1. A small computer lab has 2 terminals. The number of students working in this lab is
recorded at the end of every hour. A lab assistant notices the following pattern:
(i) If there are 0 or 1 students in the lab, then the number of students in 1 hour has a
50 − 50% chance to increase by 1 or remain unchanged.
(ii) If there are 2 students in the lab, then the number of students in 1 hour has a
50 − 50% chance to decrease by 1 or remain unchanged.
(a) Write the transition probability matrix for this Markov chain.
(b) Is this a regular Markov chain? Justify your answer.
(c) Suppose there is nobody in the lab at 7 am. What is the probability of nobody
working in the lab at 10 am?

2. A computer system can operate in two different modes. Every hour, it remains in
the same mode or switches
to a different mode according to the transition probability
0.4 0.6
matrix .
0.6 0.4
(a) Compute the 2-step transition probability matrix.
(b) If the system is in Mode I at 5:30 pm, what is the probability that it will be in
Mode I at 8:30 pm on the same day?

3. The pattern of sunny and rainy days on planet Rainbow is a homogeneous Markov chain
with two states. Every sunny day is followed by another sunny day with probability
0.8. Every rainy day is followed by another rainy day with probability 0.6. Compute
the probability that April 1 next year is rainy on Rainbow.

4. A computer device can be either in a busy mode (state 1) processing a task, or in an


idle mode (state 2), when there are no tasks to process. Being in a busy mode, it can
finish a task and enter an idle mode any minute with the probability 0.2. Being in an
idle mode, it receives a new task any minute with the probability 0.1 and enters a busy
mode. The initial state X0 is idle. Let Xn be the state of the device after n minutes.
(a) Find the distribution of X2 .
(b) Find the steady-state distribution of Xn .
 
0.3 · · · 0
5. A Markov chain has the transition probability matrix P =  0 0 · · · .
1 ··· ···
(a) Fill in the blanks.
(b) Show that this is a regular Markov chain.
(c) Compute the steady-state probabilities.

1
6. A Markov chain has 3 possible states: A, B, and C. Every hour, it makes a transition
to a different state. From state A, transitions to states B and C are equally likely.
From state B, transitions to states A and C are equally likely. From state C, it always
makes a transition to state A. Find the steady-state distribution of states.

7. An urn initially contains five black balls and five white balls. The following experiment
is repeated indefinitely: A ball is drawn from the urn; if the ball is white, it is put back
in the urn, otherwise it is left out. Let Xn be the number of black balls remaining in
the urn after n draws from the urn. Assume n ≥ 5.
(a) Is Xn a Markov process? If so, find the appropriate transition probabilities.
(b) Do the transition probabilities depend on n?

8. Let Xn be the Bernoulli iid process, and let Yn be given by Yn = Xn + Xn−1 . Consider
the vector process defined by Zn = (Xn , Xn−1 ).
(a) Show that Yn is not a Markov process.
(b) Show that Zn is a Markov process.

9. Show that the following autoregressive process is a Markov process:

Yn = rYn−1 + Xn , Y0 = 0,

where Xn is an iid process. Find the transition pdf if Xn is an iid Gaussian sequence.

10. A certain part of a machine can be in two states: working or undergoing repair. A
working part fails during the course of a day with probability α. A part undergoing
repair is put into working order during the course of a day with probability β. Let Xn
be the state of the part.
(a) Show that Xn is a two-state Markov chain and give its one-step transition proba-
bility matrix P .
(b) Find the n-step transition probability matrix P n .
(c) Find the steady state probability for each of the two states.

11. A machine consists of two parts that fail and are repaired independently. A working
part fails during any given day with probability a. A part that is not working is re-
paired by the next day with probability b. Let Xn be the number of working parts in
day n.
(a) Show that Xn is a three-state Markov chain and give its one-step transition prob-
ability matrix P .
(b) Show that the steady state pmf π is binomial with parameter p = b/(a + b).

You might also like