0% found this document useful (0 votes)
13 views

AP2

The document contains a series of problems related to applied probability, focusing on Markov chains, exponential matrices, and continuous-time processes. It includes calculations for transition rates, expected times, and invariant distributions, as well as practical applications such as customer queues and random walks on graphs. Each problem requires a mathematical approach to derive solutions and insights into the behavior of stochastic processes.

Uploaded by

derekdereklch
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views

AP2

The document contains a series of problems related to applied probability, focusing on Markov chains, exponential matrices, and continuous-time processes. It includes calculations for transition rates, expected times, and invariant distributions, as well as practical applications such as customer queues and random walks on graphs. Each problem requires a mathematical approach to derive solutions and insights into the behavior of stochastic processes.

Uploaded by

derekdereklch
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

Applied probability, Lent 2025. [email protected].

uk

Example Sheet 2

1. For any r × r matrix A, show that the exponential eA is a finite dimensional matrix and
if A1 and A2 commute, then
eA1 +A2 = eA1 · eA2 .
Calculate P (t) = etQ where  
−2 1 1
Q =  1 −1 0  .
2 1 −3

2. Let (Xt )t≥0 be a Markov chain on the integers with transition rates qi,i+1 = λqi , qi,i−1 =
µqi , and qi,j = 0 if |j − i| ≥ 2, where λ + µ = 1 and qi > 0 for all i. Write down the Q-matrix
of this chain and draw the diagram corresponding to this chain. Then
(a) Find the probability, starting from 0, that Xt hits i, for any i ≥ 1.
(b) Compute the expected total time spent in state i, starting from 0.
3. A salesman flies around between Amsterdam, Berlin, and Cambridge as follows. She stays
in each city for an exponential amount of time with mean 1/2 month if the city is B or C, but
with mean 1/4 month if the city is A. From A she goes to B or C with probability 1/2 each;
from B she goes to A with probability 3/4 and to C with probability 1/4; from C she always
goes back to A. (a) Find the limiting fraction of time she spends in each city. (b) What is
her average number of trips each year from Berlin to Amsterdam?
4. Consider a fleet of N buses. Each bus breaks down independently at rate µ, when it is
sent to the depot for repair. The repair shop can only repair one bus at a time and each bus
takes an exponential time of parameter λ to repair. Find the equilibrium distribution of the
number of buses in service.
5. N frogs are playing near a pond. When they are in the sun they are too hot so jump
in the pond at rate 1. When they are in the pond they are too cold so jump out at rate
2. Write down the Q-matrix for Xt , the number of frogs in the sun at time t. Show that
πi = Ni pi (1 − p)N −i is an invariant distribution for this chain, for some suitable value of
p ∈ (0, 1). Explain.
6. Customers arrive at a certain queue in a Poisson process of rate λ. The single ‘server’ has
two states A and B, state A signifying that he is ‘in attendance’ and state B that he is having
a tea-break. Independently of how many customers are in the queue, he fluctuates between
these states as a Markov chain Y on {A, B} with Q-matrix

−α α
.
β −β

The total service time of any customer is exponentially distributed with parameter µ and is
independent of the chain Y and of the service times of other customers.
Show that there exists θ ∈ (0, 1], such that for all k ≥ 0

P(X hits A0 |X0 = Ak ) = θk

1
where Ak denotes the state of the chain where there are k customers and the server is in
state A.
Show that (θ − 1)f (θ) = 0, where

f (θ) = λ2 θ2 − λ(λ + µ + α + β)θ + (λ + β)µ .

By considering f (1) or otherwise, prove that X is transient if µβ < λ(α + β), and explain
why this is intuitively obvious.
7. Consider the continuous-time Markov chain (Xt )t≥0 on Z with non-zero transition rates

qi,i−1 = i2 + 1, qii = −2(i2 + 1), qi,i+1 = i2 + 1.

Is (Xt )t≥0 recurrent? Is (Xt )t≥0 positive recurrent?

8. Consider the continuous-time Markov chain (Yt )t≥0 on Z with non-zero transition rates

qi,i−1 = 3|i| , qii = −3|i|+1 , qi,i+1 = 2 · 3|i| .

Show that Y is transient. Show that Y has an invariant distribution. Explain.

9. Calls arrive at a telephone exchange as a Poisson process of rate λ, and the lengths of
calls are independent exponential random variables of parameter µ. Assuming that infinitely
many telephone lines are available, set up a Markov chain model for this process.
Show that for large t the distribution of the number of lines in use at time t is approxi-
mately Poisson with mean λ/µ.
Show that the expected number of lines in use at time t, given that n are in use at time 0,
is ne−µt + λ(1 − e−µt )/µ

10. Let G = (V, E) be a finite weighted graph: so each edge e = (x, y) is endowed with a
weight wxy ≥ 0. We assume that wxy = wyx , so the weights are undirected. Consider the
Simple Random Walk (Xt , t ≥ 0) on G, which is the Markov chain on V with transition rates
qxy = wxy for x ̸= y.
Pick a vertex v ∈ V uniformly at random, and given v, consider two independent simple
random walks (Xs , s ≥ 0) and (Xs′ , s ≥ 0) started from v. Show that d(Xt , Xt′ ) has the
same distribution as d(v, X2t ), where d(x, y) denote the graph distance between x and y: the
smallest number of edges separating x from y.
11. Let Q be irreducible and let (Xt , t ≥ 0) be the corresponding continuous-time Markov
chain. Fix h > 0 and let Zn = Xnh , n ≥ 0. Show that Z is a discrete-time Markov chain and
give its transition matrix. Show that X is recurrent if and only if Z is recurrent.
Show that X is positive recurrent if and only if Z is positive recurrent in the non-explosive
case. Do they have the same invariant distribution?
12. Let X be a non-explosive irreducible
P Markov chain on S and let π be a reversible invariant
2
distribution. Let E = {f : S → R : x f (x)π(x) < ∞} and define an inner product on E
by setting X
(f |g)π = f (x)g(x)π(x).
x

2
(a) Let qt (x, y) = pt (x, y)/πy and let qtx be the function defined by qtx (y) = qt (x, y). Show
that qt (x, y) = qt (y, x). Show that

qt+s (x, y) = (qtx |qsy )π

and deduce that qtx ∈ E.


(b) Define a quadratic form
1X
E(f, g) = πx qxy (f (y) − f (x))(g(y) − g(x)),
2 x,y

whenever this is defined. The quantity E(f, g) is sometimes called the Dirichlet form of the
Markov chain, and E(f, f ) is called the Dirichlet energy of the function f . Show that

(Qf |g)π = −E(f, g).

(In particular this shows Q is self-adjoint).


(c) Fix x ∈ S and let ψ(t) = q2t (x, x). Deduce that ψ is decreasing and then convex.

You might also like