0% found this document useful (0 votes)
5 views22 pages

3.Lecture 3-Random Variables

Chapter 2 introduces the concept of random variables, defining them as functions that assign real numbers to sample points in a random experiment. It discusses the properties and definitions of discrete and continuous random variables, including their probability mass functions and density functions, as well as their means and variances. The chapter also covers special distributions such as Bernoulli, binomial, Poisson, uniform, exponential, and normal distributions.

Uploaded by

henrynjoroge283
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views22 pages

3.Lecture 3-Random Variables

Chapter 2 introduces the concept of random variables, defining them as functions that assign real numbers to sample points in a random experiment. It discusses the properties and definitions of discrete and continuous random variables, including their probability mass functions and density functions, as well as their means and variances. The chapter also covers special distributions such as Bernoulli, binomial, Poisson, uniform, exponential, and normal distributions.

Uploaded by

henrynjoroge283
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 22

Chapter 2

2.1 INTRODUCTION
In this chapter, the concept of a random variable is introduced. The main purpose of using a
random variable is so that we can define certain probability functions that make it both convenient
and easy to compute the probabilities of various events.

2.2 RANDOM VARIABLES


A. Definitions:
Consider a random experiment with sample space S. A random variable X(c) is a single-valued
real function that assigns a real number called the value of X([) to each sample point [ of S. Often, we
use a single letter X for this function in place of X(5) and use r.v. to denote the random variable.
Note that the terminology used here is traditional. Clearly a random variable is not a variable at
all in the usual sense, and it is a function.
The sample space S is termed the domain of the r.v. X, and the collection of all numbers [values
of X ( [ ) ] is termed the range of the r.v. X. Thus the range of X is a certain subset of the set of all real
numbers (Fig. 2-1).
Note that two or more different sample points might give the same value of X(0, but two differ-
ent numbers in the range cannot be assigned to the same sample point.

x (0 R

Fig. 2-1 Random variable X as a function.

EXAMPLE 2.1 In the experiment of tossing a coin once (Example 1.1), we might define the r.v. X as (Fig. 2-2)
X(H) = 1 X(T)=0

Note that we could also define another r.v., say Y or 2, with


Y(H)= 0, Y ( T )= 1 or Z ( H ) = 0, Z ( T ) = 0

B. Events Defined by Random Variables:


If X is a r.v. and x is a fixed real number, we can define the event (X = x) as

(X = x) = {l: X(C) = x)
Similarly, for fixed numbers x, x,, and x, , we can define the following events:
(X 5 x) = {l: X(l) I x)
(X > x) = {C: X([) > x)
(xl < X I x2) = {C: XI < X(C) l x2)
CHAP. 21 RANDOM VARIABLES

Fig. 2-2 One random variable associated with coin tossing.

These events have probabilities that are denoted by


P(X = x) = P{C: X(6) = X}
P(X 5 x) = P(6: X(6) 5 x}
P(X > x) = P{C: X(6) > x)
P(x, < X I x,) = P { ( : x , < X(C) I x,)
EXAMPLE 2.2 In the experiment of tossing a fair coin three times (Prob. 1.1), the sample space S, consists of
eight equally likely sample points S , = (HHH, ..., TTT). If X is the r.v. giving the number of heads obtained, find
(a) P(X = 2); (b) P(X < 2).
(a) Let A c S, be the event defined by X = 2. Then, from Prob. 1.1, we have
A = ( X = 2) = {C: X(C) = 2 ) = {HHT, HTH, THH)
Since the sample points are equally likely, we have
P(X = 2) = P(A) = 3
(b) Let B c S , be the event defined by X < 2. Then
B = ( X < 2) = { c : X ( ( ) < 2 ) = (HTT, THT, TTH, TTT)
and P(X < 2) = P(B) = 3 = 4

2.3 DISTRIBUTION FUNCTIONS


A. Definition :
The distribution function [or cumulative distributionfunction (cdf)] of X is the function defined by

Most of the information about a random experiment described by the r.v. X is determined by the
behavior of FAX).

B. Properties of FAX):
Several properties of FX(x)follow directly from its definition (2.4).

2. Fx(xl)IFx(x,) if x , < x2
3. lim F,(x) = Fx(oo) = 1
x-'m

4. lim F A X )= Fx(- oo) = 0


x-r-m

5. lim F A X )= F d a + ) = Fx(a) a + = lim a +E


x+a+ O<&+O
40 RANDOM VARIABLES [CHAP 2

Property 1 follows because FX(x)is a probability. Property 2 shows that FX(x) is a nondecreasing
function (Prob. 2.5). Properties 3 and 4 follow from Eqs. (1.22) and (1.26):
l i m P ( X < x ) = P(X < co) = P(S)= 1
X+oO

lim P(X s x) = P(X s - co) = P(0) = 0


x-'-a,

Property 5 indicates that FX(x)is continuous on the right. This is the consequence of the definition
(2.4).

Table 2.1

%
(TTT)
( T T T , T T H , THT, H T T )
( T T T , T T H , THT, HTT, HHT, HTH, THH)
S
S

EXAMPLE 2.3 Consider the r.v. X defined in Example 2.2. Find and sketch the cdf FX(x)of X.
Table 2.1 gives Fx(x)= P(X I x) for x = - 1, 0, 1 , 2, 3, 4. Since the value of X must be an integer, the value of
F,(x) for noninteger values of x must be the same as the value of FX(x)for the nearest smaller integer value of x.
The FX(x)is sketched in Fig. 2-3. Note that F,(x) has jumps at x = 0, 1,2,3, and that at each jump the upper value
is the correct value for FX(x).

-I 0 I 2 3 4

Fig. 2-3

C. Determination of Probabilities from the Distribution Function:


From definition (2.4), we can compute other probabilities, such as P(a < X I b), P(X > a), and
P(X < b) (Prob. 2.6):

P(X < b) = F,(b-) b-= lim b - E


O<E-'O
CHAP. 21 RANDOM VARIABLES

2.4 DISCRETE RANDOM VARIABLES AND PROBABILITY MASS FUNCTIONS


A. Definition :
Let X be a r.v. with cdf FX(x).If FX(x)changes values only in jumps (at most a countable number
of them) and is constant between jumps-that is, FX(x)is a staircase function (see Fig. 2-3)-- then X
is called a discrete random variable. Alternatively, X is a discrete r.v. only if its range contains a finite
or countably infinite number of points. The r.v. X in Example 2.3 is an example of a discrete r.v.

B. Probability Mass Functions:


Suppose that the jumps in FX(x) of a discrete r.v. X occur at the points x,, x,, . . ., where the
sequence may be either finite or countably infinite, and we assume xi < x j if i < j.
Then FX(xi)- FX(xi-,) = P(X 5 xi) - P(X I xi- ,) = P(X = xi) (2.13)
Let px(x) = P(X = x) (2.14)
The function px(x) is called the probability mass function (pmf) of the discrete r.v. X.

Properties of p d x ) :

The cdf FX(x)of a discrete r.v. X can be obtained by

2.5 CONTINUOUS RANDOM VARIABLES AND PROBABILITY DENSITY FUNCTIONS


A. Definition:
Let X be a r.v. with cdf FX(x).If FX(x)is continuous and. also has a derivative dFx(x)/dx which
exists everywhere except at possibly a finite number of points and is piecewise continuous, then X is
called a continuous random variable. Alternatively, X is a continuous r.v. only if its range contains an
interval (either finite or infinite) of real numbers. Thus, if X is a. continuous r.v., then (Prob. 2.18)

Note that this is an example of an event with probability 0 that is not necessarily the impossible event
0.
In most applications, the r.v. is either discrete or continuous. But if the cdf FX(x) of a r.v. X
possesses features of both discrete and continuous r.v.'s, then the r.v. X is called the mixed r.v. (Prob.
2.10).

B. Probability Density Functions:

Let

The function fx(x) is called the probability densityfunction (pdf) of the continuous r.v. X.
RANDOM VARIABLES [CHAP 2

Properties of fx(x) :

3. fx(x) is piecewise continuous.

The cdf FX(x)of a continuous r.v. X can be obtained by

By Eq. (2.19),if X is a continuous r.v., then

2.6 MEAN AND VARIANCE


A. Mean:
The mean (or expected ualue) of a rev.X , denoted by px or E(X), is defined by

X : discrete
px = E(X) =
xfx(x) dx X : continuous

B. Moment:

irx(xk)
The nth moment of a r.v. X is defined by

E(.n) = X : discrete
x n f d x ) dx X : continuous

Note that the mean of X is the first moment of X .

C. Variance:
The variance of a r.v. X , denoted by ax2 or Var(X),is defined by
ox2 = Var(X)= E { [ X - E(X)I2}
Thus,

rC (xk - p X ) 2 p X ( ~ J X : discrete

eX2 = 1 im (X - px)2/x(x) dx x : continuous


CHAP. 21 R A N D O M VARIABLES

Note from definition (2.28) that

The standard deviation of a r.v. X, denoted by a,, is the positive square root of Var(X).
Expanding the right-hand side of Eq. (2.28), we can obtain the following relation:

which is a useful formula for determining the variance.

2.7 SOME SPECIAL DISTRIBUTIONS


In this section we present some important special distributions.

A. Bernoulli Distribution:
A r.v. X is called a Bernoulli r.v. with parameter p if its pmf is given by
px(k) = P(X = k) = pk(l - P ) ' - ~ k = 0, 1
where 0 p I1. By Eq. (2.18), the cdf FX(x)of the Bernoulli r.v. X is given by
x<o

Figure 2-4 illustrates a Bernoulli distribution.

Fig. 2-4 Bernoulli distribution.

The mean and variance of the Bernoulli r.v. X are

A Bernoulli r.v. X is associated with some experiment where an outcome can be classified as
either a "success" or a "failure," and the probability of a success is p and the probability of a failure is
1 - p. Such experiments are often called Bernoulli trials (Prob. 1.61).
RANDOM VARIABLES [CHAP 2

B. Binomial Distribution:
A r.v. X is called a binomial r.v. with parameters (n, p) if its pmf is given by

where 0 5 p 5 1 and
n!
(;) = k!(,, - k)!
which is known as the binomial coefficient. The corresponding cdf of X is

Figure 2-5 illustrates the binomial distribution for n = 6 and p = 0.6.

(a (h)

Fig. 2-5 Binomial distribution with n = 6, p = 0.6.

The mean and variance of the binomial r.v. X are (Prob. 2.28)

A binomial r.v. X is associated with some experiments in which n independent Bernoulli trials are
performed and X represents the number of successes that occur in the n trials. Note that a Bernoulli
r.v. is just a binomial r.v. with parameters (1, p).

C. Poisson Distribution:
A r.v. X is called a Poisson r.v. with parameter A (>0) if its pmf is given by

The corresponding cdf of X is

Figure 2-6 illustrates the Poisson distribution for A = 3.


CHAP. 21 RANDOM VARIABLES

Fig. 2-6 Poisson distribution with A = 3.

The mean and variance of the Poisson r.v. X are (Prob. 2.29)
px = E(X) = A.
ax2 = Var(X) = il
The Poisson r.v. has a tremendous range of applications in diverse areas because it may be used
as an approximation for a binomial r.v. with parameters (n, p ) when n is large and p is small enough
so that np is of a moderate size (Prob. 2.40).
Some examples of Poisson r.v.'s include
1. The number of telephone calls arriving at a switching center during various intervals of time
2. The number of misprints on a page of a book
3. The number of customers entering a bank during various intervals of time

D. Uniform Distribution:
A r.v. X is called a uniform r.v. over (a, b) if its pdf is given by

(0 otherwise
The corresponding cdf of X is

x-a
F X ( x )= - a<x<b
{h-a

Figure 2-7 illustrates a uniform distribution.


The mean and variance of the uniform r.v. X are (Prob. 2.31)
R A N D O M VARIABLES [CHAP 2

Fig. 2-7 Uniform distribution over (a, b).

A uniform r.v. X is often used where we have no prior knowledge of the actual pdf and all
continuous values in some range seem equally likely (Prob. 2.69).

E. Exponential Distribution:
A r.v. X is called an exponential r.v. with parameter A (>O) if its pdf is given by

which is sketched in Fig. 2-8(a). The corresponding cdf of X is

which is sketched in Fig. 2-8(b).

Fig. 2-8 Exponential distribution.

The mean and variance of the exponential r.v. X are (Prob. 2.32)

The most interesting property of the exponential distribution is its "memoryless" property. By
this we mean that if the lifetime of an item is exponentially distributed, then an item which has been
in use for some hours is as good as a new item with regard to the amount of time remaining until the
item fails. The exponential distribution is the only distribution which possesses this property (Prob.
2.53).
CHAP. 21 RANDOM VARIABLES 47

F. Normal (or Gaussian) Distribution:


A r.v. X is called a normal (or gaussian) r.v. if its pdf is given by

The corresponding cdf of X is

This integral cannot be evaluated in a closed form and must be evaluated numerically. It is conve-
nient to use the function @(z),defined as

to help us to evaluate the value of FX(x).Then Eq. (2.53) can be written as

Note that

The function @ ( z )is tabulated in Table A (Appendix A). Figure 2-9 illustrates a normal distribution.

Fig. 2-9 Normal distribution.

The mean and variance of the normal r.v. X are (Prob. 2.33)

We shall use the notation N ( p ; a 2 ) to denote that X is normal with mean p and variance a 2 . A
normal r.v. Z with zero mean and unit variance-that is, Z = N ( 0 ; 1si-) called a standard normal r.v.
Note that the cdf of the standard normal r.v. is given by Eq. (2.54). The normal r.v. is probably the
most important type of continuous r.v. It has played a significant role in the study of random pheno-
mena in nature. Many naturally occurring random phenomena are approximately normal. Another
reason for the importance of the normal r.v. is a remarkable theorem called the central limit theorem.
This theorem states that the sum of a large number of independent r.v.'s, under certain conditions,
can be approximated by a normal r.v. (see Sec. 4.8C).
48 RANDOM VARIABLES [CHAP 2

2.8 CONDITIONAL DISTRIBUTIONS


In Sec. 1.6 the conditional probability of an event A given event B is defined as

The conditional cdf FX(x( B) of a r.v. X given event B is defined by

The conditional cdf F,(x 1 B) has the same properties as FX(x). (See Prob. 1.37 and Sec. 2.3.) In
particular,
F,(-coIB)=O F X ( m1 B) = 1 (2.60)
P(a < X Ib I B) = Fx(b I B) - Fx(a I B) (2.61)
If X is a discrete r.v., then the conditional pmf p,(xk I B) is defined by

If X is a continuous r.v., then the conditional pdf fx(x 1 B) is defined by

Solved Problems

RANDOM VARIABLES
2.1. Consider the experiment of throwing a fair die. Let X be the r.v. which assigns 1 if the number
that appears is even and 0 if the number that appears is odd.
(a) What is the range of X ?
(b) Find P(X = 1 ) and P(X = 0).
The sample space S on which X is defined consists of 6 points which are equally likely:
S = (1, 2, 3, 4, 5, 6)
(a) The range of X is R, = (0, 1 ) .
(b) (X = 1) = (2, 4, 6). Thus, P(X = 1) = 2 = +.Similarly, (X = 0) = (1, 3,5), and P(X = 0) = 3.

2.2. Consider the experiment of tossing a coin three times (Prob. 1.1). Let X be the r.v. giving the
number of heads obtained. We assume that the tosses are independent and the probability of a
head is p.
(a) What is the range of X ?
( b ) Find the probabilities P ( X = 0), P(X = I), P(X = 2), and P(X = 3).
The sample space S on which X is defined consists of eight sample points (Prob. 1.1):
S = {HHH,HHT,..., TTT)
(a) The range of X is R , = (0, 1 , 2, 3).
CHAP. 21 RANDOM VARIABLES

(b) If P(H) = p, then P(T) = 1 - p. Since the tosses are independent, we have

2.3. An information source generates symbols at random from. a four-letter alphabet (a, b, c, d} with
probabilities P(a) = f, P(b) = $, and P(c) = P(d) = i.A coding scheme encodes these symbols
into binary codes as follows:

Let X be the r.v. denoting the length of the code, that is, the number of binary symbols (bits).
(a) What is the range of X ?
(b) Assuming that the generations of symbols are independent, find the probabilities P(X = I),
P(X = 2), P(X = 3), and P(X > 3).
(a) TherangeofXisR, = {1,2, 3).
(b) P(X = 1) = P[{a)] = P(a) =
P(X = 2) = P[(b)] = P(b) = $
P(X = 3) = P[(c, d)] = P(c) + P(d) = $
P(X > 3) = P(%) = 0

2.4. Consider the experiment of throwing a dart onto a circular plate with unit radius. Let X be the
r.v. representing the distance of the point where the dart lands from the origin of the plate.
Assume that the dart always lands on the plate and that the dart is equally likely to land
anywhere on the plate.
(a) What is the range of X ?
(b) Find (i) P(X < a) and (ii) P(a < X < b), where a < b I1.
(a) The range of X is R, = (x: 0 I x < 1).
(b) (i) (X < a) denotes that the point is inside the circle of radius a. Since the dart is equally likely to fall
anywhere on the plate, we have (Fig. 2-10)

(ii) (a < X < b) denotes the event that the point is inside the annular ring with inner radius a and
outer radius b. Thus, from Fig. 2-10, we have

DISTRIBUTION FUNCTION
2.5. Verify Eq. (2.6).
Let x, < x,. Then (X 5 x,) is a subset of ( X Ix,); that is, (X I x,) c (X I x,). Then, by Eq. (1.27),
we have
RANDOM VARIABLES [CHAP 2

Fig. 2-10

2.6. Verify (a) Eq. (2.10);(b) Eq. (2.1 1 ) ; (c) Eq. (2.12).
(a) Since ( X _< b ) = ( X I a ) u (a < X _< b) and ( X I a ) n ( a < X 5 h) = @, we have
P(X I h) = P(X 5 a ) + P(u < X Ib)
or +
F,y(b) = FX(a) P(u < X I h)
Thus, P(u < X 5 b) = Fx(h) - FX(u)
(b) Since ( X 5 a ) u ( X > a) = S and (X Ia) n ( X > a) = a,we have
P(X S a) + P(X > a ) = P(S) = 1
Thus, P(X > a ) = 1 - P(X 5 a ) = 1 - Fx(u)

(c) Now P(X < h) = P[lim X 5 h - E ] = lim P(X I b - E )


c-0 c+O
c>O E>O

= lim Fx(h - E ) = Fx(b - )


8-0
>0
8:

2.7. Show that


(a) P(a i X i b ) = P(X = a) + Fx(b) - Fx(a)
(b) P(a < X < b ) = Fx(b) - F,(a) - P ( X = h )
(c) P(a i X < b) = P(X = u) + Fx(b) - Fx(a) - P(X = b)
(a) Using Eqs. (1.23) and (2.10),we have
P(a I X I h) = P[(X = u) u ( a < X I b)]
+
= P(X = u ) P(a < X 5 b )
= P(X = a ) + F,y(h) - FX(a)

(b) We have
P(a < X 5 b ) = P[(u < X c h ) u ( X = b)]
= P(u < X < h) + P(X = b )
CHAP. 21 RANDOM VARIABLES

Again using Eq. (2.10), we obtain


P(a < X < b) = P(a < X I b) - P(X = b)
= Fx(b) - Fx(a) - P(X = b)

Similarly, P(a IX I b) = P[(a I X < b) u (X = b)]


= P(a I X < b) + P(X = b)
Using Eq. (2.64), we obtain
P(a IX < b) = P(a 5 X 5 b) - P(X = b)
= P(X = a) + Fx(b) - F,(a) - P(X = b)

X be the r.v. defined in Prob. 2.3.


Sketch the cdf FX(x)of X and specify the type of X.
Find (i) P(X I I), (ii) P(l < X I 2), (iii) P(X > I), and (iv) P(l I X I 2).
From the result of Prob. 2.3 and Eq. (2.18), we have

which is sketched in Fig. 2-1 1. The r.v. X is a discrete r.v.


(i) We see that
P(X 5 1) = Fx(l) = 4
(ii) By Eq. (2.1O),
P(l < X 5 2) = Fx(2) - FA1) = -4 =
(iii) By Eq. (2.1I),
P(X > 1) = 1 - Fx(l) = 1 - $ =$

(iv) By Eq. (2.64),


P(l IX I2) = P(X = 1) + Fx(2) - Fx(l) = 3 + 3 - 3 = 3

Fig. 2-1 1

Sketch the cdf F,(x) of the r.v. X defined in Prob. 2.4 and specify the type of X.
From the result of Prob. 2.4, we have
0 x<o
FX(x)=P(XIx)=
1 l l x
which is sketched in Fig. 2-12. The r.v. X is a continuous r.v.
RANDOM VARIABLES [CHAP 2

Fig. 2-12

2.10. Consider the function given by

(a) Sketch F(x) and show that F(x) has the properties of a cdf discussed in Sec. 2.3B.
(6) If X is the r.v. whose cdf is given by F(x), find (i) P(X Ii), (ii) P(0 < X i), (iii) P(X = O),
and (iv) P(0 < X < i).
(c) Specify the type of X.
(a) The function F(x) is sketched in Fig. 2-13. From Fig. 2-13, we see that 0 < F(x) < 1 and F(x) is a
nondecreasing function, F(- co) = 0, F(co) = 1, F(0) = 4, and F(x) is continuous on the right. Thus,
F(x)satisfies all the properties [Eqs. (2.5) to (2.91 required of a cdf.
(6) (i) We have

(ii) By Eq. (2.1O),

(iii) By Eq. (2.12),

(iv) By Eq. (2.64),

(c) The r.v. X is a mixed r.v.

Fig. 2-13
CHAP. 21 RANDOM VARIABLES

2.11. Find the values of constants a and b such that

is a valid cdf.
To satisfy property 1 of F X ( x )[ 0 I F X ( x )5 11, we must ha.ve 0 5 a 5 1 and b > 0 . Since b > 0 , pro-
perty 3 of F X f x )[ F x ( ~=) 1) is satisfied. It is seen that property 4 of F X ( x )[F,(-m) = O] is also satisfied.
For 0 5 a I 1 and b > 0 , F ( x ) is sketched in Fig. 2-14. From Fig. 2-14, we see that F(x) is a nondecreasing
function and continuous on the right, and properties 2 and 5 of t7,(x) are satisfied. Hence, we conclude that
F(x) given is a valid cdf if 0 5 a 5 1 and b > 0. Note that if a = 0, then the r.v. X is a discrete r.v.; if a = 1,
then X is a continuous r.v.; and if 0 < a < 1, then X is a mixed r.v.

Fig. 2-14

DISCRETE RANDOM VARIABLES AND PMF'S


2.12. Suppose a discrete r.v. X has the following pmfs:
PXW = 4 P X =~ $ px(3) = i
(a) Find and sketch the cdf F,(x) of the r.v. X.
(b) Find (i) P ( X _< I), (ii) P(l < X _< 3), (iii) P ( l I X I3).
(a) By Eq. (2.18), we obtain

which is sketched in Fig. 2-15.


(b) (i) By Eq. (2.12), we see that
P(X < I ) = F x ( l - ) = 0
(ii) By Eq. (2.10),
P(l < X I 3 ) = Fx(3) - F x ( l ) = - 4 =2
(iii) By Eq. (2.64),
P ( l I X I 3) = P ( X = 1) + Fx(3) - F x ( l ) = 3 + 4 - 3 = 3
RANDOM VARIABLES [CHAP 2

Fig. 2-15

2.13. (a) Verify that the function p(x) defined by


x =o, 1, 2, ...
otherwise
is a pmf of a discrete r.v. X.
(b) Find (i) P(X = 2), (ii) P(X I 2), (iii) P(X 2 1).
(a) It is clear that 0 5 p(x) < 1 and

Thus, p(x) satisfies all properties of the pmf [Eqs. (2.15) to (2.17)] of a discrete r.v. X.
(b) (i) By definition (2.14),
P(X = 2) = p(2) = $($)2 =
(ii) By Eq. (2.18),

(iii) By Eq. (l.25),

2.14. Consider the experiment of tossing an honest coin repeatedly (Prob. 1.35). Let the r.v. X denote
the number of tosses required until the first head appears.
(a) Find and sketch the pmf p,(x) and the cdf F,(x) of X.
(b) Find (i) P(l < X s 4), (ii) P(X > 4).
(a) From the result of Prob. 1.35, the pmf of X is given by

Then by Eq. (2.18),


CHAP. 21 RANDOM VARIABLES

These functions are sketched in Fig. 2-16.


(b) (9 BY Eq. ( 2 . m
P(l < X 1 4 ) = Fx(4)- Fx(X) = -3=
(ii) By Eq. (1.Z),
P(X > 4 ) = 1 - P(X 5 4) = 1 - Fx(4) = 1 - = -&

Fig. 2-16

2.15. Consider a sequence of Bernoulli trials with probability p of success. This sequence is observed
until the first success occurs. Let the r.v. X denote the trial number on which this first success
occurs. Then the pmf of X is given by

because there must be x - 1 failures before the first success occurs on trial x. The r.v. X defined
by Eq. (2.67) is called a geometric r.v. with parameter p.
(a) Show that px(x) given by Eq. (2.67) satisfies Eq. (2.17).
(b) Find the cdf F,(x) of X.
(a) Recall that for a geometric series, the sum is given by

Thus,

(b) Using Eq. (2.68),we obtain

Thus, P(X 5 k ) = 1 - P(X > k ) = 1 - ( 1 -


and Fx(x)=P(X<~)=1-(1-p)" x=1,2, ...
Note that the r.v. X of Prob. 2.14 is the geometric r.v. with p == 4.

2.16. Let X be a binomial r.v. with parameters (n, p).


(a) Show that p&) given by Eq. (2.36) satisfies Eq. (2.17).
RANDOM VARIABLES [CHAP 2

(b) FindP(X> l ) i f n = 6 a n d p = 0 . 1 .
(a) Recall that the binomial expansion formula is given by

Thus, by Eq. (2.36),

(b) NOW P(X > 1 ) = 1 - P(X = 0 ) - P(X = 1)

2.17. Let X be a Poisson r.v. with parameter A.


(a) Show that p,(x) given by Eq. (2.40) satisfies Eq. (2.1 7).
(b) Find P(X > 2) with 1 = 4.

(h) With A = 4, we have

and

Thus,

CONTINUOUS RANDOM VARIABLES AND PDF'S


2.18. Verify Eq. (2.19).
From Eqs. (1.27) and (2.10),we have

for any E 2 0. As F x ( x ) is continuous, the right-hand side of the above expression approaches 0 as E + 0.
Thus, P(X = x ) = 0.

2.19. The pdf .of a continuous r.v. X is given by


3 O<x<l

0 otherwise
Find the corresponding cdf FX(x)and sketch fx(x) and F,(x).
CHAP. 21 RANDOM VARIABLES 57

By Eq. (2.24), the cdf of X is given by

i
X
0 1 x c 1
3
Fix)= [idi+l$d/=%x-i 1 5 ~ ~ 2

2 1 x

The functionsf d x ) and F A X )are sketched in Fig. 2-1 7.

Fig. 2-17

2.20. Let X be a continuous r.v. X with pdf


kx O<x<l
fx(x) = {O otherwise
where k is a constant.
(a) Determine the value of k and sketchf,(x).
(b) Find and sketch the corresponding cdf Fx(x).
(c) Find P($ < X 1 2).
(a) By Eq. (2.21), we must have k > 0 , and by Eq. (2.22),

Thus, k = 2 and
2x O<x<l
0 otherwise
which is sketched in Fig. 2-18(a).
( b ) By Eq. (2.24),the cdf of X is given by

[[2(d(=l l i x
which is sketched in Fig. 2-18(b).
58 RANDOM VARIABLES [CHAP 2

Fig. 2-18

2.21. Show that the pdf of a normal r.v. X given by Eq. (2.52) satisfies Eq. (2.22).
From Eq. (2.52),

Let

Then

Letting x = r cos 9 and y = r sin 9 (that is, using polar coordinates), we have

Thus,

and

2.22. Consider a function

Find the value of a such that f ( x ) is a pdf of a continuous r.v. X.


CHAP. 21 RANDOM VARIABLES

Iff (x) is a pdf of a continuous r.v. X, then by Eq. (2.22), we must have

1
Now by Eq. (2.52), the pdf of N ( i ; 4) is - e-(x-'/2)2. Thus,
J;;

from which we obtain a = a.


2.23. A r.v. X is called a Rayleigh r.v. if its pdf is given by
#

(a) Determine the corresponding cdf FX(x).


(b) Sketch.fx(x)and FX(x)for a = 1.
(a) By Eq. (2.24), the cdf of X is

Let y = t2/(2a2).Then dy = (l/a2)t dt, and

(b) With a = 1, we have

and

These functions are sketched in Fig. 2-19.

2.24. A r.v. X is called a gamma r.v. with parameter (a, A) (a > 0 and 1 > 0)if its pdf is given by

where T(a)is the gamma function defined by

You might also like