0% found this document useful (0 votes)
52 views

Tutorial Sheet 3

This document provides 14 problems related to probability concepts such as jointly distributed random variables, expectation, variance, covariance, moment generating functions, and special discrete random variables like the binomial, Poisson, and Bernoulli distributions. The problems cover calculating and interpreting these concepts for both single and jointly distributed random variables. They also explore properties like independence and correlation between random variables.

Uploaded by

Psycho Mind
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
52 views

Tutorial Sheet 3

This document provides 14 problems related to probability concepts such as jointly distributed random variables, expectation, variance, covariance, moment generating functions, and special discrete random variables like the binomial, Poisson, and Bernoulli distributions. The problems cover calculating and interpreting these concepts for both single and jointly distributed random variables. They also explore properties like independence and correlation between random variables.

Uploaded by

Psycho Mind
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

SC224: Tutorial Sheet 3

Problems based on Jointly Distributed Random Variables, Expectation, Vari-


ance, Covariance and the Weak Law of Large Numbers.
Pb 1) The joint PMF of a discrete random vector (X1 , X2 ) is given by the following table

x2 \ x1 −1 0 1
0 1/9 2/9 1/9
1 1/9 2/9 1/9
2 0 1/9 0

a) Determine the covariance of X1 and X2 . (Ans: 15/81).


b) Calculate the correlation coefficient ρX1 ,X2 = √ Cov(X1 ,X2 ) of X1 and X2 . (Ans:
Var(X1 )Var(X2 )
15/36).
c) Are X1 and X2 independent random variables? Justify your answer.

Pb 2) The moment generating function of a random variable X is a function MX (t) of a free


parameter t, defined by MX (t) = E[etX ] (if it exists).

(i) Compute the moment generating functions for the following distributions.
a) Geometric distribution with parameter p.
b) Uniform distribution over the interval [a, b].
(ii) If the moment generating function exists for a random variable X, then show
that the nth moment about the origin (or E[X n ]) can be found by evaluating the
nth derivative of the moment generating function at t = 0.

Pb 3) Suppose that we have a resistance R. We know that the value of R follows a uniform
law between 900 and 1100 Ω. What is the density of the corresponding conductance
G = 1/R ?

Pb 4) (Universality of the uniform distribution) Let X be a real valued random vari-


able and let U ∼ U ([0, 1]). Since FX : R → [0, 1] is not always one-to-one, therefore,
we define FX−1 as
FX−1 (u) = sup{x ∈ R : FX (x) ≤ u}.
Show that FX−1 (U ) and X has the same distribution.

Pb 5) If X and Y are two independent random variables, then so are g(X) and h(Y ).

Pb 6) Two random variables X and Y are said to be uncorrelated if their covariance is 0.


Suppose X and Y are independent uniformly distributed random variables over the
common interval [0, 1]. Define Z = X + Y and W = X − Y . Show that Z and W are
not independent, but uncorrelated random variables.
Pb 7) Let the joint PDF of random variables X and Y be defined as
π π
fX,Y (x, y) = kcos(x + y) for 0 ≤ x ≤ ,0 ≤ y ≤ .
4 4
Determine the constant k and the marginal probability density functions (fX (x) and fY (y))
of X and Y . Are the random variables X and Y are orthogonal? Justify. (The ran-
dom variables X and Y are said to be orthogonal if the mathematical expectation
E[XY ] = 0.)
Pb 8) Let X and Y be linearly dependent real valued random variables. Show that X and
Y are not independent (in the probability sense.)
Pb 9) (Multinomial Distribution) Let Ω be a sample space associated with a random
experiment E, and let B1 , B2 , .., Bn be a partition of Ω. Assume that we perform m
independent repetitions of the experiment E and that the probability pk = P [Bk ] is
constant from one repetition to another. If Xk denotes the number of times that the
event Bk has occurred among the m repetitions, for k = 1, 2, .., n, then, determine the
joint PMF of the random vector (X1 , X2 , ..., Xn ) and Cov(Xi , Xj ). Also, calculate the
n
X
1
expectation and variance of the random variable X = n Xn .
k=1
√ √
Pb 10) Show that if X ≥ 0 and E(X) = µ then P (X ≥ µ) ≤ µ.
2
Pb 11) Let X have variance σX and Y have variance σY2 . Show that −1 ≤ ρX,Y ≤ 1. Further,
argue that, if ρX,Y = 1 or −1, then X and Y are related by Y = a + bX, where b > 0
if ρX,Y = 1 and b < 0 if ρX,Y = −1.
Pb 12) Consider n independent trials, each of which results in any of the outcomes i, i =
3
X
1, 2, 3, with respective probabilities p1 , p2 , p3 , pi = 1. Let Ni denote the number of
i=1
trials that result in outcome i, and show that Cov(N1 , N2 ) = −np1 p2 . Also explain
why it is intuitive that this covariance is negative.
Pb 13) Suppose that X is a random variable with mean and variance both equal to 20. What
can be said about P [0 ≤ X ≤ 40]?.
Pb 14) From past experience, a professor knows that the test score of a student taking her
final examination is a random variable with mean 75.
(a) Give an upper bound to the probability that a student’s test score will exceed
85.
(b) Suppose in addition the professor knows that the variance of a student’s test
score is equal to 25. What can be said about the probability that a student will
score between 65 and 85?
Problems based on Special Discrete Random Variables.
Pb 1) The moment generating function of a random variable X is a function MX (t) of a free
parameter t, defined by MX (t) = E[etX ] (if it exists).

2
(i) Compute the moment generating functions for the following distributions.
a) Bernoulli distribution with probability of success p.
b) Binomial distribution with parameters n and p.
c) Poisson distribution with parameter λ > 0.
(ii) Using the moment generating function, find the mean and variance of above men-
tioned distributions. Further, argue that sum of independent Binomial (Poisson)
random variables follows Binomial (Poisson) distribution.

Pb 2) An urn contains n balls numbered 1 through n. If you withdraw m balls randomly in


sequence, each time replacing the ball selected previously, find P [X = k], k = 1, ..., m,
where X is the maximum of the m chosen numbers.

Pb 3) If X is a binomial random variable with expected value 6 and variance 2.4, find
P [X = 5].

Pb 4) On average, 5.2 hurricanes hit a certain region in a year. What is the probability that
there will be 3 or fewer hurricanes hitting this year?

Pb 5) The number of eggs laid on a tree leaf by an insect of a certain type is a Poisson
random variable with parameter λ. However, such a random variable can be observed
only if it is positive, since if it is 0 then we cannot know that such an insect was on
the leaf. If we let Y denote the observed number of eggs, then

P [Y = i] = P [X = i|X > 0]

where X is Poisson with parameter λ. Find E[Y ].

Pb 6) Suppose that
P [X = a] = p P [X = b] = 1 − p.
X−b
Show that a−b
is a Bernoulli random variable. Find Var(X).

Pb 7) Each game you play is a win with probability p. You plan to play 5 games, but if you
win the fifth game, then you will keep on playing until you lose. Find the expected
number of games that you lose.

Pb 8) Ten balls are to be distributed among 5 urns, with each ball going into urn i with
5
X
probability pi , pi = 1. Let Xi denote the number of balls that go into urn i.
i=1
Assume that events corresponding to the locations of different balls are independent.

a) What type of random variable is Xi ? Be as specific as possible.


b) For i ̸= j, what type of random variable is Xi + Xj ?
c) Find P [X1 + X2 + X3 = 7].

You might also like