Random Variables
Dr. O. I. Oshin
1
Random Variables - Introduction
• A random variable is a function or a variable that assigns values to
each of an experiment’s outcome.
• The range of a random variable is always a set of real numbers.
• Examples: Coin tossed thrice (H appears), football injuries,
number of times a ‘3’ appears when two dice are tossed.
• Random variables are denoted by uppercase letters such as X and
Y, and the values taken by them are denoted by lowercase letters
with subscripts such as x1, x2, y1, y2.
• Random variables can either be discrete or continuous.
2
Discrete Versus Continuous
• A discrete random variable is a random variable that takes on
finite number of values in a finite observation interval.
A discrete random variable takes on specific values only.
• A continuous random variable is one that takes on infinite
number of values within a finite period. They have values within a
continuous range.
3
Probability Mass Function/Distribution
• A probability mass function is a mathematical function describing
the possible values of a random variable and their associated
probabilities.
Example:
• Probability distribution for the coin experiment.
4
Exercise 1
• Consider an experiment involving the rolling of two dice. The sum
of the points on the two dice is a random variable ‘N’. Obtain the
probability distribution for the random variable.
5
Cumulative Distribution Function
• The Cumulative Distribution Function (CDF) of a random variable
‘X’ may be defined as the probability that the random variable ‘X’
will take a value less than or equal to ‘x’, where ‘x’ is a given
dummy variable.
• The CDF of a random variable enables one to have a probabilistic
description of a random variable.
• It is the probability that the outcome of an experiment will be one
of the outcomes for which X≤ x. The probability of this event is
denoted as P(X≤ x).
6
Properties of CDF
1. The distribution function FX(x) is bounded between 0 and 1 i.e. 0
≤ FX(x) ≤ 1.
2. FX(-∞) = 0 and FX(∞) = 1
3. The distribution function FX(x) is a monotonic non-decreasing
function of x, i.e.
FX(x1) ≤ FX(x2) iff x1 ˂ x2
• For a discrete r.v, the CDF for a complete range of x, may be
expressed as:
7
Probability Density Function
• Probability Density Function (PDF) is an alternative probabilistic
description of a random variable.
• It is the derivative of the CDF of a random variable with respect to
a given dummy variable.
• It gives a more convenient representation for a continuous
random variable.
• PDF may be expressed as:
8
Properties of PDF
1. PDF is always non-negative for all values of x.
2. The area under the PDF curve is always equal to unity.
3. The CDF may be obtained by integrating PDF.
4. The probability of the event {x1 ˂ X ≤ x2} is simply given by the
area under the PDF curve in the range x1 ˂ X ≤ x2.
9
Example
• The CDF for a certain random variable X is 𝑥 ! + 4 for all values 0 <
𝑥 < 1.
i. Determine the PDF expression.
ii. Verify that the area under the PDF curve is unity
iii. Find P(1/2 < 𝑋 < 1)
10
Exercise 1
• A continuous random variable has a probability density function
expressed as
i. Verify that the area under the curve is unity.
ii. Determine the probability that it will take a value between 1 and
3.
11
Exercise 2
!
• Let X be a CRV whose PDF is: ,#
" for an interval 0 < 𝑥 < 𝐶. What
is the value of ‘C’ that makes fX(𝑥) valid?
12
Assignment
• Given the PDF
If ‘X’ is a random variable whose values lie in the range x = -∞ to
+∞.
i. What is the relationship between ‘a’ and ‘b’.
ii. Determine the probability that the outcome lies between 1 and
2.
13
Joint CDF
• Joint Cumulative Distribution function describes the outcome of
an experiment by two random variables.
• It is the probability that a random variable ‘X’ will take a value less
than or equal to ‘x’ and that a random variable ‘Y’ will take a value
less than or equal to ‘y’.
• It is denoted by FXY(x, y) which is P(X≤ x, Y≤ y).
• It is the probability that the outcome of an experiment will result in
a sample point lying within the range (-∞ ˂ X ≤ x, -∞ ˂ Y ≤ y) of the
joint sample space.
14
Properties of Joint CDF
1. The Joint CDF is a non-negative function.
2. The Joint CDF is a monotonic non-decreasing function of x and
y.
15
Joint PDF
• The Joint Probability Density Function of any two random variables
X and Y may be defined as the partial derivative of the joint
cumulative distribution function FXY(x, y) with respect to the given
dummy variables x and y.
• Mathematically,
16
Properties of Joint PDF
1. The joint PDF is non-negative.
2. The total volume under the surface of the joint PDF is unity.
17
Statistically Independent RVs
• If 2 RVs are statistically independent i.e. event X has nothing to do
with event Y, then the JPDF of the RVs becomes a product of 2
separate PDFs.
𝑓$% 𝑥, 𝑦 = 𝑓$ (𝑥)×𝑓% (𝑦)
#" $"
𝑃 𝑥! < 𝑋 ≤ 𝑥", 𝑦! < 𝑌 ≤ 𝑦" = * * 𝑓%& 𝑥, 𝑦 𝑑𝑥 𝑑𝑦
#! $!
#" $"
* 𝑓& 𝑦 𝑑𝑦× * 𝑓% 𝑥 𝑑𝑥
#! $!
18
Example 1
• The joint probability density function of two random variables X
and Y is given as:
Determine the value of constant C.
19
Exercise
• Let X and Y be two jointly CRVs with a JPDF of:
i. Find the constant c
ii. Find 𝑃(0 < 𝑋 < 1/2, 0 < 𝑌 < 3/4)
20
Exercise
• A joint probability density function of two random variables X and Y is
given as:
i. Show that the volume under the surface of the JPDF curve is unity.
ii. P(X < 1)
iii. P(X > Y)
iv. P(X + Y < 1)
21
Marginal Densities
• Marginal densities or Marginal probability density functions result
when the probability density function for any single random
variable is obtained from a joint PDF.
• We can find marginal PDFs of X and Y from their joint PDF. In
particular, by integrating over all y’s we obtain 𝑓𝑋(𝑥) and vice
versa.
22
Exercise
• Let X and Y be two jointly CRVs with a JPDF of:
i. Find the constant c
ii. Find the marginal PDFs (𝑓𝑋(𝑥) and 𝑓𝑌(𝑦))
23
Example
• The joint probability density of the random variables X and Y is
i. Find fX(x)
&
ii. ASSIGNMENT fY(y) = Show the workings
(()&)"
iii. Show that the random variables are dependent on each other.
24
Conditional PDF
• Out of two random variables, one variable may take a fixed value.
In this case, the PDF is called conditional.
25
Properties of Conditional PDF
1. A conditional PDF is always non-negative.
2. The area under the conditional PDF curve is always unity.
3. If the two random variables are statistically independent, their
conditional PDFs are reduced to their marginal densities.
26
Example
𝑐𝑥𝑦 0 < 𝑥 < 1, 0 < 𝑦 < 1
𝑓$% 𝑥, 𝑦 = 7
0 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
• Find c
𝑐𝑥𝑦 0 < 𝑥 < 𝑦; 0 < 𝑦 < 1
𝑓$% 𝑥, 𝑦 = 7
0 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
1. Find the conditional PDF of 𝑋 given 𝑌 = 𝑦
2. Find the conditional PDF of 𝑌 given 𝑋 = 𝑥
3. Determine if X and Y are dependent on each other
27
Exercise
• Let X and Y be two jointly CRVs with a joint PDF of
i. Find the conditional PDF of X given 𝑌 = 𝑦
#
ii. 𝑃(𝑋 < $
/ 𝑌 = 𝑦)
iii. Find the conditional PDF of 𝑌 given 𝑋 = 𝑥
iv. 𝑃(𝑌 < 1/𝑋 = 𝑥)
28
Statistical Averages of RVs
• Statistical averages give more useful and quick information about
the random variable.
• Examples of statistical averages include: mean/average,
moments, standard deviation, variance etc.
• They are special characteristics of the Probability Density
Function of the random variables.
29
Mean or Average
• In general terms, the mean or average of any random variable is
given by the summation of all the values of the random variable
divided by the total number of values the random variable can
take.
• The mean value of a random variable is also known as the
Expectation of the random variable.
30
Mean value of Discrete RVs
• The mean value of a discrete random variable is the summation of
the values of the random variable, each weighted by its
probability.
Mean value of Continuous RVs
The summation of the values of the random variable converts to
an integration over the complete range of x : {-∞ ˂ x ˂ ∞}. The
mean value is obtained by also weighting by the values of the
random variable by the probability (PDF).
31
Example
• Consider a random variable ‘X’ is uniformly distributed in the
interval (-π, π) with the PDF:
Determine the mean value of X.
32
Mean Value of Functions of Continuous RVs
• You can also determine the mean of a function of a given random
variable.
• Assume a random variable X, you can determine the mean of a
function of X, say g(x), over a complete range of x: {−∞ < 𝑥 < ∞}.
,
𝐸𝑔 𝑥 = H 𝑔 𝑥 𝑓$ 𝑥 𝑑𝑥
+,
33
Example
• Consider a random variable ‘X’ is uniformly distributed in the
interval (-π, π) with the PDF:
and another random variable ‘Z’ is the function of ‘X’ such that Z =
cos(X).
Determine the mean value of Z.
34
Moments
• The “moments” of a random variable are the expected values of
powers or related functions of the random variable.
• The nth moment of any random variable ‘X’ may be defined as the
mean value of Xn
• The first two moments are the most important.
• The first moment of any random variable will be same as its mean
value.
• The second moment is known as the mean square value of the
random variable ‘X’.
35
Central Moments
• Central Moments can be defined as the moments of the
difference between a random variable and its mean.
• The first central moment is 0.
• The second central moment is known as the Variance of the
random variable.
• The square root of the variance gives the Standard deviation of the
random variable. If the mean = 0, SD is simply the root mean
square (rms) value of the random variable.
36
Example
• Let X be a continuous random variable with PDF given as:
i. Verify that the area under the curve is unity.
ii. What is the expectation of X.
iii. Determine the second moment of X.
iv. Find the variance and standard deviation of X.
37
Exercise
• The probability density function of a random variable ‘X’ is given
as:
Determine E[X], E[X2] and δx.
38
Uniform Distribution
• A distribution is said to be uniform if the value of the PDF is the
same for all possible values of the random variable.
• Prove that the mean and variance of a random variable ‘X’, having
a uniform distribution in the interval [a, b] are:
39
Gaussian Distribution
• Gaussian Distribution is defined for continuous random variables.
• It is also known as Normal probability distribution.
• It is very important in the analysis of communication and
statistical systems.
• The PDF for a Gaussian random variable is expressed as:
40
Properties of Gaussian PDF
1. The peak value occurs at x = m i.e. mean value. This value may
be obtained by putting x = m into the Gaussian PDF.
2. The plot of Gaussian PDF exhibits even symmetry around the
mean value.
3. The area under the PDF curve is ½ for all values of x above mean
value and ½ for all values of x below mean value.
4. As the Standard Deviation tends to 0, the Gaussian function
approaches the impulse function located at x = m
41