KASDI MERBAH Ouargla University L2 ST, S3
Faculty of New Technologies of Information and Communication Probabilities and statistics
Department of Electronic and Telecommunications Part B
Chapter 3: Random variables (discrete and continuous)
I. Introduction
A random variable is as a real function that maps the elements of the sample space into points
of the real axis. The random variable is represented by a capital letter (X, Y, Z, …), and any
particular real value of the random variable is denoted by a lowercase letter (x, y, z, …). In this
chapter, we present two types of random variables discrete and continuous. We start the chapter
by defining the step an impulse functions because they are used to sketch the probability density
function (pdf) and the cumulative distribution function (cdf) in the case of discrete random
variables (r.v.).
1
The same translations as on the unit step function can be made on the unit impulse function
to obtain Aδ(x-x0). A very interesting integral can be used to deduce the mathematical expression
of a pdf from its graphical representation is:
I.1. Discrete Random Variables
If a random variable X can assume only a particular finite or counting infinite set of values, x1, x2,
… , xn, then X is said to be a discrete random variable. If we associate each outcome xi with a
number P(xi) = P(X = xi), called the probability of xi, the number P(xi), sometimes denoted Pi for
simplicity, i = 1, 2, … , must satisfy the following conditions:
The distribution function or cumulative distribution function (CDF) of a random variable X is
defined as:
The pdf of a random variable X is can be given in a table as:
xi x1 x2 … xn
P(xi) P(x1) P(x2) … P(xn)
Example 1:
Consider the experiment of rolling two dice. Let X represent the sum of numbers that show up on
the upper faces of the two dice.
a) What is the probability that X is between 4 and 6 inclusive?
b) Determine P(X ≥ 5).
c) Sketch the pdf and the cdf of X.
Solution:
a) When rolling two dice, we obtain this table of probability:
2
Number on each
1 2 3 4 5 6
upper face
1 1/36 1/36 1/36 1/36 1/36 1/36
2 1/36 1/36 1/36 1/36 1/36 1/36
3 1/36 1/36 1/36 1/36 1/36 1/36
4 1/36 1/36 1/36 1/36 1/36 1/36
5 1/36 1/36 1/36 1/36 1/36 1/36
6 1/36 1/36 1/36 1/36 1/36 1/36
c) graphs:
I.1. Continuous Random Variables
X is called a continuous random variable if its distribution function FX (x) is represented as
where, fX (x) is a probability density function, it must satisfy these two conditions
Example 2:
Let fX(x) a function defined as
3
a) Find the constant c such that fX(x) is a density function.
b) Compute P(1 < X < 2).
c) Find the distribution function FX (x).
Solution:
(a) The first condition is realized because fX(x) is a nonnegative for the given range of x.
For fX(x) to be a density function, we need to find the constant c as follows:
Solving this integral, we obtain the unique value of c, which realizes the condition:c=2/9. Hence:
Hence
The curves of the pdf and the cdf of X are as follows:
4
II. Moments
II.1. Mathematical expectation
The mathematical expectation, or expected value, or mean value of a discrete random variable is
given by:
For a continuous random variable X with density function fX(x), the expectation of X is defined
as:
Example 3:
Let the random variable X is the result of rolling a fair die. Find the expected value of X.
Solution:
In this experience, xi=1, 2, …, 6 and P(xi)=1/6. Then:
Example 4:
Consider the random variable X with the density function given by the following figure.
- Find E[X].
Solution:
5
The expected value of X is
Let X be a random variable. Then, the function g(X) is also a random variable, and its expected
value, E[g(X)], is
Properties :
1. If c is any constant, then
and
2. If the function g(X) = Xn, n = 0,1, … , then
This is called the nth moment of the random variable X about the origin. For n = 2, we obtain the
second moment of X, which is called the mean-square value.
II.2. Variance and standard deviation
The variance V(X) is defined as
where, σx is called the standard deviation of X.
Example 5:
Calculate the variance of the random variable given is example 4.
Solution:
In example 4, we find that E[X]=0, hence:
6
Properties :
1. The variance and the standard deviation are positive quantities (V(X)>0, σx>0).
2. If c is any constant, then
and
II.3. Moment generating function
The moment generating function (MGF), Mx (t), of a random variable X is defined by
It is obtained for a discrete random variable X as
And for a continuous random variable X as
Recall that the McLaurin series of the function ex is
Hence, we can write the MGF as
Taking the derivative of Mx (t) with respect to t, we obtain
7
Setting t = 0, we obtain
Similarly, we obtain
In the same manner, we obtain all moment of X, as
where, denotes the nth derivative of Mx (t) with respect to t.
To obtain any moment from the MGF, it must be follows these steps:
1- Find the MGF
2- Fixed the order r of the moment.
3- Differentiate the MGF r times with respect to t
4- Set t=0.
Example 6:
Let X be a continuous random variable, and its pdf is given by:
where, λ is a strictly positive number and RX=[0, +[.
a) Find the mgf Mx(t).
b) Find the first and the second moment of X using the mgf.
Solution:
a) The mgf is obtained as follows:
8
b) The first moment is E[X]and the second moment is E[X2], they can be calculated from the
mgf as follows:
III. Two-dimensional random variables
If X and Y are two continuous random variables, then we define the joint probability density
function or simply the joint density function of X and Y by
and
The joint distribution of X and Y is
The joint distribution FXY (x, y) has the following properties:
9
The joint density function can be obtained from the distribution function using this expression:
III.1. Marginal functions
The marginal distribution function of X, FX (x) = P(X ≤ x), is obtained using the following
expression:
Also, the marginal distribution function of Y, Fy (Y) = P(Y ≤ y), is obtained in the same manner:
Then, the marginal density function of X and Y are, respectively, given by
and
III.2. Conditional density functions
The conditional density function of X and Y can be obtained, respectively, using the following
expressions:
10
and
Example 7:
Solution:
11
Hence:
III.3. Covariance and correlation coefficient
If X and Y are two random variables with joint density function fXY (x, y), then
Hence:
The covariance of two random variables X and Y is :
And the correlation coefficient can be calculated from the covariance using this equation
If X and Y are independent random variables, then:
Hence, Cxy=ρxy=0.
Example 8:
The probability density function of the two-dimensional random variable (X, Y) in the area shown
in the following Figure, is given by
12
- Find ρxy.
Solution:
13