Random Variables
Random Variables
Basic Preliminaries: -
Random Experiment::Experiment whose outcomes are not predictable.
Sample Point:: Each and every outcome of random experiment.
Sample Space:: Totality or aggregate of sample points. It is denoted by symbol S or
Event:: A subset of sample space.
Impossible Event:: Event which does not contain any sample point.
Certain Event::Event which contains all sample points.
Mutually Exclusive events:: The happening of any one event excludes the happening of
the other event.
n
Exhaustive Event::The events A1 , A2 , , An are said to be exhaustive if Ai S .
i 1
Independent Events:: The occurrence or non occurrence of one does not affect the
occurrence or non occurrence of the other.
# Probability
If a random experiment results in ‘n’ mutually exclusive and equally likely outcomes of
which ‘m’ are favorable to event A then probability of event A, denoted as
m
p ( A) and is defined as p ( A) .
n
Standard Results ::
1. 0 p( A) 1
2.
p AC p A 1 p A
3. p 0, p S 1
4. If A B then p ( A) p ( B )
5. p A B p A p B p A B
6. If A and B are mutually exclusive events then p A B 0
7. In particular, A and B are mutually exclusive p A B p A p B
p A B p A B
8. Conditional Probability: p A / B , p B / A .
p B p A
Thus p A B p A p B / A , p A B p B p A / B
For n events A1 , A 2 , , A n the probability of intersection event is
p A1 A2 A3 An p A1 p A2 / A1 p A3 / A1 A2 p An / A1 A2 An1
Sufficient condition for above result to hold is p A1 A2 An1 0 .
9. If A and B are independent events then p A B p A p B .
n
10. A1 , A2 , , An is a set of mutually exclusive events then p Ai p S 1 .
i 1
11. Binomial Probability :: An experiment is performed n number of times repeatedly. A is
an event known as success with probability p. If event A occurs r times among n trials
then P(r successes ) nCr p r q n r , where p is the probability of success and q 1 p the
probability of failure.
Random Variable
Random Variables
Let be a sample space associated with given random experiment. A function defined
from to a number set is known as Random Variable. Thus a random variable is a function
whose domain set is a sample space of some random experiment.
The collection of values actually taken by a random variable is known as Range of the random
variable. Depend on the range; random variables get classified into the following two types.
Discrete Random Variables:: Random variable whose range set is a discrete set or the
set of natural numbers is known as discrete random variable.
e.g. Number of accidents on Mumbai-Pune expressway, Number of Neutrons emitted by a
radioactive isotope, Number of students in a class.
Continuous Random Variables:: Random variable whose range set is an interval is
known as continuous random variable.
e.g. Lifetime of an electrical or electronic components, Height of students, Amount of cosmic
radiation, Time between arrivals of two flights at certain airport, Time taken for surgery at a
certain hospital.
Note : Random variables arising through the counting processes are usually discrete in nature
while those which arises through measuring processes are continuous in nature.
Examples of Random Variables
1. Consider the experiment of tossing of two fair coins simultaneously. The sample space is
HH , HT , TH , TT . Let X 1 count of heads in a single toss. By this every sample point get
assign by a real number viz X1 ( HH ) 2, X1 ( HT ) X1 (TH ) 1, X1 ( HH ) 0 .
Thus X1 is a random variable defined on . Here Range of X1 0,1, 2 . Since the range of
X1 is a subset of discrete set, X1 is a discrete random variable.
Similarly, X 2 count of tails in a single toss is also a discrete random variable defined on .
2. Consider the experiment of tossing of two fair dice simultaneously.
Sample space (i, j ) /1 i, j 6 .
i) X 1 Sum of the numbers appear on the two faces, i.e., X1 ((i, j )) i j .
Range of X1 2,3, 4,5,6,7,8,9,10,11,12 .
ii) X 2 Product of the numbers appear on the two faces, i.e., X 2 ((i, j )) ij
Range of X 2 1, 2,3, ,36 .
iii) X 3 Minimum or Maximum of the two numbers appear on the two faces, i.e.,
X 3 ((i, j )) min(i, j ) or max(i, j ) . Range of X 3 1, 2,3, 4,5,6 .
Note : Range of all X1 , X 2 and X 3 is a subset of set of Natural number set(Discrete set)
Hence X1 , X 2 and X 3 are discrete random variables (drv).
iv) X 4 Quotient of the two numbers appear on the two faces, i.e., X 4 ((i, j )) i / j . Range
of . Here X 4 is taking integer as well as rational values.
The table containing the value of X along with the probabilities given by probability
mass function is called probability distribution of the random variable X .
Examples
1. Consider the experiment of tossing of 3 coins simultaneously.
HHH , HHT , HTH , THH , HTT , TTH , THT , TTT .
Let X be count of heads. Then c. d. f of X can be tabulated as follows.
X xi 0 1 2 3
p( X xi ) 1/ 8 3/ 8 3/ 8 1/ 8
2. Consider the experiment of tossing of 2 fair dice simultaneously.
(i, j ) /1 i, j 6 . Let X be count of sum of the numbers appear on the
faces.
X xi 2 3 4 5 6 7 8 9 10 11 12
p( X xi ) 1 2 3 4 5 6 5 4 3 2 1
36 36 36 36 36 36 36 36 36 36 36
3. A point is chosen at random in a circle of radius . Let X be the distance of the point
from the centre of the circle. Then the range of X is the closed interval with end points
and , i.e., range is Here X is a continuous random variable.
Distribution Function or Cumulative Distribution Function (c. d. f )
Let be a discrete random variable with rangeset .The distribution function
of X denoted as , is the probability of the event , i. e.,
X xi x1 x2 x3 xn
p( X xi ) p1 p2 p3 pn
p1
Random Variable
Note that: i) Probability Mass Function (pmf) is a function of discrete variable. There are two
ways for graphical representation of pmf. The bar chart and the histogram. The sum of the
lengths of the bars in the bar chart is 1 whereas the sum of the areas of the rectangles in the
histogram is 1.
ii) is a function of real continuous variable . Graph of this function is step or staircase.
Graphs of pmf and cdf :
Illustrative Examples
Q 1) If X is a random variable the difference between heads and tails obtained when a fair
coin is tossed 3 times. What are the possible values of X and its probability mass
function? Also write the distribution function of X .
Sol . HHH , HHT , HTH , THH , HTT , TTH , THT , TTT . X can take values from
n
Q 2) A fair dice is rolled twice. Find the possible values of random variable X and its
associated probability mass function, where X is the maximum of the values appearing
in 2 rolls.
n. (i, j ) /1 i, j 6 . X can take values from 1 to 6, i.e.,
Sol
Range of X 1, 2,3, 4,5,6 . The pmf is
X xi 1 2 3 4 5 6
1 3 5 7 9 11
p( X xi )
36 36 36 36 36 36
Q 3) A random variable X takes values 3, 1, 2, 5 with respective probabilities
2k 3 k 1 k 1 k 2
, , and . Determine the distribution of X .
10 10 10 10
Soln.Theassignments are probabilities, equating the to one
2k 3 k 1 k 1 k 2
1 k 3 . Hence the distribution of X is
10 10 10 10
X 3 1 2 5
p( X x) 3/10 4 /10 2 /10 1/10
Random Variable
Q 4) A random variable X has probability mass function (pmf) shown in the following
tabular form. Find the value of unknown k . Hence write pmf and cdfof X . Draw
graphs of pmf and cdf . Also find i) p (1 X 3) ii) p (1 X 3)
iii) p( X 1) iv) p ( X 5)
X 1 2 3 4 5
p( X x) k / 36 3k / 36 4k / 36 10k / 36 18k / 36
1 3 4 1
i) p 1 X 3 p X 1 p X 2 .
36 36 36 9
3 4 7
ii) p 1 X 3 p X 2 p X 3 .
36 36 36
1. Mathematical Expectation / Theoretical Mean (analogous to Centre of Gravity)
Theoretical mean or expectation of X denoted as E ( X ) or is defined as
n
E ( X ) xi pi .
i 1
Expectation value of X provides a central point of the distribution.
Note: Expected value of a random variable may not be actually taken by the variable.
2. Variance
Variance of X denoted as Var ( X ) .
n
Var ( X ) xi E ( X ) pi . This can be simplified as
2
i 1
n
Var ( X ) E ( X 2 ) E ( X ) , where E ( X 2 ) xi 2 pi
2
i 1
p X xi
1 3 5 7 9 11 13
49 49 49 49 49 49 49
c.d.f. F a p X a
X xi 0 1 2 3 4 5 6
1 4 9 16 25 36
F (a) 1
49 49 49 49 49 49
Q 2) Determine k such that the following functions are p.m.f.s
Soln. 1. P ( x ) k x, x 1, 2, 3, , 10
k (1 2 3 4 5 6 7 8 9 10) 1
1
k
55
2x
2. P( x) k , x 0, 1, 2, 3
x!
4 3
k 1 2 2 1 k
3 19
3. P( x) k (2 x 2 3x 1), x 0, 1, 2, 3
1
k (1 6 15 28) 1 k
50
n
Q 3) Verify whether the assignment p( X n) 2 , n 1, 2,3, is a probability mass
function for random variable X .
Soln. To be a probability mass function
i) p ( X n) 0 n
ii) p ( X n) 1
n 1
1
p ( X n) is in exponential function, condition i) holds.
2n
n
1 1 1
Further p( X n) n 1 1 1
n 1 n 1 2 n 0 2 1 (1/ 2)
Random Variable
Hence condition ii) also holds.Therefore the assignment is a probability mass function.
Q 4) A box contains 8 items of which 2 are defective. A person draws 3 items from
the box. Determine the expected number of defective items he has drawn.
n
Sol . Let X be the number of defective items drawn by a person. The pmf of X is
0 1 2
Axioms
b
i) 0 p a X b 1, or , 0 f ( x)dx 1
a
p a X b
Properties of c.d.f.
1) F (a) is monotonically increasing. i.e. a b F (a) F (b)
2) lim F (a) 1
a
3) lim F (a) 0
a
4) p a X b F (b) F (a)
Note that : For a continuous random variable X ,
p a X b p a X b p a X b p a X b
5) p X a 1 p X a 1 F a
Illustrative Examples
1 1
ii) p 1 X 2 F 2 F 1 1 .
2 2
Random Variable
Q 2) The time in years X required to complete a software project has a p.d.f.
kx(1 x), 0 x 1
f ( x)
0 otherwise
Compute the probability that the project will be completed in less than four months.
Soln.
f ( x)dx 1 k 6
1
The probability that the project will be completed in 4 months, i.e. less than
3
years.
1/3
p X 4 f ( x) dx 0.259 .
0
2 x F (a) 1
Q 4) X continuous random variable with probability density function
kx, 0 x2
f ( x ) 2k , 2 x4
kx 6k , 4 x6
Find k and mean.
6 2 4 6
Soln.
0
f ( x) dx 1 kx dx 2k dx (kx 6k ) dx 1
0 2 4
Random Variable
1
Evaluating the integral and simplifying, we have 8k 1 k .
8
6 2 4 6
E ( X ) xf ( x) dx kx 2 dx 2kx dx (kx 2 6kx) dx
0 0 2 4
2 4 6 6
x3 x2 x3 x2
k 2 k k 6k
3 0 2 2 3 4 2 4
Simplifying E X 3 .
Q 5) i) Verify f ( x) 6 x 1 x ,0 x 1 is a probability density function for the
diameter of a cable.
ii) Find the condition on b such that p X b p X b .
Soln.i) At x 0, f ( x) 0. For 0 x 1 , 0 1 x 1and 0 6x 6 . Therefore
0 6 x(1 x) 6 . Hence f ( x ) is positive for 0 x 1 .
1 1
1
Further f ( x)dx 0 6 x(1 x)dx 6 2 3 1 .
This proves f ( x ) is pdf for 0 x 1 .
ii) p X b p X b p X b 1 p X b .
Random variable is continuous, p X b p X b . Therefore
b
1 1
2 p X b 1 p X b f ( x)dx .
2 0 2
Solving above equation we have 4b3 6b 2 1 0.
Q 6) A continuous random variable X has pdf f ( x) 3x 2 , 0 x 1 . Find a such that
p X a 0.05 .
1
Soln. p X a 0.05 f ( x)dx 0.05 , i.e.,
a
1
Variance
Var ( X ) E ( X 2 ) ( E ( X )) 2 i 2 p(i) p 2 p p 2 p(1 p)
i
Applications
Situations where Bernoulli distribution is commonly used are
1) Items produced by a machine defective / non-defective.
2) Students appearing for examination pass / fail.
Binomial Distribution
Binomial random variable counts the number of successes when n Bernoulli trials are
performed.Suppose n independent Bernoulli trials are performed. Each trial results in a success
with probability p . If X represents the number of successes that occur in n trials, then X is
said to be a Binomial random variable and the probability distribution is known as Binomial
distribution with parameters (n, p ) .
Probability Mass Function (p.m.f.)
probability mass function of a binomial random variable having parameters (n, p ) is given
by P( X i ) nCi p i (1 p ) n i , i 0, 1, , n .
n
Note that nCi p i (1 p ) n i ( p 1 p ) n 1 .
i 0
Expectation
n n
E ( X ) i p(i ) i nCi p i (1 p ) n i
i 0 i 1
Variance
n n n
E ( X 2 ) i 2 p(i ) i 2 nCi p i (1 p) n i i 2 nCi p i (1 p) n i
i 0 i 0 i 1
n 1 n 1
r
r 0
n 1
Cr p r (1 p ) n 1 r n n 2Cr 1 p r (1 p ) n 1r
r 0
n n n
E ( X ) (i i i ) Ci p (1 p )
2 2 n i n i
i (i 1) Ci p (1 p )
n i n i
i nCi p i (1 p ) n i
i 1 i 1 i 1
i (i 1) n ! n! (n 2)!
Now, i (i 1) nCi n(n 1) n(n 1) n 2Ci 2
i !(n i )! (i 2)!(n i)! (i 2)!(( n 2) (i 2))!
n n
n(n 1) n2
Ci 2 p (1 p)
i n i
n(n 1) n 2Ci 2 p i (1 p) n i
i 1 i 2
n2 n2
i 2 r n(n 1) n 2Cr p r 2 (1 p) n r 2 p 2 n (n 1) n 2Cr p r (1 p )( n 2) r
i 0 i 0
n2
p n (n 1)( p 1 p)
2
p n (n 1)
2
E ( X ) p n (n 1) np p 2 n2 p n n np p 2n2 np(1 p)
2 2
Illustrative Examples
Q 1) The probability that a bomb dropped from a plane will strike the target is the probability
that (i) exactly two will strike the target (ii) at least two will strike the target.
Soln. Random Variable X : Count of bombs dropped from a plane striking the target
1 1
p & n 6 X B 6, .
5 5
(i) Exactly two will strike the target.
4444
2 4
1 4
P( X 2) C2 15 (0.2)6 153
6
0.24576
5 5 555555
(ii) At least two will strike the target.
P( X 2) 1 P( X 2)
Random Variable
1 P( X 0) P( X 1)
0 6 5
1 4 1 4
1 6
5 5 5 5
4 4 4 4 4 4 6 45
1 6
5 5 5 5 5 5 5
45 45
1 6 [4 6] 1 102 1 0.65536 0.34464
5 5 55
Q 2) The probability that an entering student will graduate is 0.4. Determine the probability
that out of 5 students (a) none (b) one (c) at least one will graduate.
Soln. Random Variable X : Count of graduate students p 0.4, n 5
X B (5, 0.4) .
(a) P( X 0) (0.4)0 (0.6)5 0.07776
(b) P( X 1) 0.2592
(c) P( X 1) 1 P( X 0) 0.92224
Q 3) Point out the fallacy of the statement : The mean of binomial distribution is 3 and
variance is 5.
5
Soln. np 3 & npq 5 q 1
3
Q 4) The mean and variance are 6 and 2 respectively. Find P ( X 3) .
1
Soln. np 6 & npq 2 q
3
1 2
p 1 nq
3 3
P( X 3) 1 P( X 3)
1 P( X 0) P( X 1) P( X 2)
2 1 9 8 2 1
0 9 8 2 7
2 1
1 9
3 3 3 3 2 3 3
9
1
P( X 1) 1 P( X 1) 1 P( X 0) 1 0.999949
3
Q 5) If the probability that a new born child is a male is 0.6, find the probability that in a
family of 5 children there are exactly 3 boys.
Sol . X : count of a new born male child. Given p 0.6 & n 5 X B(5, 0.6) .
n
5!
P( X 3) 5C3 (0.6)3 (1 0.6) 2 (0.6)3 (0.4) 2 10 (0.6)3 (0.4) 2 0.3456 .
3! 2!
Q 6) Out of 800 families with 5 children each, how many would you expect to have
i) 3 boys?
ii) either 2 or 3 boys?
Assume equal probabilities for boys and girls.
1 1
Soln. Let X = Number of boys. Given p , n 5 . X B 5, .
2 2
Random Variable
3 2
1 1
i) p ( X 3) 5C3 0.3125
2 2
Total number of families with 3 boys = 0.3125*800 250
3 2 2 3
1 1 1 1
ii) p( X 2 or X 3) 5C3 5C2 0.625
2 2 2 2
Total number of families with either 2 or 3 boys = 0.625*800 500 .
Poisson Distribution
A random variable X said to follow Poisson distribution if its probability mass function
e r
is given p( X r ) , r 0,1, 2,3 . is known as parameter of the distribution.
r!
Expectation and Variance
If X P ( ) then E ( X ) , Var ( X ) sd ( X ) .
Illustrative Examples
2
where, mean and standard deviation.
x
Note that Z then E ( Z ) 0 & Var( Z ) 1 .
In this case, Z (0, 1) is known as standard normal variate whose pdf is
Random Variable
1 z 2 /2
f ( z) e
2
Properties of normal distribution
1. The normal curve is bell-shaped and symmetric about its mean.
2. The probability of x lying between x1 & x2 is given by the area under the normal curve
from x1 to x2 .
z
1
In terms of standard normal variate, p(0 z ) e z /2 dz .
2
2 0
This integral is called the probability integral or the error function.
Note that:
i) The area under the normal curve between the ordinates x &
2
x is 0.6826, i.e. 68.5%. Approximately of the values lie within
3
these limits.
ii) The area under the normal curve between x 2 & x 2 is 0.9544 nearly
1
95.5% which implies that 4 % of values lie inside these limits.
2
iii) 99.73% values lie inside x 3 to x 3 .
iv) 95% values lie inside x 1.96 to x 1.96 , i.e. only 5% lie outside these
limits.
v) 99% values lie inside x 2.58 to x 2.58 , i.e. only 1% values lie outside
these limits.
# Here onwards we will denote the pdf of standard normal variable z by symbol
f ( z) .
# Values of pdf of standard normal variate can be obtained from statistical tables.
Illustrative Examples
Q 1) A certain number of articles manufactured one batch were classified into 3 categories
according to a particular characteristic, being less than 50, between 50 – 60 and 60 . If
Random Variable
this characteristic is known to be normally distributed, determine the mean and s.d. for
this batch if 60%, 35% and 5% were found in these categories.
X
Soln. Let Z .
60
P( Z1 ) P Z 0.6
60
P( Z Z 2 ) P Z 0.05
50
A1 Z1 0.1
60
A2 Z 2 0.05
6 7.543, 48.092
Z1 Z1 Z1
f ( z )dz 0.6 0.5 f ( z )dz 0.6
0
f ( z )dz 0.1 Z1 0.26 .
Q 2) In a normal distribution, 31% items are under 45 and 8% over 64. Find mean and s.d.
X 45
Z1
Soln. Z
6
S.N.V. Z1
P ( Z Z1 )
f ( z )dz
0 Z1 Z1
0.31
f ( z )dz f ( z)dz f ( z )dz 0.19
0 0
64
From table, Z1 0.5 Z2
Z2
P( Z Z 2 )
Z2
f ( z )dz f ( z )dz f ( z )dz
0 0
0.5 Z2
0
f ( Z )dZ 0.19 f ( z )dz 0.5 0.08 0.42
0
45 64
From table, Z 2 1.4 0.5 &
Thus, 1.4 50 & 10
Q 3) A manufacturer of air-mail envelopes knows from experience that the weight of envelope
is normally distributed with mean 1.95 gm and s.d. 0.05 gm. About how many envelopes
weighing 2 gm or more 2.05 gm or more can be expected in a given packet of 100
envelopes.
Soln. 1.95 & 0.05
Random Variable
2 1.95
1
PZ P( Z 1) f ( z )dz f ( z )dz f ( z )dz 0.5 0.3413 0.1587
0.05 1 0 0
xm Pmn P( X xm )
Total P(Y y1 ) P(Y y2 ) P(Y yn ) 1
Random Variable
Marginal Distributions
Let {( xi , y j ), P( xi , y j )), i 1, , m, j 1, , n} represents a joint probability
distribution of a bivariate random variable ( X , Y ) .
n
The marginal distribution of X is obtained as P[ X xi ] P [ X xi , Y y j ] .
j 1
i 1 j 1 j 1 i 1
CorelationCoefficient :
It is a measure of correlation between the two random variates. It is given by the formula
Cov( X , Y ) Cov( X , Y )
.
sd ( X ) sd (Y ) X Y
Note that : always lie between 1 and 1 . Depend on the value of , following
conclusions can be drawn about the variables
0 Variatesare uncorrelated.
0 1 Variates are negatively correlated.
1 0 Variates are positively correlated.
1 There is perfect positive correlation.
1 There is perfect negative correlation.
Independent Random Variables:
Two random variables are said to be independent if the product of marginal probabilities
equal to the corresponding joint probability.
Pi j P( X xi ) P(Y y j ) i, j .
Note that: Two Independent random variables are always Uncorrelated. But the converse
is not true. This means Uncorrelated Independence.
Random Variable
Illustrative Example
Q 1) Consider the following bivariate distribution.
X
0 1 P( X ) For ( X )
Y
1 1
1 0
3 3
1 1
0 0
3 3
1 1
1 0
3 3
1 2
P(Y ) 1 For (Y )
3 3
X -1 0 1
P( X ) 1/3 1/3 1/3 1
Y 0 1
P(Y ) 1/3 2/3 1
1 1 1 1 2 2
E ( X ) 1 0 1 0 and E (Y ) 0 1
3 3 3 3 3 3
1 1 1
E ( XY ) 1 0 1 0 0 0 0 0 1 0
3 3 3
Cov( X , Y ) E ( XY ) E ( X ) E (Y ) 0 . ( X , Y ) 0 .
Therefore variates are uncorrelated.
1 1
Now P11 0 P( X 1) P(Y 0)
3 3
X and Y are not independent.
Q 2) Let X and Y be independent random variables with the following distributions :
X 1 2 Y 5 10 15
p( X xi ) 0.6 0.4 p (Y y j ) 0.2 0.5 0.3
Find the joint distribution of X and Y .
Random Variable
Soln.Since X and Y are independent, joint probabilities are the product of marginal
probabilities. Hence the joint distribution of X and Y is
Y 5 10 15 Marginal Probabilities
X of X
1 0.12 0.30 0.18 0.6
2 0.08 0.20 0.12 0.4
Marginal
0.2 0.5 0.3 Total 1
Probabilities of Y
Q 3) A fair coin is tossed three times. Let X equals 0 or 1 according as a head or a tail occurs
on the first toss. Let Y equals the total number of heads that occur in each toss.
a) Write the marginal distributions of X and Y .
b) Write the joint distribution of X and Y .
c) Find Cov X , Y
d) Find X , Y . Comment on value of X , Y .
e) Determine whether X and Y are independent.
Soln. The sample space of the experiment is
HHH , HHT , HTH , THH , HTT , THT , TTH , TTT
a) For r. v. X we have X HHH X HHT X HTH X HTT 0
And X THH X THT X TTH X TTT 1
For r. v. Y we have Y HHH 3 , Y HHT Y HTH Y THH 2 ,
Y THT Y TTH Y HTT 1 , Y TTT 0 .
Marginal Distributions
p X 0 p HHH HHT HTH HTT .
1
2
p X 1 p THH THT TTH TTT
1
2
Hence the marginal distribution of X
X 0 1
p( X xi ) 1 1
2 2
Y 0 1 2 3
p (Y y j ) 1 3 3 1
8 8 8 8
Random Variable
p X 0, Y 0 p 0 , p X 0, Y 1 p HTT
1
,
8
p X 0, Y 2 p HHT HTH , p X 0, Y 3 p HHH ,
1 1
4 8
p X 1, Y 0 p TTT , p X 1, Y 1 p THT TTH ,
1 1
8 4
p X 1, Y 2 p THH , p X 1, Y 3 p 0 .
1
8
Hence the joint distribution is
Y 0 1 2 3 Marginal
X Probabilitiesof X
0 0 1/ 8 1/ 4 1/ 8 1/ 2
1 1/ 8 1/ 4 1/ 8 0 1/ 2
Marginal 1/ 8 3/ 8 3/ 8 1/ 8 Total 1
Probabilities of Y
, E Y y j p Y y j .
1 3
c) E X xi p X xi
i 2 j 2
E XY xi y j p X xi , Y y j .
1
i, j 4
Cov X , Y
d) X , Y , X Var X , Y Var Y .
X . Y
Var X E X 2 E X , E X 2 xi 2 p X xi . Var X .
2 1 1
Therefore
i 2 4
X . . Var Y E Y 2 E Y , E Y 2 y j 2 p Y y j 3 .
1
2
2
j
3 3 Cov X , Y 1/ 4 1 .
Var Y . Therefore Y . X ,Y
4 2 X . Y
1/ 2 3 / 2 3
X , Y 0, variates X and Y are negatively correlated.
1 1
e) p X 0 , p Y 0 But p X 0, Y 0 0 p X 0 p Y 0 .
2 8
Hence variables are not independent.