LECTURE 7: Conditioning on a random variable; Independence of r.v.
's
• Conditional PMFs
Conditional expectations
- Tota I expectation theorem
• Independence of r.v. 's
- Expectation properties
- Variance properties
• The variance of the binomial
• The hat problem: mean and variance
1
Co di ional M s
Px1A(x I A)= P(X = x I A) P I (x I y) - P(X - x I = y)
Pxw(x Iy defined for y such that py(y) >O LPx1 (x I y) - 1
X
y <l
4 1/20 2/20 2/20
3 2/20 4120 1/20 2/20
2 1/20 3/20 1/20
1 1/20
I Px,y(x, y) = py(y) Px1y(x y)
-
1 2 3 4 X Px,y(x, y) - Px(x) Pv1x(Y x)
2
Conditional PMFs involving more than two r.v.'s
• Self-explanatory notation
PXIY,z(x I Y, z)
Px,Y1z(x, Y I z)
• Multiplication rule
P(A n B n C) = P(A) P(B I A) P( C IA n B)
PX,Y,z(x,y,z) = Px(x)py1x(Y I x)pz1x,y(z I x,y)
3
Conditional expectation
E(X] = Lxpx(x) E[X I A] = LxPXIA(x) E[X I Y = y] = LxPxjY(x I y)
X X X
• Expected value rule
E[g(X)] = Lg(x)px(x) E[g(X) I A]= Lg(x)pXIA(x)
X X
E[g(X) I Y = y] = Lg(x)PxjY(x I y)
X
4
ota ro ability and expectation theo ems
• A1, ... , An: pa ition of n
• Px(x) = P(A1) PXIAi (x) + ··· + P(An) PXIAn(x)
Px(x) = LP (y) Px (x Iy)
y
• E[X] = P(A1) E[X I A1] + ···+ P(An) E[X I An]
E[ ] = Lpy(y)E I y = y]
y
• Fine print:
Also valid when Y is a discrete r.v. that ranges over an infinite set,
as long as E[IXI] < oo
5
Independence
• of two events. P(A n B) = P(A) · P(B) P(A I B) = P(A)
• of a· r.v. a d an event. P( - x a A) - P( - x) P(A), 0 a X
• of two r. v. 's: (X-x a - y) - ( - X) (Y - y) ' o a x,y
Px,y(x, y) = Px(x) py(y), or a I x, y
, Y, Z a i d pende t ·f
PX Y,z(x, y, z) - p (x p (y) pz(z) for a x, y, z
6
Example: ind p ndenc and
. conditiona independenc
y j L
4 1/20 2120 2/20
• Independent?
3 2/20 4/20 1/20 2/20
2 1/20 3/20 1/20
1 1/20
-
-
X
1 2 3 4
• W at if we condition on X <2 and Y > 3?
7
Indepe de ce and expectat-o s
• In genera . E[g(X, Y)] -:j=.g(E X , E[Y])
• Exceptions: E[aX + b] = aE X +b E[X+Y+Z] =E[X +E[Y + -E(Z
If X, Y are independent:
g(X) and h(Y) are also i dependent: E[g(X)h(Y) = E[g(X) E h(Y)]
8
Independence and var-ances
• Always true: var(aX) = a 2 va (X) var(X + a) = var(X)
• In ge era : var(X Y) ~ var(X) var(Y)
f Y) = va ( ) va ( )
• xamples~
I X = Y. var(X Y)=
If X = -Y: var(X Y)=
- I X, Y independent: var(X - 3Y) =
9
Variance of the binomial
• X: binomial with parameters n, p
- number of successes in n independent trials
Xi =1 if ith trial is a success:
(indicator variable)
Xi= o otherwise
X = X1 + ···+ Xn
10
Th ha p obi m
• n people throw their hats in a box and then ick one at random
- Al permuta io s equally likely
Equivalent to picki g one at at a - ·me
• X. number of people who get their own hat
Find E[X
1, if i selec s ow hat
Xi=
0, otherwise.
11
The va, -a ce i the hat ob em
• X. number of peop e who get their own hat
ind var(X)
X i. -- 1, if i se ects ow hat
0, otherwise.
X2 + ··· + Xn
• var(X) = E X 2] - (E[X ) 2 x 2 = Lx?-+ L xixi
i i,j.i=f=j
• E[Xf] =
12
MIT OpenCourseWare
https://ocw.mit.edu
Resource: Introduction to Probability
John Tsitsiklis and Patrick Jaillet
The following may not correspond to a particular course on MIT OpenCourseWare, but has been provided by the author as an individual learning resource.
For information about citing these materials or our Terms of Use, visit: https://ocw.mit.edu/terms.
13