0% found this document useful (0 votes)
54 views3 pages

Problem Set 3

This document contains problems related to probability theory concepts including: 1) Interchanging limits and integrals, using examples of monotone convergence, dominated convergence, and Fatou's lemma. 2) Defining Riemann and Lebesgue integrals of random variables. 3) Expectation and conditional expectation, using change of variables formulas and properties of independent random variables. 4) Defining stopping times and proving properties like the sum of two stopping times is also a stopping time. 5) Applying martingales and the optional stopping theorem to problems involving random walks.

Uploaded by

Jacob M
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
Download as pdf or txt
0% found this document useful (0 votes)
54 views3 pages

Problem Set 3

This document contains problems related to probability theory concepts including: 1) Interchanging limits and integrals, using examples of monotone convergence, dominated convergence, and Fatou's lemma. 2) Defining Riemann and Lebesgue integrals of random variables. 3) Expectation and conditional expectation, using change of variables formulas and properties of independent random variables. 4) Defining stopping times and proving properties like the sum of two stopping times is also a stopping time. 5) Applying martingales and the optional stopping theorem to problems involving random walks.

Uploaded by

Jacob M
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
Download as pdf or txt
Download as pdf or txt
You are on page 1/ 3

EE 046868 Problem Set 3

Gal Mendelson galmen@campus.technion.ac.il

Expectation. Conditional Expectation.

Some theory: Interchanging limits and integrals.

Monotone convergence. Let {Xn } be a non-negative, non decreasing sequence of RVs (mean-
ing Xn () 0, and Xn+1 () Xn ()). Set X() = limn Xn (). Then limn E[Xn ] =
E[limn Xn ], or in short, E[Xn ] E[X], where the convergence is monotonic non-decreasing.
a.s.
Dominated convergence. If Xn X, and |Xn | Y for all n, and E[Y ] < , then
E[Xn ] E[X].

Bounded convergence. A special case of dominated convergence, where Y is a constant.

Fatous Lemma. If Xn 0, then lim inf n E[Xn ] E[lim inf n Xn ].

Problems.

1. Interchanging limits and integrals. Please prove the correctness of the examples you
provide. You can use examples we saw in class.
a.s.
(a) Give an example where Xn X, but limn E[Xn ] 6= E[limn Xn ].
(b) Let {Xn } be a non-negative sequence of RVs. Prove E[
P P
k=1 Xk ] = k=1 E[Xk ].
(c) Let {Xn } be an i.i.d, non-negative sequencePof RVs. Let N be integer valued RV,
independent of the sequence {Xn }. Prove E[ N k=1 Xk ] = E[X1 ]E[N ], without using
conditional probability (the smoothing theorem for example).

2. Integration - the essentials.

(a) Riemann Integration. Let h be a bounded function on a bounded interval of the


Rb R
form (a, b]. Define a h(x)dx. Next, if h is defined over R, define h(x)dx.
(b) Lebesgue Integration of a RV. Given aR probability space (, F, P) and a RV X, define
the Lebesgue integral of X, E[X] = X()P(d). Do this in four steps:
X is an indicator RV,.
X is a simple RV

1
X is non negative
X is a RV, and E[X + ] and E[X ] exist (recall x+ = x 0, x = x 0).

3. Expectation and conditional expectation. Connection to undergraduate probability.


Lebesgue-Stieltjes integration. Let F be a cumulative distribution function (CDF). Since
F is a CDF, it is right continuous and defines a probability measure P on the Borel subsets
of R by setting P((a, b]) = F (b) F (a) for half-open intervals (a, b], and then extending
to all Borel sets via -additivity.
Given a Borel measurable function h on R, the Lebesgue-Stieltjes integral of h with
respect to F is defined to be the Lebesgue integral of h with respect to P. That is
Z Z
h(x)dF (x) := h(x)P(dx).

(a) Change of variables formula to calculate expectation. Given are a probability space
(R, B, P) (i.e. the real line, the Borel -field and
R some probability measure), a RV
X, and a function g : R R. Prove E[g(X)] = g(x)dF (x) for 4 cases:
g is an indicator function: g(X) = 1B (X), for some B B.
a simple function: g(X) = nm=1 cm 1Bm (X), where cm R and Bm B.
P
g is
g is a non negative function. Use gn (x) = (b2n g(x)c/2n ) n to approximate g.
g is a Borel measurable function. Use g = g + g .

R change of variables formula is much more general. For example, E[g(X, Y )] =


(b) RThe
g(x, y)dFXY (x, y).

Suppose X and Y are independent RVs (This means that their distribution is a prod-
uct measure, i.e dFXY (x, y) = dFX (x)dFY (y)). Let (x, y) be a Borel measurable
function, and let h(x) = E[(x, Y )]. Prove E[(X, Y )|X] = h(X). In particular,
show E[Y |X] = E[Y ].

4. Stopping times.

(a) Define a stopping time with respect to a filtration {Fn }.


(b) Let and be stopping times with respect to a filtration {Fn }. Prove that + is
also a stopping time with respect to the same filtration.
(c) Suppose a.s. Is a stopping time? Prove that it is or provide a counter
example.

5. Martingales,
Pn Optional stopping. Consider the simple, symmetric random walk given by
Sn = i=1 Xi , S0 = 0, where the Xi s are i.i.d, such that P(Xi = 1) = 0.5 and P(Xi =
1) = 0.5. Let Fn = (X1 , ..., Xn ). Fix some a < 0 and b > 0. Define the event
A = {the walk reaches a before it reaches b}. Define the stopping time

T = min{n : Sn {a, b}}.

2
(a) Prove that Sn is a martingale with respect to {Fn }.
(b) Find P(A) using the optional stopping theorem and the previous exercise.
(c) Prove that Sn2 n is a martingale with respect to {Fn }.
(d) Use the previous results to find E[T ].

You might also like