0% found this document useful (0 votes)
72 views4 pages

Chebyshev's Inequality and Consistent Estimators

- Chebyshev's inequality provides bounds on the probability that a random variable will deviate from its mean by a given amount. It states that for any random variable X with mean μ and variance σ2, the probability that X deviates from the mean by more than k standard deviations is at most 1/k2. - The weak law of large numbers states that if X1, X2, ... are independent random variables with the same mean μ and finite variance, then the sample mean of X1 + ... + Xn/n converges in probability to the true mean μ as n approaches infinity. - An estimator θ^ is consistent if it converges in probability to the true parameter value θ as the

Uploaded by

d
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
72 views4 pages

Chebyshev's Inequality and Consistent Estimators

- Chebyshev's inequality provides bounds on the probability that a random variable will deviate from its mean by a given amount. It states that for any random variable X with mean μ and variance σ2, the probability that X deviates from the mean by more than k standard deviations is at most 1/k2. - The weak law of large numbers states that if X1, X2, ... are independent random variables with the same mean μ and finite variance, then the sample mean of X1 + ... + Xn/n converges in probability to the true mean μ as n approaches infinity. - An estimator θ^ is consistent if it converges in probability to the true parameter value θ as the

Uploaded by

d
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

STAT 410

Chebyshevs Inequality:
Let X be any random variable with mean and variance 2. For any > 0,
P(|X | < ) 1

2
2

or, equivalently,
P(|X | )

2
2

Setting = k , k > 1, we obtain


P(|X | < k ) 1

k2

or, equivalently,
P(|X | k )

k2

That is, for any k > 1, the probability that the value of any random variable will
1
.
be within k standard deviations of its mean is at least 1

k2

Def

Let U 1 , U 2 , be an infinite sequence of random variables, and let W be


another random variable. Then the sequence { U n } converges in probability
to W, if for all > 0,
lim P
n

Un W = 0 ,

and write U n W .

Def

An estimator for is said to be consistent if , i.e.,


for all > 0, P 0 as n .

The (Weak) Law of Large Numbers:


Let X 1 , X 2 , be a sequence of independent random variables, each having the
same mean and each having variance less than or equal to v < . Let
Mn =

X 1 + ... + X n

n = 1, 2, .

Then M n . That is, for all > 0, lim P


n

Mn = 0.

Let X 1 , X 2 , be i.i.d. with mean and standard deviation . Let


Xn =

X 1 + ... + X n

n = 1, 2, .

Then X n . That is, for all > 0,

lim P X n = 0 .
n

Let Y n be the number of successes in n independent Bernoulli trials with


probability p of success on each trial. Then for all > 0,
Yn
P
p

p (1 p ) ,

n 2

Yn P
and
p.

X n X , Yn Y X n + Yn X + Y

X n X , a = const a X n a X

X n a , g is continuous at a g X n g ( a )

X n X , g is continuous g X n g ( X )

X n X , Yn Y X n Yn X Y

1.

Let X 1 , X 2 , , X n be a random sample of size


distribution with probability density function
1

1
x
f (x; ) =

Recall:

E( X ) =

1
,
1+

0 x 1

n from the

0 < < .

otherwise

E ( ln X ) = .

a)

1 n
Show that = ln X i is a consistent estimator of .
n i =1

b)

Show that =

1 X
1
= 1 is a consistent estimator of .
X
X

Similarly to Chebyshevs Inequality,

MSE

2.

Let X 1 , X 2 , , X n be a random sample of size n from a uniform


distribution on the interval ( 0 , ).

a)

~
Show that = 2 X is a consistent estimator of .

b)

Show that = max X i is a consistent estimator of .

You might also like