0% found this document useful (0 votes)
38 views30 pages

Advance Engineering Methametics

The document contains lecture notes on Advanced Engineering Mathematics, focusing on Probability and Statistics. It covers concepts such as random experiments, sample space, events, axiomatic definitions of probability, and important theorems related to probability. Additionally, it includes several problems and solutions to illustrate the application of these concepts.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
38 views30 pages

Advance Engineering Methametics

The document contains lecture notes on Advanced Engineering Mathematics, focusing on Probability and Statistics. It covers concepts such as random experiments, sample space, events, axiomatic definitions of probability, and important theorems related to probability. Additionally, it includes several problems and solutions to illustrate the application of these concepts.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 30

Advanced Engineering Mathematics Aman Kanshotia

Lecture Notes
Advanced Engineering Mathematics

Aman Kanshotia
Assistant Professor
Sekhawati Group of Institutes, Sikar
September 2, 2025

1 Probability and Statistics 1


Random Experiment
An experiment whose outcome or result can be predicted with certainty is called a De-
terministic experiment.
Although all possible outcomes of an experiment may be known in advance the out-
come of a particular performance of the experiment cannot be predicted owing to a
number of unknown causes. Such an experiment is called a Random experiment.
(e.g.) Whenever a fair dice is thrown, it is known that any of the 6 possible outcomes
will occur, but it cannot be predicted what exactly the outcome will be.

Sample Space
The set of all possible outcomes which are assumed equally likely.

Event
A sub-set of S consisting of possible outcomes.

Mathematical definition of Probability


Let S be the sample space and A be an event associated with a random experiment. Let
n(S) and n(A) be the number of elements of S and A. Then the probability of event A
occurring is denoted as P(A), is denoted by

n(A)
P (A) =
n(S)
Note:
1. It is obvious that 0 ≤ P (A) ≤ 1.

1
Advanced Engineering Mathematics Aman Kanshotia

2. If A is an impossible event, P (A) = 0.


3. If A is a certain event, P (A) = 1.
A set of events is said to be mutually exclusive if the occurrence of any one of them
excludes the occurrence of the others. That is, set of the events does not occur simulta-
neously,

P (A1 ∩ A2 ∩ A3 ∩ . . . ∩ An . . .) = 0

A set of events is said to be mutually exclusive if the occurrence of any one then
excludes the occurrence of the others. That is, set of the events does not occur simulta-
neously,

P (A1 ∩ A2 ∩ A3 ∩ . . . ∩ An . . .) = 0

Axiomatic definition of Probability


Let S be the sample space and A be an event associated with a random experiment. Then
the probability of the event A, P(A) is defined as a real number satisfying the following
axioms.

1. 0 ≤ P (A) ≤ 1

2. P (S) = 1

3. If A and B are mutually exclusive events, P (A ∪ B) = P (A) + P (B)

4. If A1 , A2 , A3 , . . . , An , . . . are mutually exclusive events,

P (A1 ∪ A2 ∪ A3 ∪ . . . ∪ An . . .) = P (A1 ) + P (A2 ) + P (A3 ) + . . . + P (An ) . . .

Important Theorems

Theorem 1: Probability of impossible event is zero.


Proof: Let S be sample space (certain events) and ϕ be the impossible event.
Certain events and impossible events are mutually exclusive.

P (S ∪ ϕ) = P (S) + P (ϕ) (Axiom 3)

S∪ϕ=S

P (S) = P (S) + P (ϕ)

P (ϕ) = 0, hence the result.

2
Advanced Engineering Mathematics Aman Kanshotia

Theorem 2: If A is the complementary event of A, P (A) + P (Ā) ≤ 1.


Proof: Let A be the occurrence of the event Ā be the non-occurrence of the event.
Occurrence and non-occurrence of the event are mutually exclusive.

P (A ∪ Ā) = P (A) + P (Ā)

A ∪ Ā = S ⇒ P (A ∪ Ā) = P (S) = 1

∴ 1 = P (A) + P (Ā)

P (Ā) = 1 − P (A) ⇒ P (A) ≤ 1.

Theorem 3: (Addition theorem) If A and B are any 2 events,

P (A ∪ B) = P (A) + P (B) − P (A ∩ B) ≤ P (A) + P (B).

Proof: We know, A = AB̄ ∪ AB and B = ĀB ∪ AB.

P (A) = P (AB̄) + P (AB), P (B) = P (ĀB) + P (AB) (Axiom 3)

P (A) + P (B) = P (AB̄) + P (AB) + P (ĀB) + P (AB)

= P (AB̄) + P (ĀB) + 2P (AB)

P (A ∪ B) = P (AB̄) + P (ĀB) + P (AB)

P (A ∪ B) = P (A) + P (B) − P (A ∩ B) ≤ P (A) + P (B).

Note: The theorem can be extended to any 3 events, A, B and C

P (A ∪ B ∪ C) = P (A) + P (B) + P (C) − P (A ∩ B) − P (B ∩ C) − P (C ∩ A) + P (A ∩ B ∩ C)

Theorem 4: If B ⊂ A, P (B) ≤ P (A).


Proof: A and AB̄ are mutually exclusive events such that B ∪ AB̄ = A.

P (B ∪ AB̄) = P (A)

P (B) + P (AB̄) = P (A) ⇒ P (B) ≤ P (A).

Conditional Probability
The conditional probability of an event B, assuming that the event A has happened, is

3
Advanced Engineering Mathematics Aman Kanshotia

denoted by P (B/A) and defined as

P (A ∩ B)
P (B/A) = , provided P (A) ̸= 0
P (A)

Product theorem of probability


Rewriting the definition of conditional probability, We get

P (A ∩ B) = P (A)P (B/A)

The product theorem can be extended to 3 events, A, B and C as follows:

P (A ∩ B ∩ C) = P (A)P (B/A)P (C/A ∩ B)

Note: 1. If A ⊂ B, P (B/A) = 1, since A ∩ B = A. 2. If B ⊂ A, P (B/A) ≥ P (B),


P (B)
since A ∩ B = B, and ≥ P (B). As P (A) ≤ P (S) = 1. 3. If A and B are
P (A)
mutually exclusive events, P (B/A) = 0, since P (A ∩ B) = 0. 4. If P (A) > P (B),
P (B) < P (A)P (B). 5. If A1 ⊂ A2 , P (A1 /B) ≤ P (A2 /B).

Independent Events
A set of events is said to be independent if the occurrence of any one of them does not
depend on the occurrence or non-occurrence of the others.
If the two events A and B are independent, the product theorem takes the form

P (A ∩ B) = P (A) × P (B)

Conversely, if P (A ∩ B) = P (A) × P (B), the events are said to be independent (pairwise


independent).
The product theorem can be extended to any number of independent events. If
A1 , A2 , A3 , . . . , An are n independent events, then

P (A1 ∩ A2 ∩ A3 ∩ . . . ∩ An ) = P (A1 ) × P (A2 ) × P (A3 ) × . . . × P (An )

Theorem 4:
If the events A and B are independent, the events Ā and B are also independent.
Proof: The events A ∩ B and Ā ∩ B are mutually exclusive such that

(A ∩ B) ∪ (Ā ∩ B) = B

∴ P (A ∩ B) + P (Ā ∩ B) = P (B)

4
Advanced Engineering Mathematics Aman Kanshotia

P (Ā ∩ B) = P (B) − P (A ∩ B)

= P (B) − P (A)P (B) (∵ A and B are independent)

= P (B)[1 − P (A)] = P (Ā)P (B)

Hence proved.

Theorem 5:
If the events A and B are independent, the events Ā and B̄ are also independent.
Proof:
P (Ā ∩ B̄) = P (Ā ∪ B) = 1 − P (A ∪ B)

= 1 − [P (A) + P (B) − P (A ∩ B)] (Addition theorem)

= [1 − P (A)] − P (B)[1 − P (A)]

= P (Ā)P (B̄)

Problem 1:
From a bag containing 3 red and 2 black balls, 2 balls are drawn at random. Find the
probability that they are of the same colour.
Solution:
Let A be the event of drawing 2 red balls. Let B be the event of drawing 2 black balls.

∴ P (A ∪ B) = P (A) + P (B)
3
C2 2 C2 3 1 2
= 5C
+5 = + =
2 C2 10 10 5

Problem 2:
When 2 cards are drawn from a well-shuffled pack of playing cards, what is the probability
that they are of the same suit?
Solution:
Let A be the event of drawing 2 spade cards. Let B be the event of drawing 2 club cards.
Let C be the event of drawing 2 hearts cards. Let D be the event of drawing 2 diamond
cards.
13
C2 4
∴ P (A ∪ B ∪ C ∪ D) = 4 · 52 C
=
2 17

Problem 3:
1
When A and B are mutually exclusive events such that P (A) = 2
and P (B) = 31 , find
P (A ∪ B) and P (A ∩ B).

5
Advanced Engineering Mathematics Aman Kanshotia

Solution:

P (A ∪ B) = P (A) + P (B) = 65 , P (A ∩ B) = 0

Problem 4:
If P (A) = 0.29, P (B) = 0.43, find P (A ∩ B̄), if A and B are mutually exclusive.
Solution:
We know A ∩ B̄ = A
P (A ∩ B̄) = P (A) = 0.29

Problem 5:
A card is drawn from a well-shuffled pack of playing cards. What is the probability that
it is either a spade or an ace?
Solution:
Let A be the event of drawing a spade. Let B be the event of drawing an ace.

P (A ∪ B) = P (A) + P (B) − P (A ∩ B)
13 4 1 4
= + − =
52 52 52 13
Problem 6:
If P (A) = 0.4, P (B) = 0.7 and P (A ∩ B) = 0.3, find P (Ā ∩ B̄).
Solution:
P (Ā ∩ B̄) = 1 − P (A ∪ B)

= 1 − [P (A) + P (B) − P (A ∩ B)]

= 0.2

Problem 7:
If P (A) = 0.35, P (B) = 0.75 and P (A ∪ B) = 0.95, find P (Ā ∪ B̄).
Solution:

P (Ā ∪ B̄) = 1 − P (A ∩ B) = 1 − [P (A) + P (B) − P (A ∪ B)] = 0.85

Problem 8:
A lot consists of 10 good articles, 4 with minor defects and 2 with major defects. Two
articles are chosen from the lot at random (without replacement). Find the probability
that (i) both are good, (ii) both have major defects, (iii) at least 1 is good, (iv) at most

6
Advanced Engineering Mathematics Aman Kanshotia

1 is good, (v) exactly 1 is good, (vi) neither has major defects, (vii) neither is good.
Solution: 10
C2 3
(i) P (both are good) = 16 =
C2 8
2
C2 1
(ii) P (both have major defects) = 16 C
=
2 120

10C1 6C1 + 10C2 7


(iii) P (at least 1 is good) = =
16C2 8

10C0 6C2 + 10C1 6C1 5


(iv) P (at most 1 is good) = =
16C2 8

10C1 6C1 1
(v) P (exactly 1 is good) = =
16C2 2

14C2 91
(vi) P (neither has major defects) = =
16C2 120

6C2 1
(vii) P (neither is good) = =
16C2 8

Problem 9:
If A, B and C are any 3 events such that P (A) = P (B) = P (C) = 41 , P (A ∩ B) =
P (B ∩ C) = P (C ∩ A) = 0, P (A ∩ B ∩ C) = 18 . Find the probability that at least 1 of
the events A, B and C occurs.
Solution: Since P (A ∩ B) = P (B ∩ C) = 0; P (A ∩ B ∩ C) = 0

P (A ∪ B ∪ C) = P (A) + P (B) + P (C) − P (A ∩ B) − P (B ∩ C) − P (C ∩ A) + P (A ∩ B ∩ C)

3 1 5
= −0−0−0+ =
4 8 8
Problem 10:
A box contains 4 bad and 6 good tubes. Two are drawn out from the box at a time. One
of them is tested and found to be good. What is the probability that the other one is
also good?
Solution: Let A be a good tube drawn and B be another good tube drawn.
6
C2 1
P (both tubes are good) = P (A ∩ B) = 10 C
=
2 3

7
Advanced Engineering Mathematics Aman Kanshotia

1
P (A ∩ B) 3 5
P (B|A) = = 6 = (By conditional probability)
P (A) 10
9

Problem 11:
In a shooting test, the probability of hitting the target is 1/2 for A, 2/3 for B and 3/4
for C. If all of them fire at the target, find the probability that (i) none of them hits the
target and (ii) at least one of them hits the target.
Solution: Let A, B and C be the events of hitting the target.

P (A) = 12 , P (B) = 32 , P (C) = 3


4

P (Ā) = 12 , P (B̄) = 31 , P (C̄) = 1


4

1
P (none hits) = P (Ā ∩ B̄ ∩ C̄) = P (Ā) × P (B̄) × P (C̄) =
24

1 23
P (at least one hits) = 1 − P (none hits) = 1 − =
24 24
Random Variable:
A random variable is a real valued function whose domain is the sample space of a random
experiment taking values on the real line R.

Discrete Random Variable:


A discrete random variable is one which can take only finite or countable number of values
with definite probabilities associated with each one of them.

Probability Mass Function:


Let X be discrete random variable which assuming values x1 , x2 , . . . , xn with each of the
values, we associate a number called the probability

P (X = xi ) = p(xi ), (i = 1, 2, . . . , n)

this is called the probability of xi satisfying the following conditions:

1. pi ≥ 0 ∀i, i.e., pi ’s are all non–negative


Pn
2. i=1 pi = p1 + p2 + · · · + pn = 1, i.e., the total probability is one.

Continuous Random Variable:


A continuous random variable is one which can assume every value between two specified
values with a definite probability associated with each.

8
Advanced Engineering Mathematics Aman Kanshotia

Distribution Function or Cumulative Distribution Function

1. Discrete Variable:
A distribution function of a discrete random variable X is defined as
X
P (X ≤ x) = P (xi ).
xi ≤x

2. Continuous Variable:
A distribution function of a continuous random variable X is defined as
Z x
F (x) = P (X ≤ x) = f (x) dx.
−∞

Mathematical Expectation
The expected value of the random variable X is defined as

1. If X is a discrete random variable



X
E(X) = xi p(xi ),
i=1

where p(x) is the probability function of x.

2. If X is continuous random variable


Z ∞
E(X) = xf (x) dx
−∞

where f (x) is the probability density function of x.

Properties of Expectation:

1. If C is constant then E(C) = C


Proof:
Let X be a discrete random variable then
X
E(x) = xp(x)

Now

X X n
X
E(C) = Cp(x) = C p(x) = C since pi = p1 + p 2 + · · · + p n = 1
i=1

9
Advanced Engineering Mathematics Aman Kanshotia

2. If a, b are constants then E(ax + b) = aE(x) + b


Proof:
Let X be a discrete random variable then
X
E(x) = xp(x)

Now
X X X
E(ax + b) = (ax + b)p(x) = axp(x) + bp(x)
X X n
X
=a xp(x) + b p(x) = aE(x) + b since pi = 1
i=1

3. If a and b are constants then V ar(ax + b) = a2 V ar(x)


Proof:
h i
V ar(ax + b) = E (ax + b − E(ax + b))2
h i
= E (ax + b − (aE(x) + b))2
h i h i
= E (a(x − E(x)))2 = a2 E (x − E(x))2

= a2 V ar(x)

4. If a is constant then V ar(ax) = a2 V ar(x)


Proof:
h i
V ar(ax) = E (ax − E(ax))2
h i
2
= E (ax − aE(x))
h i h i
= E (a(x − E(x)))2 = a2 E (x − E(x))2

= a2 V ar(x)

5. Prove that V ar(x) = E(x2 ) − [E(x)]2


Proof:
h i
V ar(x) = E (x − E(x))2
h i
2 2
= E x + (E(x)) − 2xE(x)

= E[x2 + µ2 − 2xµ]

= E(x2 ) + E(µ2 ) − E(2xµ)

10
Advanced Engineering Mathematics Aman Kanshotia

= E(x2 ) + µ2 − 2µE(x)

= E(x2 ) + µ2 − 2µ2

= E(x2 ) − µ2

V ar(x) = E(x2 ) − [E(x)]2

Problem 1
If the probability distribution of X is given as

X 1 2 3 4
P (X) 0.4 0.3 0.2 0.1

Find P { 21 < X < 72 | X > 1}.


Solution:

1 7
P { 21 < X < 72 , X > 1}
P 2
<X< 2
|X>1 =
P (X > 1)

P (X = 2 or 3) P (X = 2) + P (X = 3)
= =
P (X = 2, 3, or 4) P (X = 2) + P (X = 3) + P (X = 4)

0.3 + 0.2 0.5 5


= = = .
0.3 + 0.2 + 0.1 0.6 6

Problem 2
A random variable X has the following probability distribution

X −2 −1 0 1 2 3
P (X) 0.1 K 0.2 2K 0.3 3K
a) Find K. b) Evaluate P (X < 2) and P (−2 < X < 2).
c) Find the cdf of X. d) Evaluate the mean of X.
Solution:
P
a) Since P (X) = 1

0.1 + K + 0.2 + 2K + 0.3 + 3K = 1

0.4 1
6K + 0.6 = 1 ⇒ 6K = 0.4 ⇒ K= =
6 15
b) P (X < 2) = P (X = −2, −1, 0, 1)

= P (X = −2) + P (X = −1) + P (X = 0) + P (X = 1)

11
Advanced Engineering Mathematics Aman Kanshotia

1 2
= 0.1 + + 0.2 +
15 15

30 1
= =
30 2
Also,

P (−2 < X < 2) = P (X = −1, 0, 1)

= P (X = −1) + P (X = 0) + P (X = 1)

1 2
=
+ 0.2 +
15 15
c) The distribution function of X is given by F (x) defined by

X = x P (X = x) F (x) = P (X ≤ x)
1
−2 0.1 F (x) = P (X ≤ −2) = 10
1 1
−1 15
F (x) = P (X ≤ −1) = 10
+ = 16 1
15
0 0.2 F (x) = P (X ≤ 0) = 16 + 15 = 11
30
2
1 15
F (x) = P (X ≤ 1) = 11
30
+ 2
15
= 5
10
2 0.3 F (x) = P (X ≤ 2) = 12 + 10
3
= 45
3
3 15
F (x) = P (X ≤ 3) = 1
P
d) Mean of X is defined by E(X) = xP (x)

1 1
E(X) = (−2 × 10
) + (−1 × 15
) + (0 × 15 ) + (1 × 2
15
) + (2 × 3
10
) + (3 × 51 )

= − 15 − 1
15
+ 2
15
+ 35 + 3
5
= 16
15

Problem 3
A random variable X has the following probability function:

X 0 1 2 3 4 5 6 7
P (X) 0 K 2K 2K 3K K 2 2K 2 7K 2 + K
Find (i) K , (ii) Evaluate P (X < 6), P (X ≥ 6) and P (0 < X < 5), (iii) Determine the
distribution function of X, (iv) P (1.5 < X < 4.5 | X > 2), (v) E(3x − 4), V ar(3x − 4),
(vi) The smallest value of n for which P (X ≤ n) > 12 .

12
Advanced Engineering Mathematics Aman Kanshotia

Solution:
P
(i) Since P (X) = 1,

K + 2K + 2K + 3K + K 2 + 2K 2 + 7K 2 + K = 1

10K 2 + 9K − 1 = 0

1
K= 10
or K = −1
1
As P (X) cannot be negative, K = 10 .
(ii)
P (X < 6) = P (X = 0) + P (X = 1) + · · · + P (X = 5)
1 2 2 3 1 81
= 10
+ 10
+ 10
+ 10
+ 100
= 100

Now,
81 19
P (X ≥ 6) = 1 − P (X < 6) = 1 − 100
= 100

Also,

P (0 < X < 5) = P (X = 1) + P (X = 2) + P (X = 3) + P (X = 4)

8 4
= K + 2K + 2K + 3K = 8K = 10
= 5


(iii) The distribution function of X is given by F (x) = P (X ≤ x)

X = x P (X = x) F (x) = P (X ≤ x)
0 0 F (x) = P (X ≤ 0) = 0
1 1
1 10
F (x) = P (X ≤ 1) = 10
2 3
2 10
F (x) = P (X ≤ 2) = 10
2 5
3 10
F (x) = P (X ≤ 3) = 10
3 8
4 10
F (x) = P (X ≤ 4) = 10
1 81
5 100
F (x) = P (X ≤ 5) = 100
2 83
6 100
F (x) = P (X ≤ 6) = 100
17
7 100
F (x) = P (X ≤ 7) = 1

(iv)

P (X = 3) + P (X = 4)
P (1.5 < X < 4.5 | X > 2) =
1 − [P (X = 0) + P (X = 1) + P (X = 2)]

13
Advanced Engineering Mathematics Aman Kanshotia

5
10 5
= 3 = 7
1− 10

(v)
X
E(X) = xp(x)

1 2 2 3 1 2 17
=1× 10
+2× 10
+3× 10
+4× 10
+5× 100
+6× 100
+7× 100

E(X) = 3.66

X
E(x2 ) = x2 p(x)

(vi) The smallest value of n for which P (X ≤ n) > 12 is n = 4.



Problem 4 The probability mass function of random variable X is defined as

P (X = 0) = 3C 2 , P (X = 1) = 4C − 10C 2 , P (X = 2) = 5C − 1,

where C > 0, and P (X = r) = 0 if r ̸= 0, 1, 2.


Find (i) The value of C. (ii) P (0 < X < 2 | X > 0). (iii) The distribution function of
X. (iv) The largest value of x for which F (x) < 12 .

Solution:
P
(i) Since x P (x) = 1,

P (0) + P (1) + P (2) = 1

3C 2 + (4C − 10C 2 ) + (5C − 1) = 1

7C 2 − 9C + 2 = 0

2
C = 1, 7

Since C = 1 is not applicable, we take C = 27 .


The probability distribution is

14
Advanced Engineering Mathematics Aman Kanshotia

X 0 1 2
12 16 21
P (X) 49 49 49

(ii)
P ((0 < X < 2) ∩ (X > 0))
P (0 < X < 2 | X > 0) =
P (X > 0)

P (0 < X < 2) P (X = 1)
= =
P (X > 0) P (X = 1) + P (X = 2)
16
49 16
= 16 21 = 37
49
+ 49

(iii) The distribution function of X is (iii) The distribution function of X is

F (x) = P (X ≤ x)

X F (X = x) = P (X ≤ x)
12
0 F (0) = P (X ≤ 0) = 49
≈ 0.24
12 16 28
1 F (1) = P (X ≤ 1) = P (X = 0) + P (X = 1) = 49
+ 49
= 49
≈ 0.57
12 16 21
2 F (2) = P (X ≤ 2) = P (X = 0) + P (X = 1) + P (X = 2) = 49
+ 49
+ 49
=1

(iv) The largest value of x for which F (x) = P (X ≤ x) < 12 is 0.



Problem 5
If
 x , x = 1, 2, 3, 4, 5

P (x) = 15
0, elsewhere

Find (i) P {X = 1 or 2} and (ii) P 21 < X < 52 | X > 1 .




Solution:
(i)
P (X = 1 or 2) = P (X = 1) + P (X = 2)

1 2 3 1
= 15
+ 15
= 15
= 5

(ii)
1
< X < 25 ∩ (X > 1)
 
1 5
 P 2
P 2
<X< 2
|X>1 =
P (X > 1)

15
Advanced Engineering Mathematics Aman Kanshotia

P (X = 2)
=
P (X > 1)
2 2
15 15 2/15 2 1
= = 1 = = 14
= 7
1 − P (X = 1) 1− 15
14/15

Problem 6
A continuous random variable X has a probability density function

f (x) = 3x2 , 0 ≤ x ≤ 1.

Find a such that P (X ≤ a) = P (X > a).


Solution:
1
Since P (X ≤ a) = P (X > a), each must be equal to 2
because the total probability
is 1.

1
P (X ≤ a) = 2

Z a
1
f (x)dx = 2
0

Z a
3x2 dx = 1
2
0

h 3 ia
x a3 1
1
= 1
= 2
0

1 1/3
a3 = 12 ,

a= 2

Problem 7
A random variable X has the p.d.f f (x) given by

Cxe−x , x > 0
f (x) =
0, x≤0

Find the value of C and cumulative density function of X.


Solution:
Since Z ∞
f (x)dx = 1
−∞

16
Advanced Engineering Mathematics Aman Kanshotia

Z ∞
Cxe−x dx = 1
0

∞
C −(x + 1)e−x 0 = 1


 
C 0 − (−1) = 1 ⇒ C=1

Thus, 
xe−x , x > 0
f (x) =
0, x≤0
Cumulative Distribution of X is
Z x Z x
F (x) = f (t)dt = te−t dt
0 0

x
= −te−t − e−t 0 = − xe−x − e−x + 1
 

= 1 − (1 + x)e−x , x>0


Problem 8
If a random variable X has the p.d.f

 1 (x + 1), −1 < x < 1


f (x) = 2
0, otherwise

Find the mean and variance of X.


Solution:
Z 1 Z 1 Z 1
µ = E[X] = xf (x)dx = 1
2
x(x + 1)dx = 1
2
(x2 + x)dx
−1 −1 −1

h i1
1 x3 x2 1 1
+ 21 − − 13 + 1 1 2 1
 
= 2 3
+ 2
= 2 3 2
= 2 3
= 3
−1

Now, Z 1 Z 1
µ′2 = E[X ] = 2 2
x f (x)dx = 1
2
(x3 + x2 )dx
−1 −1

h i1
1 x4 x3 1 1
+ 31 − 1 1 1 2 1
 
= 2 4
+ 3
= 2 4 4
− 3
= 2 3
= 3
−1

17
Advanced Engineering Mathematics Aman Kanshotia

Hence,
1 2
Variance = µ′2 − (µ)2 = 1 1 1 3−1 2

4
− 3
= 3
− 9
= 9
= 9

Problem 9
A continuous random variable X that can assume any value between X = 2 and
X = 5 has a probability density function given by f (x) = k(1 + x). Find P (X < 4).
Solution:
Given X is a continuous random variable whose pdf is

k(1 + x), 2 < x < 5
f (x) =
0, Otherwise

Z ∞ Z 5
Since f (x) dx = 1 ⇒ k(1 + x) dx = 1,
−∞ 2

5
x2 (1 + 5)2 (1 + 2)2
  
k x+ =1 ⇒ k − =1
2 2 2 2
   
36 9 27 2
k − =1 ⇒ k =1 ⇒ k= .
2 2 2 27

 2(1 + x) , 2 < x < 5

∴ f (x) = 27
0,

Otherwise

4 4
x2 2 (1 + 4)2 (1 + 2)2
Z   
2 2 16
P (X < 4) = (1 + x) dx = x+ = − = .
27 2 27 2 2 27 2 2 27

Problem 10
A random variable X has density function given by

2e−2x , x ≥ 0
f (x) =
0, x<0

Find the m.g.f.


Solution:
Z ∞ Z ∞ Z ∞
tX tx tx −2x
MX (t) = E(e ) = e f (x) dx = e 2e dx = 2 e(t−2)x dx.
0 0 0

18
Advanced Engineering Mathematics Aman Kanshotia

For convergence we need t < 2. Then


∞
e(t−2)x

2
MX (t) = 2 = , t < 2.
t−2 0 2−t

Problem 11
The pdf of a random variable X is given by

2x, 0 ≤ x ≤ b
f (x) =
0, otherwise

For what value of b is f (x) a valid pdf? Also find the cdf of X with the above pdf.
Solution: 
2x, 0 ≤ x ≤ b
Given f (x) =
0, otherwise
R∞ Rb
Since −∞ f (x) dx = 1 ⇒ 0 2x dx = 1
b
x2

= 1 ⇒ b2 − 0 = 1 ⇒ b = 1
 
1 0

2x, 0 ≤ x ≤ 1
∴ f (x) =
0, otherwise
Z x Z x  x
F (x) = P (X ≤ x) = f (t) dt = 2t dt = x2 0 = x2 , 0≤x≤1
−∞ 0

Z x Z 0
F (x) = P (X ≤ x) = f (t) dt = 0 dx = 0, x<0
−∞ −∞

Z x Z 0 Z 1 Z x  1
F (x) = P (X ≤ x) = f (t) dt = 0 dx + 2x dx + 0 dx = x2 0 = 1, x>1
−∞ −∞ 0 1




 0, x < 0

F (x) = x2 , 0 ≤ x ≤ 1



1, x > 1

Problem.12
 K , −∞ < x < ∞

A random variable X has density function f (x) = 1 + x2 .


0, otherwise

Determine K and the distribution functions. Evaluate the probability P (x ≥ 0).
Solution:

19
Advanced Engineering Mathematics Aman Kanshotia

R∞
Since −∞
f (x) dx = 1
Z ∞ Z ∞
K 1
dx = 1 ⇒ K dx = 1
−∞ 1 + x2 −∞ 1 + x2
π  π  1
K(tan−1 x)∞
−∞ = 1 ⇒ K − − = 1 ⇒ Kπ = 1 ⇒ K =
2 2 π
Z x Z x
K
F (x) = f (t) dt = dt
−∞ −∞ 1 + t2

1 h −1  π i
= tan x − −
π 2

1 hπ i
F (x) = + tan−1 x , −∞ < x < ∞
π 2
Z ∞
1 1 1  −1 ∞
P (X ≥ 0) = · dx = tan x 0
0 π 1 + x2 π

1 π −1
 1
= − tan 0 =
π 2 2
Problem.13 
Ke−3x , x > 0
If X has the probability density function f (x) = , find K, P (0.5 ≤
0, otherwise
X ≤ 1) and the mean of X.
Solution:
R∞
Since −∞ f (x) dx = 1
Z ∞
Ke−3x dx = 1
0

∞
e−3x

1
K =1⇒K· =1⇒K=3
−3 0 3
1
1
e−3x
Z 
−3x
= e−1.5 − e−3
 
P (0.5 ≤ X ≤ 1) = 3e dx = 3
0.5 −3 0.5
R∞ R∞
Mean of X = E(X) = 0 xf (x) dx = 0 3xe−3x dx
∞ ∞
−xe−3x e−3x
   
1 1
=3 + = [0 − 0] + 0 − =
3 0 9 0 9 3
1
Hence, the mean of X = E(X) = 3
.

20
Advanced Engineering Mathematics Aman Kanshotia

Problem.14 If X is a continuous random variable with pdf given by





Kx in 0 ≤ x ≤ 2


in 2 ≤ x ≤ 4

2K
f (x) =


6K − Kx in 4 ≤ x ≤ 6



0 elsewhere

Find the value of K and also the cdf F (x).


Solution:
R∞
Since −∞ f (x) dx = 1,
Z 2 Z 4 Z 6
Kx dx + 2K dx + (6K − Kx) dx = 1
0 2 4

2 6
x2 x2
 
K + (2K)[x]42 + K 6x − =1
2 0 2 4

 
4 1
K + 8 + (36 − 18 − 24 + 8) = 1 ⇒ K[2 + 8 + 2] = 1 ⇒ 12K = 1 ⇒ K =
2 8

We know that Z x
F (x) = f (t) dt
−∞

If x < 0, then Z x
F (x) = f (t) dt = 0
−∞

If x ∈ (0, 2), then


x x x
1 x2 x2
Z Z Z  
1
F (x) = f (t) dt = Kt dt = t dt = = , 0≤x≤2
−∞ 0 8 0 8 2 16

If x ∈ (2, 4), then

2 x
1 22
 
x−2 1 x−2 x−1
Z Z
1 1
F (x) = Kt dt+ 2K dt = + (x−2) = (2)+ = + = , 2≤x<4
0 2 8 2 4 8 4 4 4 4

If x ∈ (4, 6), then


Z 2 Z 4 Z x
F (x) = Kx dx + 2K dx + (6K − Kx) dx
0 2 4

21
Advanced Engineering Mathematics Aman Kanshotia

Z 2 Z 4 Z x
1 1 1
= x dx + dx + (6 − x) dx
8 0 4 2 8 4

 2 x
1 x2 x2

1 1
= · + (4 − 2) + 6x −
8 2 0 4 8 2 4

6x − x2 24 − 16 4 + 8 + 12x − 2x2 − 48 + 16 −x2 + 12x − 20


 
4 1
= + + − = = , 4≤x≤6
16 2 8 8 16 16

If x > 6, then
Z 2 Z 4 Z 6 Z x
F (x) = Kx dx + 2K dx + (6K − Kx) dx + 0 dx = 1
0 2 4 6




 0, x<0


x2
0≤x≤2

 ,
 16


F (x) = x−1
4
, 2≤x<4

 2
−x +12x−20
, 4≤x≤6


16




x≥6

1,

Problem.15
A random variable X has the P.d.f

2x, 0 < x < 1
f (x) =
0, otherwise

1 1 1 3 1
  
Find: (i) P X < 2
(ii) P 4
<X< 2
(iii) P X > 4
|X> 2
Solution:
(i)
  Z 1/2
1  1/2 1
P X< = 2x dx = x2 0 =
2 0 4
(ii)
  Z 1/2
1 1  1/2 1 1 3
P <X< = 2x dx = x2 1/4 = − =
4 2 1/4 4 16 16
(iii)

3 2
P X > 43 1 − P X ≤ 34 9
  
1− 1 − 16
 
3 1 4 7/16 7
P X> |X> = = = = 1 = =
4 2 P X>2 1
1−P X ≤ 2 1 1 2 1− 4 3/4 12

1− 2

22
Advanced Engineering Mathematics Aman Kanshotia

  Z 1  
3  2 1 9 7
P X> = 2x dx = x 3/4 = 1 − =
4 3/4 16 16
  Z 1
1  1 1 3
P X> = 2x dx = x2 1/2 = 1 − =
2 1/2 4 4

P X > 34
  
3 1 7/16 7
P X> |X> = 1
= =
4 2 P X>2 3/4 12
Problem.16
Let the random variable X have the p.d.f

 1 e−x/2 , x > 0
f (x) = 2
0, otherwise

Find the moment generating function, mean and variance of X.


Solution:

Z ∞ Z ∞ Z ∞
1 1
tx
MX (t) = E(e ) = tx
e f (x) dx = e · e−x/2 dx =
tx
ex(t−1/2) dx
−∞ 0 2 2 0

∞
1 ex(t−1/2)

1 1
= · = , if t <
2 t − 1/2 0 1 − 2t 2
   
d 2
E(X) = MX (t) = =2
dt t=0 (1 − 2t)2 t=0

d2
   
2 8
E(X ) = MX (t) = =8
dt2 t=0 (1 − 2t)3 t=0

Var(X) = E(X 2 ) − [E(X)]2 = 8 − 4 = 4

Moment Generating Function (MGF) and Moments


The Moment Generating Function (MGF) of a random variable X is a function
that helps summarize all the moments (expected values of powers) of X. It is defined as
the expected value of the exponential function etX :

MX (t) = E etX
 

where t is a real number for which this expectation exists (i.e., the integral or sum

23
Advanced Engineering Mathematics Aman Kanshotia

converges).

Uses of MGF
• To generate moments: The MGF encodes all moments of X. By differentiating
the MGF with respect to t and evaluating at t = 0, we can find the moments (mean,
variance, skewness, etc.).

• To characterize distributions: The MGF uniquely determines the probability


distribution if it exists in an open interval around 0.

• To simplify calculations: Especially useful for sums of independent random


variables, since the MGF of a sum is the product of individual MGFs.

Moments of a Random Variable:


Moments are quantitative measures related to the shape of the probability distribu-
tion of a random variable. They provide information about characteristics like location
(mean), spread (variance), skewness, and kurtosis.
Types of Moments

1. Raw Moments (Moments about the origin): The nth raw moment of X is
defined as:

µ′n = E[X n ]

• The first raw moment µ′1 = E[X] is the mean or expected value.

• The second raw moment µ′2 = E[X 2 ] is used to find variance and spread.

2. Central Moments (Moments about the mean): The nth central moment of X
is:

µn = E[(X − µ)n ]

where µ = E[X] is the mean.

• The first central moment µ1 = 0 always.

• The second central moment µ2 = Var(X) = E[(X − µ)2 ] is the variance, which
measures spread.

• The third central moment measures skewness (asymmetry).

• The fourth central moment measures kurtosis (peakness or tail behavior).

24
Advanced Engineering Mathematics Aman Kanshotia

MGF for Different Types of Random Variables


1. For Discrete Random Variables: If X is discrete with probability mass function
(PMF) p(x), the MGF is defined as:

X
MX (t) = E[etX ] = etx p(x)
x

where the sum is over all possible values of X.

2. For Continuous Random Variables: If X is continuous with probability density


function (PDF) f (x), the MGF is:
Z ∞
tX
MX (t) = E[e ] = etx f (x) dx
−∞

How to find moments from MGF?


The Moment Generating Function (MGF) of a random variable X is defined as:

MX (t) = E etX ,
 
for all t in some neighborhood of 0.

The MGF is very useful because it can be used to compute the moments of a random
variable. The nth moment about the origin (also called the raw moment) is obtained by
differentiating the MGF n times with respect to t, and then evaluating at t = 0:

(n) dn
E[X n ] = MX (0) = MX (t) .
dtn t=0

That is:

• The 1st derivative of MX (t) at t = 0 gives the mean.

• The 2nd derivative of MX (t) at t = 0 gives the second raw moment E[X 2 ].

• Higher order derivatives give higher order moments.

Examples

1. Mean (First Moment):


E[X] = MX′ (0).

2. Variance: Variance is defined as:

Var(X) = E[X 2 ] − (E[X])2 .

25
Advanced Engineering Mathematics Aman Kanshotia

Using the MGF:


2
Var(X) = MX′′ (0) − MX′ (0) .

Skewness and Kurtosis from Raw Moments


Raw Moments: The rth raw moment of a random variable X is defined as:

µ′r = E[X r ], r = 1, 2, 3, . . .

Thus,
µ′1 = E[X], µ′2 = E[X 2 ], µ′3 = E[X 3 ], µ′4 = E[X 4 ].

Central Moments in terms of Raw Moments

• Mean: µ = µ′1 = E[X].

• Variance (second central moment):

µ2 = E[(X − µ)2 ] = µ′2 − (µ′1 )2 .

• Third central moment:

µ3 = E[(X − µ)3 ] = µ′3 − 3µ′1 µ′2 + 2(µ′1 )3 .

• Fourth central moment:

µ4 = E[(X − µ)4 ] = µ′4 − 4µ′1 µ′3 + 6(µ′1 )2 µ′2 − 3(µ′1 )4 .

Definition of Skewness

Skewness is a measure of the asymmetry of a probability distribution about its mean.


It is defined as the normalized third central moment:

µ3
Skewness(X) = γ1 = 3/2
,
µ2

where µ2 is the variance.


In terms of raw moments:

µ′3 − 3µ′1 µ′2 + 2(µ′1 )3


Skewness(X) = 3/2 .
µ′2 − (µ′1 )2

26
Advanced Engineering Mathematics Aman Kanshotia

Definition of Kurtosis

Kurtosis is a measure of the peakedness or tail heaviness of a distribution compared


to the normal distribution. It is defined as the normalized fourth central moment:

µ4
Kurtosis(X) = γ2 = .
µ22

In terms of raw moments:

µ′4 − 4µ′1 µ′3 + 6(µ′1 )2 µ′2 − 3(µ′1 )4


Kurtosis(X) = 2 .
µ′2 − (µ′1 )2

Interpretation

• For skewness:

– γ1 = 0: symmetric distribution.
– γ1 > 0: positively skewed (right tail longer).
– γ1 < 0: negatively skewed (left tail longer).

• For kurtosis:

– γ2 = 3: mesokurtic (normal-like).
– γ2 > 3: leptokurtic (sharper peak, heavier tails).
– γ2 < 3: platykurtic (flatter peak, lighter tails).

Figure 1: Skewness and Kurtosis

27
Advanced Engineering Mathematics Aman Kanshotia

2 Probability Distributions
2.1 Binomial Distribution
Binomial Distribution is a probability distribution used to model the number of successes
in a fixed number of independent trials, where each trial has only two possible outcomes:
success or failure. This distribution is useful for calculating the probability of a specific
number of successes in scenarios like flipping coins, quality control, or survey predictions.
Binomial Distribution is based on Bernoulli trials, where each trial has an indepen-
dent and identical chance of success. The probability distribution for a Bernoulli trial is
called the Bernoulli Distribution.
Dichotomous Experiments An experiment is said to be a dichotomous exper-
iment if it results in only two possible outcomes: success or failure. Examples include
tossing a coin (Head or Tail), checking a product (Defective or Non-defective), etc.
Bernoulli Trials A trial is called a Bernoulli trial if:

• The trial has only two outcomes: success or failure.

• The probability of success remains constant for every trial.

• The trials are independent of each other.

The probability distribution of a Bernoulli trial is called the Bernoulli Distribution.


Conditions for Binomial Distribution The Binomial distribution can be used in
scenarios where the following conditions are satisfied:

1. Fixed Number of Trials: There is a set number of trials or experiments (denoted


by n), such as flipping a coin 10 times.

2. Two Possible Outcomes: Each trial has only two possible outcomes, often la-
beled as “success” and “failure.” For example, getting heads or tails in a coin
flip.

3. Independent Trials: The outcome of each trial is independent of the others,


meaning the result of one trial does not affect the result of another.

4. Constant Probability: The probability of success (denoted by p) remains the


same for each trial. For example, if you are flipping a fair coin, the probability of
getting heads is always 0.5.

Definition of Binomial Distribution A random variable X is said to follow bino-


mial distribution if it assumes only non negative values and its probability mass function
is given by

28
Advanced Engineering Mathematics Aman Kanshotia


 nx px q n−x , x = 0, 1, 2, . . . , n; q = 1 − p
 
P (X = x) = p(x) =
0,

otherwise

Notation: X ∼ B(n, p) read as X is following binomial distribution with parameter n


and p.
Problem 1 Find m.g.f. of Binomial distribution and find its mean and variance.
Solution M.G.F. of Binomial distribution:
n
X
tX
MX (t) = E[e ] = etx P (X = x)
x=0

n  
X n
= px q n−x etx
x=0
x
n  
X n
= (pet )x q n−x
x=0
x

By binomial theorem,
MX (t) = (q + pet )n

Binomial Theorem
The Binomial Theorem provides a way to expand expressions of the form (x + y)n ,
where n is a non-negative integer. It states that:
n  
n
X n n−k k
(x + y) = x y
k=0
k

where  
n n!
= , k = 0, 1, 2, . . . , n
k k!(n − k)!
Mean of Binomial distribution

Mean = E(X) = MX′ (0)

= n(q + pet ) n−1 · pet t=0 = np Since q + p = 1


 

E(X 2 ) = MX′′ (0)

= n(n − 1)(q + pet ) n−2 (pet )2 + npet (q + pet ) n−1 t=0


 

29
Advanced Engineering Mathematics Aman Kanshotia

E(X 2 ) = n(n − 1)p2 + np

= n2 p2 + np(1 − p) = n2 p2 + npq

Variance = E(X 2 ) − [E(X)]2 = npq

∴ Mean = np, Variance = npq

Note :- In Binomial distribution, the mean is greater than variance. If any


distribution follows this condition then that is said be a binomial distribution.

30

You might also like