0% found this document useful (0 votes)
17 views66 pages

Chapter-4 Noise Updated

The document is a chapter from a course on communication systems, specifically focusing on noise and its characteristics. It covers probability theory, random variables, types of noise, and their effects on communication systems. Additionally, it discusses sources of noise, including external and internal noise, and their impact on system performance.

Uploaded by

markosniguse1
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
17 views66 pages

Chapter-4 Noise Updated

The document is a chapter from a course on communication systems, specifically focusing on noise and its characteristics. It covers probability theory, random variables, types of noise, and their effects on communication systems. Additionally, it discusses sources of noise, including external and internal noise, and their impact on system performance.

Uploaded by

markosniguse1
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd

Adama Science & Technology University

School of Electrical Engineering and Computing


Department of Electronics and Communication
Engineering

Introduction to Communication systems


(ECEG- 3202)

Chapter 4
Noise
Outlines
 Review of Probability and Random Variables
Overview of random processes
 Sources of Noises
 Mathematical Frequency domain representation of Noise
Gaussian and white noise characteristics
Super position of Noises, Linear filtering of Noises.
Signal to Noise Ratio, Figure of Merit
 Equivalent Noise resistance of Amplifier and temperature of
system

2
Review of Probability and Random Variables
Review of Probability theory and Random Variables
 Probability theory is based on the phenomena that can be modeled by an
experiment with an outcome that is subject to chance.
Definition: A random experiment is repeated n time (n trials) and the event A
is observed m times (m occurrences).
Define the probability of an event A as:

where N is the number of possible


outcomes of the random experiment and
NA is the number of outcomes favorable
to the event A.
For example:
A 6-sided die has 6 outcomes.
3 of them are even,
Thus P(even) = 3/6
Review of Probability theory and Random
Variables(cont...)
Axiomatic definition of Probability
 A probability law (measure or function) that assigns probabilities to
events such that:
 P(A) ≥ 0
 P(S) =1
 If A and B are disjoint events (mutually exclusive),
i.e. A ∩ B = ∅, then P(A ∪ B) = P(A) + P(B)
Some Useful Properties

 Probability of Impossible event
 , the Complement of A
 If A and B are two events, then
Cont…
 If the sample space consits of n mutually exclusive events such that , Then

Conditional Probability
 Let A and B be two events. The probability of event B given that event A has
occured is called the Conditional Probability.

If the Occurence of B has no effect on A, we say A and B are indenpendent


events.
In this case
Combining both, We get , When A and B are independent.
Random Variables
Definition: A random variable is the assignment of a variable to represent a
random experiment.
 X(s) denotes a numerical value for the event s.
 When the sample space is a number line, x = s.
 RV can be defined as a function that maps from sample space of an experiment
to real numbers. Mathematically, RV expressed as: X: S →R. Where X IS RV, S is
sample space, and R is a set of real numbers.
Random Variables(cont…)
 There are two basic types of random variables,
 Discrete Random Variables
 Continuous Random Variables

Discrete Random Variables


 It takes on a finite number of values. The probability function associated with it is
said to be Probability Mass Function (PMF).
 If X is a discrete random variable and the PMF of X is P(xi), then
 0 ≤ pi ≤ 1
 ∑pi = 1 where the sum is taken over all possible values of x
0 1 2
Example: Let S = {0, 1, 2} Find the value of P(X=0)
Solution: P(X=0)= 0.2 0.3 0.5
Random Variables(cont…)
Continuous Random Variables
 It takes on an infinite number of values. The probability function associated
with it is said to be Probability Density Function (PDF).
 If X is a continuous random variable. P ( < X < ) then,
0 ≤ f(x) ≤ 1; for all x
∫ f(x) dx = 1 over all values of x
Then P (X) is said to be a PDF of the distribution.
Example: Find the values of P (1 < X < 2) Where ,
Solution: ∫ f(x) dx = 1 =>
P (1 < X < 2) => P =
Two Random Variables

FX ,Y ( x, y ) P X  x, Y  y 
CDF:

 x  y

Marginal cdf: FX ( x)  f X ,Y (u, v) du dv FY ( y )  f X ,Y (u , v) du dv


     

 
2

PDF: f X ,Y ( x, y )  FX ,Y ( x, y ) f X ,Y (u , v) du dv 1
xy   

 

Marginal pdf: f X ( x)  f X ,Y ( x, v) dv fY ( y )  f X ,Y (u , y ) du
 

f X ,Y ( x, y )
Conditional pdf: fY ( y | x)  f X ( x)
Statistical Averages of RV
 The expected value of a random variable is a measure of the average of
the value that the random variable takes in a large number of
experiments.  E X   xf x  dx X 

X

Y g ( X )
 
 Function of a random variable:

EY   yfY y  dy Eg X   g X  f X x  dx
 
E X n  x n f X x  dx
 

 nth moments: 

 
 Central moments:E X 2  x 2 f x  dx
 X 
Mean-square value of X
 

E X   X   x   X  f X x  dx
n n



 

E X   X   x   X  f X x  dx  X2
2 2


Variance of X
Overview of random processes
Overview of random processes
Definition: a random process is described as a time-varying random
variable
 A random process may be viewed as a collection of random variables, with
time t as a parameter running through all real numbers.


 X t  E X t   xf X t  x  dx

Mean of the random process:
Cont.
Definition: a random process is first-order stationary if its pdf is
constant
Definition: the autocorrelation is the expected value of the
product of two random variables at different times.
R X t1 , t 2  E X t1 X t 2 

Where are random variables obtained by observing at times and ,


respectively.
 It is the measure of the degree to which two time samples of the
same stochastic process are related.
Definition: the autocovariance of a stationary random process is
C X t1 , t 2  EX t1   X X t 2   X 
R X t1  t 2   X2
Properties of Autocorrelation

Definition: autocorrelation of a stochastic process depends on


the time differences RX   E X t   X t 

Mean-square value:  
RX 0  E X 2 t 

RX   RX   
Autocorrelation is an even function:

RX   RX 0
Autocorrelation has maximum at zero:
Example
Consider a process are constant R.V. is uniform in the interval (). Find the
mean function and autocorrelation function.
Solution
The probability density function of RV

mean function
Example (Cont…)
Autocorrelation function

{a. }
{}
. }.
.
Cross-correlation
Two random processes have the cross-correlation
X t , Y t 
R XY t , u  E X t Y u 

Wide-sense stationary cross-correlation

RX t ,  RX t  , RY u,  RY u  


RXY  , u  RXY  
Power Spectral Density
Definition: Fourier transform of autocorrelation function is called power spectral
density 
S X  f   RX  e  j 2f dτ


RX τ   S X  f e j 2f df


Consider the units of X(t) Volts or Amperes


Autocorrelation is the projection of X(t) onto itself
Resulting units of Watts (normalized to 1 Ohm)
Properties of PSD


Zero-frequency of PSD S X 0   R X   dτ



Mean-square value  
E X 2 t   S X  f df


PSD is non-negative S X  f  0

PSD of a real-valued RP S X  f  S X  f 
Example
Example (cont…)
Ensemble & Time Average
Mean and autocorrelation can be determined in two ways:

◦ The experiment can be repeated many times and the average taken over all
these functions. Such an average is called ensemble average.
◦ Take any one of these function as being representative of the ensemble and
find the average from a number of samples of this one function. This is
called a time average.

24
Ergodicity & Stationarity
If the time average and ensemble average of a random function are the same,
it is said to be ergodic.
A random function is said to be stationary if its statistics do not change as a
function of time.
◦ This is also called strict sense stationarity (vs. wide sense stationarity).

Any ergodic function is alsoxstationary.


(t )  x
For a stationary signal we have:
p ( x1 , x2 , t1 , t 2 )  p ( x1 , x2 , )
 
Stationarity t 2  t1 as:
is defined
◦ Where
r ( )   x1 x2 p ( x1 , x2 , )
And the autocorrelation function is : x1 , x2

25
Sources of Noises
Noise
 Noise is random signal that exists in communication systems

 Noise in electrical terms may be defined as any unwanted introduction of


energy tending to interfere with the proper reception and reproduction of
transmitted signals.
Cont…

• Practically, we cannot avoid the existence of unwanted signal together


with the modulated signal transmitted by the transmitter.
• This unwanted signal is called noise.
• Noise is random signal that exists in communication system.
• Random signal cannot be represented with a simple equation
• The existence of noise will degrade the level of quality of the received
signal at the receiver.
Noise
effect
• Degrades system performance for both analog and digital system.
• The receiver cannot understand the sender
• The receiver cannot function as it should be.
• Reduce the efficiency of communication system.
Sources of noise
1. External noise
External noise is defined as the type of Noise which is generated externally

External Noise are analyzed qualitatively.

Now, External Noise may be classified as

a) Atmospheric Noise : Atmospheric Noise is also known as static noise which is


the natural source of disturbance caused by lightning, discharge in thunderstorm
and the natural disturbances occurring in the nature.
Cont...
b) Extraterrestrial Noise : A noise that is originated from the sun and the outer
space. They are subdivided into
i) Solar Noise
ii) Cosmic Noise
Solar noise is the noise that originates from the sun.
The sun radiates a broad spectrum of frequencies, including those, which are used
for broadcasting.
The sun is an active star and is constantly changing
Cont…

 Cosmic Noise
• Distant stars also radiate noise in much the same way as the sun.
• The noise received from them is called black body noise.
• Noise also comes from distant galaxies in much the same way as they come
from the milky way.
Cont…
c) Industrial Noise : Sources of Industrial noise are auto-mobiles,
aircraft, ignition of electric motors and switching gear.

The main cause of Industrial noise is High voltage wires. These noises is
generally produced by the discharge present in the operations.

Noise made by man easily outstrips any other between the frequencies of 1 to
600 MHz.
 This includes such things as car and aircraft ignition, electric motors,
switching equipment, leakage from high voltage lines etc.
2. Internal Noise

 This is the noise generated by any of the active or passive devices found
in the receiver.
 This type of noise is random and difficult to treat on an individual basis
but can be described statistically.
 Random noise power is proportional to the bandwidth over which it is
measured.
Types of internal noise

Internal Noise are the type of Noise which are generated internally or within
the Communication System or in the receiver.

Internal Noises are classified as

a) Shot Noise : These Noise are generally arises in the active devices due to
the random behavior of Charge particles or carriers. In case of electron tube,
shot Noise is produces due to the random emission of electron form cathodes.

b) Partition Noise : When a circuit is to divide in between two or more paths


then the noise generated is known as Partition noise. The reason for the
generation is random fluctuation in the division.
Cont…
c) Low- Frequency Noise : They are also known as FLICKER NOISE. These type
of noise are generally observed at a frequency range below few kHz. Power
spectral density of these noise increases with the decrease in frequency. That why
the name is given Low- Frequency Noise.

d) High- Frequency Noise : These noises are also known TRANSIT- TIME Noise.
They are observed in the semi-conductor devices when the transit time of a charge
carrier while crossing a junction is compared with the time period of that signal.

e) Thermal Noise : Thermal Noise are random and often referred as White Noise
or Johnson Noise. Thermal noise are generally observed in the resistor or the
sensitive resistive components of a complex impedance due to the random and
rapid movement of molecules or atoms or electrons.
Mathematical frequency domain representation of
Noise
Mathematical frequency domain representation of Noise
 In communication system, received signals passed through filters. The filters
are characterized in the frequency domain.
 So, to determine the effect of filters on noise, It is required to have a frequency
domain representation of noise.
Consider a particular sample function of time of noise within interval of time
duration T (i.e –T/2 to +T/2) as shown on the following figure 1 (a) and (b).
Cont…
 The noise signal of fig 1(b) appears to be periodic in period T and can be
represented by Fourier series.
 The Fourier series is written as:[you can refer the proof from textbooks]
x
Here x Assuming that noise signal has no d.c. component

 Then, the above equation can be rewritten as:


……………………………………………………1
………………………………………..2
 The polar Fourier series is represented as,
3
Cont…
And …………………………………….4
 With the above representation we can write,

 By parseval’s theorem, …………………………………7

(proof : [Link]
 The above equation is the addition of individual components. Then the power of each
component will be,
……………………………………………………………………………………………………………………8
 The average power P is equal to the area under the power spectral density curve i.e..,

………………………………………………………………………………………………………..9
 Consider each component individually, then above equation become, (for discrete psd)
Cont…
 For one n component, or …………………………………………..10
 Putting the value of , we get …………………………………………11
 By fourier series relations
………………………………………………………………………..12
 Then equation (11) become: …………………………………………13
 Equation (11) gives power spectral density (psd) for frequency components
 Since n varies from to , the power spectrum will be double sided.
 The following Fig 2 shows the power spectrum for some arbitrary signal
Cont…
 Figure 1(b) considered the noise signal which is periodic. But noise signal can
not be periodic. Thus, consider
 Equation (2) and (6) represent the actual noise signal of figure (a) i.e.
………………………………14
And equation (6) become ………………15
These two equation represent the noise in frequency domain.
 The values of and (or ) do not remain fixed since the noise signal is not
deterministic. Therefore and become random variables for noise signals.
 Then write in place of and in place of since they are random variables.
Here and are ensemble averages of and
i.e. ……………………………………………………………………………………………………
16
Cont…

Gaussian and white noise characteristics
Gaussian noise
 Gaussian noise is a kind of signal noise that has a probability density
function (pdf) equal to that of the normal distribution (which is also known
as the Gaussian distribution).

 The PDF of a Gaussian random variable z is given by:

 A special case is white Gaussian noise, in which the values at any pair of
times are identically distributed and statistically independent (and hence
uncorrelated). In communication channel testing and modelling, Gaussian
noise is used as additive white noise to generate additive white Gaussian
noise.
Cont…
A. With out noise B. With Gaussian
noise
White noise
Noise in an idealized form is known as WHITE NOISE
WHITE NOISE contains all Frequency component in equal amount like
white light consists of all colors of light
If the probability of distribution of occurrence of a white noise is
specified by a Gaussian distribution function, then it is called White
Gaussian noise.
Since power density spectrum of thermal and shot noise is independent
of frequency, they are referred as White Gaussian noise.
The power spectrum density of white noise is expressed as

Here the factor 1/2 has been included to show that


half of the power is associated with the +ve frequencies
and remaining half with –ve frequencies shown in figure below.
Cont…
Cont…
 Gaussian noise means the probability density function of the noise has a
Gaussian distribution, which basically defines the probability of the signal
having a certain value. Whereas white noise simply means that the signal
power is distributed equally over time.

 Gaussianity refers to the probability distribution with respect to the value, in


this context the probability of the signal falling within any particular range of
amplitudes, while the term 'white' refers to the way the signal power is
distributed (i.e., independently) over time or among frequencies.
Super position of Noises, Linear filtering
of Noises
Super position of Noises, Linear filtering of Noises
Super position of Noises
Superposition of noises occur When two or more waves simultaneously pass
through a point, the disturbance at the point is given by the sum of the
disturbances each wave would produce in absence of the other waves.
Principle of superposition: ‘For all linear systems, the principle of
superposition states that the net response caused by two or more stimuli is
the sum of the responses that would have been caused by each stimulus
individually.’
Linear filtering of Noises
 Linear filtering of a signal is a controlled scaling of the signal
components in the frequency domain. Reducing the components in the
center of the frequency domain (low frequencies), gives the high-frequency
components an increased relative importance, and thus high pass filtering is
performed.

[Reading assignment]
Signal to Noise Ratio, Figure of
Merit
Signal to Noise Ratio

Where is the signal power in watts
is the noise power in watts
 Hartley-Shannon theorem (also called Shannon’s Limit) states that the
maximum data rate for a communication channel is determined by a channel’s
bandwidth and SNR.
 A SNR of zero dB means that the noise power equals the signal power
 Thus, if a signal voltage is associated with a noise voltage source , then the
ratio of signal power to the noise power become

 Because the power spectrum density is power per unit bandwidth, hence the
above expression may be expressed as
Noise figure/Factor ()
 Electrical noise is defined as electrical energy of random amplitude, phase
and frequency.
 It is present in the output of every radio receiver.
 The noise is generated primarily within the input stages of the receiver
system itself.
 Noise generated at the input and amplified by the receiver’s full gain greatly
exceeds the noise generated further along the receiver chain.
 The noise performance of a receiver is described by a figure of merit called
noise figure.

10( )
Note: power gain in decibel is defined as follows:
Gain-dB=
Where is the power applied to the input, is the power from the output
Equivalent Noise resistance of Amplifier
and temperature of system
Equivalent Noise resistance of Amplifier and
temperature of system
The available thermal noise power is expressed as:
 The equivalent noise temperature is defined as ‘‘the temperature at which
a noisy resistor has to be maintained such that, by connecting the resistor
to the input of a noiseless version of the system, it produce the same
available noise power at the output of the system as that produced by all
the sources of noise in actual system’’. It depends only on system
parameters

Where
T= environment temperature (kelvin)
N= noise power (watts)
K= Boltzmann’s constant (1.38*)
B= total noise frequency (hertz)
= Equivalent noise temperature
F= noise factor
Noise Factor of a Device(Friis' formula)
The noise factor of a device is related to its noise
temperature Te

If several devices are cascaded, the total noise factor can be found
with Friis' formula

where Fn is the noise factor for the n-th device, and Gn is the power gain (linear, not
in dB) of the n-th device.

The first amplifier in a chain usually has the most significant effect on the total
noise figure because the noise figures of the following stages are reduced by stage
gains.
Friis formula for noise factor

Friis's formula is used to calculate the total noise factor of a cascade of stages,
each with its own noise factor and power gain (assuming that the impedances are
matched at each stage).

The total noise factor is given as

Friis's formula can be equivalently expressed in terms of noise


temperature
Example-1
1. A receiver has a noise power band width of 15 KHz a resistor

which matches with the receiver input impedance is connected

across the antenna terminals. What is the noise power

contributed by this resistor in the receiver bandwidth at

temperature .
Given

Solution
Noise power
1.38*
Example-2
2. Consider the communication system shown in Fig. 1 and the related
quantities referred to each element in the system. The system is a chain
composed by a receiving antenna at a temperature Ta = 350K, a first amplifier
of gain Gd1 = 12dB and noise figure F1 = 4dB, and a second amplifier of gain
Gd2= 30dB and noise figure F2 = 9dB. Compute the useful signal Power at
the input of the receiver; in order to obtain an output SNR = 45 dB. Assume
an equivalent noise bandwidth BN = 10 MHz.
Solution
Solution(cont…)
Solution(cont…)
Exercise
3. Assume to have a receiving system for TV signals composed by the
following chain of elements after the receiving antenna:
1) a pre-amplifier with gain Gt = 20 dB, noise figure F1 = 6 dB
2) a coaxial cable featuring an attenuation A2= 3 dB
3) an amplifier having gain G3 = 60 dB and noise figure F3 = 16 dB
Compute the total noise figure of the chain shown in Fig. 2, and the total noise
figure obtained when removing the pre-amplifier and increasing the amplifier
gain by 20 dB, with the same noise figure.
Any question

You might also like