100% found this document useful (1 vote)
351 views8 pages

Signal Detection Theory Elements

Signal detection theory models how a receiver detects signals transmitted over a noisy channel. The model includes an information source, modulator, channel with noise, sampler that takes samples, and receiver that decides which message was sent. For a binary case with two messages, the receiver compares a sample to a threshold and guesses which hypothesis is true. The minimum risk criterion chooses the threshold to minimize the average cost of possible detection errors. The ideal observer criterion minimizes the probability of error, while the maximum likelihood criterion chooses the hypothesis most likely given the sample.

Uploaded by

Xela
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
100% found this document useful (1 vote)
351 views8 pages

Signal Detection Theory Elements

Signal detection theory models how a receiver detects signals transmitted over a noisy channel. The model includes an information source, modulator, channel with noise, sampler that takes samples, and receiver that decides which message was sent. For a binary case with two messages, the receiver compares a sample to a threshold and guesses which hypothesis is true. The minimum risk criterion chooses the threshold to minimize the average cost of possible detection errors. The ideal observer criterion minimizes the probability of error, while the maximum likelihood criterion chooses the hypothesis most likely given the sample.

Uploaded by

Xela
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

Chapter III.

Elements of Signal Detection Theory


The model for signal detection

Figure 1: Signal detection model

I Contents:
I Information source: generates messages an with probabilities p(an )
I Modulator: transmits a signal sn (t) for message an
I Channel: adds random noise
I Sampler: takes samples from the signal sn (t)
I Receiver: decides what message an has been transmitted
Example

I A simple case (binary):


I two messages a0 and a1
I signals are constants (i.e. 0 for a0 , 5 for a1 )
I take just 1 sample
I decide: compare with a threshold
I General case: many messages, various signals, more samples (or
continuous)
Detection for the binary case

I Receiver guesses between two hypotheses:


I H0 : a0 has been transmitted
I H1 : a1 has been transmitted
I The sample r = s + n
I if more samples, then they are vectors →

r =→

s +→

n
I Decision based on regions:
I if r in region R0 , then decide D0 : was a0
I if r in region R1 , then decide D1 : was a1
I for single sample, regions are intervals: below/above the threshold
I for 2 samples: regions are areas in a 2D plane, etc.
I Possible errors:
I false alarm: was a0 , but decided D1
I probability is P(D1 ∩ a0 )
I miss: was a1 , but decided D0
I probability is P(D0 ∩ a1 )
Minimum risk (cost) criterion

I How to choose the threshold? We need criteria


I In general: how to delimit regions Ri ?
I Minimum risk (cost) criterion: assign costs to decisions, minimize
average cost
I Cij = cost of decision Di when symbol was aj
I C00 = cost for good a0 detection
I C10 = cost for false alarm
I C01 = cost for miss
I C11 = cost for good a1 detection
I The risk = the average cost

R = C00 P(D0 ∩ a0 ) + C10 P(D1 ∩ a0 ) + C01 P(D0 ∩ a1 ) + C11 P(D1 ∩ a1 )

I Minimum risk criterion: minimize the risk R


Computations

I Proof on table:
I Use Bayes rule
I Notations: w (rR |aj ) (likelihood)
I Probabilities: Ri w (r |aj )dV
I Conclusion, decision rule is
w (r |a1 ) (C10 − C00 )p(a0 )

w (r |a0 ) (C01 − C11 )p(a1 )

Λ(r ) ≷ K
I Interpretation: effect of costs, probabilities (move threshold)
I Can also apply logarithm (useful for normal distribution)

ln Λ(r ) ≷ ln K

I Example at blackboard: random noise with N(0, σ 2 ), one sample


Ideal observer criterion

I Minimize the probability of decision error Pe


I definition of Pe
I Particular case of minimum risk, with
I C00 = C11 = 0 (good decisions bear no cost)
I C10 = C01 (pay the same in case of bad decisions)

w (r |a1 ) p(a0 )

w (r |a0 ) p(a1 )
Maximum likelihood criterion

I Particular case of above, with equal probability of messages

w (r |a1 )
≷1
w (r |a0 )
w (r |a1 )
ln ≷0
w (r |a0 )

I Example at blackboard: random noise with N(0, σ 2 ), one sample


I Example at blackboard: random noise with N(0, σ 2 ), two samples

You might also like