Introduction To Communications: Source Coding
Introduction To Communications: Source Coding
Introduction to Communications
Lecture 31: Source Coding
This lecture:
1. Information Theory.
2. Entropy.
3. Source Coding.
4. Huffman Coding.
Ref: CCR pp. 697–709, A Mathematical Theory of Communication.
point-to-point communication.
0 ≤ H (X) ≤ log M.
I We’ll use Hb (X) when we need to be explicit
about using a logarithm to the base b.
identi ied.
I These sequences are enumerated using binary
block sizes?
I We’ll start with codes for a single symbol, i.e.,
N = 1.
I Consider a variable-length source code where
p=1 Step 4
0
p=
Step 3 0.55
1
0 1
p=
p= 0 0.45
0.3
0 1
p0=0.25 0 1
Step 2
Step 1 1 2
3 4 p1=0.25 p2=0.2
p3=0.15 p4=0.15
I Average code length is 2.3 bits and the entropy is 2.29 bits.