Image Compression
Image Compression
Lossless
Information preserving
Low compression ratios
Lossy
Not information preserving
High compression ratios
compression
Compression ratio:
Relevant Data Redundancy
Example:
Types of Data Redundancy
Expected value: E ( X ) xP ( X x)
x
Coding Redundancy
Case 1: l(rk) = constant length
Example:
Coding Redundancy (contd)
Case 2: l(rk) = variable length
variable length
auto-correlation
f ( x) o g ( x) f ( x ) g ( x a )da
auto-correlation: f(x)=g(x)
Interpixel redundancy (contd)
To reduce interpixel redundancy, some kind of transformation
must be applied on the data (e.g., thresholding, DFT, DWT)
Example:
threshold
original
11 0000..11..000..
thresholded
(1+10) bits/pair
Psychovisual redundancy
units of information!
using
units/pixel
Entropy:
(e.g., bits/pixel)
Redundancy - revisited
Redundancy:
where:
image
Entropy Estimation (contd)
First order estimate of H:
image
Differences in Entropy Estimates
16
Differences in Entropy Estimates (contd)
What is the entropy of the difference image?
How close is to ?
Criteria
Subjective: based on human observers
Objective: mathematically defined criteria
Subjective Fidelity Criteria
Objective Fidelity Criteria
Backward Pass
Assign code symbols going backwards
Huffman Coding (contd)
Lavg assuming Huffman coding:
Encode message: 1 2 3 3 4
0 1
2) Subdivide [0, 1) based on the probabilities of i
Encode
1 2 3 3 4
0.8
0.4
[0.06752, 0.0688)
0.2
or
0.068
(must be inside sub-interval)
Example (contd)
4
0.8 0.72 0.688 0.5856 0.57152
Decode 0.572
3
(code length=4)
0.4 0.56 0.624 0.5728 0.56896
2
0.2 0.48 0.592 0.5664 0.56768
3 3 1 2 4
1
0.0 0.4
0.56 0.56 0.5664
LZW Coding
(addresses interpixel redundancy)
Requires no prior knowledge of symbol probabilities.
Initial Dictionary
Consider a 4x4, 8 bit image Dictionary Location Entry
39 39 126 126 0 0
39 39 126 126 1 1
. .
39 39 126 126 255 255
39 39 126 126 256 -
511 -
LZW Coding (contd)
As the encoder examines image pixels, gray level
sequences (i.e., blocks) that are not in the dictionary are
assigned to a new entry.
39 39 126 126
39 39 126 126
39 39 126 126
Dictionary Location Entry
39 39 126 126
0 0
1 1
- Is 39 in the dictionary..Yes
. . - What about 39-39.No
255 255
256 -
39-39
* Add 39-39 at location 256
511 -
Example
39 39 126 126 Concatenated Sequence: CS = CR + P
39 39 126 126
(CR) (P)
39 39 126 126
39 39 126 126
CR = empty
repeat
P=next pixel
CS=CR + P
If CS is found:
(1) No Output
(2) CR=CS
else:
(1) Output D(CR)
(2) Add CS to D
(3) CR=P
Decoding LZW
e.g., (0,1)(1,1)(0,1)(1,0)(0,2)(1,4)(0,2)
Bit-plane coding
(addresses interpixel redundancy)
Process each bit plane individually.
~ (N/n)2 subimages
Example: Fourier Transform
K << N
K-1 K-1
Transform Selection
Forward:
Inverse:
if u=0 if v=0
if u>0 if v>0
DCT (contd)
Reconstructed images
by truncating
50% of the
coefficients
More compact
transformation
Reconstructions
DFT
has n-point periodicity
DCT
has 2n-point periodicity
JPEG Compression
Entropy
encoder
Accepted
as an
internatio
nal image
compressi
on
standard
in 1992. Entropy
decoder
JPEG - Steps
Quantization
JPEG Steps (contd)
6. Encode coefficients:
symbol_1 symbol_2
(SIZE) (AMPLITUDE)
predictive
coding:
Intermediate Symbol Sequence AC coeff
# bits
Original
Effect of Quantization:
non-homogeneous 8 x 8 block
Effect of Quantization:
non-homogeneous 8 x 8 block (contd)
Quantized De-quantized
Effect of Quantization:
non-homogeneous 8 x 8 block (contd)
Reconstructed
Error is high!
Original:
Case Study: Fingerprint Compression
No blocky artifacts.
WSQ Algorithm
Compression ratio
FBIs target bit rate is around 0.75 bits per pixel (bpp)
Sequential
Progressive
Progressive JPEG (contd)
N/4 x N/4
N/2 x N/2
NxN
More Methods