Underwater
Underwater
net/publication/43807988
CITATIONS READS
715 7,359
2 authors, including:
SEE PROFILE
All content following this page was uploaded by Silvia Elena Corchs on 12 February 2014.
Review Article
Underwater Image Processing: State of the Art of Restoration and
Image Enhancement Methods
Copyright © 2010 R. Schettini and S. Corchs. This is an open access article distributed under the Creative Commons Attribution
License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly
cited.
The underwater image processing area has received considerable attention within the last decades, showing important
achievements. In this paper we review some of the most recent methods that have been specifically developed for the underwater
environment. These techniques are capable of extending the range of underwater imaging, improving image contrast and
resolution. After considering the basic physics of the light propagation in the water medium, we focus on the different algorithms
available in the literature. The conditions for which each of them have been originally developed are highlighted as well as the
quality assessment methods used to evaluate their performance.
(i) The image restoration aims to recover a degraded dependence. The irradiance E at position r can be modeled
image using a model of the degradation and of the as:
original image formation; it is essentially an inverse
problem. These methods are rigorous but they E(r) = E(0)e−cr , (1)
require many model parameters (like attenuation
and diffusion coefficients that characterize the water where c is the total attenuation coefficient of the medium.
turbidity) which are only scarcely known in tables This coefficient is a measure of the light loss from the com-
and can be extremely variable. Another important bined effects of scattering and absorption over a unit length
parameter required is the depth estimation of a given of travel in an attenuation medium. Typical attenuation
object in the scene. coefficients for deep ocean water, coastal water and bay water
are 0.05 m−1 , 0.2 m−1 , and 0.33 m−1 , respectively.
(ii) Image enhancement uses qualitative subjective crite- Assuming an isotropic, homogeneous medium, the total
ria to produce a more visually pleasing image and attenuation coefficient c can be further decomposed as a
they do not rely on any physical model for the image sum of two quantities a and b, the absorption and scattering
formation. These kinds of approaches are usually coefficients of the medium, respectively:
simpler and faster than deconvolution methods.
E(r) = E(0)e−ar e−br . (2)
In what follows we give a general view of some of the most The total scattering coefficient b is the superposition of all
recent methods that address the topic of underwater image scattering events at all angles through the volume scattering
processing providing an introduction of the problem and function β(θ) (this function gives the probability for a ray
enumerating the difficulties found. Our scope is to give of light to be deviated of an angle θ from its direction of
the reader, in particular who is not an specialist in the propagation)
field and who has a specific problem to address and solve, π
the indications of the available methods focusing on the
b = 2π β(θ) sin θ dθ. (3)
imaging conditions for which they were developed (lighting 0
conditions, depth, environment where the approach was
tested, quality evaluation of the results) and considering the The parameters a, b, c, and β(θ) represent the inherent
model characteristics and assumptions of the approach itself. properties of the medium and their knowledge should
In this way we wish to guide the reader so as to find the theoretically permit us to predict the propagation of light
technique that better suits his problem or application. in the water. However, all these parameters depend on the
In Section 2 we briefly review the optical properties of location r (in a three dimensional space) and also on time.
the light propagation in water and the image formation Therefore, the corresponding measurements are a complex
model of Jaffe-McGlamery, following in Section 3 with a task and computational modeling is needed.
report of the image restoration methods that take into McGlamery [1] laid out the theoretical foundations of
account this image model. In Section 4, works addressing the optical image formation model while Jaffe [2] extended
image enhancement and color correction in underwater the model and applied it to design different subsea image
environment are presented. We include a brief description acquisition systems. Modeling of underwater imaging has
of some of the most recent methods. When possible, some also been carried out by Monte Carlo techniques [3].
examples (images before and after correction) that illustrate In this section we follow the image formation model of
these approaches are also included. Section 5 considers the Jaffe-McGlamery. According to this model, the underwater
lighting problems and Section 6 focuses on image quality image can be represented as the linear superposition of
metrics. Finally the conclusions are sketched in Section 7. three components (see Figure 1). An underwater image
experiment consists of tracing the progression of light from
a light source to a camera. The light received by the camera
2. Propagation of Light in the Water is composed by three components: (i) the direct component
Ed (light reflected directly by the object that has not been
In this section we focus on the special transmission proper- scattered in the water), (ii) the forward-scattered component
ties of the light in the water. Light interacts with the water E f (light reflected by the object that has been scattered at a
medium through two processes: absorption and scattering. small angle) and (iii) the backscatter component Eb (light
Absorption is the loss of power as light travels in the medium reflected by objects not on the target scene but that enters
and it depends on the index of refraction of the medium. the camera, for example due to floating particles). Therefore,
Scattering refers to any deflection from a straight-line the total irradiance ET reads:
propagation path. In underwater environment, deflections
can be due to particles of size comparable to the wavelengths E T = Ed + E f + Eb . (4)
of travelling light (diffraction), or to particulate matter with
refraction index different from that of the water (refraction). Spherical spreading and attenuation of the source light beam
According to the Lambert-Beer empirical law, the decay is assumed in order to model the illumination incident
of light intensity is related to the properties of the material upon the target pane. The reflected illumination is then
(through which the light is travelling) via an exponential computed as the product of the incident illumination and the
EURASIP Journal on Advances in Signal Processing 3
z
Illumination source Source
Camera RV
Zci ϕb
Floating ΔVi Rs
particles
Object Rc
y
θ
(x , y )
x
Image plane Scene point
Camera
Figure 2: Coordinate system of the Jaffe-McGlamery model.
Direct component
Forward component
Backscatter component
where the function g is given by
Figure 1: The three components of underwater optical imaging: g x, y, Rc , G, c, B
direct component (straight line), forward component (dashed line) (7)
and backward scatter component (dash-dot line). = exp(−GRc ) − exp(−cRc ) F −1 exp(−BRc w)
yield good results but these configurations are contrast- The exponent, D(φ), is the decay transfer function obtained
limited at greater ranges. If longer distances are desired (2- by Wells [12] for the seawater within the small angle
3 attenuation lengths), systems with separated camera and approximation
lights are preferred but backscattering problems appear as the
distance increases. For greater distances more sophisticated b 1 − exp −2πθ0 φ
D φ =c− , (14)
technology is required, like for example, laser range-gated 2πθ0 φ
systems and synchronous scan imaging.
where θ0 is the mean square angle, b and c are the
total scattering and attenuation coefficients, respectively.
3. Image Restoration The system (camera/lens) response was measured directly
from calibrated imagery at various spatial frequencies.
A possible approach to deal with underwater images is to
In water optical properties during the experiment were
consider the image transmission in water as a linear system
measured: absorption and attenuation coefficients, particle
[8].
size distributions and volume scattering functions. The
Image restoration aims at recovering the original image
authors implemented an automated framework termed
f (x, y) from the observed image g(x, y) using (if available)
Image Restoration via Denoised Deconvolution. To deter-
explicit knowledge about the degradation function h(x, y)
mine the quality of the restored images, an objective quality
(also called point spread function PSF) and the noise
metric was implemented. It is a wavelet decomposed and
characteristics n(x, y):
denoised perceptual metric constrained by a power spectrum
g x, y = f x, y ∗ h x, y + n x, y , (10) ratio (see Section 6). Image restoration is carried out and
medium optical properties are estimated. Both modeled
where ∗ denotes convolution. The degradation function and measured optical properties are taken into account in
h(x, y) includes the system response from the imaging system the framework. The images are restored using PSFs derived
itself and the effects of the medium (water in our case). In the from both the modeled and measured optical properties (see
frequency domain, we have: Figure 3).
Trucco and Olmos [13] presented a self-tuning restora-
G(u, v) = F(u, v)H(u, v) + N(u, v), (11) tion filter based on a simplified version of the Jaffe-
McGlamery image formation model. Two assumptions are
where (u, v) are spatial frequencies and G, F, H, and N are
made in order to design the restoration filter. The first one
Fourier transforms of g, f , h, and n, respectively. The system
assumes uniform illumination (direct sunlight in shallow
response function H in the frequency domain is referred
waters) and the second one is to consider only the forward
as the optical transfer function (OTF) and its magnitude is
component E f of the image model as the major degradation
referred as modulation transfer function (MTF). Usually, the
source, ignoring back scattering Eb and the direct component
system response is expressed as a direct product of the optical
Ed . This appears reasonable whenever the concentration
system itself and the medium:
of particulate matter generating backscatter in the water
optical column is limited. A further simplification considers the
H(u, v) = Hsystem (u, v)Hmedium (u, v). (12)
difference of exponentials in the forward scatter model (6)
The better the knowledge we have about the degradation as an experimental constant K (with typical values between
function, the better are the results of the restoration. How- 0.2 and 0.9)
ever, in practical cases, there is insufficient knowledge about
the degradation and it must be estimated and modeled. In K ≈ exp(−GRc ) − exp(−cRc ) . (15)
our case, the source of degradation in underwater imaging Within these assumptions, from (7), a simple inverse filter in
includes turbidity, floating particles and the optical prop- the frequency domain is designed as follows (the parameter
erties of light propagation in water. Therefore, underwater B is approximated by c)
optical properties have to be incorporated into the PSF and
MTF. The presence of noise from various sources further G f , Rc , c, K ≈ K exp(−cRc w). (16)
complicates these techniques.
Recently, Hou et al. [9–11] incorporated the underwa- Optimal values of these parameters were estimated auto-
ter optical properties to the traditional image restoration matically for each individual image by optimizing a quality
approach. They assume that blurring is caused by strong scat- criterion based on a global contrast measure (optimality
tering due to water and its constituents which include various is defined as achieving minimum blur). Therefore, low-
sized particles. To address this issue, they incorporated backscatter and shallow-water conditions represent the opti-
measured in-water optical properties to the point spread mal environment for this technique. The authors assessed
function in the spatial domain and the modulation transfer both qualitative (by visual inspection) and quantitatively the
function in frequency domain. The authors modeled Hmedium performance of the restoration filter. They assessed quanti-
for circular symmetrical response systems (2-dimensional tatively the benefits of the self-tuning filer as preprocessor
space) as an exponential function for image classification: images were classified as containing
or not man-made objects [14, 15]. The quantitative tests
Hmedium φ, r = exp −D φ r . (13) with a large number of frames from real videos show an
EURASIP Journal on Advances in Signal Processing 5
(a) (b)
(c)
Figure 3: Image taken at 7.5 m depth in Florida. The original (a), the restored image based on measured MTF (b) and the restored image
based on modeled MTF (c). Courtesy of Hou et al. [9].
important improvement to the classification task of detecting where Sg is the power spectrum of the blurred image. Then,
man-made objects on the seafloor. The training videos were the spectrum of the restored image is
acquired under different environments: instrumented tank,
shallow and turbid waters conditions in the sea. H ∗ (u, v)
Liu et al. [16] measured the PSF and MTF of seawater in F(u, v) = G(u, v) 2 . (19)
|H(u, v)| + Sn /S f
the laboratory by means of the image transmission theory
and used Wiener filters to restore the blurred underwater
images. The degradation function H(u, v) is measured in a Also parametric Wiener filter is used by the authors and both
water tank. An experiment is constructed with a slit image deconvolution methods are compared.
and a light source. In a first step, one dimensional light Schechner and Karpel [17] exploit the polarization effects
intensity distribution of the slit images at different water in underwater scattering to compensate for visibility degra-
path lengths is obtained. The one dimensional PSF of sea dation. The authors claim that image blur is not the domi-
water can be obtained by the deconvolution operation. Then, nant cause for image contrast degradation and they associate
according to the property of the circle symmetry of the PSF underwater polarization with the prime visibility distur-
of seawater, the 2-dimensional PSF can be calculated by bance that they want to delete (veiling light or backscattered
mathematical method. In a similar way, MTFs are derived. light). The Jaffe-McGlamery image formation model is
These measured functions are used for blurred image applied under natural underwater lighting exploiting the fact
restoration. The standard Wiener deconvolution process is that veiling light is partially polarized horizontally [18]. The
applied. The transfer function W(u, v) reads algorithm is based on a couple of images taken through a
polarizer at different orientations. Even when the raw images
H ∗ (u, v)
W(u, v) = 2 , (17) have very low contrast, their slight differences provide the
|H(u, v)| + Sn /S f key for visibility improvement. The method automatically
where Sn and S f are the power spectrum of noise and original accounts for dependencies on object distance, and estimates
image, respectively, and H ∗ (u, v) is the conjugate matrix of a distance map of the scene. A quantitative estimate for the
H(u, v) (measured result as previously described). Noise is visibility improvement is defined as a logarithmic function
regarded as white noise, and Sn is a constant that can be of the backscatter component. Additionally, an algorithm
estimated form the blurred images with noise while S f is to compensate for the strong blue hue is also applied.
estimated as Experiments conducted in the sea show improvements of
scene contrast and color correction, nearly doubling the
Sg (u, v) − Sn (u, v) underwater visibility range. In Figure 4 a raw image and its
S f (u, v) = 2 , (18)
|H(u, v)| recovered version are shown.
6 EURASIP Journal on Advances in Signal Processing
(a) (b)
Figure 5: Raw image (a), De-scattered image (b) [19]. From [Link]
(a) (b)
Figure 6: Pairs of images before (a) and after (b) Bazeille et al.’ processing. Image courtesy of Bazeille et al. [20].
with distance. Remaining noises corresponding to sensor a color value to each pixel of the input image that best
noise, floating particles and miscellaneous quantification describes its surrounding structure using the training image
errors are suppressed using a generic self-tuning wavelet- patches. This model uses multi-scale representations of
based algorithm. The use of the adaptive smoothing filter the color corrected and color depleted (bluish) images to
significantly improves edge detection in the images. Results construct a probabilistic algorithm that improves the color
on simulated and real data are presented. of underwater images. Experimental results on a variety of
The color recovery is also analyzed by Torres-Mendez underwater scenes are shown.
and Dudek [27] but from a different perspective: it is for- Ahlen et al. [28] apply underwater hyperspectral data
mulated as an energy minimization problem using learned for color correction purposes. They develop a mathematical
constraints. The idea, on which the approach is based, stability model which gives a value range for wavelengths
is that an image can be modeled as a sample function that should be used to compute the attenuation coefficient
of a stochastic process known as Markov Random Field. values that are as stable as possible in terms of variation with
The color correction is considered as a task of assigning depth. Their main goal is to monitor coral reefs and marine
8 EURASIP Journal on Advances in Signal Processing
(a) (b)
Figure 7: Original images (a), after correction with ACE (b). Image courtesy of Chambah et al. [23].
(a) (b)
Figure 8: Original images (a), images after enhancement using Iqbal et al’. technique (b). Image courtesy of Iqbal et al. [25].
EURASIP Journal on Advances in Signal Processing 9
habitats. Spectrometer measurements of a colored plate at by the smoothed image fs , giving rise to an estimate of ideal
various depths are performed. The hyperspectral data is then image rideal
color corrected with a formula derived from Beer’s law
f x, y
rideal = δ, (23)
I(z ) = I(z) exp[c(z)z − c(z )z ], (20) fs x, y
where δ is a normalization constant. Next, the contrast of
where I(z) is the pixel intensity in the image for depth z and the resulting image is emphasized, giving rise to an equalized
c(z) is the corresponding attenuation coefficient calculated version of r.
from spectral data. In this way, they obtain images as if they Some authors compensate for the effects of non-uniform
were taken at a much shallower depth than in reality. All lighting by applying local equalization to the images [31, 32].
hyperspectral images are “lifted up” to a depth of 1.8 m, The non uniform of lighting demands a special treatment for
where almost all wavelengths are still present (they have not the different areas of the image, depending on the amount of
been absorbed by the water column). The data is finally light they receive. The strategy consists in defining an nxn
brought back into the original RGB space. neighborhood, computing the histogram of this area and
Another approach to improve color rendition is pro- applying an equalization function but modifying uniquely
posed by Petit et al. [29]. The method is based on light the central point of the neighborhood [33]. A similar strategy
attenuation inversion after processing a color space contrac- is used in Zuidervel [34].
tion using quaternions. Applied to the white vector (1, 1, 1) An alternative model consists of applying homomorphic
in the RGB space, the attenuation gives a hue vector H filtering [30]. This approach assumes that the illumination
characterizing the water color factor varies smoothly through the field of view; generating
low frequencies in the Fourier transform of the image (the
H = exp{−cR z}, exp{−cG z}, exp{−cB z} , (21)
offset term is ignored). Taking the logarithm of (22), the
multiplicative effect is converted into an additive one
where cR , cG , and cB are the attenuation coefficients for
red, green and blue wavelengths, respectively. Using this ln f x, y = ln cm x, y + ln r x, y . (24)
reference axis, geometrical transformations into the color
space are computed with quaternions. Pixels of water areas Taking the Fourier transform of (24) we obtain
of processed images are moved to gray or colors with a low
F(u, v) = Cm (u, v) + R(u, v), (25)
saturation whereas the objects remain fully colored. In this
way, objects contrasts result enhanced and bluish aspect of where F(u, v), Cm (u, v), and R(u, v) are the Fourier trans-
images is removed. Two example images before and after forms of ln f (x, y), ln cm (x, y), and ln r(x, y), respectively.
correction by Petit et al’. algorithm are shown in Figure 9. Low frequencies can be suppressed by multiplying these
components by a high pass homomorphic filter H given by
5. Lighting Problems −1
H(u, v) = 1 + exp −s u2 + v2 − w0 + ρ, (26)
In this section we summarize the articles that have been
specifically focused on solving lighting problems. Even if where w0 is the cutoff frequency, s is a multiplicative factor
this aspect was already taken into account in some of the and ρ is an offset term. This filter not only attenuates non
methods presented in the previous sections, we review here uniform illumination but also enhances the high frequencies,
the works that have addressed in particular this kind of sharpening the edges.
problem, proposing different lighting correction strategies. Rzhanov et al. [35] disregards the multiplicative constant
Garcia et al. [30] analyzed how to solve the lighting cm , considering the lighting of the scene as an additive factor
problems in underwater imaging and reviewed different which should be subtracted from the original image
techniques. The starting point is the illumination-reflectance
r x, y = f x, y − Φ x, y + δ, (27)
model, where the image f (x, y) sensed by the camera is
considered as a product of the illumination i(x,y), the where Φ(x, y) is a two dimensional polynomial spline and δ
reflectance function r(x, y) and a gain factor g(x, y) plus an is a normalization constant.
offset term o(x, y): Garcia et al. [30] tested and compared the different
lighting-corrections strategies for two typical underwater
f x, y = g x, y · i x, y · r x, y + o x, y . (22) situations. The first one considers images acquired in shallow
waters at sun down (simulating deep ocean). The vehicle
The multiplicative factor cm (x, y) = g(x, y) · i(x, y) due carries its own light producing a bright spot in the center
to light sources and camera sensitivity can be modeled of the image. The second sequence of images was acquired in
as a smooth function (the offset term is ignored). In shallow waters on a sunny day. The evaluation methodology
order to model the non-uniform illumination, a Gaussian- for the comparisons is qualitative. The best results have
smoothed version of the image is proposed. The smoothed been obtained by the homomorphic filtering and the point-
image is intended to be an estimate of how much the by-point correction by the smoothed image. The authors
illumination field (and camera sensitivity) affects every pixel. emphasize that both methods consider the illumination field
The acquired image is corrected by a point-by-point division is multiplicative and not subtractive.
10 EURASIP Journal on Advances in Signal Processing
(a) (b)
Figure 9: Original image (a), corrected by Petit et al.’ algorithm (b). Image courtesy of Petit et al. [29].
6. Quality Assessment slope of edges. They use wavelet transforms to remove the
effect of scattering when locating edges and further apply
In the last years many different methods for image quality the transformed results in restraining the perceptual metric.
assessment have been proposed and analyzed with the goal Images are first decomposed by a wavelet transform to
of developing a quality metric that correlates with perceived remove random and medium noise. Sharpness of the edges
quality measurements (for a detailed review see [36]). Peak is determined by linear regression, obtaining the slope angle
Signal to Noise Ratio and Mean Squared Error are the most between grayscale values of edge pixels versus location. The
widely used objective image quality/distortion metrics. In the overall sharpness of the image is the average of measured
last decades however, a great effort has been made to develop grayscale angles weighted by the ratio of the power of the
new objective image quality methods which incorporate high frequency components of the image to the total power
perceptual quality measures by considering human visual of the image (WGSA metric). The metric has been used in
system characteristics. Wang et al. [37] propose a Structural their automated image restoration program and the results
Similarity Index that does not treat the image degradation demonstrate consistency for different optical conditions and
as an error measurement but as a structural distortion attenuation ranges.
measurement. Focusing on underwater video processing algorithms,
The objective image quality metrics are classified in three Arredondo and Lebart [39] propose a methodology to
groups: full reference (there exists an original image with quantitative assess the robustness and behavior of algorithms
which the distorted image is to be compared), no-reference in face of underwater noises. The principle is to degrade
or “blind” quality assessment and reduced-reference quality test images with simulated underwater perturbations and the
assessment (the reference image is only partially available, in focus is to isolate and assess independently the effects of the
the form of a set of extracted features). different perturbations. These perturbations are simulated
In the present case of underwater image processing, no with varying degrees of severity. Jaffe and McGlamery’ model
original image is available to be compared, and therefore, is used to simulate blur and unequal illumination. Different
no-reference metrics are necessary. Within the above cited levels of blurring are simulated using the forward-scattered
methods for enhancement and restoration, many of the component of images taken at different distances from the
authors use subjective quality measurements to evaluate the scene: Rc in (6) is increased varying from R1 to R2 meters
performance of their methods. In what follows we focus to the scene at intervals ΔR. The non-uniform lighting is
on the quantitative metrics used by some of the authors to simulated placing the camera at distances between d1 and
evaluate the algorithm performance and image quality in the d2 meters, at intervals of Δd. In order to isolate the effect
specific case of underwater images. of non-uniform lighting, only the direct component is taken
Besides visual comparison, Hou and Weidemann [38] into account. The lack of contrast is simulated by histogram
also propose an objective quality metric for the scattering- manipulation. As a specific application, different optical
blurred typical underwater images. The authors measure flow algorithms for underwater conditions are compared.
the image quality by its sharpness using the gradient or A well known ground-truth synthetic sequence is used
EURASIP Journal on Advances in Signal Processing 11
Measurements on controlled
Measurement of PSF of water
environment. Set up: light
Liu et al. [16] and image restoration. Standard Visual inspection.
source, slit images at 1–3 m in
2001 and parametric Wiener filter
water tank. Restoration of images
deconvolution.
taken in turbid water.
Visual inspection. Quantitative
Polarization associated with the
Schechner and Polarizer used to analyze the estimate for the visibility
prime visibility disturbance to be
Karpel [17] scene. Experiments in the sea improvement. Estimation of the
deleted (backscatter). Natural
2005 (scene 26 m deep). distance map of the scene.
lighting.
Polarization-based method for
Treibitz and Experiments in real underwater Visual inspection. Quantitative
visibility enhancement and
Schechner [19] scenes: Mediterranean sea, Red estimate for the visibility
distance estimation in scattering
2009 Sea and lake of Galilee. improvement.
media. Artificial illumination.
Image enhancement and color correction methods
Visual inspection. Quantitative
index: closeness of histogram to
Automatic pre-processing. Deep marine habitats. Scenes
Bazeille et al. exponential distribution and
Natural and artificial with man-made objects in the
[20] 2006 tests for object recognition in the
illumination. sea floor.
sea floor.
Table 1: Continued.
Model’s characteristics and
Algorithm Experiments and data set Image quality evaluation
assumptions
Test image: colored plate at 6 m
Ahlen et al. [28] Hyperspectral data for color
depth in the sea. Coral reefs and Visual inspection.
2007 correction. Natural illumination.
marine habitats.
Enhancement method: color
Petit et al. [29] space contraction using Marine habitats at both shallow
Visual inspection.
2009 quaternions. Natural and and deep waters.
artificial lighting
Compensating for lighting Shallow waters on a sunny day.
Garcia et al. [30]
problems: non uniform Shallow waters at sun down Visual inspection.
2002
illumination. (simulating deep ocean).
Visual inspection. Quantitative
Test images are degraded with
Video processing algorithms. evaluation: mean angular error is
Arredondo and simulated perturbations.
Simulations of perturbations. measured in motion estimation
Lebart [39] 2005 Simulations in shallow (1–7 m)
Natural and artificial lighting for different methods as a
and deep waters.
function of Gaussian noise.
for the experiments. The true motion of the sequence is implementation of an objective image quality measure). The
known and it is possible to measure quantitatively the majority of the algorithms here reviewed have been evaluated
effect of the degradations on the optical flow estimates. In using subjective visual inspection of their results.
[39] different methods available are compared. The angular
deviation between the estimated velocity and the correct one
is measured. An attenuation coefficient typical of deep ocean 7. Conclusions
is used. It is shown that the angular error increases linearly
with the Gaussian noise for all the methods compared. The difficulty associated with obtaining visibility of objects at
In order to assess the quality of their adaptive smoothing long or short distance in underwater scenes presents a chal-
method for underwater image denoising, Arnold-Bos et al. lenge to the image processing community. Even if numerous
[26] proposed a simple criterion based on a general result approaches for image enhancement are available, they are
by Pratt [40]: for most well contrasted and noise free mainly limited to ordinary images and few approaches
images, the distribution of the gradient magnitude histogram have been specifically developed for underwater images.
is closely exponential, except for a small peak at low In this article we have reviewed some of them with the
gradients corresponding to homogeneous zones. They define intention of bringing the information together for a better
a robustness index between 0 and 1 (it is linked to the comprehension and comparison of the methods. We have
variance of the linear regression of the gradient magnitude summarized the available methods for image restoration and
histogram) that measures the closeness of the histogram with image enhancement, focusing on the conditions for which
an exponential distribution. The same index was also used each of the algorithms has been originally developed. We
by Bazeille et al. [20] to evaluate the performance of their have also analyzed the methodology used to evaluate the
algorithm. algorithms’ performance, highlighting the works where a
In Table 1 we summarize the articles above reviewed quantitative quality metric has been used.
indicating the model assumptions and imaging conditions As pointed by our analysis, to boost underwater imaging
for which they have been developed and tested as well as processing, a common suitable database of test images for
the image quality assessment method used to evaluate the different imaging conditions together with standard criteria
corresponding results. for qualitative and/or quantitative assessment of the results is
To make a quantitative comparison of the above cited still required.
methods, judging which of them gives the best/worst results Nowadays, leading advancements in optical imaging
is beyond the scope of this article. In fact, in order to do technology [41, 42] and the use of sophisticated sensing
such a quantitative comparison of results, a common data techniques is rapidly increasing the ability to image objects
base should be available in order to test the corresponding in the sea. Emerging underwater imaging techniques and
algorithms according to specific criteria. To our knowledge, technologies make it necessary to adapt and extend the above
no such underwater database exist at present and therefore, cited methods to, for example, handle data from multiple
to build this database could be one of the future research sources that can extract 3-dimensional scene information.
lines from which the underwater community would certainly On the other hand, studying the vision system of underwater
beneficiate. However, we have pointed out how each of animals (their physical optics, photoreceptors and neuro-
the algorithms has been evaluated by the own authors: physiological mechanisms) will certainly give us new insights
subjectively (by visual inspection) or objectively (by the to the information processing of underwater images.
EURASIP Journal on Advances in Signal Processing 13
Preliminaryȱcallȱforȱpapers OrganizingȱCommittee
HonoraryȱChair
The 2011 European Signal Processing Conference (EUSIPCOȬ2011) is the MiguelȱA.ȱLagunasȱ(CTTC)
nineteenth in a series of conferences promoted by the European Association for GeneralȱChair
Signal Processing (EURASIP, [Link]). This year edition will take place AnaȱI.ȱPérezȬNeiraȱ(UPC)
in Barcelona, capital city of Catalonia (Spain), and will be jointly organized by the GeneralȱViceȬChair
Centre Tecnològic de Telecomunicacions de Catalunya (CTTC) and the CarlesȱAntónȬHaroȱ(CTTC)
Universitat Politècnica de Catalunya (UPC). TechnicalȱProgramȱChair
XavierȱMestreȱ(CTTC)
EUSIPCOȬ2011 will focus on key aspects of signal processing theory and
TechnicalȱProgramȱCo
Technical Program CoȬChairs
Chairs
applications
li ti as listed
li t d below.
b l A
Acceptance
t off submissions
b i i will
ill be
b based
b d on quality,
lit JavierȱHernandoȱ(UPC)
relevance and originality. Accepted papers will be published in the EUSIPCO MontserratȱPardàsȱ(UPC)
proceedings and presented during the conference. Paper submissions, proposals PlenaryȱTalks
for tutorials and proposals for special sessions are invited in, but not limited to, FerranȱMarquésȱ(UPC)
the following areas of interest. YoninaȱEldarȱ(Technion)
SpecialȱSessions
IgnacioȱSantamaríaȱ(Unversidadȱ
Areas of Interest deȱCantabria)
MatsȱBengtssonȱ(KTH)
• Audio and electroȬacoustics.
• Design, implementation, and applications of signal processing systems. Finances
MontserratȱNájarȱ(UPC)
Montserrat Nájar (UPC)
• Multimedia
l d signall processing andd coding.
d
Tutorials
• Image and multidimensional signal processing. DanielȱP.ȱPalomarȱ
• Signal detection and estimation. (HongȱKongȱUST)
• Sensor array and multiȬchannel signal processing. BeatriceȱPesquetȬPopescuȱ(ENST)
• Sensor fusion in networked systems. Publicityȱ
• Signal processing for communications. StephanȱPfletschingerȱ(CTTC)
MònicaȱNavarroȱ(CTTC)
• Medical imaging and image analysis.
Publications
• NonȬstationary, nonȬlinear and nonȬGaussian signal processing. AntonioȱPascualȱ(UPC)
CarlesȱFernándezȱ(CTTC)
Submissions IIndustrialȱLiaisonȱ&ȱExhibits
d i l Li i & E hibi
AngelikiȱAlexiouȱȱ
Procedures to submit a paper and proposals for special sessions and tutorials will (UniversityȱofȱPiraeus)
be detailed at [Link]. Submitted papers must be cameraȬready, no AlbertȱSitjàȱ(CTTC)
more than 5 pages long, and conforming to the standard specified on the InternationalȱLiaison
EUSIPCO 2011 web site. First authors who are registered students can participate JuȱLiuȱ(ShandongȱUniversityȬChina)
in the best student paper competition. JinhongȱYuanȱ(UNSWȬAustralia)
TamasȱSziranyiȱ(SZTAKIȱȬHungary)
RichȱSternȱ(CMUȬUSA)
ImportantȱDeadlines: RicardoȱL.ȱdeȱQueirozȱȱ(UNBȬBrazil)
P
Proposalsȱforȱspecialȱsessionsȱ
l f i l i 15 D 2010
15ȱDecȱ2010
Proposalsȱforȱtutorials 18ȱFeb 2011
Electronicȱsubmissionȱofȱfullȱpapers 21ȱFeb 2011
Notificationȱofȱacceptance 23ȱMay 2011
SubmissionȱofȱcameraȬreadyȱpapers 6ȱJun 2011
Webpage:ȱ[Link]