You are on page 1of 8

Image and Vision Computing 25 (2007) 523–530

www.elsevier.com/locate/imavis

q
Cloud covering denoising through image fusion
Salvador Gabarda, Gabriel Cristóbal *

Instituto de Óptica ‘‘Daza de Valdés’’ (CSIC), Serrano 121, Madrid 28006, Spain

Received 12 March 2004; received in revised form 27 October 2005; accepted 27 March 2006

Abstract

This paper presents a solution to the cloud removal problem, based in a recently developed image fusion methodology consisting in
applying a 1-D pseudo-Wigner distribution (PWD) transformation to the source images and on the use of a pixel-wise cloud model. Both
features could also be interpreted as a denoising method centered in a pixel-level measure. Such procedure is able to process sequences of
multi-temporal registered images affected with spatial-variant noise. The goal consists in providing a 2-D clean image, after removing the
spatial-variant noise disturbing the set of multi-temporal registered source images. This is achieved by taking as reference a statistically
parameterized model of a cloud prototype. Using this model, a pixel-wise measure of the noise degree of the source images can be cal-
culated through their PWDs. This denoising procedure enables to choose the noise-free pixels from the set of given source images. The
applicability of the method to the cloud removal paradigm is illustrated with different sets of artificial and natural cloudy or foggy
images, partially occluded by clouds in different regions. Another advantage of the present approach is its reduced computational cost,
once the 1-D case has been preferred instead of a full 2-D implementation of the PWD.
 2006 Elsevier B.V. All rights reserved.

Keywords: Wigner distribution; Image fusion; Image enhancement

1. Introduction describes a fusion algorithm based in transforming the


source images by a pseudo-Wigner distribution (PWD),
The recovering of an image from a degraded realization following a methodology recently developed by the
has been the subject of many contributions in the area of authors. In a previous paper [6] the theoretical background
image restoration, under the name of image deblurring or of this fusion methodology was described and experimen-
image deconvolution [1]. When image restoration is accom- tally validated for defocused images coming from digital
plished without any ‘‘a priori’’ knowledge about the degra- camera. Later on, the method was successfully applied to
dation process, we are dealing with blind image multi-focus microscopic images [7]. However, the main
deconvolution methods [2]. If the blurring is not homoge- contribution of this paper consists in extending the method
neously distributed, the defocusing process will affect differ- to multi-temporal images for cloud removing purposes,
ent regions of the image with different strength. This under a similar fusion scenario. This case is formally differ-
scenario is referred to as space-variant blurring [3]. A spe- ent from the others previously treated, where the origin of
cial case of space-variant degradation occurs when multi-fo- degradation was image blurring. Out-of-focus affects digital
cus or multi-temporal images are available and therefore camera images segmenting the view generally in two
image fusion methods [4,5] can be applied. The method pre- regions, foreground and background. Microscopic images
sented in this paper applies to multi-temporal images and present a very narrow focus depth, originating a continu-
ous changing of the in-focus regions on equally spaced real-
q
izations. Now, the origin is occluding noise, ranging from a
Submitted to Image and Vision Computing, Elsevier Science, 2004 (in
revision).
light haze to a dense cloud. Therefore, this case has funda-
*
Corresponding author. Tel.: +34 91 561 6800x2305; fax: +34 91 564 mental differences that require to be treated separately.
5557. Several experiments with artificial and realistic images are

0262-8856/$ - see front matter  2006 Elsevier B.V. All rights reserved.
doi:10.1016/j.imavis.2006.03.007
524 S. Gabarda, G. Cristóbal / Image and Vision Computing 25 (2007) 523–530

presented here to illustrate the performances of the nates a N-component vector. Furthermore, Eq. (2) must
method. be interpreted as the discrete Fourier transform (DFT) of
This paper is structured as follows. The mathematical product r (n, k) = z (n + k)z* (n  k). It is worth noting that
background of the method, including a brief introduction the discretizing process implies losing of some properties of
to the pseudo-Wigner distribution is described in Section the continuous WD. One important property preserved
2 as the basis of the fusion process. Section 3 gives an intro- within the definition given by Eq. (2) is the inversion
duction to the cloud removal problem and the particulari- property, which is a greatly desirable feature for the
ties of our method. Some examples of image fusion recovering of the original signal and, which allows to per-
are given in Section 4, together with a quantitative fusion form local filtering operations on the images under
quality assessment study, performed considering some consideration.
ground-truth test images. Finally, conclusions are drawn Quality of the source images can be formulated as fol-
in Section 5. lows. Consider an original image z and a reference degraded
(i.e., by convolution) image g = z * h0. After applying the
2. Mathematical background PWD to z (n) and g (n) separately, let us identify their
respective transformations as Wz and Wg. From a filtering
Modeling and restoring images affected by spatial-vari- point of view, it can be helpful to consider that the PWDs
ant degradation is a challenging problem that still require of the functions z (n) and g (n) are related by the product
further attention in order to achieve a successful solution. W g ðn; mÞ ¼ W z ðn; mÞH 1 ðn; mÞ ð3Þ
One way of approaching the spatial-variant degradation
case is by means of the conjoint spatial/spatial-frequency where H1 (n, m) can be interpreted as a pseudo-filtering
representations [8]. One of the most representative method function, since g might not exists [13,14]. For a thorough
is the Wigner Distribution (WD), which is a bilinear (qua- discussion see Ref. [15]. If the Fourier transformation is
dratic) signal representation introduced by Wigner [9]. A represented by F [Æ], Eq. (3) can be rewritten as
comprehensive discussion of the WD properties can be F½rz ðn; kÞ  h1 ðn; kÞ ¼ F½rz ðn; kÞ  F½h1 ðn; kÞ ð4Þ
found in a series of classical articles by Claasen and Mec-
klenbräuker [10]. Originally, the WD was applied to con- where the convolution affects only to variable k and, the
tinuous variables as follows. Consider an arbitrary 1-D relationship rg (n, k) = rz (n, k)*h1 (n, k) between their
function z (x). The WD of z (x) is given by respective product functions holds, after the convolution
Z 1  property of the Fourier Transform, where z (n), g (n),
a  a rz (n, k) and rg (n, k) all represent real non-negative func-
W ðx; uÞ ¼ z x þ z x  eiðuaÞ da; ð1Þ
1 2 2 tions. On the other hand, according with the energy conser-
where ‘‘*’’ denotes complex conjugate. vation principle h0 (n), h1 (n, k) will not present negative
By considering the shifting parameter a as a variable, coefficients. Consequently, the following conditions are
Eq. (1) represents the Fourier Transform (FT) of the prod- satisfied:
uct z (x + a/2)z* (x  a/2), where u denotes the spatial-fre- N
2 1
X
quency variable and, hence, the WD can be interpreted h0 ðnÞ ¼ 1 and 8n=h0 ðnÞ P 0 ) kh0 ðnÞk 6 1 ð5Þ
as the local spectrum of the signal z (x). The Wigner Distri- n¼N2
bution satisfies many desirable mathematical properties [8].
N
Although the WD was initially defined for continuous var- 2 1
X
iable functions, Claasen and Mecklenbrauker proposed at h1 ðn; kÞ ¼ 1 and 8n; k=h1 ðn; kÞ P 0
the beginning of the eighties a first definition for discrete k¼N2

variable functions. However, some attempts to extend def- ) kh1 ðn; kÞkn¼f 6 1 ð6Þ
initions of the WD to discrete signals have not been yet
completely successful [11]. For this application, we have Then, by applying the properties of the norm iÆi to Eq. (4),
selected the following discrete Wigner distribution similar the following expression is achieved:
to the one proposed by Claasen and Mecklembräuker kF½rz ðn; kÞ  h1 ðn; kÞkn¼f 6 kF½rz ðn; kÞkn¼f ð7Þ
and also by Brenner [12].
N 1
If images to be fused have the same energy, in other words,
X
2
Parseval’s theorem tell us that equality holds in Eq. (7).
 2ið2pm
N Þk
W ðn; mÞ ¼ 2 zðn þ kÞz ðn  kÞe ð2Þ
Nevertheless, differences between PWDs belonging to dif-
k¼N2
ferent degraded versions of the same image will not neces-
This equation is limited to a spatial interval [N/2, N/21] sarily nullify. Consequently, these differences can be used
and, it has to be considered as a pseudo-Wigner distribu- for measuring the quality of the source images. Thus, dis-
tion (PWD). In Eq. (2), n and m represent the spatial and tances {di} can be measured from the same pixel location
space-frequency discrete variables, respectively, and k is n ” f ” (x, y), of the source images {zi} to a worst case ref-
the shifting parameter, which is also discrete. Once the erence. Theoretically, the largest distance corresponds to
n ” (x, y) position on a given image is fixed, Eq. (2) origi- the pixel least locally degraded. Then by comparing dis-
S. Gabarda, G. Cristóbal / Image and Vision Computing 25 (2007) 523–530 525

tances {di} a decision map can be obtained, which provides are discarded and substituted by cloud-free areas from
an indexed matrix of noise-free regions. Therefore, the other images of the same set acquired at different time.
noise-free pixels from the source images can be assigned The whole procedure can be considered as an automatic
to a clean resulting image. ‘‘cut-and-paste’’ method. The method operates in a pixel-
wise scheme, allowing dealing with so diverse cloudy cases
3. The cloud removal problem such as fogs or mists.
Fig. 1A and B show two images that present artificially
Optical remote sensing, as typical satellite applications, generated cloudy regions, located in different parts of the
has to cope with the so-called cloud cover problem that image. Fig. 1C and D, to the right, represent the PWDs
can be pictured as an important difficulty affecting the of the same given pixel position in both images. As it has
observation of the earth surface. Diverse techniques have been explained in the previous section, the 1-D PWD is a
been proposed including different methods such as thresh- pixel-wise vector with the same number of components as
olding [16] or wavelet decomposition [17] to solve the prob- the window used in its calculation. Namely,
lem. Typically, a clean image can be produced by creating a PWDðx; yÞ ¼ w ¼ ½wN =2 ; . . . ; wN =21  ð8Þ
cloud-free mosaic from several multi-temporal images
related to the same area of interest [18]. There are inherent represents the PWD of a pixel located at position (x, y) of a
difficulties to this matter when radiance variations are used given image, calculated with a window of N pixels.
as the main feature to discriminate cloudy and clear condi- On the other hand, some multi-temporal images, such as
tions, given that the intensity of the cloud can occasionally satellite images, are often corrupted by clouds and studies
match the intensity of the earth. In this paper we propose a of the statistics of clouds, considered as noise, can be found
method insensible to the intensity of image pixels, based on in the literature [19]. According to this, the cloudy spots on
characterizing the frequency content of the cloud by a multi-temporal images can be modeled by generating frac-
PWD pattern discernible solely from the earth’s pattern. tal noise through the midpoint displacement algorithm or
A set of multi-temporal images, containing partial cloud- by generating a noise image with 1/f power spectra [19].
covered views, are processed together to build an entirely An averaged prototype of cloud was synthesized from a
clear image of the site. The cloudy areas in the input images collection of randomly generated clouds and, used to define

Fig. 1. Simulation of a cloud covering condition. In this example the camera is static, and the aerial images are partially occluded by clouds that change in
shape and position from ‘‘A’’ to ‘‘B’’. (C) PWD of a cloudy pixel. (D) PWD of the same pixel in a ground position. Note the different frequency content of
samples ‘‘C’’ and ‘‘D’’. Horizontal axis in ‘‘C’’ and ‘‘D’’ represents spatial frequency position, whereas vertical axis represents spectral value contents.
526 S. Gabarda, G. Cristóbal / Image and Vision Computing 25 (2007) 523–530

the PWD pattern used as reference. Then, the cloud was the high frequencies will be diminished and consequently
modeled by a vector as its PWD will be affected. Cloud is an extreme case where
PWDcloud ðx; yÞ ¼ wcloud ¼ ½0; 0; . . . ; 1; . . . ; 0; 0 ð9Þ high frequencies are completely missing (see Fig. 1C).
The main idea behind this reference is to have a worst case
We use a normalized version for earth and cloud, i.e., that allows establishing a distance measurement at pixel
iwi = iwcloudi = 1. The smooth change of gray-values in level from the PWD of every one of the M source images
the cloud model originates all elements in (9), except the to the reference, i.e.,
central one, to be zero. Normalization makes the algorithm
to be independent of the brightness of the pixels, i.e., dark- d i ðx; yÞ ¼ kwi ðx; yÞ  wcloud ðx; yÞk; i 2 f1; 2; . . . ; Mg ð10Þ
ness or brightness of the cloud is irrelevant, only local The norm operator in Eq. (10) is defined as usual,
space-frequency content counts. 0N 112
Considering the frequency contents of the 1-D pixel-wise 2 1
X
PWD, it is possible to process the data of a set of source knk ¼ @ n2 ðiÞA ð11Þ
images to identify the cloudy pixels. The method proposed i¼N2

herein involves scanning the original images by a sliding where n represents any arbitrary vector of real numbers.
window of N pixels in a row and assigning to each pixel The selected distance (Eq. (10)) is an Euclidean distance
its N-component PWD vector. In such way, we are gather- and, belongs to the more general family of Minkowsky dis-
ing both spatial and spatial-frequency information in every tances [20].
pixel of the image. If we have a multi-temporal information The maximum value of d among the M different source
of the same area of observation, it is possible to compare images determines the image where the least cloudy pixel is
the PWDs of homologous pixels with the PWD cloud located, i.e.,
model, given by expression (9) for identifying the cloudy
pixels. A clean image is obtainable if the cloudy pixels Iðx; yÞ ¼ argfmax½d i ðx; yÞg ; i 2 f1; 2; . . . ; Mg ð12Þ
i
are substituted by cloud-free pixels from other images of
the same sequence. Here I (x, y) has the form of a decision map, determining
Our method is based on the idea that every pixel on the the image from which the gray pixel values have to be
image has a different N-component PWD vector associated selected. Fig. 2A shows an example of the map obtained
with it; thus, a measure can be taken for determining their after processing source images from Fig. 1A and B. Black
greatest distance to a reference cloud model. This approach pixels, labeled as ‘‘1’’ indicate that those pixels were taken
is closely related with shape matching [20]. The selection of from the image of Fig. 1A and pasted in the resulting clean
a similarity measure is required in order to take an accurate image shown in Fig. 2B. Analogously, white pixels are la-
decision, since dissimilarity can be coupled with the notion beled as ‘‘2’’ and indicate that pixels were taken from the
of distance. Dealing with a distance metric, it is feasible to image of Fig. 1B and pasted in Fig. 2B.
set a reference model. In this application, the reference The whole fusion procedure can be summarized as fol-
model is the cloud prototype. In particular, such cloud pro- lows. First, a 1-D PWD of each pixel is calculated using
totype is obtained by averaging a localized representation a segment of N-pixels, computed according to the proce-
such as the PWD of selected cloud regions. It is well dure shown in Sections 2 and 3. A small value of such anal-
known, that the more defocused an image is, the more ysis mask (e.g., N = 8) permits higher computational

Fig. 2. Result of processing images from Figs. 1A and B. (A) Decision map. (B) Cloud-free resulting image. Black pixels in (A) represent which pixels from
Fig. 1A have been incorporated to the fused image shown in (B), whereas white pixels represent which pixels from Fig. 1B have been incorporated to fused
image shown in B.
S. Gabarda, G. Cristóbal / Image and Vision Computing 25 (2007) 523–530 527

savings and a localized spectral analysis. This procedure is possible. The method had to discriminate cloud from earth.
repeated for every image row up to cover the full image. The activity map generated by the process indicates the ori-
Then, the Euclidean distance to the reference cloud model gin of the pixels used in the final cut-and-paste process
was obtained pixel-by-pixel using Eq. (10). The highest dis- which produces the estimated clean output. Black pixels
tance according with such criteria allows extracting the less in the decision map indicate that pixels must be taken from
noisy pixels of the referred images. Since the source images input labeled as B, whereas white pixels indicate that they
(Fig. 1A and B) can be considered non-overlapping must be taken from input 3. In other words, white pixels
degraded versions of an unknown ground-truth image, the indicate misclassified pixels. This special case have a per-
decision map shown in Fig. 2A allows computing the fused centage of correct decisions (PCD) of 95.41, i.e., the per-
image (Fig. 2B). This decision map is an indexing matrix centage of earth identified as earth (here equal to the
whose elements are related one-to-one with the pixels from percentage of cloud identified as cloud). The numerical
the images. In the example, they have a value of ‘‘1’’ when results with the 36 examples elaborated with the images
d1 (x, y) > d2 (x, y) and a value of ‘‘2’’ otherwise. shown in Fig. 5 are summarized in Table 1. The results
shown in the table indicate that quality is highly depending
4. Experimental results on morphological contain of the earth (see differences in
the values from images A to F in Table 1). One observable
A realistic example is presented in Fig. 3. Here the cam- consequence is that earth spots having a smooth surface
era is fixed looking at a city landscape, taking multi-tempo- morphology resulted misclassified (image F). This effect
ral photographs through different foggy situations, ranging can be justified thinking that some earth patterns are much
from a clear weather to a thick mist. The method produces alike to the cloud pattern used as model, preventing from
a cloud-free result as shown in Fig. 4B. obtaining an accurate discrimination.
The results in the examples presented in Figs. 1–4 illus- To illustrate the above explanation, a new example is
trate the performances of the method presented here. It is presented in Fig. 6, where nine multi-temporal MODIS
worth noting that an area of the resulting image cannot views of the Mississippi River Plume are shown. Here the
be better than the best view of the same area in the source cloud and the sea have similar spatial frequency contents.
images. This is due to the fact that this method is actually a The meaning of this is that it is impossible to discriminate
cut-and-paste method, and not an image enhancement water from cloud surfaces simply using spatial frequency
method. For instance, the result shown in Fig. 2B, has information of the source images. More information is
the same quality level as the source undegraded image required, as brightness differences between water and
areas. The only benefit being that the cloud has been cloud. To process the images of this example, expression
removed. It corresponds to the way the picture should look (10) was applied without a previous normalization of the
if there were no cloud when the pictures were taken. vectors, keeping the intensity information active. As clouds
In order to objectively quantify the quality of the in the images appear as the brightest areas, the model of
method a numerical test was performed using the six views the cloud was here restricted to be a white surface with
and the six clouds shown in Fig. 5. Hence, 36 examples, its corresponding PWD. The result is shown in Fig. 7. A
similar to the one shown at the bottom of this figure, are simple observation of such figure indicates that a good

Fig. 3. Ten frames corresponding to different realistic weather situations (April 2002). Source: WILD (Weather and Illumination Database), Columbia
University, Computer Science Dept. In this example the camera is static in all the frames and therefore the registration process between them is not needed.
528 S. Gabarda, G. Cristóbal / Image and Vision Computing 25 (2007) 523–530

Fig. 4. (A) Decision map corresponding to images of Fig. 3. Every gray-value encodes from which image the pixels should be taken, i.e., black from top
left frame of Fig. 3, white from bottom right frame of Fig. 3. (B) Fused image. Not all input images are necessarily represented in the fused output.

Fig. 5. Top: six views and six artificially generated clouds. Bottom: Particular example when processing image B against cloud 3. The activity map
generated by the process indicates the origin of the pixels to be used in the final cut-and-paste step. Black pixels in the decision map indicate that pixels must
be taken from input labeled as B, whereas white pixels indicate that they must be taken from input 3. In other words, white pixels indicate misclassified
pixels. PCD is standing for the percentage of correct decisions and indicates the percentage of earth identified as earth.

Table 1
Quality measurements
Image A vs. Image B vs. Image C vs. Image D vs. Image E vs. Image F vs. Total average
clouds 1–6 clouds 1–6 clouds 1–6 clouds 1–6 clouds 1–6 clouds 1–6
98.9890 93.3954 99.5539 99.3882 99.4233 83.2171 95.6612
Percentage of correct decisions (PCD). Values represent the PCD averaged after processing every image with the six cloud realizations shown in Fig. 5. The
average of the 36 possible cases appears in the last column of the table.

approximation of the whole area under a perfect cloud-free images originated by Measurement of Pollution in the Tro-
condition. posphere (MOPITT) instrument, aboard the Earth Observ-
Finally, to put the results of Table 1 in perspective with ing System (EOS) Terra spacecraft. The validation results
other results published elsewhere, they can be compared showed that their MOPITT cloud detection threshold work
with measurements performed by Warner et al. [21] with well for scenes covered with more than 5–10% cloud cover,
S. Gabarda, G. Cristóbal / Image and Vision Computing 25 (2007) 523–530 529

Fig. 6. Nine frames corresponding to multi-temporal MODIS views of the Mississippi River Plume (photographs from the Institute for Marine Remote
Sensing of the University of South Florida).

Fig. 7. (A) Decision map corresponding to images of Fig. 6. Every gray-value encodes from which image the pixels should be taken, i.e., black from top
left frame of Fig. 6, white from bottom right frame of Fig. 6. (B) Fused image. Not all input images are necessarily represented in the fused output.

the PCD ranging from 99.11 to 99.89 and poorly when considered in the process. One remarkable feature of the
cloud cover is less than 5%. The method used for cloud present method is its reduced computational cost. The
detection procedure will be a thresholding method by com- use of a small 1-D window for the PWD analysis greatly
paring the observed radiance against a model-calculated decreases its computational time.
cloud free radiance. Jedlovec and Laws [22] propose a
new method called new bi-spectral threshold (BTH) based Acknowledgements
on a modification of the bi-spectral spatial coherence
(BSC) method developed by Guillory et al. [23] and present This work has been partially supported by the following
some results with GOES imagery. Results range from 87.76 Grants: TEC2004-00834; TEC2005-24739-E; TEC2005-
to 96.34 for the PCD related to the cloud detection. 24046-E; 2004CZ0009 from the Spanish Ministry of Edu-
cation and Science and the PI040765 project from the
5. Conclusions Spanish Ministry of Health.

A new cloud removal method based in a pixel-wise References


PWD analysis of multi-temporal source images have been
presented and applied to different sequences of multi-tem- [1] R.L. Lagendijk, J. Biemond, Basic methods for image restoration and
poral images. This method is able to operate under a cut- identification, in: A. Bovik (Ed.), Handbook of Image & Video
Processing, Academic Press, San Diego, 2000, pp. 125–139.
and-paste fusion scheme when two or more multi-temporal [2] D. Kundur, D. Hatzinakos, Blind image deconvolution, IEEE Signal
registered images are available. Experimental results show Processing Magazine 13 (3) (1996) 43–64.
that the method behaves well when earth patterns present [3] A.N. Rajagopalan, S. Chaudhuri, Space-variant approaches to
a rich morphological structure. Quality decreases when recovery of depth from defocused images, Computer Vision and
earth’s morphology tends to by regular, approaching the Image Understanding 68 (3) (1997) 309–329.
[4] D. Kundur, D. Hatzinakos, H. Leung, A novel approach to
specific characteristics postulated to define the cloud multispectral blind image fusion, in: B.V. Dasarathy (Ed). Sensor
model. In such cases the intensity information must be taken Fusion: Architectures, Algorithms, and Applications, Proceedings of
in account to define the cloud model, added to its frequency SPIE, vol. 3067 (1997) pp. 83–93.
content. Once the cloud model is defined and parameter- [5] Z. Zhang, R.S. Blum, A categorization of multiscale-decomposition-
ized, the images can be processed in an automatic way. based image fusion schemes with a performance study for a digital
camera application, Proceedings of IEEE 87 (1999) 1315–1328.
Additionally, based on the performances shown by this [6] S. Gabarda, G. Cristóbal, Multifocus image fusion through the
method, new fusion applications for multi-temporal images pseudo-Wigner distribution, Optical Engineering 44 (4) (2005)
can be expected, specially if multi-sensor information is 047001-1–047001-9.
530 S. Gabarda, G. Cristóbal / Image and Vision Computing 25 (2007) 523–530

[7] S. Gabarda, G. Cristóbal, F. Sroubek, Image fusion schemes using Measurements of Pollution in the Troposphere (MOPITT) experi-
local spectral methods, Applications of Computer Vision Workshop, ment, Applied Optics 40 (8) (2001) 1269–1284.
Prague, 2004. [17] S. Wisetphanichkij, K. Dejhan, F. Cheevasuvit, S. Mitatha, C.
[8] L.D. Jacobson, H. Wechsler, Joint spatial/spatial-frequency repre- Netbut. Multi-temporal cloud removing based on image fusion with
sentation, Signal Processing 14 (1988) 37–68. additive wavelet decomposition, in: Proceeding of the 20th Asian
[9] E. Wigner, On the quantum correction for thermodynamic equilib- Conference on Remote Sensing, Hong Kong, China, November 1999,
rium, Physical Review 40 (1932) 749–759. CD.
[10] T.A.C.M. Claasen, W.F.G. Mecklenbräuker, The Wigner distribu- [18] M. Li, S.C. Liew, L.K. Kwoh, Automated production of cloud-free
tion – a tool for time–frequency analysis, parts I–III, Philips Journal and cloud shadow-free image mosaics from cloudy satellite imagery,
of Research 35 (1980) 217–250, 276–300, 372–389. in: Proceedings of the XXth ISPRS Congress, 12–13 July 2004
[11] J.C. O’Neill, P. Flandrin, W.J. Williams, On the existence of discrete Istanbul, Turkey.
Wigner distributions, EDICS Number:SPL. SP. 2.3, Time–Frequency [19] L. Galleani, L. Cohen, G. Cristóbal, B. Suter, Generation and
Signal Analysis, March 1998. denoising of images with clouds, SPIE, Seattle, Washington, USA, 8–
[12] K.H. Brenner, A discrete version of the Wigner distribution function, 10 July 2002.
Proceedings of EURASIP: Signal Processing II: Theories and [20] R.M. Haralick, L.G. ShapiroComputer and Robot Vision, vol. I,
Applications (1983) 307–309. Addison-Wesley, 1992.
[13] C. Gonzalo, Filtrado Optico y Digital de Imagenes de Forma [21] J.X. Warner, J.C. Gille, D.P. Edwards, D.C. Ziskin, M.W. Smith,
Espacialmente Variante Mediante la Funcion de Distribucion de P.L. Bailey, L. Rokke, Cloud detection and clearing for the Earth
Wigner, PhDs Thesis, Instituto de Optica, CSIC, Madrid, 1989. Observing System Terra satellite Measurements of Pollution in the
[14] C. Gonzalo, J. Bescós, L.R. Berriel-Valdós, J. Santamarı́a, Space- Troposphere (MOPITT) Experiment, Applied Optics 40 (8) (2001).
variant filtering through the Wigner distribution function, Applied [22] G.J. Jedlovec, K. Laws. Operational cloud detection in GOES imagery,
Optics 28 (1989) 730–736. Preprints, in: 11th Conference on Satellite Meteorology and Oceanog-
[15] N.S. Subotic, B.E.A. Saleh, Time variant filtering of signals in the raphy, Madison, WI, Amer. Meteor. Soc., 2001, pp. 412–415.
mixed time–frequency domain, IEEE Transactions on Acoustics [23] A.R. Guillory, J.M. Lecue, G.J. Jedlovec, B.N. Whitworth. Cloud
Speech and Signal Processing 33 (1985) 1479–1485. filtering using a bi-spectral spatial coherence approach, in: 9th
[16] J.X. Warner, J.C. Gille, D.P. Edwards, D.C. Ziskin, W. Smith, Cloud Conference on Satellite Meteorology and Oceanography, AMS, Paris,
detection and clearing for the Earth Observing System Terra satellite France, 1998, pp. 374–376.

You might also like