Professional Documents
Culture Documents
Abstract-The obstacle detection under adverse weather preceding vehicle's tail lamp is perceived to be 60% further
T
conditions, especially foggy conditions, is a challenging task away than under fair conditions. Furthermore, fog changes
because the contrast is drastically reduced. In weather significantly both temporally and spatially, and as a result
conditions in particular, daylight fog, the contrast of images there is a need for real-time detection using in-vehicle
grabbed by in-vehicle cameras in the visible light range is
sensors. One method that involves installing large numbers
drastically degraded, which makes current driver assistance
that relies on cameras very sensitive to weather conditions. The of sensors along roads might be one solution, though it may
not accurately reflect a driver's visual condition. It would
ES
visibility distance is calculated from the camera projection
equations and the blurring due to the fog. The effects of
daylight fog vary across the scene and are exponential with
respect to the depth of scene points.
also be a very expensive system to establish. Considering
these problems, we propose a method that classifies fog
density into three levels using in-vehicle camera images and
millimetre-wave (mm-W) radar data. The image from the
in-vehicle camera reflects the driver's visual conditions, vital
when driving. This is the prime advantage of using an in-
Key-Words-Fog removal, Atmospheric Visibility distance,
Contrast Restoration, Genetic algorithm, Contrast restoration vehicle camera. We also evaluate the degradation in
visibility of images that are captured in foggy conditions,
especially by focusing on the change in visibility of a
preceding vehicle. We must also take into account the
I. INTRODUCTION distance to the targets to determine the fog density, because
A
under the same fog condition, nearby objects is easy to see
Recently, many systems have been developed that use
while distant objects are not. We therefore use mm-W radar
computers and various sensors to assist driving. Some
together with the in-vehicle camera, since it can measure
notable examples include self-steering by white-line
distance without being influenced by adverse weather
detection, a rear-end collision-prevention system that
operates by measuring the distance to the vehicle ahead, a Under adverse meteorological conditions, the contrast of
danger notification system that recognizes pedestrians, and a images that are grabbed by a classical in-vehicle camera in
IJ
system that automatically operates the windshield wipers the visible light range is drastically degraded, which makes
upon recognizing rain drops . When considering a driving current in-vehicle applications relying on such sensors very
assistant system, we cannot ignore changes in weather sensitive to weather conditions. An in-vehicle vision system
conditions, since in such adverse weather conditions as rain, should take fog effects into account to be more reliable. A
snow, or fog, driving is more difficult than in fair first solution is to adapt the operating thresholds of the
conditions, leading to a significant increase in the accident system or to momentarily deactivate it if these thresholds
rate. Therefore, a close relationship exists between driver have been surpassed. A second solution is to remove
assistance and weather recognition. In this paper, we focus weather effects from the image beforehand. Unfortunately,
on fog detection. Fog negatively influences human the effects vary across the scene. They are exponential with
perception of traffic conditions, making for potentially respect to the depth of scene points. Consequently, space-
dangerous situations. Automatic lighting of fog lamps, invariant filtering techniques cannot be directly used to
speed control, and rousing of attention are examples of adequately remove weather effects from images. A judicious
potential assistance to be realized with respect to fog approach is to detect and characterize weather conditions to
recognition. Under foggy conditions the distance between a estimate the decay in the image and then to remove it.
II. FOG EFFECTS ON VISION Fig.1. Fog or haze luminance is due to the scattering of daylight. Light
Coming from the sun and scattered by atmospheric particles toward the
A. Visual Properties of Fog camera is the air light A. It increases with the distance. The light emanating
from the object R is attenuated by scattering along the line of sight. Direct
The attenuation of luminance through the atmosphere transmission T of R decreases with distance
was studied by Koschmieder, who derived an equation
relating the apparent luminance or radiance L of an object B. Camera Response
that is located at distance d to the luminance L0 measured
close to this Object, Let us denote f as the camera-response function,
which models the mapping from scene luminance to image
intensity by the imaging system, including optic as well as
electronic. The intensity I of a pixel is the result of f applied
+ (1- ) (1)
to the sum of the air light A and the direct transmission T,
T
also named air light. L∞ is the atmospheric luminance. In between incident energy on the charge-coupled device
the presence of fog, it is also the background luminance on (CCD) sensor and the intensity in the image is linear. This is
which the target can be detected. The previous equation may generally the case for short exposure times because it
be written as prevents the CCD array from being saturated. Furthermore,
short exposure times (1–4 ms) are used on in-vehicle
cameras to reduce the motion blur. This assumption can,
=( - )
On the basis of this equation, Duntley developed a contrast
attenuation law, stating that a nearby object exhibiting
ES(2) thus, be considered as valid, and (1) becomes
) (1 ))
C= [( - )/ ] = (3)
) ) (1 )
This expression serves to base the definition of a standard
A
+ (1 )
dimension that is called “meteorological visibility distance”
Vmet, i.e., the greatest distance at which a black object (C0 C. Enhancement
= −1) of a suitable dimension can be seen in the sky on the
The aim of image enhancement is to improve the
horizon, with the threshold contrast set at 5%. It is, thus, a interpretability or perception of information in images for
standard dimension that characterizes the opacity of a fog human viewers, or to provide `better' input for other
layer. This definition yields the following expression automated image processing techniques. Image
IJ
T
and contrast enhancement by sub-block overlapped
histogram equalization, the complexity of computation is
very high.
III. CONTRAST-RESTORATION
Restoration Principle
METHOD
ES
Here, we describe a simple method to restore scene
contrast from an image of a foggy scene. Let us consider a
pixel with known depth d. Its intensity I is given by. (A∞, β)
characterizes the weather condition and is estimated through
consequently, contrary to; R can be directly estimated for all
scene points from
+ (1- ) (1)
A
Fig 3.Restoration Principles Outputs
∞ ∞ (2)
∞ )/ ∞ ] =C (3)
Edge detection
Image (binary)
Genetic Iteration to Out Put
Foggy reduce
algorithm
Image intensity
levels
Gray scale
Foggy image
1. Foggy Image
while preserving the important structural properties of an
In this foggy image is taken. That foggy image may image. If the edge detection step is successful, the
be any format i.e., color image may be j-peg bitmap subsequent task of interpreting the information contents in
...etc that foggy image (color image) is converted in to the original image may therefore be substantially simplified.
T
Gray scale image. Color image is 3-dimension image However, it is not always possible to obtain such ideal edges
where as gray scale image is 2- dimension image. from real life images of moderate complexity. Edges
Image extraction is possible in 2- dimension in 3- extracted from nontrivial images are often hampered by
dimension image extraction is quite complicated. fragmentation, meaning that the edge curve are not
connected, missing edge segments as well as false edges not
To convert gray scale to edge detection canny corresponding to interesting phenomena in the image – thus
algorithm is used
2. Edge Detection
ES
Edge detection is a fundamental tool in image processing and
complicating the subsequent task of interpreting the image
data
3. Gray Scale Images possible to cover everything in these pages. But you should
get some idea, what the genetic algorithms are and what
A grayscale image (also called gray-scale, gray scale, or
they could be useful for. Do not expect any sophisticated
gray-level) is a data matrix whose values represent
mathematics theories here. Genetic algorithms are a part of
intensities within some range. MATLAB stores a grayscale
evolutionary computing, which is a rapidly growing area of
image as an individual matrix, with each element of the
artificial intelligence. As you can guess, genetic algorithms
matrix corresponding to one image pixel. By convention,
are inspired by Darwin's theory about evolution. Simply
this documentation uses the variable name I to refer to
said, solution to a problem solved by genetic algorithms is
grayscale images. The matrix can be of class uint8, uint16,
evolved.
int16, single, or double. While grayscale images are rarely
saved with a color map, MATLAB uses a color map to A. Initialization
display them. For a matrix of class single or double, using
Initially many individual solutions are randomly generated
the default grayscale color map, the intensity 0 represents
to form an initial population. The population size depends
black and the intensity 1 represents white. For a matrix of
on the nature of the problem, but typically contains several
type uint8, uint16, or int16, the intensity intmin (class (I))
hundreds or thousands of possible solutions. Traditionally,
represents black and the intensity intmax (class (I))
the population is generated randomly, covering the entire
represents white
range of possible solutions (the search space). Occasionally,
the solutions may be "seeded" in areas where optimal
solutions are likely to be found.
T
During each successive generation, a proportion of the
existing population is selected to breed a new generation.
Individual solutions are selected through a fitness-based
process, where fitter solutions (as measured by a fitness
function) are typically more likely to be selected. Certain
selection methods rate the fitness of each solution and
ES preferentially select the best solutions. Other methods rate
only a random sample of the population, as this process may
be very time-consuming
B. Reproduction
The next step is to generate a second generation population
of solutions from those selected through genetic operators:
Fig 7.Input Images crossover (also called recombination), and/or mutation. For
each new solution to be produced, a pair of "parent"
solutions is selected for breeding from the pool selected
A
previously. By producing a "child" solution using the above
methods of crossover and mutation, a new solution is
created which typically shares many of the characteristics of
its "parents". New parents are selected for each new child,
and the process continues until a new population of
solutions of appropriate size is generated. Although
reproduction methods that are based on the use of two
IJ
T
Combinations of the above their solutions have been highly effective, unlike anything a
human engineer would have produced, and inscrutable as to
Simple generational genetic algorithm pseudo code
how they arrived at that solution.
1. Choose the initial population of individuals
2. Evaluate the fitness of each individual in that population The simplest algorithm represents each chromosome as
3. Repeat on this generation until termination: (time limit,
a.
b.
sufficient fitness achieved, etc.)
Select the best-fit individuals for reproduction
ES
Breed new individuals through crossover and mutation
a bit string. Typically, numeric parameters can be
represented by integers, though it is possible to use floating
point representations. The floating point representation is
operations to give birth to offspring natural to evolution strategies and evolutionary
c. Evaluate the individual fitness of new individuals
programming. The notion of real-valued genetic algorithms
d. Replace least-fit population with new individuals
The building block hypothesis has been offered but is really a misnomer because it does
Genetic algorithms are simple to implement, but their not really represent the building block theory that was
behavior is difficult to understand. In particular it is difficult proposed by Holland in the 1970s. This theory is not
A
to understand why these algorithms frequently succeed at without support though, based on theoretical and
generating solutions of high fitness when applied to experimental results (see below). The basic algorithm
practical problems. The building block hypothesis (BBH) performs crossover and mutation at the bit level. Other
consists of: variants treat the chromosome as a list of numbers which are
indexes into an instruction table, nodes in a linked list,
IJ
T
ES
A
Fig 10: Fog Removal Algorithm Outputs (Genetic Algorithm)
IJ
V. Conclusion
In this paper, we proposed a method that classifies fog
density according to a visibility feature of a preceding
vehicle and the distance to the vehicle. We obtained
promising results through an experiment using data
collected from an in-vehicle camera while driving the
vehicle. From the results, we confirmed that the
proposed method could make judgments that comply
with human perception.
In future, we will consider an improved visibility
feature that does not vary depending on the type or
colour of the preceding vehicle. In addition, we will
consider a situation when there is no preceding vehicle at
all.
VI. References
1. Nicolas Hautière, Jean-Philippe Tarel, and Didier
Aubert Mitigation of Visibility Loss for Advanced
T
Camera-Based Drive Assistance IEEE transactions on
intelligent transportation systems, vol. 11, no. 2, June
2010
2. R. T. Tan, “Visibility in bad weather from a single
image,” in Proc. IEEE Conf. Comput. Vis. Pattern
Recog., 2008
ES
3. N. Hautière, J.-P. Tarel, and D. Aubert, “Towards fog-
free in-vehicle vision systems through contrast
restoration,” in Proc. IEEE Conf. Comput. Vis. Pattern
Recog., 2007, pp. 1–8.
4. N. Hautière, J.-P. Tarel, J. Lavenant, and D. Aubert.
Automatic Fog Detection and Estimation of Visibility
Distance through use of an Onboard Camera. Machine
Vision and Applications Journal, 17(1):8– 20, April
2006.
5. S. G. Narasimhan and S. K. Nayar, “Contrast
restoration of weather degraded images,” IEEE Trans.
Pattern Anal. Mach. Intell., vol. 25, no. 6, pp. 713–724,
A
Jun. 2003
IJ