You are on page 1of 34

This article was downloaded by: [Gitam University] On: 04 October 2013, At: 20:56 Publisher: Taylor & Francis Informa Ltd Registered in England

and Wales Registered Number: 1072954 Registered office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK

International Journal of Remote Sensing
Publication details, including instructions for authors and subscription information: http://www.tandfonline.com/loi/tres20

Review article Multisensor image fusion in remote sensing: Concepts, methods and applications
C. Pohl & J. L. Van Genderen Published online: 25 Nov 2010.

To cite this article: C. Pohl & J. L. Van Genderen (1998) Review article Multisensor image fusion in remote sensing: Concepts, methods and applications, International Journal of Remote Sensing, 19:5, 823-854, DOI: 10.1080/014311698215748 To link to this article: http://dx.doi.org/10.1080/014311698215748

PLEASE SCROLL DOWN FOR ARTICLE Taylor & Francis makes every effort to ensure the accuracy of all the information (the “Content”) contained in the publications on our platform. However, Taylor & Francis, our agents, and our licensors make no representations or warranties whatsoever as to the accuracy, completeness, or suitability for any purpose of the Content. Any opinions and views expressed in this publication are the opinions and views of the authors, and are not the views of or endorsed by Taylor & Francis. The accuracy of the Content should not be relied upon and should be independently verified with primary sources of information. Taylor and Francis shall not be liable for any losses, actions, claims, proceedings, demands, costs, expenses, damages, and other liabilities whatsoever or howsoever caused arising directly or indirectly in connection with, in relation to or arising out of the use of the Content. This article may be used for research, teaching, and private study purposes. Any substantial or systematic reproduction, redistribution, reselling, loan, sub-licensing, systematic supply, or distribution in any form

to anyone is expressly forbidden. Terms & Conditions of access and use can be found at http://www.tandfonline.com/page/terms-and-conditions

Downloaded by [Gitam University] at 20:56 04 October 2013

int. j. remote sensing, 1998 , vol. 19 , no. 5 , 823± 854

Review article M ultisensor image fusion in remote sensing: concepts, methods and applications
C. POHL
Western European Union± Satellite Centre (WEUSC), Research Division, P.O. Box 511, 28850 Torrejo  n de Ardoz (Madrid ), Spain

Downloaded by [Gitam University] at 20:56 04 October 2013

and J. L. VAN GENDEREN
International Institute for Aerospace Survey and Earth Sciences (ITC), P.O. Box 6, 7500 AA Enschede, The Netherlands ( Received 5 May 1996; in ® nal form 30 May 1997 ) With the availability of multisensor, multitemporal, multiresolution and multifrequency image data from operational Earth observation satellites the fusion of digital image data has become a valuable tool in remote sensing image evaluation. Digital image fusion is a relatively new research ® eld at the leading edge of available technology. It forms a rapidly developing area of research in remote sensing. This review paper describes and explains mainly pixel based image fusion of Earth observation satellite data as a contribution to multisensor integration oriented data processing.
Abstract.

1.

Introduction

Earth observation satellites provide data covering di erent portions of the electromagnetic spectrum at di erent spatial, temporal and spectral resolutions. For the full exploitation of increasingly sophisticated multisource data, advanced analytical or numerical data fusion techniques are being developed (Shen 1990 ). Fused images may provide increased interpretation capabilities and more reliable results since data with di erent characteristics are combined. The images vary in spectral, spatial and temporal resolution and therefore give a more complete view of the observed objects. It is the aim of image fusion to integrate di erent data in order to obtain more information than can be derived from each of the single sensor data alone (`1 + 1 = 3’). A good example is the fusion of images acquired by sensors sensitive to visible/ infrared ( VIR) with data from active synthetic aperture radar (SAR). The information contained in VIR imagery depends on the multispectral re¯ ectivity of the target illuminated by sun light. SAR image intensities depend on the characteristics of the illuminated surface target as well as on the signal itself. The fusion of these disparate data contribute to the understanding of the objects observed. Image fusion has many aspects to be looked at. Before being able to implement and use an image fusion approach some of its questions that need to be answered by the user include: Ð Ð What is the objective /application of the user? Which types of data are the most useful for meeting these needs?
0143± 1161 /98 $12.00

Ñ

1998 Taylor & Francis Ltd

Then the paper explains why and in which cases image fusion might be useful. e. spectral. In case of pixel based image fusion the geocoding is of vital importance. the explanation of existing techniques and the assessment of achievements in image fusion. It depends on the satellite coverage. Polidori and Mangolini 1996 ). Naturally. Another relevant issue in order to make full use of the bene® ts of image fusion is the selection of appropriate interpretation methods. The selection of the sensor depends on satellite and sensor characteristics such as Ð Ð orbit. The application also de® nes which season and weather conditions might be of relevance to the fused results.824 Ð Ð Ð C. the same is valid for the observed area. ® nancial issues. imaging geometry of optical and radar satellites. accuracy) digital elevation model resampling method etc. etc. This review is meant to contribute to the comprehension of image fusion including the de® nition of terms. Bearing in mind all these considerations further research is required to generalise and operationalize image fusion. The next step is the choice of a suitable fusion level. operational aspects of the space agency running the satellite. the existing techniques are reviewed including the necessary processing performances followed by an overview of the evaluation criteria for . van Genderen Which is the `best’ technique of fusing these data types for that particular application? What are the necessary pre-processing steps involved? Which combination of the data is the most successful? These and other questions comprise a large number of parameters to be considered.g. Particularly when fusing very disparate data sets. atmospheric constraints such as cloud cover. Thereafter.. L . Pohl and J. Following this introduction a de® nition of image fusion provides the concepts involved. Ð Ð Downloaded by [Gitam University] at 20:56 04 October 2013 The availability of speci® c data plays an important role too. The data has to be carefully evaluated using ground truth for veri® cation ( Pohl 1996. distribution. ( Pohl 1996). Related to the geometric correction of the data.2 ). details such as the Ð Ð Ð Ð geometric model ground control points (number. It is structured in ® ve sections. Image fusion requires well-de® ned techniques as well as a good understanding of the input data. platform. Especially the topography has a great in¯ uence on the fused remote sensing data besides the actual ground cover and land use. need further consideration. The question of which technique to choose is very closely related to the de® nition of evaluation criteria. The pre-processing steps are depending on this. The most important question to be answered ® rst is: What is the application the data is needed for? Knowing that it is possible to de® ne the necessary spectral and spatial resolution which again has an impact on the selection of the remote sensing data. Both ® elds are rather di cult to de® ne and depend very much on empirical results (further considerations on this topic are reviewed in § 4 and § 6. VIR and SAR the resulting grey values might not refer to physical attributes. spatial and temporal resolution ( Pohl and Genderen 1993).

The obtained information is then combined applying decision rules to reinforce .Multisensor image f usion in remote sensing 825 Figure 1. ( 1995 ) and Toutin ( 1994).. 2. Concept of image fusion Data f usion is a process dealing with data and information from multiple sources to achieve re® ned / improved information for decision making ( Hall 1992). Decision level. Fusion at feature level requires the extraction of objects recognised in the various data sources. The geocoding plays an essential role because misregistration causes arti® cial colours or features in multisensor data sets which falsify the interpretation later on. It uses raster data that is at least co-registered but most commonly geocoded. Pixel 2. e. A general de® nition of image fusion is given as `Image fusion is the combination of two or more di erent images to form a new image by using a certain algorithm’ (Genderen and Pohl 1994 ).or interpretation level fusion represents a method that uses value-added data where the input images are processed individually for information extraction. Processing levels of image fusion. regions) from multiple sources are assigned to each other and then fused for further assessment using statistical approaches or Arti® cial Neural Networks (ANN).. Feature 3. Image fusion at pixel level means fusion at the lowest processing level referring to the merging of measured physical parameters. It includes the resampling of image data to a common pixel spacing and map projection. An illustration of the concept of pixel based fusion is visualised in ® gure 1. Image fusion is performed at three di erent processing levels according to the stage at which the fusion takes place: 1. shape and neighbourhood (Mangolini 1994 ). the latter only in the case of geocoding. Downloaded by [Gitam University] at 20:56 04 October 2013 fused data. Features correspond to characteristics extracted from the initial images which are depending on their environment such as extent.g. A comparison of methods for the geometric and radiometric correction of remote sensing data is given in Cheng et al.g. Finally. Decision. bene® ts and limitations of image fusion are summarized in the concluding section. These similar objects (e. using segmentation procedures.

.g. composited ( Daily et al. e. image integration ( Welch and Ehlers 1988) and multi-sensor data f usion ( Franklin and Blodgett 1993 ). 1988 ). In these cases not always an alteration of the digital numbers amongst the di erent image channels is involved... Here the images are already interpreted to reach the information or knowledge level before being fused. etc. ( 1990 ) and Franklin and Blodgett ( 1993 ) describe the fusion as the computation of three new values for a pixel based on the known relation between the input data for the location in the image. in the case of data fusion. Data integration comprises algorithms for image fusion too ( Nandhakumar 1990). e.g. This literature review mainly tackles pixel based image fusion. 1994 ) or synergism ( Harris and Graham 1976) of remote sensing data. Other scientists evaluate multisensor image data in the context of combined ( Lichtenegger 1991 ). ERS-1/ ERS-2 remote sensing data with ancillary data ( Janssen et al.g. e. spectral or temporal characteristics. 1979 ) or co-registered ( Rebillard and Nguyen 1982 ) data analysis. high/ low resolution SPOT/ Landsat single dataÐ multi-sensor (Guyenne 1995 ). van Genderen Downloaded by [Gitam University] at 20:56 04 October 2013 common interpretation and resolve di erences and furnish a better understanding of the observed objects (Shen 1990). A simple overlay of multi-source data in a RedGreen-Blue ( RGB) colour space integrates the data set. . 1990). Referring to a di erent level of image processing are the words information f usion (Shufelt and McKeown 1990 )..g. The replacement of one of the three channels or parts of it with an image from another data source is called substitution (Suits et al.g. SAR multitemporal for change detection multi-sensorÐ temporal ( Pohl and Genderen 1995 ).. coincident (Crist 1984).826 C. L . GPS coordinates. complementary ( Koopmans and Forero 1993). the following section outlines some goals of image fusion. not only remote sensing images are fused but further ancillary data (e. Often. whilst others can be used equivalently. In the literature a large number of terms can be found.g. Other expressions accepted in this context are image merging (Carper et al. topographic maps. e. Apart from the three levels at which fusion can be performed image fusion can be applied to various types of data sets: Ð Ð Ð Ð Ð Ð single sensorÐ temporal ( Weydahl 1993 ). VIR/ SAR image mapping single sensorÐ spatial (Cliche et al. geophysical information. Keys et al. Pohl and J.g. Techniques used to fuse data at pixel level are described in § 4. It requires the input of data that provide complementary rather than redundant information. high/ low resolution panchromatic/ multi-spectral SPOT multi-sensorÐ spatial (Chavez et al.. He describes data fusion as group of methods and approaches using multisource data of di erent nature to increase the quality of information contained in the data. image with topographic maps. A very wide ® eld of applications and approaches in image fusion are summarised by synergy (Genderen et al. (the references given are not exhaustive but meant as example).. A broader view of the term is presented by Mangolini ( 1994).) contribute to the resulting image ( Harris and Murray 1989 ). e. 1985 ). 1991). Some terms de® ne di erent approaches to image fusion. Having introduced the concept of image fusion. e. 1990). This term is often used in the context of geographical information systems (GIS) ( Ehlers 1993). The input images are di erent in terms of spatial.

( 1996 ). A special case forms the fusion of channels from a single sensor for resolution enhancement. Carper et al. The fusion of SAR / VIR does not only result in the combination of disparate data but may also be used to spatially enhance the imagery involved. Pohl 1996 ). Examples of fusing XS/ PAN or TM / PAN for resolution enhancement were published.g. Studying the literature. Price ( 1987). clouds-VIR.. improved reliability and improved classi® cation ( Rogers and Wood 1990). Improvement of registration accuracy The multi-sensor image registration gains importance in areas with frequent cloud cover where VIR imagery is not continuously available. 1994) detect changes using multitemporal data ( Duguay et al. by Simard ( 1982. ( 1996 ). 3. shadows-SAR) in one image with signals from another sensor image (Aschbacher and Lichtenegger 1990) replace defective data ( Suits et al. In that case high-resolution panchromatic imagery is fused with low-resolution often multispectral image data. 1990) provide stereo-viewing capabilities for stereophotogrammetry ( Bloom et al. improve geometric corrections (Strobl et al. ( 1990 ). Other approaches increase the spatial resolution of the output channel using a windowing technique on the six multispectral bands of TM (Sharpe and Kerr 1991). TM-data. increased con® dence. 1991). Ranchin et al. A distinction has to be made between the pure visual enhancement (superimposition) and real interpolation of data to achieve higher resolution (wavelets) the latter being proposed amongst others by Mangolini ( 1994 ) and Ranchin et al. 1990) and increased utility ( Rogers and Wood 1990). 1988). Geometric accuracy and increase of scales using fusion techniques is of concern to mapping and map updating (Chiesa and Tyler 1990. 3. it appears that image fusion is applied to digital imagery in order to: Ð Ð Ð sharpen images (Chaves et al. In this way the spectral resolution may be preserved while a higher spatial resolution is incorporated which represents the information content of the images in much more detail ( Franklin and Blodgett 1993.. 1994) substitute missing information (e. This leads to more accurate data ( Keys et al.1. The lower resolution thermal channel can be enhanced using the higher resolution spectral channels (Moran 1990). 1993 ). simulated data) Cliche et al. i. Ð Ð Ð Downloaded by [Gitam University] at 20:56 04 October 2013 Ð Ð The following paragraphs describe the achievements of image fusion in more detail. ( 1985). It aims at the integration of disparate and complementary data to enhance the information apparent in the images as well as to increase the reliability of the interpretation. 1988 ) enhance certain features not visible in either of the single data alone ( Leckie 1990 ) complement data sets for improved classi® cation (Schistad-Solberg et al.e. reduced ambiguity. It is also stated that fused data provides for robust operational performance.2.g. e.Multisensor image f usion in remote sensing 3.. amongst others. Pellemans et al. The drawbacks of . Image sharpening Image fusion can be used as a tool to increase the spatial resolution. Objectives of image fusion 827 Image fusion is a tool to combine multisource imagery using advanced image processing techniques. Franklin and Blodgett ( 1993).

1988.6. 1986. 3. Pohl and J. Improved classi® cation The classi® cation accuracy of remote sensing images is improved when multiple source image data are introduced to the processing. Images from microwave and optical sensors o er complementary information that helps in discriminating the di erent classes. A new trend in this respect is the ANN approach since the conventional statistical techniques were found inappropriate for processing multisensor data from SAR and VIR.5.). SAR / SAR (multiple incidence angles) and VIR/ SAR were successfully used. Therefore radar images can contribute di erent signals due to di erences in surface roughness. Of advantage is an integrative recti® cation approach which iteratively improves the registration accuracy ( Ehlers 1991 ). The combination of these temporal images enhances information on changes that Downloaded by [Gitam University] at 20:56 04 October 2013 . 3. The feature enhancement capability of image fusion is visually apparent in VIR/ VIR combinations that often results in images that are superior to the original data ( Keys et al. Multisensor stereo mapping requires accurately co-registered input data and shows improved results in comparison with image-to-map registration ( Welch et al. Raggam et al.g. 1990 ). Bloom et al. 1994. This means that a certain area on the Earth is observed at di erent times of the day.. Creation of stereo data sets It is proven that multisensor stereo data sets can help to overcome the lack of information. 1985. Some constraints exist depending on radiometric di erences of the images used as stereo pair ( Domik et al. 3. in order to maximise the amount of information extracted from satellite image data useful products can be found in fused images ( Welch and Ehlers 1987). shape and moisture content of the observed ground cover. 3. Franklin and Blodgett 1993. 1992. observed ( Daily et al. Ye 1993 b). ERS-1. due to cloud cover. Welch et al. e. 1979. Toutin and Rivard 1995 ). 1990. Multisensor image fusion enhances semantic capabilities of the images and yields information which is otherwise unavailable or hard to obtain from single sensor data (Mitiche and Aggarwal 1986 ).4. The positioning of points is facilitated if they are localised in similar `views’ (remote sensing data). Some vegetation species can not be separated due to their similar spectral response. T emporal aspects for change detection Image fusion for change detection takes advantage of the di erent con® gurations of the platforms carrying the sensors. month or year. 1990 ). Working with VIR data users rely on the spectral signature of the ground targets in the image. Conclusively.3.828 C. Based on the orbital characteristics the so called repeat-cycle of the satellite is de® ned and varies from system to system from daily ( NOAA) up to tenths of days ( Landsat. Combinations of VIR/ VIR (di erent spatial resolution). van Genderen conventional image to map registration are the di erences in appearance of the features used as control or tie points. Buchroithner et al. SPOT. L . Ehlers 1991. etc. The use of multisensor data in image classi® cation becomes more and more popular with the increased availability of sophisticated software and hardware facilities to handle the increasing volumes of data. Feature enhancement Taking advantage of the di erent physical nature of microwave and optical sensor systems the fusion of those data results in an enhancement of various features  sou et al.

Weydahl 1992. In the case of SAR. the data is further radiometrically processed. 3. Overcoming gaps The images acquired by satellite remote sensing are in¯ uenced by a number of e ects based on the carrier frequency of the electromagnetic waves. in others only co-registered to coincide on a pixel by pixel basis depending on the height variations in the area contained in the data. The evaluation of their usefulness and disadvantages is discussed later in § 6. The speckle reduction can be performed at various stages of the processing. An overall processing ¯ ow for fusion optical and radar satellite data is given in ® gure 2. Apart from creating simple or complex mosaics there are other techniques such as Optimal Resolution Approach ( Haefner et al. ratioing and PCA is given in Mouat et al. others allow the ® ltering and resampling during the geocoding process in one step to reduce the number of times the data is resampled. 1992 ). shadow). Kattenborn et al. According to Knipp ( 1993) the 16 to 8-bit data conversion in SAR processing. it is advisable to speckle ® lter before geocoding which implies an improved object identi® cation for GCP measurements ( Dallemand et al. T emporal image f usion is applicable to images from the same sensor as well as to multiple sensor data. In some cases it is geocoded. Image fusion techniques This section describes techniques to fuse remote sensing images. i. Kohl et al. The evaluation of single SAR imagery is di cult due to the ambiguities contained in SAR images ( De Groof et al. The techniques are described in § 4. Following the radiometric processing the data are geometrically corrected.7. speckle reduction is an elementary operation in many applications. After having corrected the remote sensing images for system errors. atmospheric and illumination correction in order to create compatible data sets. depending on the application.Multisensor image f usion in remote sensing 829 Downloaded by [Gitam University] at 20:56 04 October 2013 might have occurred in the area observed. It is nearly impossible to acquire multisensor data simultaneously so that fusion of data from di erent sensors mostly includes a temporal factor.. foreshortening. in case that it is required. 1993. To overcome these in¯ uences it is possible to combine di erent images acquired by the same or a di erent instrument. The images to be fused should be acquired at similar seasons to account for seasonal changes which might in¯ uence the change detection capabilities of this approach.e. Again it . SAR on the other hand su ers from severe terrain induced geometric distortions based on its side-looking geometry ( layover. Even the shadows of the clouds in¯ uence the interpretability of the imagery. 4. An overview on image enhancement techniques for change detection from multisensor data including image di erencing. 1994 ). 1992. It is well known that VIR sensors are hindered by clouds to obtain information on the observed objects on Earth. Multitemporal SAR data provides good potential for change detection analysis because of its all-weather capability. ( 1993). IHS and others to fuse the data for the purpose of ® lling the gaps which are described in more detail in the following section. Optical imagery is in¯ uenced by the atmosphere during data acquisition and therefore needs correction and /or other radiometric enhancements such as edge enhancement and others. 1993 ). There is a necessity of correcting the input imagery for radiometric distortions which occur especially in VIR data. should be performed after speckle reduction in order to reduce the loss of information. Another aspect to be considered when dealing with SAR is the 16 to 8-bit data conversion.

One method that relies on statistics in order to select the data containing most of the variance. especially if image data of very di erent spatial resolution is involved. the techniques can be grouped into two classes: ( 1 ) Colour related techniques.830 C. the data can be fused using one of the fusion techniques described in this review. The resulting image map can be further evaluated and interpreted related to the desired application. In some cases.. the resampling of the low resolution data to the pixel size of the high resolution image might causes a blocky appearance of the data. The ® rst comprises the colour composition of three image channels in the RGB colour space as well as more sophisticated colour transformations. RGB. Therefore a smoothing ® lter can be applied before actually fusing the images (Chavez 1987). Band selection Some techniques only allow a limited number of input bands to be fused (e. 3 i 1 3 j 1 OIF = = si ( 1) = |ccj | s i = standard deviation of digital numbers for band. whilst others can be performed with any number of selected input bands.g.1. van Genderen Downloaded by [Gitam University] at 20:56 04 October 2013 Figure 2. cc j = correlation coe cient . IHS). The numerical methods follow arithmetic operations such as image di erencing and ratios but also adding of a channel to other image bands. Techniques like PCA and regression belong to this group. and ( 2 ) Statistical/ numerical methods. Pohl and J. Processing ¯ ow chart for pixel based image fusion ( Pohl 1996 ). Statistical approaches are developed on the basis of channel statistics including correlation and ® lters. has to be pointed out that this is a major element of pixel level image fusion. e. Subsequently. These fusion techniques are very sensitive to misregistration. This is the selection method developed by Chavez et al. A sophisticated numerical approach uses wavelets in a multiresolution environment. The next sections describe these techniques in more detail. IHS and HSV.g. L . 4. In general.. ( 1982) called Optimum Index Factor (OIF) mathematically described in equation ( 1).

g. Other considerations related to OIF can be found in Chavez et al. Kaufmann and Buchroithner ( 1994) suggest to select the three bands with the highest variance. Chromaticity Each of these has several special cases. image channels) to the three primary colours red. Russ 1995). They can represent ® lters or emission spectra. A chromatic representation based on the CIE tristimulus values and using Cartesian coordinates is shown in ® gure 3. Colour composites (RGB) The so-called additive primary colours allow one to assign three di erent types of information (e. This requires a priori knowledge by the user (She eld 1985 ). 1982). It is one of the most common techniques to colour composite multi-sensor data and is mathematically described ® rst in the following section.. In general. 1993 a). RGB refers to the tristimulus values associated with a colour monitor. Others provide a certain band combination based on an algorithm which takes into account the statistics of the scene including correlations between image channels (She eld 1985 ).Multisensor image f usion in remote sensing 831 between any two of three bands. A chromaticity representation consists of the luminosity of an object and two quotients of the luminosity leading to intensity hue and saturation. Colour related techniques There are a variety of techniques to display image data in colour. 2.2. . green and Downloaded by [Gitam University] at 20:56 04 October 2013 Figure 3. Its advantages are its simplicity and the fact that other colour representations have to be transformed to RGB in order to be displayed on a colour monitor ( Haydn et al. 1982. Keys et al. ( 1993 a). Another approach is to select the bands which are the most suitable to a certain application. The principal component analysis is another solution to reduce the number of channels containing the majority of the variance for the purpose of image fusion ( Ye  sou et al. colours are mathematically described in two ways: 1. Tristimulus values are based on three di erent spectral curves. 4. A special representation of the spectral curves are the CIE tristimulus values obtained with the CIE tristimulus coe cients (see ® gure 3).2. 4. CIE chromaticity diagram (Haydn et al. Tristimulus values.1. ( 1984 ). ( 1990 ) and Ye  sou et al.

g. Reports about multi-sensor optical composites can be found in Welch et al. used to select a set of red. van Genderen Downloaded by [Gitam University] at 20:56 04 October 2013 blue. and blue brightness’. ( 1994). H and S stand for Hue and Saturation. ( 1993 b). The ® rst refers to the transformation of three image channels assigned A B 1 1 1 Ó 3 1 Ó 3 1 Ó 3 2 Ó 6 1 Ó 6 Õ Ó 6 Ó 2 Õ 1 1 AB n2 n1 Ó 2 0 AB R G B 2 ( a) ( 2) ( b) S = Ó n1 + n2 2 ( c) . It might be of advantage to invert input channels before combining them in the RGB display with other data depending on the objects of interest to be highlighted ( Pohl et al. ( 1985 ). Comhaire et al. and Chavez ( 1987 ).832 C. Marek and Schmidt ( 1994). AB I n1 n2 = H = tan Õ There are two ways of applying the IHS technique in image fusion: direct and substitutional. Dallemand et al. is stored in a look up table ( LUT ) that are the voltages sent to the display tube. Hinse and Coulombe ( 1994 ). Â sou et al. ( Harrison and Jupp 1990 ). Zobrist et al. green. I relates to the intensity. Work in the ® eld of geology was published amongst others by Daily et al. e. The grey scale value. IHS. ( 1994 ). L . cathode ray tube (CRT). It relates to the human colour perception parameters. e. ( 1979 ). The mathematical context is expressed by equation ( 2) (a± c).. ( 1992).. Intensity-Hue-Saturation (IHS) The IHS colour transformation e ectively separates spatial (I ) and spectral (H. Depending on the selection of the input image channels the fused data will show di erent features. while `n 1’ and `n 2’ represent intermediate variables which are needed in the transformation. PCA. Vornberger and Bindschadler ( 1992). On the basis of the complementary information from VIR (spectral re¯ ectivity) and SAR (surface roughness) the features are enhanced in fused imagery. Very important for the colour composite is the distribution of the available 0± 255 grey values to the range of the data. ( 1994). S ) information from a standard RGB image. Multisensor SAR fusion by RGB are reported by Marek and Schmidt ( 1994 ). Examples of successfully used colour composites of optical and microwave data are described by Aschbacher and Lichtenegger ( 1990). Together they form a colour composite that can be displayed with conventional media. Pohl and J.2. ( 1979) and Ye In many cases the RGB technique is applied in combination with another image fusion procedure. ( 1994). The colour composite facilitates the interpretation of multichannel image data due to the variations in colours based on the values in the single channels. The possibilities of varying the composite are manifold. and Pohl et al. 4.2. Oprescu et al.g. and others which are explained in the coming sections. Multitemporal ERS-1 SAR colour composites were used by Comhaire et al. Operations on the LUT and the histogram of the image data can enhance the colour composite for visual interpretation ( Russ 1995 ). 1994 ).

An example of HSV image fusion was presented by Chiesa and Tyler ( 1990). The IHS transformation can be performed either in one or in two steps. The I is essentially red A B 1 1 1 Ó 3 1 Ó 6 1 Ó 2 Ó 3 1 Ó 6 Õ 1 Ó 2 Ó 3 Õ 2 Ó 6 0 AB I n1 n2 ( 3) . This is called colour contrast stretching (Gillepsie et al. A closely related colour system to IHS (sometimes also called HSI) is the HSV: hue. and blue signals in proportion to the human eye’s sensitivity to them. 1986. 1986). Replacing the intensityÐ the sum of the bandsÐ by a higher spatial resolution value and reversing the IHS tranformation leads to composite bands (Chavez et al. A reverse transformation from IHS to RGB as presented in equation ( 3 ) converts the data into its original image space to obtain the fused image ( Hinse and Proulx 1995 b). 1990). ( 1996 ). saturation and value ( Russ 1995 ). and Yildimi et al.3. The two step approach includes the possibility of contrast stretching the individual I . Koopmans and Richetti ( 1993).2. 1986). its dominant wavelength contribution ( hue) and its purity (saturation) (Gillespie et al. It has the advantage of resulting in colour enhanced fused imagery ( Ehlers 1991). green. H and S ( Rast et al. ( 1991 ). but based on one principle: the replacement of one of the three components (I . the improvement of spatial resolution (Welch and Ehlers 1987. The I and Q components of the colour are chosen for compatibility with the hardware used. Then. L uminance-chrominance Another colour encoding system called YIQ has a straightforward transformation from RGB with no loss of information. This corresponds to the surface roughness. Carper et al. More results using IHS image fusion are reported by Rast et al. In many published studies the channel that replaced one of the IHS components is contrast stretched to match the latter. 4. Jutz and Chorowicz ( 1993 ). It serves colour enhancement of highly correlated data (Gillespie et al. H and S channels. AB R G B = The use of IHS technique in image fusion is manifold. The second transforms three channels of the data set representing RGB into the IHS colour space which separates the colour aspects in its average brightness (intensity). 1991 ). ( 1996 ). feature enhancement (Daily 1983). Y . one of the components is replaced by a fourth image channel which is to be integrated. Smara et al. The IHS technique has become a standard procedure in image analysis. A variation of the IHS fusion method applies a stretch to the hue saturation components before they are combined and transformed back to RGB ( Zobrist et al. 1990. Both the hue and the saturation in this case are related to the surface re¯ ectivity or composition (Grasso 1993). Oprescu et al. 1979 ). Ehlers 1991). H or S ) of one data set with another image. 1990 ) and the fusion of disparate data sets ( Harris et al. Most commonly the intensity channel is substituted. the luminance .Multisensor image f usion in remote sensing 833 Downloaded by [Gitam University] at 20:56 04 October 2013 to I . is just the brightness of a panchromatic monochrome image. Carper et al. 1991). ( 1994). It combines the red. These are linear combinations of the original (resampled ) multispectral bands and the higher resolution panchromatic band (Campbell 1993).

1993 ). Adding and multiplicati on To enhance the contrast. i.834 C.1. Carper et al. Pradines ( 1986). Arithmetic combinations Certain combinations of bands of images can lead to image sharpening. b ): ( a) ( b) Downloaded by [Gitam University] at 20:56 04 October 2013 Since the Y . while Q is magenta minus green. The possibilities of combining the data using multiplication or summation are manifold. Pohl and J. DN a and DNb refer to digital numbers of the ® nal fused image and the input images a and b .3. this transformation o ers better possibilities for enhancing an image. ( 1993). The relation between YIQ and RGB is shown in equation ( 4 a . adding and multiplication of images are useful.2. L . Ehlers ( 1991 ). b ): DNf = A ( w 1 DNa + w 2 DNb ) + B DNf = A DNa DNb + B ( 5 a) ( 5 b) AB A AB A Y I Q R G B 0 ´299 0 ´587 0 ´114 = 0 ´596 Õ Õ 0 ´211 1 ´000 0 ´274 0 ´523 0 ´956 Õ 0 ´322 0 ´312 0 ´621 = 1 ´000 Õ Õ 1 ´000 0 ´272 1 ´106 Õ Õ 0 ´647 1 ´703 BA B BA B R G B Y I Q ( 4) A and B are scaling factors and w 1 and w 2 weighting parameters. Cliche et al.1. ( 1993).3. ( 1990 ). DNf . respectively. The ratio method is even more useful because of its capability to emphasise more on the slight signature variations ( Zobrist et al. 4.e. In some . multiplication and division (ratio) as well as combinations of these operations. An example is the multiplication process expressed by equation ( 5 a . ( 1985 ). 1979. Mangolini et al. Singh 1989). There are a large number of publications containing suggestions on how to fuse high resolution panchromatic images with lower resolution multi-spectral data to obtain high resolution multi-spectral imagery. Munechika et al. But the integration of SAR can also improve the spatial structure of an image because it introduces the surface roughness to the image.1. This method was successfully applied to Landsat-TM and SPOT PAN data ( Ye  sou et al. It comprises addition. 4. Common combinations are SPOT PAN and SPOT XS or SPOT PAN and Landsat TM. Details can be found in Simard ( 1982 ).1. 1993 b). This was shown in an example by Guo and Pinliang ( 1989). ( 1993 ) and Pellemans et al. 4.3. subtraction. 4. I and Q components are less correlated than the RGB ones. higher spatial resolution. Di erence and ratio images Di erence or ratio images are very suitable for change detection (Mouat et al. van Genderen minus cyan. The choice of weighting and scaling factors may improve the resulting images. Price ( 1987). Statistical / numerical methods In this section all operations are grouped together which deal with mathematical combinations of image channels.3.. Welch and Ehlers ( 1987 ).

digital change detection. Principal component analysis The PCA is useful for image encoding. image data compression. 1993 ). If the intensity of the re¯ ected energy is nearly the same in each image then the ratio image pixel is one.3. (Gri ths 1988): XS3 Õ XS 2 Õ XS3 + XS2 TM 4 Õ TM 3 +C TM 4 + TM 3 (C = 128 for positive values) ( 6) Downloaded by [Gitam University] at 20:56 04 October 2013 A ratio for spatial enhancement is summarised by equation ( 7). Covariance (unstandardised PCA) or correlation (standardised PCA) matrix 2. band by band if the image data have more than one channel. named after its author. DNPAN . The critical part of this method is selecting appropriate threshold values in the lower and upper tails of the distribution representing change pixel values. It is a formula that normalises multispectral bands used for a RGB display.corresponding pixel in low resolution synthetic PAN image. it indicates no change.corresponding pixel in high resolution input PAN image. In this respect the normalisation of the data is of advantage as indicated in equation ( 6 ). DNfused = DNb1 DNb1 + DNb2 + DNbn DNhighres ( 8) 4. The use of the correlation matrix implies a scaling of the axes so that the features . In ratioing. di erences do not always refer to changes since other factors. DNHybridXS(j) = DNPAN DNXS( i) DNSynPAN ( 7) where DNHybridXS( i) . sensor calibration. Therefore a constant has to be added to produce positive digital numbers. image enhancement.i th band of the fused high resolution image. created from low resolution multispectral bands that overlap the spectral response of the input high resolution PAN. two images from di erent dates are divided. PCs An inverse PCA transforms the combined data back to the original image space. It is a statistical technique that transforms a multivariate data set of intercorrelated variables into a data set of new un-correlated linear combinations of the original variables. (Munechika et al.2.Multisensor image f usion in remote sensing 835 cases the resulting di erence image contains negative values. and multiplies the result by any other desired data to add the intensity or brightness component to the image. multitemporal dimensionality and image fusion. atmospheric conditions. The approach for the computation of the principal components ( PCs) comprises the calculation of: 1. -vectors 3. The algorithm is shown in equation ( 8) where D N fused means the DN of the resulting fused image produced from the input data in n multispectral bands multiplied by the high resolution image DNhighres . can lead to di erences in radiance values. DNSynPAN . Eigenvalues.super-pixel in i th band of input low resolution XS image. The aim of this method is to maintain the radiometric integrity of the data while increasing the spatial resolution. like di erences in illumination. ground moisture conditions and registration of the two images. That is also the aim of the Brovey Transform. It generates a new set of axes which are orthogonal. DNXS( i) . However.

van Genderen receive a unit variance. PCA of multichannel image Ð replacement of ® rst principal component by di erent images (Principal Component Substitution Ð PCS) (Chavez et al. The latter uses all available bands of the input image.836 C. 4. The PCA approach is sensitive to the choice of area to be analysed. A similar approach to the PCS is accomplished in the C -stretch (colour stretch) ( Rothery and Francis 1987) and the D -stretch (de-correlation stretch) ( Ehlers 1987. In D -stretching three channel multispectral data are transformed on to principal component axes. Better results are obtained if the statistics are derived from the whole study area rather than from a subset area ( Fung and LeDrew 1987).g. retain the colour relations of the original colour composite but albedo and topographically induced brightness variations are removed. The signal-to-noise ratio ( SNR) is signi® cantly improved applying the standardised PCA (Singh and Harrison 1985. Shettigara 1992 ). 1991 ) or 2. or set to a uniform DN across the entire image. ( 1993 a) and Richards ( 1984 ).3. Two types of PCA can be performed: selective or standard. before applying the inverse transformation. The higher resolution image replaces PC1 since it contains the information which is common to all bands while the spectral information is unique for each band (Chavez et al. The second procedure integrates the disparate natures of multisensor imput data in one image. PC1 accounts for maximum variance which can maximise the e ect of the high resolution data in the fused image (Shettigara 1992 ). The de-correlation stretch helps to overcome the perceived problem that the original data often occupy a relatively small portion of the overall data space (Campbell 1993 ).3. 1993 b). 1991). The correlation coe cient re¯ ects the tightness of a relation for a homogeneous sample.. The image channels of the di erent sensor are combined into one image ® le and a PCA is calculated from all the channels. Campbell 1993. In C -stretching PC1 is discarded. The channel which will replace PC1 is stretched to the variance and average of PC1. respectively. the selective PCA uses only a selection of bands which are chosen based on a priori knowledge or application purposes ( Ye  sou et al. The PCA technique can also be found under the expression Karhunen L oeve approach ( Zobrist et al. The ® rst version follows the idea of increasing the spatial resolution of a multichannel image by introducing an image with a higher resolution. Some examples of image fusion applying the ® rst and the second method of PCA are reported by Ye  sou et al. PCA of all multi-image data channels ( Ye  sou et al. L . PCA in image fusion has two approaches: Downloaded by [Gitam University] at 20:56 04 October 2013 1. Pohl and J. It prevents certain features from dominating the image because of their large digital numbers. Jutz and Chorowicz 1993 ). In case of TM the ® rst three PCs contain 98± 99 per cent of the variance and therefore are su cient to represent the information. when composited. stretched to give the data a spherical distribution in feature space and then transformed back onto the original axes ( Jutz and Chorowicz 1993). However. 1979 ). 1993 a). This yields three colour stretched bands which. e. High pass ® ltering Another approach to enhance the spatial resolution of multispectral data adds the spatial to the spectral information: high pass ® ltering ( HPF) in combination . shifts in the band values due to markedly di erent cover types also in¯ uence the correlations and particularly the variances (Campbell 1993). TM 1± 7.

4. 4. In image fusion the regression procedure is used to determine a linear combination (replacement vector) of an image channel that can be replaced by another image channel.3. The most popular COS technique (already described above) is the IHS colour tranformation. Shettigara 1992 ). Canonical variate substitution The canonical variate analysis is a technique which produces new composite bands based on linear combinations of the original bands. Regression variable substitution Multiple regression derives a variable. 1993). He suggests to look at it in a more general way and calls any technique that involves a forward transformation. The method can be applied to spatially enhance data or for change detection with the assumption that pixels acquired at time one are a linear function of another set of pixels received at time two. already discussed in the preceding sections (Shettigara 1992).5. The aim of this technique is to sharpen imagery. This method is called regression variable substitution ( RVS). Jutz and Chorowicz 1993).3.4. Nevertheless.3.7. relative to the variation within the training classes. that will have maximum correlation with unvariate data.6. Successful implementation of this approach requires the de® nition of homogeneous areas or training classes for each of the information classes of interest (Campbell 1993 ). a COmponent Substitution (COS) technique. To achieve the e ect of fusion the replacement vector should account for a signi® cant amount of variance or information in the original multi-variate data set. Shettigara 1992. The high spatial resolution image is ® ltered with a small highpass-® lter resulting in the high frequency part of the data which is related to the spatial information. 4. one of the pre-requisites to pixel based image fusion ( Djamdji et al. Principal Component Substitution ( PCS) and Standardised PC Substitution (SPS). W avelets A mathematical tool developed originally in the ® eld of signal processing can also be applied to fuse image data following the concept of the multiresolution analysis (MRA) (Mallat 1989). the HPF method has limitations in passing on important textural information from the high resolution band to the low resolution data (Shettigara 1992). as a linear function of multi-variable data. Using the predicted value obtained from the leastsquare regression.3. Another application is the automatic geometric registration of images. 4. It derives linear combinations of the bands which maximise the di erences between training or reference classes. The interpretation of structures Downloaded by [Gitam University] at 20:56 04 October 2013 . Other methods which can be ® tted into the COS model are Regression V ariable Substitution ( RVS). the di erence image is the regression value-pixel of time one (Singh 1989. The weights assigned to the wavelets are the wavelet coe cients which play an important role in the determination of structure characteristics at a certain scale in a certain location. Component substitution Shettigara ( 1992 ) presented a slightly di erent view of categorising certain image fusion techniques. The wavelet transform creates a summation of elementary functions (= wavelets) from arbitrary functions of ® nite energy. This is pixel wise added to the low resolution bands ( Tauch and Ka È hler 1988.Multisensor image f usion in remote sensing 837 with band addition. the replacement of a variable and a backward transformation.

Once the wavelet coe cients are determined for the two images of di erent spatial resolution. Singh 1989. This creates sharpened VIR/ SAR image combinations. 1993 a. 4. It is widely used to integrate especially very disparate data such as VIR and SIR ( Ehlers 1991. IHS / other techniques The IHS transformation o ers a large number of possibilities to fuse remote sensing data with e.3. data stretch insu cient. The resulting data can then be fused using multiplication with SAR imagery. Aspects such as: (a ) (b) (c) (d ) cloud cover problem not solved. They serve only as examples of the combination of techniques used by various authors. RGB/ other techniques The display of remote sensing data in RGB can be found with many of the fusion techniques mentioned above.4. Downloaded by [Gitam University] at 20:56 04 October 2013 are reasons for investigating other possibilities and the optimization of techniques. Combined approaches In some cases it might not be enough to follow one approach in order to achieve the required results. mosaic and other fusion techniques. Loercher and Wever 1994). Koopmans and Richetti 1993. 4. Many other combinations are of course also possible. a transformation model can be derived to determine the missing wavelet coe cients of the lower resolution image. 1993 a).1.4. 1982.838 C. The following paragraphs provide a brief overview on the manifold possibilities of combining techniques in order to fuse remote sensing images. The combinations of IHS / RGB and PCA / RGB found recognition especially in geology ( Haydn et al. L . 1993 b). Therefore.g. The most successful techniques for using images are developed from combinations of techniques. 4. This method is called ARSIS. colour assignment not acceptable. Ye  sou et al... such as multiple band combinations.4. The wavelet transform in the context of image fusion is used to describe di erences between successive images provided by the MRA.2. The arithmetic combination of image channels in combination with RGB display facilitate the interpretation in many applications (Lichtenegger et al. 4. 1996 ). Using these it is possible to create a synthetic image from the lower resolution image at the higher spatial resolution. hence showing more spatial detail. Pohl and J. Arithmetic methods combined with IHS including weighting factors can enhance the remote sensing data for visual interpretation ( Ye  sou et al. ratios and PCA as input.4. blurred image. 1991 ). van Genderen or image details depend on the image scale which is hierarchically compiled in a pyramid produced during the MRA ( Ranchin and Wald 1993). This image contains the preserved spectral information with the higher resolution. HPF ® ltered imagery) from the original data in order to enhance lines and edges. HPF/Band combinations It can be of advantage to use subtraction of pre-processed data (e. Grasso 1993. A further development of the method described is the introduction of multispectral information to the merge of high-resolution PAN . Chiuderi and Fini 1993. the use of combined techniques plays an important role in image fusion. an abbreviation of the French de® nition `ame  lioration de la re  solution spatial par injection de structures’ ( Ranchin et al.g.

agriculture and forestry. the mosaic approach o ers a wide variety of possibilities in connection with other fusion techniques. 1989.4. 4. Concerning multisensor SAR image fusion the di erence in incidence angles data may solve ambiguities in the . 1994. The results of the triple sensor image fusion combine VIR/ SAR and high spatial resolution in one image ( Pohl 1996). 1982.1. i.2.. In addition the two data sets complement each other in terms of information contained in the imagery. Hussin and Shaker 1996). the contribution of other SAR imagery represents the more operational solution. ice. Bloom et al. 1996. Franklin and Blodgett 1993. 1992. T opographic mapping and map updating Image fusion as tool for topographic mapping and map updating has its importance in the provision of up-to-date information. 5. Hinse and Proulx 1995 a. crop classi® cation in agriculture applications is facilitated (Ahern et al. 5. land use. This section gives a couple of examples of multisensor image fusion comprising the combination of multiple images and ancillary data with remote sensing images: Ð Ð Ð Ð Ð topographic mapping and map updating. Mosaic / other techniques In order to solve the cloud cover problem e ectively. Albertz and Tauch 1994. SAR data is introduced to areas of no information on the optical data.e. Matte 1994. Tanaka et al. Perlant et al. ¯ ood monitoring. Munechika et al. Tauch and Ka È hler 1988. Kaufmann and Buchroithner 1994. The optical data serves as reference whilst the SAR data that can be acquired at any time provides the most recent situation. agriculture and forestry Regarding the classi® cation of land use the combination of VIR with SAR data helps to discriminate classes which are not distinguishable in the optical data alone based on the complementary information provided by the two data sets ( Toll 1985. 1993. Downloaded by [Gitam University] at 20:56 04 October 2013 5.4. it is possible to introduce all types of data in the various elements of the mask ( Pohl 1996 ). Work in this ® eld has been published amongst others in ( Essadiki 1987. Similarly. Ulaby et al. Perlant 1992. 1978. Areas that are not covered by one sensor might be contained in another.Multisensor image f usion in remote sensing 839 with SAR. Rogers and Wood 1990. Therefore. Ex amples of image fusion There is an increasing number of applications in which multisensor images are used to improve and enhance image interpretation. Pohl and Genderen 1995 ).and snow-monitoring and geology. Welch and Ehlers 1988. Pohl 1995. SAR data from di erent sensors or orbits can reduce the regions of foreshortening. Dallemand et al. clouds and their shadows. Likewise. In the ® eld of topographic mapping or map updating often combinations of VIR and SAR are used. Brisco and Brown 1995). 1988. The idea to input optical data often fails because the mountainous areas which are causing these geometric distortions in the SAR are also the reason for cloud coverage. layover and shadow. Once the cloud / shadow ( VIR) and layover/ foreshortening / shadow mask (SAR) has been produced. L and use. Each section contains a list of references for further reading on these topics.

1993. In general there are two advantages to introduce SAR data in the fusion process with optical imagery: 1. Others rely on multitemporal SAR image fusion to assess ¯ ood extents and damage ( Matthews and Ga ney 1994. many SAR systems provide images in which water can be clearly distinguished from land. 1995). van Genderen Downloaded by [Gitam University] at 20:56 04 October 2013 classi® cation results ( Brisco et al.4. 1996 b). Pohl et al. Wang et al. 1996 ). In addition. 1983). Multitemporal SAR is a valuable data source in countries with frequent cloud cover and successfully used in crop monitoring. Flood monitoring In the ® eld of the management of natural hazards and ¯ ood monitoring using multisensor VIR/ SAR images plays an important role. 5. L . 5.840 C. Optical and microwave image fusion is also well known for the purpose of identifying and mapping forest cover and other types. 1993 ). ESA 1995 a. It is a well known fact that the use of multisensor data improves the interpretation capabilities of the images. Ye and Fellah and Tholey ( 1996 ). With the implementation of fusion techniques using multisensor optical data the accuracy of urban area classi® cation is improved mainly due to the integration of multispectral with high spatial resolution (Gri ths 1988. For the representation of the pre-¯ ood situation the optical data provides a good basis. SAR data acquisition at the time of the ¯ ood can be used to identify ¯ ood extent and damage. MacIntosh and Profeti 1995. 2. 1995 b. Desnos et al. Then. Mangolini and Arino 1996 a. as compared to the results obtained with the individual sensors ( Leckie 1990.3. 1994. 1994 ).5. Ranchin et al. Examples of multisensor fusion for ¯ ood monitoring are  sou et al. Aschbacher et al. Wilkinson et al. 1996 ). SAR data is available at any time of the day or year independent from cloud cover or daylight. for Developing Countries the fusion of SAR data with VIR is a cost e ective approach which enables continuous monitoring ( Nezry et al. ( 1995) described by Corves ( 1994). Hinse and Coulombe 1994. Lichtenegger and Calabresi 1995. SAR is sensitive to the di-electric constant which is an indicator for the humidity of the soil. Pohl and J. The VIR image represents the land use and the water bodies before ¯ ooding. Brakenridge 1995. The combined optical and microwave data provide a unique combination that allows more accurate identi® cation. Regarding the use of SAR from di erent orbits for snow monitoring the amount of distorted areas due to layover. Haack and Slonecker 1991. Badji 1995. Kachwalha 1993. Geology Multisensor image fusion is well implemented in the ® eld of geology and a widely applied technique for geological mapping. This makes it a valuable data source in the context of regular temporal data acquisition necessary for monitoring purposes. Ice /snow monitoring The fusion of data in the ® eld of ice monitoring provides results with higher reliability and more detail ( Ramseier et al. 1995. Furthermore. 1993. ( 1995). 5. multitemporal SAR or SAR / VIR combinations are used together with topographic maps ( Bonansea 1995. Otten and Persie 1995 ). ( 1994). Geological . shadow and foreshortening can be reduced signi® cantly ( Haefner et al. Kannen et al. Especially. Lozano-Garcia and Ho er 1993. Armour et al.

This implies that the data should be geocoded with sub-pixel accuracy. line and edge detection. Harris et al . Haydn et al .) and then fuse the images. Ray et al . Grasso 1993. Evans 1988. 1989. the temporal aspect should not be underestimated. including receiving stations and data distributors ( Pohl 1996. Hopkins et al . etc. Selection of appropriate data set and techniques The decision on which technique is the most suitable is very much driven by the application. They introduce information on soil geochemistry. Koopmans and Forero 1993.. Paradella et al .g. An advantage is the possibility of ® ltering and enhancing the data during the geocoding process to avoid multiple resampling. it is very di cult to make general statements on the quality of a fusion technique. 1990.1. The importance of geometric accuracy to avoid artefacts and misinterpretation in pixel based image fusion should not be underestimated. Any spatial enhancement performed prior to image fusion will bene® t the resulting fused image. Other limiting factors in respect to the data selection are operational constraints of the sensors that are supposed to acquire the data. topography and surface roughness (SAR) ( Daily et al . In most cases VIR is combined with SAR based on the fact that the data sets complement each other. Ye  sou et al . In addition the type of data available and statistical measurements. 6. The characteristics of fused image data depend very much on the applied preprocessing and the image fusion technique. 1994. e. The user of multisensor data in the context of image fusion has to be aware of the physical characteristics of the input data in order to be able to select appropriate processing methods and to judge the resulting data ( Pohl 1996 ).1. Polidori and Mangolini 1996). 1993 b. vegetation and land use ( VIR) as well as soil moisture.Multisensor image f usion in remote sensing 841 features which are not visible in the single data alone are detected from integrated imagery. The constraints are related to the disturbance of spectral content of the input data and a blurring e ect when introducing images with a low SNR. ® lter. A general rule of thumb is to ® rst produce the best singlesensor geometry and radiometry (geocoding. 6. After image fusion the contribution of each sensor cannot be distinguished or quanti® ed in order to be treated accordingly. Guo and Pinliang 1989. Therefore. Also. Koopmans and Richetti 1993. correlation matrix may support the choice. Advantages and limitations Downloaded by [Gitam University] at 20:56 04 October 2013 In order to summarise the review some issues related to advantages and disadvantages along with constraints of image fusion are discussed below. Changes in the area between the acquisition dates of the imagery might introduce problems in the fused image. Baker and Henderson 1988. Jutz and Chorowicz 1993. 1995). Rebillard and Nguyen 1982. Blom and Daily 1982. The data has to be resampled at the pixel spacing required for the desired image fusion ( Pohl 1996 ). The DEM therefore plays an important . Reimchen 1982. 6. Koopmans et al .1. 1988 and Taranik 1988. 1982. The choice of a data set is application dependent as is the technique used to fuse the data ( Pohl 1996). Pre-processing All sensor-speci® c corrections and enhancements of image data have to be applied prior to image fusion since the techniques refer to sensor-speci® c e ects. Pixels registered to each other should refer to the same object on the ground. 1979. 1993 a. Harris and Murray 1989. 1988. Aarnisalo 1984.

L .). T echniques and image combinations Categorized by technique the following sections summarize observations relevant to the judgement of the usefulness of a certain fusion technique: 6. coastal zone monitoring.. This implies that the following factors need consideration: 1. Subsequently. tidal inlets) ( Pohl 1996). soil moisture studies. The fusion of SAR with VIR data improves the interpretation of the SAR data. RGB channel assignment signi® cantly in¯ uences the visibility of features and visual interpretation by a human interpreter ( blue = water. green = land. RGB Digital numbers in single images in¯ uence the colours in the RGB composite.842 C.1.2.1. oceanographic objects) ( Pohl 1996). Band combinations The use of linear combinations have the disadvantage that these band combinations lead to a high correlation of the resulting bands and that part of the spectral information is lost compared to the original multispectral image ( Ehlers 1987. 1991). etc. As a result. The resulting fused image depends very much on the appearance and content of the SAR data. 6.1. RGB overlay protects the contribution from optical imagery from being greatly a ected by speckle from SAR ( Pohl 1996).4. features that are detectable on SAR data can be introduced to the VIR data by image fusion to complement the data (e. 6. In¯ uence of terrain on SAR backscatter reduces the suitability of the data for recognizing features such as roads. tidal activities. The PCA technique enables the integration of more than three types of data ( Zobrist et al . The appearance of SAR signi® cantly in¯ uences the feature visibility in the fused VIR/ SAR image.1. The water-land boundaries are well de® ned in the fused images.1. The technique is simple and does not require CPU time-intensive computations. etc. The need for DEMs of high quality and appropriate grid spacing is evident. the SAR data have to be selected according to the ® eld of interest of the application.2. 6. Downloaded by [Gitam University] at 20:56 04 October 2013 . PCA Radiometric pre-processing plays an important role in relation to the spectral content of the fused image. The resulting image is not quite as sharp as the one produced from multiplication only.2.3. This type of technique does not solve the cloud-cover problem because the range of digital numbers corresponding to clouds is preserved and even enhanced if not excluded from the calculation ( Pohl 1996). rivers.2. it allows colours to be assigned to the water currents (e.1. cities. As a consequence.g. 1979 ). 3. urban area.. Inversion of image channels might be desirable to assign colour to features. Pohl and J. 6. soil moisture.2.). Brovey The spectral content of the VIR data is preserved while introducing the texture from SAR. it helps applications that bene® t from the interpretation of up-to-date SAR data (urban growth. 2. Histogram stretching of single channels in¯ uences the visibility of features in the ® nal colour composite. van Genderen role in this process.g. etc.2.

Related to the IHS technique the hue has to be carefully controlled since it associates meaningful colour with well de® ned characteristics of the input.1.g. in particular. The informative aspects are presented in IHS using readily identi® able and quanti® able colour attributes that can be distinctly perceived. It is necessary to have access to a variety of situations. These products are often delivered with the SAR image itself. IHS The IHS o ers a controlled visual presentation of the data. variance and correlation of imagery were taken into account as indicators during the mathematical quality assessment ( Ranchin et al.. It is essential to match the histograms of the various input data to each other. An advantage of the IHS versus the RGB is the ability of integrating four instead of three channels in the fused image ( Rothery and Francis 1987 ). for the three-dimensional impression of topography and change detection. 6. Numerical variations can be uniformly represented in an easily perceived range of colours ( Harris et al . image bias. They implemented mathematical conditions to judge the quality of merged imagery in respect to their improvement of spatial resolution while preserving the spectral content of the data. The identi® cation of foreshortening. A disadvantage is the reduced spatial detail compared to original optical data ( Pohl 1996).Multisensor image f usion in remote sensing 843 Principal component SAR images show potential for topographic mapping. ( 1996 ). IHS fused imagery have the capability of allocating data from the SAR to cloud covered areas without having to identify the clouds at an earlier stage.1.2. With the use of di erence images a comparison was made with the original data introduced to the fusion process. e. Mosaic The mosaic has an important position amongst the image fusion techniques as far as cloud removal from VIR and the replacement of radiometrically distorted SAR data is concerned. Downloaded by [Gitam University] at 20:56 04 October 2013 .5. combination of principal components with optical data ( Pohl 1996 ). layover and shadow areas in the SAR is based on DEM calculations and pure geometry. relevant parameters and ground truth ( Polidori and Mangolini 1996 ). Another possibility is to validate ® ndings from fused data by testing the methods on simulated data ( known parameters) followed by a comparison with actual data sets. It can be used in combination with any other image fusion technique ( Pohl 1996 ). In addition. the evaluation of the achieved results becomes relatively complex because of the di erent sources of data that are involved. Assessment criteria Working in the ® eld of multisensor image fusion.2. 1996 ). A numerical quality assessment of image fusion processes was recently introduced by Ranchin et al.2. 6. 6. The di erent aspects of image acquisition of the various sensors have to be considered as well as the approach of the image fusion itself plays a role. This is valid. The speckle is preserved from the SAR data in the fused image. This is a critical point for the optical imagery. The result depends very much on the quality of the mask designed for the mosaic. 1990 ).6. It shows similarities with the Brovey transformation in terms of spectral content of the imagery. The possibilities have not yet been fully explored. Other techniques produce images which are di cult to interpret quantitatively and qualitatively because the statistical properties have been manipulated and the original integrity of the data is disturbed.

3. A very important aspect in future research and development activities is to place emphasise on methods to estimate and assess the quality of fused imagery.5. and of course of the use of more than one scene has an impact on the costs involved. Future improvements The development of powerful hardware along with more and more sophisticated software allow the implementation of algorithms and techniques that require large data volumes and time intensive computation such as the wavelet approach. In areas with considerable height variations a digital elevation model ( DEM ) is necessary. Feature level image fusion uses corresponding structures which makes the geometry a less critical issue. the lack of available ground truth ( Lozano-Garcia and Ho er 1993) forms another constraint. concerning ANN and the wavelet transform as already mentioned in the context of the ARSIS method ( Ranchin and Wald 1993).g. ( 1996 ). the user is often forced to work empirically to ® nd the best result.4. Operational image f usion The fusion of multisensor data is limited by several factors. Furthermore.844 C. An improvement and the development of new techniques is currently being developed. Frequently. ( 1993) and Ranchin et al. L evels of image f usion Some considerations regarding the judgement of image fusion are necessary regarding the processing levels as mentioned above. The geometric correction requires knowledge on the sensor viewing parameters along with software that takes into account the image acquisition geometry. The validation of the fusion algorithm bares some insecurity ( Polidori and Mangolini 1996 ). A start is the mathematical approach along with a set of criteria to check the radiometric integrity of the fused imagery described by Mangolini et al. the involvement of expert systems in a GIS can support the integration and evaluation of fused image data. The pixel based methods have the advantage against feature or information level fusion of using the most original data. In the case of large di erences in spatial resolution of the input data problems arise from the limited (spatial) compatibility of the data. L . Since there is no standard procedure of selecting the optimal data set. 6. pixel based methods deal with very large data volumes and long computation times are involved. Also. the lack of simultaneously acquired multisensor data hinders the successful implementation of image fusion. It avoids a loss of information which occurs during a feature extraction process (Mangolini 1994 ). A combination of wavelet transform and ANN classi® cation was already presented by Melis and Lazzari ( 1994). The use of multisensor data requires more skills of the user in order to handle the data appropriately which also invokes the availability of software that is able to process the data. the di culty of comparing pixels from heterogeneous data is mentioned in the literature. 6. A constraint of pixel based fusion is the necessity of accurate geocoding. van Genderen In this respect much more research is required to provide objective evaluation methods for fused imagery ( Pohl 1996).. e. including a resampling of the data. Downloaded by [Gitam University] at 20:56 04 October 2013 . 6. This is especially important for SAR data processing due to the side-looking geometry of the sensor. In addition. Often. Pohl and J.

EURIMAGE. ITC’s research on image fusion has been externally funded by the CEC (CHRX-CT93-0310 ). With upcoming improved sensors.. Ye on the ® rst draft of the manuscript. Value-added products are to be provided by the data processing agencies along with special training of the data users ( Polidori and Mangolini 1996 ). NLR and NASDA.g. Appendix AÐ Abbreviations and acronyms ANN ARSIS CAMS COS CRT DN HPF HSV IHS LUT MRA OIF PAN PC1 PCA PCS RGB RVS SAR SIR-B SNR SPOT SPS TIMS TM VIR XS YIQ Arti® cial Neural Network ame  lioration de la re  solution spatial par injection de structures Calibrated Airborne Multispectral Scanner (CAMS) COmponent Substitution Cathode Ray Tube Digital Number High Pass Filtering Hue Saturation Value colour representation Intensity Hue Saturation colour space Look Up Table MultiResolution Approach Optimum Index Factor PANchromatic band First Principal Component Principal Component Analysis Principal Component Substitution Red Green Blue colour space Regression Variable Substitution Synthetic Aperture Radar Shuttle Imaging Radar Experiment B Signal to Noise Ratio System d’Observation de la Terre Standardised PC Substitution Thermal Infrared Multispectral Scanner ( TIMS) Landsat Thematic Mapper Visible and Infrared SPOT multispectral bands ( 3) Luminance Chrominance colour space . MPA. MCS. Their support to this research is gratefully acknowledged by the authors. ESA. Future satellites with multiple sensors on board will provide the opportunity to simultaneously acquire a data set.Multisensor image f usion in remote sensing 845 More ¯ exible satellite sensors are expected to contribute to the optimisation of the input data set (e. The operation of ERS-1 and ERS-2 in the so-called `Tandem-mode’ provides a solution to problems that occur due to temporal changes of ground cover between the acquisition dates of SAR data. EOSAT. the importance of multisensor image fusion and interpretation will increase and open new possibilities in Earth observation for operational applications for the protection and development of the Earth’s environment ( Pohl 1996). Radarsat). EUROSENSE. Downloaded by [Gitam University] at 20:56 04 October 2013 Acknowledgments  sou for his valuable comments and suggestions The authors thank Dr H. An example is SPOT-4 with the VEGETATION program ( 1 km resolution) as well as multispectral HRV at 20 m resolution ( Pohl 1996).

. In Application Achievements of ERS-1. B loom. J . Multisensor data merging for digital landscape models.. R . V ibulsresth.. A . assessment. X . O fren... Statistical methods for selecting Landsat MSS ratios.. and R yerson. C ampbell. R . SP-1176 / 11. Downloaded by [Gitam University] at 20:56 04 October 2013 . G . J . 1988. A . In Application Achievements of ERS-1. The use of Intensity-HueSaturation transformations for merging SPOT Panchromatic and multispectral image data.. 31. and L ichtenegger. B adji. G . C . F ... Spaceborne SAR data for land cover classi® cation and change detection. F . and B rown. 56. Flood monitoring and assessment in Piemonte. 1993. pp. A lbertz. 1995. Tropical mangrove vegetation mapping using advanced remote sensing and GIS technology. In Proceedings ISPRS Symposium.. S . and F u.. Pohl and J. 52. Digest of International Geoscience and Remote Sensing Symposium . pp. Colorado Springs A hern. An integrated package for (Colorado Springs: ISRS). Cepadue  s Editions ( Paris: CNES). and C harupat. Mapping from space Ð Cartographic applications of satellite image data... and B owman. New V iews of the Earth ( Paris: European Space Agency). B . A . 4± 8... R . J . R . 1990. 1992.. and D aily. F ielding. France. R . 1983. 1984 . G oodenough. 4. p. Proceedings of the International Symposium on Remote Sensing of Environment.. G rey.E. G iri. P . Radar image processing for rock type discrimination. Preliminary V ersion (Paris: European Space Agency). T iangco. T . J . 95. M .... Proceedings of T hird T hematic Conference. 6± 10 June 1994. and D obson. W .. S .. p. Towards more quantitative extraction of information from remotely sensed data. B aker. Canada (Ottawa: International Archives of Photogrammetry and Remote Sensing ). Simultaneous microwave and optical wavelength observations of agricultural targets. 1995. B rakenridge. Ottawa. and T auch. and A lmer. 1988. 1994. A demonstration of stereophotogrammetry with combined SIR-B and Landsat-TM images. Earth Observation Quarterly. 1982. 32. G .. held in Sinaia. P . 343± 351. J . Image processing and integration of geophysical. B . held in Sydney. 1637± 1646. F . Remote Sensing for Exploration Geology.E.. Flood mapping in the Mississippi. S trobl. 1990. J . 2. and S owers. Multidate SAR / TM synergism for crop classi® cation in Western Canada. 731± 738. Photogrammetric Engineering and Remote Sensing.. 9. S .. J . Australia. L . Analysis of Earth Observation Space Data Integration. W . F . 114± 115. D . A . P . T . 1978. B .E.. E hrismann. B onansea. B risco.. 29± 40. 1023± 1038. R . R aggam. M . Digital merging of Landsat TM and digitized NHAP data for 1 : 24 000 scale image mapping. N .. B uchroithner. C arper. C hen. T . S uselo. B erlin.. 8. A rmour. N .. L . M . J . R . results. G . B risco. C havez.. 1982.. C havez. and K ieffer.. 299± 306. 459± 467. P . Flood plain management in Wallonia. N . International Journal of Remote Sensing.1± 1. 20. L .. B . 28± 30 October 1992 (Paris: EARSeL ). I. Proceedings CNES Conference. M . In Application Achievements of ERS-1. Advanced Remote Sensing. Proceedings EARSeL W orkshop. 127± 142. G . Results of the GEOSAT programme to evaluate SPOT data for petroleum and mineral exploration. New V iews of the Earth ( Paris: European Space Agency). Executive Summary. 107± 128. 1. S . GeoJournal .. J . Romania. and H enderson.. Canadian Journal of Remote Sensing.. A schbacher. 1995. A schbacher. E . ADC Austria and Asian Institute of T echnology (Bangkok: AIT). SPOT 1: Image utilization. F .. D elso.. R . Paris. 1994. 29± 37. 1995. pp. New V iews of the Earth. T ransactions on Geoscience and Remote Sensing. J . U laby. 1009± 1014.. MSS and other data as a tool for numerical exploration in glaciated Precambrian terrain. 1994. pp. 61.. B . van Genderen A arnisalo.. L illesand.. B . Photogrammetric Engineering and Remote Sensing. J . E . Complementary nature of SAR and optical data: a case study in the Tropics. and the fusion of radar and passive microwave data. SP-1176 / 11. T . the processing and analysis of SAR imagery.. 34± 42.846 References C. D . 92. B lom.8. C . M . 23± 30. A . pp. 1. Conference Proceedings.. P . L . Journal of Applied Photographic Engineering .. Photogrammetric Engineering and Remote Sensing. 1987..

. L eberl. D jamdji. C hiesa. S . F arr. Italy. GeoInformatics ’95.. 1992. C heng. Stein (New York: I. 311± 316. and T yler. O zer. V . B ijaoui.E. Proceedings 12th Annual I.. .. A comparison of geometric models for multisource data fusion. S ides. Proceedings of First ERS-1 Pilot Project W orkshop.. Comparison of coincident Landsat-4 MSS and TM data over an agricultural region. F . Washington DC. 1979. T .. U. 645± 653. 86± 98. A ..S. D .E. Proceedings of First ERS-1 Pilot Project W orkshop. L ichtenegger.. Florence. J .. J . D aily. C omhaire. 295± 303.. P ... SP-365 (Paris: European Space Agency).. Integrating remotely sensed data from di erent sensors for change detection. Cannes. Photogrammetric Engineering and Remote Sensing.S. Downloaded by [Gitam University] at 20:56 04 October 2013 data in the morphologic and bathymetric study of the north sea and the Scheldt Estuary. 1987. K aufmann.-L . 1994. and S ieber. S . Photogrammetric Engineering and Remote Sensing.. D allemand. 1± 8. 297± 302. P . 22± 24 June 1994 . 1009± 1116. Image Processing and Remote Sensing. 26± 28 May 1995. The contribution of ERS-1 and SPOT pp. edited by R. G . J . 18± 21 May 1987 ( New York: I. D aily. Photogrammetric Engineering and Remote Sensing. P . T ransactions on Geoscience and Remote Sensing. W . 45. G .. GIS and GPS in Sustainable Development and Environmental Monitoring. 1993. Multitemporal ERS-1 SAR images of the Brahmaputra ¯ ood plains in Northern Bangladesh.S. 1343± 1345.. Data fusion techniques for remotely sensed images. T oledo. A . and J aspar. 22± 24 June 1994 . I . and R eichert.. 555± 561. Assessment of multi-temporal ERS-1 SAR Landsat TM data for mapping the Amazon river ¯ oodplain. 11± 17.. T echnical Papers.E. G . and M anieri. F .A. 1984. C . D esnos. C . Ann Arbor... 492± 497.E. T . M . Geometrical registration of images: The multiresolution approach. .E. International Space Year: Space Remote Sensing. 24. L . C havez. p.E. J . A . Multi incidence angle SIR-B experiment over Argentina: Generation of secondary image products. 57. H owarth. Proceedings 50th Annual Meeting of ASPRS. pp. P ... R . Comparison of three di erent methods to merge multiresolution and multispectral data: TM & SPOT pan. 1995.. Williamson with collaboration of T.. M ayer... Integration of the SPOT Pan channel into its multispectral mode for image sharpness enhancement. T echnical Papers 1990. B . C hiuderi. and P ohl. . C liche. C . E . U. J . C rist. P audyal. T .E. D e G roof.. 49. A . 349± 355. and F ini. A . 333± 000. T outin. France. G uptill.E. A . 1992. J . SP-359 (Paris: European Space Agency).E. 1984. (Washington: ASPRS).E. 51. Y . A . S owter. A . 51. G . D omik..A.. Proceedings 50th Annual Meeting ASP-ACSM Symposium. 59. T imevarying Image Processing and Moving Object Recognition. S . T echnical Papers. Washington DC.. Image Processing Techniques for TM data. Proceedings of International Symposium. U.. 1985. and E lachi.. Hue-saturation-intensity split-spectrum processing of SEASAT radar imagery. H . B onn..E. Photogrammetric Engineering and Remote Sensing. pp. S . pp. and A nderson. 1994. C . A .A.. International Geoscience and Remote Sensing Symposium (IGARSS ’87)... C .E. Geologic interpretation from composited radar and Landsat imagery. and C imino. 508± 517. 129± 132. Photogrammetric Engineering and Remote Sensing.. Spain. C . C . and L eD rew. Proceedings First ERS-1 Symposium. 1986.). International Geoscience and Remote Sensing Symposium (IGARSS ’92).E.-P . Data fusion of o -nadir SPOT panchromatic images with other digital data sources.). A . Houston. 4. Hong Kong (Hong Kong: Chinese University of Hong Kong ). pp. Combined analysis of ERS-1 SAR and visible /infrared RS data for land cover/ land use mapping in tropical zone: a case study in Guinea.. H older.. Remote Sensing. The TREES project: Creating an environment for the routine classi® cation and image processing of ERS-1 SAR imagery.S. M . Proceedings of 4th W orkshop. Spain. M . ACSM-ASPRS Annual Convention. Earth Observation Quarterly. A .A. and S ardar. 1996. C orves. F .. T oledo. ASPRS.E.Multisensor image f usion in remote sensing 847 C havez. R . I. 2. P . 10± 11 June 1993 ... U.. pp.. Proceedings of the I. SP-365 (Paris: European Space Agency). 1993.. D uguay. 1990. E . 6± 10. P . J . 6± 8 November 1992.. S . Space at the Service of our Environment.E.. and T eillet. H . 728± 742. 1983. and B owell. 1991. pp.

18± 26...). Scotland. and W alker. J . R . GIS /L IS ACSM-ASPRS Fall Convention. L.. L . 53. van Genderen and V. 48. 1993. 1995 b. IHS transform for the integration of radar imagery Downloaded by [Gitam University] at 20:56 04 October 2013 . ERS-1 /2 multitemporal tandem image: Helsinki (Finland ). Helsinki. J . Atlanta. H arris. 209± 235. Earth Observation Quarterly. 1994. Proceedings of the XII Congress of ISP. Remote Sensing of Environment . F ranklin.. van. G illespie.. 1989. 11 September 1994 . K . techniques and applications. 19. and P iesbergen. 1993. and P inliang. and B lodgett.). Multisensor image fusion techniques in remote sensing. 13. 1976. photogrammetry and cartography: the geoinformatics approach... 149. Colour enhancement of highly correlated images. 1993. H . M .. ISPRS Journal of Photogrammetry and Remote Sensing. ERS-1 SAR images of ¯ ood in Northern Europe (January 1995). edited by J. L . Universita E hlers. T . CAS-IRSA. In Application Achievements of ERS-1. G . N . Application of Principal Component Analysis to change detection. G . H arris. 1991. T oledo. 1991. R . and M urray. 11± 14 October 1993 . 46. Integrated MSS-SAR-SPOT-geophysical geochemical data for exploration geology in Yeder area. 28 October 1991 (Atlanta: ASPRS). 1986. G enderen. N u ¨ esch D . Capabilities and limitations of ERS-1 SAR data for snowcover determination in mountainous regions. G uyenne. and G raham. U. T echnical Papers.E. 6.. pp. and L eD rew. E . 1987. 1± 8. G . E vans. Hamburg.E. An example of satellite multisensor data fusion. P .. Sensor fusion for locating villages in Sudan. H aack. Space at the Service of our Environment. Mathematical techniques in multisensor data f usion (Norwood: Artech House Inc. No. K ahle. Wissenschaftliche Arbeiten der Fachrichtung Vermessungswesen der È t Hanover.. C . 129± 144. 1995 a. Proceedings EARSeL W orkshop..E. Pohl and J. 1988. pp. Preliminary V ersion ( Paris: European Space Agency). G rasso. A .. E hlers. A . SP-361 ( Paris: European Space Agency).. D . and S lonecker. France. M eier. K onecny. B97± B107.. J .. H .. C racknell. Integrative Auswertung von digitalen Bildern aus der Satelliten- photogrammetrie und-fernerkundung im Rahmen von geographischen Informationssystemen. 1649± 58. 1993. Photogrammetric Engineering and Remote Sensing. L . Supplement to Earth Observation Quarterly. 229± 234. New V iews of the Earth. I.. H olecz. 1987. 1995. pp.. ESA.. and P ohl. E ssadiki.. H . Integration of GIS. H . Decorrelation and HSI contrast stretches. and T holey.. Computers and Geoscience .-D . Photogrammetric Engineering and Remote Sensing. D .. Strasbourg. GIS GEO-Informations-Systeme . G enderen.. 25. G riffiths. 4.. Landsat-radar synergism. D . Monitoring urban change from Landsat TM and SPOT satellite imagery by image di erencing. International Geoscience and Remote Sensing Symposium (IGARSS ’88). 73± 80. 20. Synergy of remotely sensed data: European scienti® c network in the ® eld of remote sensing. E . 971± 976. Finland.. 1989.E. Flood risk evaluation and monitoring in North East France. Edinburgh. 1987 . 59± 66. A . Proceedings of the Second ERS-1 Symposium. Proceedings of the First ERS-1 Pilot Project W orkshop.S. C . 19± 30. 47. 116± 119. Dissertation. van Genderen E hlers. remote sensing.. B . 1992. H aefner.. Application of the IHS colour transformation for 1 : 24 000-scale geologic mapping: a low cost SPOT alternative. pp. G uo. SP-365 ( Paris: European Space Agency). pp. A . 493± 497. Germany. 13± 16 September 1988 ( New York: I. L . L . F ung. IT C Journal ... 59. Intelligent Image Fusion. M . J . 1996. van. pp. C .. and S ieber. R . 1987-1. Image fusion: Issues. E . E . Multitemporal image Maastricht ( The Netherlands). pp. Earth Observation Quarterly..848 C. Remote Sensing and Environment .A. 18± 23. D . R . S . F ellah. F . Spain. Multisensor classi® cation of sedimentary rocks. 1988. T . M . 22± 24 June 1994 . 47. 577± 583.E. N . Proceedings I. ESA. 14. F . B . 1994. Cappellini ( Enschede: ITC).E.. A combination of panchromatic and multispectral SPOT images for topographic mapping. T .. D .. M . H all.. Germany.

. Conference Proceedings.... Paris.Multisensor image f usion in remote sensing 849 with geophysical data. F . 1990... Red Sea Hills (Sudan). Ko È nigswinter. T . R . K oopmans. 14. K achhwalha. E . 1993. and P rotman. S chmidt. Paris.. T . Flood mapping in Germany. High resolution detection and monitoring of changes using ERS-1 time series. 1993. 56. 6± 9. F . J .. and C horowicz. ACSM-ASPRS Annual Convention.. G . H ussin. K . G . N ezry. J . and B uchroithner.-G . Canada ( Vancouver: CSRS). 451± 456. 599± 607.. 1993. and B are. 1993. ISPRS Journal of Photogrammetry and Remote Sensing.. Bulletin d’Information Quadrimestriel (Quebec) . B . Part 2 (Melbourne: CSRIO Publications). KWR-1000-Daten. S . Photogrammetric Engineering and Remote Sensing. N avail. H aydn. J . pp. pp.. 3. M urray. C . Geological mapping and detection of oblique extension structures in the Kenyan Rift Valley with a SPOT/ Landsat-TM datamerge. M erembeck. France. D e G randi. M . G . Photogrammetric Engineering and Remote Sensing. 27 September± 1 October 1993 . H . S . Preliminary V ersion (Paris: European Space Agency). Optical and radar satellite image fusion techniques and their applications in monitoring natural resources and land use change. 803± 810. 133± 137. La spatiocarte regionale.. Optimal geological data extraction from SPOTRadar synergism with samples from Djebel Amour (Algeria).. J . Cepadues Editions (Paris: CNES). van der . 4. K ohl.. Proceedings CNES Conference. V ancouver. Earth Observation Quarterly. 28± 37. N . SPOT and ERS Applications. R ... Application of the IHS color transform to the processing of multisensor data and image enhancement. 263± 274. and S ieber. 1996. Zeitschrift f u K eys. Application Achievements of ERS-1. T echnical Papers 1990. International Journal of Remote Sensing.. M . and S chriver. 10± 13 May 1993. and D e G roof.. pp... MicroBRIAN Resource Manual. pp.. 1990.. N ezry. results.. 1993. and S haker. H arris. 238± 249.. 1990. K oopmans. Numerimage. Remote Sensing of Arid and Semi-Arid L ands. 1677± 1688. H . 48. A . 1994. assessment. K annen. From Optics to Radar. 2± 4.. J . J . Downloaded by [Gitam University] at 20:56 04 October 2013 . 177± 190. J . N .-H .. Herstellung und Anwendungsmo È glichkeiten von Satellitenkarten durch digitale Kombination von Landsat-TM und È r Photogrammetrie und Fernerkundung . Numerimage. M .. Germany.-H . pp. W . 923± 926. 26± 28 March 1996 (Berlin: VDE).. J anssen. Integrating topographic data with remote sensing for land-cover classi® cation. 46. 1988. Speckle reduction applied to ERS-1 SAR images. Proceedings ISPRS Working Group IV / 2 W orkshop and Conference.. B .. Structural analysis of the Jura mountains± Rhine Graben intersection for petroleum exploration using SPOT stereoscopic data. and P roulx. Temporal and multisensor approach in forest /vegetation mapping and corridor identi® cation for e ective management of Rajaji Nat. S . B rovey. I ( Paris: European Space Agency). 1994. 2. R . F .. H inse. 635± 642. D .. 3105± 3114. E . L . Crop acreage estimation with ERS-1 PRI images.. and C oulombe. L .. Uttar Pradesh. Germany. A prototype example of sensor fusion used for a siting analysis. and R ichetti... Image Processing and Remote Sensing. Proceedings European Conference (EUSAR ’96). L . and P hillips. L . H enkel. 14. R . pp. India. France (Paris: Cepad ). L . J utz. R . International Journal of Remote Sensing. Proceedings of the Second ERS-1 Symposium. H opkins. D alke. 1503± 1506. N . N . Introduction to image processing. p. 11± 14 October. J . H . H . 62. R . J aarsma. B . and J upp. S . IHS transform for the integration of radar imagery with other remotely sensed data. Hanover. H arrison. F . Space at the Service of our Environment. R . K attenborn. E . and H irose. H inse. 1994. K aufmann. Y . New V iews of the Earth. B .. pp. T . B erger. E . Airborne SAR and Landsat MSS as complementary information source for geological hazard mapping. B . 1995. Park.... 1982. International Mapping f rom Space. K nipp. 1990. 1995.. D . 56. 1993. Proceedings of International Symposium (Cairo: ISRS). and L inden. M . B . Synthetic Aperture Radar.. A . Germany. 15.. 1631± 1641. and F orero. G . A . Z . 124± 125. 3. Hamburg. SPOT 1: Image utilization. K . Proceedings of the 12th Canadian Symposium on Remote Sensing. Bulletin d ’Information Quadrimestriel (Quebec) . E . O berstadler.. L . A . R .. M . A . Sirte Basin ( Libya) and Magdalena Valley (Colombia). Le radar ¼ seul ou avec d’autres?.

A . 6± 8 February 1996 ( Paris: EARSeL ). 243± 246. 49. Photogrammetric Engineering and Remote Sensing. S alvaggio. 1990.. G .. T . Downloaded by [Gitam University] at 20:56 04 October 2013 . 9± 12. P . J . 1994. J . 11± 15 September 1994 (Strasbourg: ERIM). 1994. A Window-based technique for combining Landsat Thematic Mapper thermal data with higher-resolution multispectral data over agricultural land. B . K . 2677± 2694. 1994.. and W ald. M atthews. B akx. S .. M arek. 15 November 1994. P . J . Geocarto International . M . I. R . C . S . G .. Numerimage. The use of ERS SAR data to manage ¯ ood emergencies at a smaller scale. S . W arnick... L oercher. M unechika. H . and P rofeti. T oledo. 14.850 C.. P . M angolini. Y .. Earth Observation Quarterly . R . K . 25... 39± 50. and W ever. M ahin. from ERS-1 SAR image data.. M itiche. and L azzari. 1994. et d’images radar. Multi-sensor analysis for land use mapping in Tunisia. . pp. Cannes. 1993. pp. 1989. O . Optical Engineering . France. France. van Genderen K oopmans.. K . From Optics to Radar.. 674± 693. Title page & p. M atte... 1989. M angolini.. A . J . L’ortho-image au service de la cartographie. 22± 24 June 1994 . J . with wavelet transform unsupervised neural networks. 199± 209.. 66. 1993.. Strasbourg... 67± 72. Dissertation published at the University of Nice± Sophia Antipolis. M ouat. 56. Project Report Prosper Project. M . France ( Paris: Cepad ). L . Earth Observation Quarterly. 11. M . J .. 1995. Preliminary results of the comparative analysis of ERS-1 and ALMAZ-1 SAR data. D . D allemand. pp. 10± 13 May 1993. Proceedings of First ERS-1 Pilot Project W orkshop. 1± 6. J . France. M . P . Remote sensing techniques in the analysis of change detection. J . M angolini. Apport de la fusion d’images satellitaires multicapteurs au niveau pixel en te  le  de  tection et photo-interpre  tation. 22± 24 June 1994 . 2.. 1991.. Multitemporal and single image features extraction. SPOT and ERS Applications. L ichtenegger. pp. D . J . G . A .. 11± 15. R ebillard. 1237± 1246. pp... S . Bulletin d ’Information Quadrimetriel (Quebec) . 1991. L eckie. and G affney.. 1995. Synergistic e ects of combined Landsat-TM and SIR-B data for forest resources assessment. Fusion d’images SPOT multispectrale ( XS) et panchromatique (P). van. D . 1993. ESA Bulletin. and A l F asatwi. ERS monitors ¯ ooding in The Netherlands.. Spain. Conference Proceedings. Improvement of agriculture monitoring by the use of ERS /SAR and Landsat / TM. M . F ..-H . J . 380± 386. ISPRS Journal of Photogrammetry and Remote Sensing. and A rino. 1994. K . Synergism of SAR and visible /infrared data for forest type discrimination. 94. and C alabresi. A theory for multiresolution signal decomposition: The wavelet representation. G . pp. and H offer. L ichtenegger. R anchin. 6 ± 8 December 1995 (Paris: ESA publication). 125± 128. Paris.. Pohl and J. and S chott. International Journal of Remote Sensing.. D ijk. 33. 1994. 12± 18. SP-365 (Paris: European Space Agency). M allat... 337± 342. N . ESA Bulletin.. 3.-F .. L .. Proceedings of Second ERS Applications Workshop. 29± 35. 1990 . R . C . 51. and A rino.. and S chmidt. Final Report. SP-365 ( Paris: European Space Agency). 1986. M angolini.. T ransactions on Pattern Analysis and Machine Intelligence. G . M . Resolution enhancement of multispectral image data to improve classi® cation accuracy..K. ERS-1SAR and Landsat-TM multitemporal fusion for crop statistics. Fusion of Earth Data. O . CNES-SPOT Image-ITC. 81. G . III-471± III-478... 1996 b.E. Spain. M . Proceedings of First ERS-1 Pilot Project Workshop. 119± 121. and A ggarwal. 1996 a. T . U. Ireland during early 1994 using ERS-1 SAR data. L ichtenegger. M oran. and B uchroithner.E. Proceedings EARSeL Conference. 1993. 59. G . A study of ¯ ooding on Lough Corrib.. Photogrammetric Engineering and Remote Sensing. L ozano-G arcia. A . 493± 500.. T oledo. Proceedings First International Airborne Remote Sensing Conference and Exhibition. Combining optical /infrared and SAR images for improved remote sensing interpretation. Photogrammetric Engineering and Remote Sensing.E. M . 56. M elis. M . and L ancaster. M ac Intosh. L ondon. Multiple sensor integration / fusion through image processing: a review. SPOT- RADAR synergism for geological mapping in arid areas. R eichert. G . Integrated use of optical and radar data from the MACEurope campaign 1991 for geological mapping of the Hekla area (Iceland ).

17± 19. 1987. 1996.. ESA SP-361 (Paris: European Space Agency). O tten. some contributions to data calibration and validation. L . 2165± 2184. Example of SPOT/ ERS-1 complementary. P .. van... W .. L . 1995. France ( Paris: ESA). Production of SPOT/ ERSI image maps. 119± 128. 122± 123.. pp. Combining panchromatic and multispectral imagery from dual resolution satellite instruments.A. W .. L .Multisensor image f usion in remote sensing 851 N andhakumar. France. Proceedings of W orkshop. SAR image of ¯ oods using PHARS system. P rice. pp. 1994. C . van. C . pp. S erban. 1990.. In Application Achievements of ERS-1. 199± 204. I . regarding the integration of SAR-ERS data with other types of satellite and ground information.. P ohl. C . 1986 . Space at the Service of our Environment. Potentials and limitations of multisensor data fusion. A . Spain. A phenomenologica l approach to multisource data integration: analyzing infrared and visible data. C . 660. L iu. P ohl.. 78± 102.. pp.. 14± 15 June 1990 . and M angolini.E tchegorry.S. B . G .. pp. pp. M eneses.. 1992. F . J ordans. Spectral and spatial attribute evaluation of SPOT data in geological mapping of Precambrian terrains in semi-arid environment of Brazil. pp.. 6± 8 February 1996 . pp.. The 1995 ¯ ood in the Netherlands from space.. 39 ( Enschede: ITC). 49. assessment. P . C . SPOT/ ERS Image maps: Topographic map updating in Indonesia. T .. P ohl. 1993. IT C Journal . J . 1994. 15. 1994-4. C . 331± 336. ISPRS Journal of Photogrammetry and Remote Sensing. J . P radines. IT C publication No. C . R . and W ang. N . Image fusion of microwave and optical remote sensing data for map updating in the Tropics. Proceedings EUROPT O ’95. Proceedings of the Second ERS-1 Symposium. M ougin. New V iews of the Earth. SP-365 ( Paris: European Space Agency). France.. Y . and G enderen. NASA Conference N ezry. 1993. results. D .. E . Cannes. Tropical vegetation mapping with combined visible and SAR spaceborne data. Cepadue  s Editions ( Paris: CNES). 59. 4± 6 November 1992. M . T oledo. M . Proceedings of First ERS-1 Pilot Project Workshop. SP-365 ( Paris: European Space Agency). 1994.. Improving SPOT image size and multispectral resolution. M . U. B . 1995. Provision of ¯ ood information in the Netherlands. P olidori.. 14. and G uerre. W ang. Proceedings CNES Conference..-P .. P ohl. C .. H . Spain. Maryland. L opes. A lmer... J . P ellemans. International Journal of Remote Sensing. R . A . 851± 860.. Paris. Proceedings EARSeL Conference. R aggam. P aradella. 2579. 1995. L . Proceedings SPIE Conference.. N . Photogrammetric Engineering and Remote Sensing.... R . 21. ISBN 90 6164 121 7. T oledo. Geometric integration of multi-image information.. 81± 87. 11± 21. and developments. Germany. N . F . 13± 19. O prescu. Downloaded by [Gitam University] at 20:56 04 October 2013 ... P erlant. C . J . and N ecsoiu. with applications over the continental platform of the Black Sea. 22± 24 June 1994 . 25± 29 September 1995 . P erlant. Earth Observation Quarterly. V . and S trobl. Image and Signal Processing for Remote Sensing. A .. V itorello.. van. Multisource Data Integration in Remote Sensing. 563± 568. pp. December 1995. K oopmans. R . Cannes. 2± 10. Fusion of Earth Data. P . 414± 415. M . Proceedings of First ERS-1 Pilot Project W orkshop. Earth Remote Sensing using the L andsat T hematic Mapper and SPOT Systems. Technological researches Publication 3099 (Greenbelt: NASA). France. and G enderen. 1996. Preliminary V ersion (Paris: European Space Agency). 1995. B raescu. and P ersie. C . 22± 24 June 1994 . M attos. N . S empere. Geometric aspects of multisensor image fusion for topographic map updating in the humid Tropics. as well as determining their complementarity. and K oopmans. 1255± 1260. 47. D . Proceedings First ERS-1 Symposium. A combination of SAR and optical line scanner imagery for stereoscopic extraction of 3-D data. J .. and A llewijn. L . E . P ohl. SPIE Vol. P ohl. J . pp. 11± 14 October 1993. L . J . 1994.. 61± 73. 1988. Hamburg. and G astellu. M . 1993. the Danube Delta and the seismic zone Vrancea. Merging multispectral and panchromatic SPOT images with respect to the radiometric properties of the sensor. and D utea. F . Paris. Y .. SPOT MAGAZINE. Space at the Service of our Environment. SPOT 1: Image utilization. J .. Remote Sensing of Environment .-F .

R ay.. F . MD: NASA). A generalized component substitution technique for spatial enhancement of multispectral images using a higher resolution data set. Proceedings of W orkshop. T ransactions on Geoscience and Remote Sensing. K .A. 1991. B .. 1985.E. T . Maryland. U.. W ald. 16. 6± 10 December 1982 (Forth Worth: International Archive of Photogrammetry and Remote Sensing ). 1990. Photogrammetric Engineering and Remote Sensing. S imard. pp. Germany... Espoo. International Journal of Remote Sensing. A . U. 6. (Forth Worth: ISRS). pp. 433± 440. 361± 368. and P rabhakaran.. A . R . 6± 8 February 1996 R ast..E. 615± 619. RS for Exploration Geology.. Second T hematic Conference. 1990.. R eimchen. 23. Standardized principal components.. S harpe. 14± 15 June 1990 ... Pohl and J. 12. 1984. 145± 149. K . 1987. 1991. The ARSIS method: a general solution for improving spatial resolution of images by the means of sensor fusion. P . Forth W orth. ( Paris: European Space Agency).. ACSM-ASPRS Annual Convention.A.A. Fusion of Earth Data. Downloaded by [Gitam University] at 20:56 04 October 2013 . geophysical. 1996. pp. 681± 687.S.. and A ranson. L .S.E..E. pp. M .E. A . 51. 1990. I. 1989. Proceedings International Symposium. France. International Journal of Remote Sensing. NASA Conference Publication 3099 (Greenbelt. Cannes. Location of economic gypsum deposits by integration of multispectral. C . and W ald. C . 1995. 32. RS of Environment. 1992. U. pp. S hufelt. Multisource Data Integration in Remote Sensing. J ain.S.A. Digital change detection techniques using remotely-sensed data. and F rancis. T . Hamburg. M . 94± 110. L . A . 1982. S ingh. and K err. 3± 6 June 1991 (New York: I. T he image processing handbook . Image Processing and Remote Sensing. 8. T . 1995. C . 1993. NASA Conference Publication 3099 (Greenbelt. S chistad-S olberg. 119± 125. A . A . and G arrity. Multisource classi® cation of remotely sensed data: fusion of Landsat TM and SAR images. R . 501± 508. 1982.. Maryland. Photonirvachak.. V . R ogers.. pp. Summary of types of data fusion methods utilized in workshop papers.. Forth W orth. Journal of the Indian Society of Remote Sensing. R uss. H .. and N guyen. -botanical and geological data.. 14. Evaluation and integration of ERS-1 European Space Agency). 883± 396. Proceedings of the I. Thematic mapping from multitemporal image data using the principal component transformation.. An exploration of co-registered SIR-A. 989± 1003. C . M ..). Multichannel fusion technique for resolution enhancement during ground processing.. S ingh. J . Photogrammetric Engineering and Remote Sensing. T echnical Papers 1990.. B .. R ichards. U. The wavelet transform for the analysis of remotely sensed images. 58. L . Proceedings International Symposium. and W ood. A . A . A . Synergistic use of MOMS-01 and Landsat TM data. 352± 360. International Journal of Remote Sensing. S heffield.. J . Selecting band combinations from multispectral data. MD: NASA ). ESA SP-361 ( Paris: R anchin. S hen. T . Fusion of ERS-1 SAR and SSM / I Ice data. S . Proceedings EARSeL Conference. K . van Genderen R amseier. multitemporal. Comparative digital analysis of SeasatSAR and Landsat-TM data for Iceland.. P . Space at the Service of our Environment. 1994. and M angolini. 527± 544.A . R ebillard. Proceedings of W orkshop. Use of information fusion to improve the detection of man-made structures in aerial imagery. and T axt. Improved spatial and altimetric information from SPOT composite imagery. J . Second T hematic Conference. Remote Sensing of Environment . pp. 1459± 1462. M . 77± 86. L . and M c K eown. 109± 118. U. Multisource Data Integration in Remote Sensing. A .E. Proceedings of the Second ERS-1 Symposium.S.. and H arrison.. R oy.E. T . 4. The history and status of merging multiple sensor data: an overview. Forth Worth. 10... A rmour. R . P . O . W . SEASAT and Landsat images. RS for Exploration Geology.. B . 1982. RS of Environment. -chemical. 14± 15 June 1990 .. Proceedings ISPRS Conference. P . S . R othery.E. (Forth Worth: ISRS). H . D . F . H .852 C. J askolla.. 35± 46. K . D . S hettigara. International Geoscience and Remote Sensing Symposium (IGARSS ’91). 1993. R anchin...E. International Journal of Remote Sensing.S. E mmons. Finland. 11± 14 October 1993. K . 1985.. 561± 567. Second edn (London: CRC Press). International Journal of Remote Sensing. 768± 778... SAR and optical sensor data (TM and IRS) for geological investigations.

N . International Journal of Remote Sensing. 1990. Geocoding and stereo display of tropical forest multisensor data sets.. pp. International Archives of Photogrammetry and Remote Sensing. 1988. Comparative evaluations of geodetic accuracy and cartographic potential of Landsat-4/ 5 TM image data. Analysis of digital Landsat MSS and Seasat SAR data for use in discriminating land cover at the urban fringe of Denver. First results of international investigations of the applciations of SPOT-1 data to geologic problems..S. S . M alila. Merging multiresolution SPOT HRV and Landsat TM data.. 1988. R aggam. D . W eydahl. J . W ilkinson. and K a ¨ hler. M . 25. Forest mapping from multi-source satellite data using neural network Me classi® ers Ð An experiment in Portugal. M .E. W ang. Kyoto. R . J ordon..E. International Journal of Remote Sensing.E. J . V .E. M c C ormick.. Temporal change detection in ERS-1 SAR images. T outin. mineral and energy exploration. 1459± 1461. B elhadj. France...E. F ullerton. 98± 107. W elch. P . and W eller. R .).. and H igashi. 6± 8 February 1996 ( Paris: European Space S trobl. Multisource data integration with an integrated and uni® ed geometric modelling. 9. 1389± 1392. 107± 112.. S . Better Understanding of Earth Environment. France. Japan (Kyoto: International Archives of T oll. ( Paris: CNES). 1992.A. J . 1995.E. R . and S ansal. 83± 106.. G . and R ivard. K anellopoulos.. R . 1993 .. Decorrelation and IHS color transforms applied for multisource satellite images integration. T ransactions in Geoscience and Remote Sensing. B . Crop classi® cation using airborne radar and Landsat data... Proceedings of 14th EARSeL Symposium on Sensors and Environmental Applications. Photogrammetric Engineering and Remote Sensing. 395± 408. 701± 708. W . I . and ’ gier.E. M . International Journal of Remote Sensing. Multi-spectral analysis of ice sheets using co-registered SAR and TM imagery. Japan. J . J . 12.. 1982. 1985.Multisensor image f usion in remote sensing 853 S mara.. Proceedings CNES Conference. IV238± IV247. I. W eydahl. 163± 174. D . pp. Photogrammetric Engineering and Remote Sensing. 9. and B lindschadler. G .. ( Paris: European Space Agency). Go È teborg. L . Y . 1209± 1211. assessment. Photogrammetric Engineering and Remote Sensing. A new tool for depth perception of multi-source data. M .. T .A issa. 53. A . F olving. L i. K . Houston. 18± 21 August 1993 (New York: I.. V ornberger. 42± 51. T anaka. W elch. results. 13. Proceedings 12th Annual I. 6± 8 June 1994. pp. U laby. 1249± 1262. R. S ugimura. High resolution satellite image map from the SPOT and Landsat TM data... Improving the quality of satellite image maps by various processing techniques. T .. B . 301± 303. International Geoscience and Remote Sensing Symposium (IGARSS ’92). U. pp.. 1987. edited by Agency). Cartographic feature extraction from integrated SIR-B and Landsat TM images. Cepadue  s Editions T auch. and P ohl. T . 6.. T oulouse.E. Advanced Space Research. 115± 120.. Remote Sensing of Environment . and L uvall. J ... pp.E. Sweden (Paris: European Space Agency). Proceedings 10th EARSeL Symposium. 873± 889. 1990. and B uchroithner. M . 1994. 637± 645.E. France S uits. F . R . Remote Sensing Reviews. Fusion of Earth Data. and E hlers. International Geoscience and Remote Sensing Symposium (IGARSS ’93). R . B . Cannes. Stein (New York: I. J ordon. D . G . 20. The 1995 ¯ ood in The Netherlands monitored from space Ð a multi-sensor approach. R . N .. . T outin..E. Multitemporal analysis of ERS-1 SAR images over land areas. 1988. C . T . T okyo. Y .E. 56.E. and E hlers. and S hanmugan. W elch... K . Y .E. and E hlers. 1985.. 1992. A . M . 1209± 1229. Williamson with collaboration of T. R ... 1995.E. 1988.. 2735± 2739. Procedures for using signals from one sensor as substitutes for signals of another. Proceedings XV I ISPRS Congress. . 1996. Paris. F . Proceedings 13th Annual I. 51.. T . pp. pp.. Terrain correction geocoding of a multi-sensor image data set. T . 61. L . S . T . C . K oopmans... D . 16.. International Space Year: Space Remote Sensing. W elch. Colorado.. SPOT 1: Image utilization... Downloaded by [Gitam University] at 20:56 04 October 2013 Photogrammetry and Remote Sensing). T aranik. R . Proceedings EARSeL Conference.. International Journal of Remote Sensing. 1989. Photogrammetric Engineering and Remote Sensing. 1346± 1348. 1995..).

271± 280.S. Integration of Landsat. 1994. 43.. Fusion of Earth Data. 1979.. and R olet. International Journal of Remote Sensing. L . 1993 b. J . H . Seasat. Y . Remote Sensing of Environment . E . Ye ’ sou. W ... and S tromberg... pp. ’ sou. 15.. 23± 27 April 1979 (Ann Arbor: E. Perception of a geological body using multiple Ye source remotely-sensed dataÐ relative in¯ uence of the spectral content and the spatial resolution.. U. and R olet. ISPRS Journal of Photogrammetry and Remote Sensing. C . and other geo-data sources. M . ERIM.R. B lackwell. H . 2495± 2510.. and P ion. J .. Proceedings EARSeL Conference. Y . and O zel. A lparstan. 48. 1993 a.. J .A. 6± 8 February 1996 ( Paris: European Space Agency). . R olet.. B esnus. 265± 280.. H . France. Y .). 23± 36. Z obrist..M. 1996. Data fusion of sharpened di erence image by IHS transform. Merging Seasat and SPOT imagery for the study of geologic structures in a temperate agricultural region. Proceedings 13th International Symposium on Remote Sensing of Environment. J . E .I.. Cannes.. H . A . pp. Y ildimi. Ann Arbor. R . B esnus. B esnus. Extraction of spectral information from Landsat Ye Downloaded by [Gitam University] at 20:56 04 October 2013 TM data and merger with SPOT panchromatic imageryÐ a contribution to the study of geological structures.854 Multisensor image f usion in remote sensing ’ sou. 125± 130. J . D ..