Professional Documents
Culture Documents
WR - Monitoring Gas Pipelines PDF
WR - Monitoring Gas Pipelines PDF
1. Keywords
Remote Sensing, Gas Transmission Pipeline Monitoring, Security, Infrastructure Monitoring, Unmanned Aerial
Vehicles; Optics, SAR, Image Processing, Feature Extraction
2. Abstract
It is in the interest of any gas company to maintain the value of its pipelines and to protect them effectively
against damage caused by third parties. As a result of global progress in high-resolution remote sensing and image
processing technology, it is now possible to design natural gas pipeline monitoring systems with remote sensors and
context-oriented image processing software. Recent developments in UAV technology show their suitability as plat-
forms for such customer driven missions [1,2]. Two different scenarios for a UAV based gas pipeline monitoring sys-
tem will be discussed.
3. Introduction
The legal framework in place to ensure the safe operation and supervision of oil&gas transmission pipelines dif-
fers substantially from country to country. In some cases, the differences are quite significant. Regardless of the re-
quirements imposed by authorities, oil&gas companies themselves take a host of measures to ensure that their pipeline
systems are operated in a safe, economic and environmentally friendly way and that they are protected effectively
against damage caused by third parties. Concerning the size of oil & gas pipeline networks all over Europe, the oil seg-
ment is of a rather reduced size compared with the 200,000 km natural gas transmission pipelines. Thus, the focus of
this study is on the monitoring tasks of the European natural gas transmission network, whereas later expansion to other
networks would enlarge the application of such remote sensing systems. The monitoring methods most widely used for
natural gas transmission pipelines include foot patrols along the pipeline route and aerial surveillance using small planes
or helicopters. These patrols prevent developments and events which could place high pressure pipelines, the surround-
ings of pipelines or security of supplies at risk. Although these methods ensure a high level of safety in pipeline opera-
tion, the cost is also very high.
1
e-mail dieter.hausamann@dlr.de
• Soil upheaval, erosion, deep vehicle tracks, water-logged surfaces,
• Planting of new shrubs and trees,
• Discolouring of vegetation above the pipeline,
• Temporary deposition of materials and agricultural products.
In addition, any transport and work carried out within a 200 m-wide strip must be reported if there is reason to
believe that it may affect the pipeline route at a later stage.
A gas emission detection system must be capable of identifying possible small gas escapes with flow rates of
0.01 - 10 m³/hr at an early stage. Any major gas emission caused by severe damage to a pipeline is detected and re-
ported directly by other systems.
These monitoring tasks have to be carried out throughout the year at regular intervals, largely regardless of
weather conditions. Although the areas monitored differ quite significantly in terms of soil characteristics, vegetation
and building density, it is important for future remote sensing methods employed to be usable in almost all types of
terrain.
On the basis of a rapid automatic data fusion and evaluation process the future remote monitoring system must
be capable of identifying objects and situations representing threats to the pipeline. In terms of its price/performance
ratio it must at least correspond to the methods presently in use, or be better.
5. Suitable optical and Radar sensor technology for monitoring of gas pipelines
Optical and infrared remote sensing systems
General principle:
The principle of optical sensors is that via an optical lens system the amount of sunlight as reflected by the earth
is sensed. Normally only light in a specific spectral window is gathered, e.g. red, green or blue light. High resolution
optical systems sense the light intensity for very small areas on the earth surface, in the order of 0,5 to several meters.
Given this principle, optical sensors can be characterised by a number of important features:
1. Optical sensors correspond to the human eye in many aspects, which makes interpretation easy.
2. Optical sensors can reach relatively high spatial resolutions. Resolutions of less than 0.5 meter from space
and some centimeters from the air are possible. Limiting factors for the spatial resolution are the sensitiv-
ity of the detectors, the size of the lens or mirror and the refraction.
3. Optical sensors can only be used in daytime. Since optical sensors make use of a passive light-source, the
sun.
4. Optical sensors are hindered by cloud cover, which represents, from an operational point of view, an im-
portant limitation. since especially Europe is frequently covered with clouds.
Imaging mechanisms:
For a digital optical/IR system different mechanisms can be distinguished for imaging a large area:
• Rotating (Whiskbroom) scanner: By this sensor mechanism there is only one sensor element per spectral chan-
nel. In order to build up a total image, the element scans the earth perpendicular to the flight direction by using
a rotating mirror. As a consequence of the speed over the earth of the platform the image is built up in flight di-
rection.
• Pushbroom scanner: In this case the sensor consists of a long row of light sensitive elements (CCD's) with
which in one moment a complete image line is observed. As a consequence of the speed of the platform over
the earth surface the second image direction is built up.
• Matrix sensors: These sensors do have a matrix of CCD-elements (a focal plane array, FPA) so that in one
moment a complete image can be gathered.
Spectral information:
Optical sensors can have a different number of spectral bands with a different bandwidth:
• Panchromatic sensors. They have only one, relative broad, spectral band, generally covering the region of 500
to 700 nm, resulting in a 'black and white' image. So the image contains little spectral (colour) information.
Differences between objects with different colours but the same intensities cannot be observed. On the other
side very high spatial resolutions can be reached because of the wide spectral band, collecting relatively much
energy. The spectral band for panchromatic sensors is in general chosen from 500 to 700 nanometre. So corre-
sponding to the human eye. The blue region (400 to 500 nm) is not included because these region is often dis-
turbed by atmospheric influences. Sometimes the spectral band is extended up to 800 nm. This gives the ad-
vantage that a lot of energy reflected by vegetation is included and also that the vegetation can clearly be rec-
ognised in the images.
• Multispectral sensors. These sensors contain a limited number of spectral bands (3 to 10). The spectral bands
are defined by spectral filters which only pass light within a certain spectral region, generally with a bandwidth
of 10 to 100 nanometer. Because of the less wide spectral bands the spatial resolution (and or the radiometric
resolution) of the multispectral images in general is less then for panchromatic images. The spectral bandwidth
is in the order of 10 to 100 nanometre, also depending on the spatial resolution which has to be obtained, and
on the spectral information to be gathered. Most multispectral sensors at least do have three bands: green (550
nm), red (550 nm) and near infrared (850 nm). These three bands are represented as false colour image, in
blue, green and red respectively. Within these images most information which is sensed by the human eye is
represented plus information on the vegetation, which is represented in the near infrared channel. A lot of sen-
sor systems do only have these three bands: red green and near infra-red. Some multispectral sensors do have
more spectral bands:
• Blue band (400 nm). Combined with the green and red band this gives natural colour images. Also this band
contains information on the atmosphere so that it can be used for the atmospheric correction of the other spec-
tral bands.
• Bands in the infrared (1 – 15 µm), where spectral bands in the near and middle infrared (900 to 2500 nm)
mainly contain information on vegetation and soil moisture, whereas the spectral bands around 10 µm
characterise especially the temperature.
Developments in high resolution optical systems have gone relatively fast during the last years. For spaceborne
systems the last years several commercial satellites have come available with panchromatic resolutions of better than
1 m and multispectral resolutions of better than 3 m. For airborne platforms during already for a long time high
resolution optical digital sensors. Currently a subdivision can be made between four types of systems:
• High quality photogrammetric systems. Currently the first operational photogrammetric systems are coming to
market. The resolution and geometric requirements of these systems are very high. Main applications are large
scale mapping and DEM generation activities. The processing software for these systems is very sophisticated
and almost fully automated. Examples are the Z(I Imaging, For the application of pipeline monitoring the sys-
tems are probably of too high quality and therefore of too high cost.
• Multispectral scanning systems. Multi-spectral airborne scanners exist for about 20 years already and more and
more operational systems come available. Different examples exist of multispectral sensors operating in the
VIS/NIR region, both pushbroom scanners and matrix cameras. Also examples exist of operational sensors op-
erating in the VIS, NIR and MIR region, e.g. the Daedalus (whiskbroom) scanners, which operate with 10 to
20 channels. Important aspect for these systems is that the data processing software is strongly related to the
system (sensor, radiometry, geometry) and therefore need to be fully operational.
• Digital camera based systems. The quality of commercial digital camera's is increasing rapidly, which means
that these camera's are more and more suited for operational use. Especially the high end camera's provide
good detail and radiometric sensitivity, and are provide possibilities to influence the spectral band definition,
quantisation and automatic read out of the imagery. Examples exist of operational systems of one or a cluster
of digital camera's, including a highly automated and operational processing chain.
• Digital video based systems. Also the quality of digital video camera's has increased during the last years. Still
the number of pixels and the sensitivity stays a limitation. It is expected that within a number of year very high
resolution video camera's will enter the market with non interlaced images of 1000*1500. These might be
suited very well for airborne monitoring systems. Also currently examples exist however of operational video
based airborne monitoring systems. Even from false colour video systems.
This airborne sensor is a bulky instrument (see. Fig.2), however it is an operational system (including the data
processing chain), it has bands from the visible to the thermal infrared wavelength range, and (most important) it is a
highly economic and technically simple system. These are the main reasons for utilising the Daedalus as a suitable
optical and infrared sensor system in a demonstration campaign. With this configuration, quite a range of different air-
borne and spaceborne platforms and mission scenarios, as well as wavelength ranges and other sensor parameters, can
be investigated. However, the Daedalus system is not suitable for operation on a UAV.
For the utilisation on a UAV platform, suitable lightweight IR sensor technology is available on the market, e.g. an
AGEMA 570 infrared camera with a real-time data acquisition system (Fig. 3).
• Principle: Uncooled Microbolometer Array, temperature stabilized
• Detector: Vanadium Oxide, 320 x 240 Pixel
• Spectral Range: 7.5 - 13 µm
• FOV: 12° x 9° resp. 24° x 18°
• Spatial Resolution: 0.65 mrad resp. 1.3 mrad (1 m in 0.7 resp. 1.5 km)
• Temperature Resolution: 0.15 K @ 300 K
• Framerate: 50 Hz
• Weight: ca. 2 kg without Objectives and Power Supply
For both UAV cases, the technical feasibility of the total system and platform is only the first step: Once all (fu-
ture) requirements and standards for a safe operation are met, the crucial aspect of cost-effectiveness can be considered,
which has to take into account all cost categories including potentially high expenses for certification and operation.
10. Acknowledgement
This work has partly been accomplished in the frame of the EU projects PRESENSE (http://www.presense.net/),
Contract No. ENK6-CT2001-00553, and UAVNET (http://www.uavnet.com/), Contract No. Contract No. G4RT-CT-
2001-05053. The respective inputs from both consortia, especially from Intermap Technologies and NLR, are kindly
acknowledged. Special thanks to Marcus Schwäbisch (Intermap Technologies) and Peter Reinartz (DLR) for processing
the sensor data.
11. References
[1] Hausamann, D., “Civil Applications of UAVs – User Approach”, Shephard’s Civil UAV Symposium, London,
UK, 17 – 19 July (2002).
[2] Hausamann, D., Brokx, W., “User Driven UAV Applications - Pipeline Monitoring and other Examples”, Proc.
First European Conference on the Applied Scientific Use of UAV Systems, Kiruna, SW, 10 - 11 June (2002).
[3] Zirnig, W., Hausamann, D., Schreier, G., “A Concept For Natural Gas Transmission Pipeline Monitoring Based
on New High-Resolution Remote Sensing Technologies”, Contributed Paper, International Gas Research Confer-
ence 2001, Amsterdam 5th – 8th November (2001).
[4] Zirnig, W., Hausamann, D., Schreier, G., “High-Resolution Remote Sensing Used to Monitor Natural Gas Pipe-
lines”, Earth Observation Magazine, 11, pp12-17 (2002).
[5] eCognition (2003); most recent information under www.definiens-imaging.com.
[6] Benz, U. C., Schreier, G. (2001): OSCAR - object oriented segmentation and classification of advanced radar
allows automated information extraction. In: Proceedings of IGARSS 2001, July 2001, Sydney.
[7] Blaschke, T., Strobl, J. (2001): What’s wrong with pixels? Some recent developments interfacing remote sensing
and GIS. In: GeoBIT/GIS 6: 12-17. http://www.definiens-imaging.com/down/GIS200106012.pdf
12. Figures
A) Θ0
SW
Θdr
r
azimuth x
SW
B) SW
Θa Θ da
C) range direction r
Human knowledge
Raster & vector
stored in “rule bases“
data
Classification
based on “rule bases“
Automatic
generation of
object hierarchy
Result:
Information from images ready to use
in geographical information systems
Figure 5: Processing steps and data flow within object oriented eCognition: Raster and collateral vector-information are
turned into a hierarchy of objects. Pre-defined rule based – applied to this object hierarchy – classify and label the objects. The
results can be exported to GIS for further analysis
Figure 6: Identification of objects near a pipeline track based on 1 m resolution IKONOS images. The right image contains
the eCognition object classes, separated into areas outside and inside the pipeline surveillance corridor.
Satel-
Air-
Imag
Fea-
Feature
Supp
Data Fu-
sion &
Feature and
Scenery
Op (GIS)
eration
Change De-
tection
co
Alarming
System
Figure 7: Sketch of a fully operational image analysis system for pipeline monitoring. The principle would be the compari-
son of objects – found in the imagery – with reference GIS information, available for the pipeline track. Some identified objects (such
as permanent new buildings) need to update the GIS in order not to be subject for false alarms during the next inspection.
Figure 8: Pipeline Monitoring Test Site
c) d)
e) f)