Professional Documents
Culture Documents
A Heads Up Display Based On A DGPS and Real Time Accessible Geo-Spatial Database For Low Visibility Driving
A Heads Up Display Based On A DGPS and Real Time Accessible Geo-Spatial Database For Low Visibility Driving
There are two computers used for the but essentially transparent optically correct
generation of the HUD: a geo-spatial map plastic lens. The light reaching the eyes is a
database manager and a graphic image generator. combination of the light passing through the lens
The processor (QNX, PC-104) used for map and light reflected from the projector. The
database management was also used for driver, actually sees two images superimposed
collecting other sensory information from the together. The image passing through the
vehicle. The second processor (Windows98, Dell combiner comes from the actual frontal field of
Inspiron 3000 notebook) was used only for view while the reflected image is generated by
generating the screen projection. Query results the graphics processor. The combiner’s optical
from the geo-spatial database are delivered characteristics allow it to generate a virtual
through a local Ethernet network. Details about screen projected to float 30 - 40ft ahead of the
the communications between the database unit. This feature, which results in a virtual focus
managing processor and the graphics processor in front of the vehicle, ensures that the driver’s
are explained in the geo-spatial database section. eyes do not have to focus back and forth between
An image projector and combiner the real image and the virtual image, thus
system manufactured by Delco was used for reducing eye strain.
prototype development. This HUD was The HUD software was developed
originally designed and sold for displaying using Microsoft Visual C++6.0 under the
character based information in patrol cars. Its Windows 98 operating system using the QNX
NTSC video-displaying feature was used for the real-time operating system software development
HUD development described in this paper. Video tools.
output from the graphics processor was fed into a The position of the vehicle and the
scan converter which converts VGA input into information stored in the map database is
the needed NTSC video signal. The unit projects expressed in terms of a global coordinate frame
the image on to a combiner, a partially-reflective, located in the North American Datum 1983
(NAD83) Minnesota South State Plane situations. The camcorder was mounted at the
coordinates in units of meters. The raw DGPS driver’s right eye position using custom
string, which contains latitudinal and mounting brackets. The optical image stabilizer
longitudinal angle data coming from the DGPS, in the camcorder was enabled while taking video
was converted from degrees into global images. The digitally stored images were
coordinates. The vehicle coordinate frame was transferred to a PC, then processed and analyzed.
defined as a frame moving with the GPS antenna To synchronize the beginning of the video image
mounted on top of the vehicle. A local stream, a special mark was put on the video
coordinate frame was attached to the driver’s screen by the HUD software when recording was
eye, see Figure 2. From his/her view point, started.
straight forward was defined as positive y, an Sampling for error analysis was done at
axis to the right was assigned as positive x, and two second intervals along the centerline. A
straight up was positive z. The extracted map special grid mark was drawn to synchronize
data was converted into a local coordinate frame these two second intervals as shown in Figure 3.
that moves with the vehicle. In Figure 3, yellow lines (which may not be
apparent in a gray scale image reproduction) are
the computer generated. The three segments
Eye Coordinate along the centerline are reference marks to
System measure mismatch error. Each horizontal line
Road Location Data z x segment is 0.5m long, and the gap between two
Pk (Xk, Yk, Zk) z
y x line segments is also 0.5m. The height of the
vertical mark is also 0.5m.
Errors associated with the projected
Z lane boundaries are computed by comparing the
Vehicle Coordinate distance the projected lines are displaced from
Y
System (VX,VY,VZ) the actual lane boundary. The lateral
displacement of the projected lane boundary
OG X from the actual lane boundary at any of the three
Global Coordinate System marks is computed knowing that the length of
Figure 2. Coordinate Systems the reference mark is 0.5m. This lateral
displacement is then normalized by dividing by
the distance to the camera, which yields a value
for the normalized visual sighting angle.
EXPERIMENTAL SETUP
A Navistar International truck was used
as the test bed for the experiments. No yaw rate
gyro or magnetometer was used for estimating
the vehicle’s heading angle. The vehicle’s
heading angle was estimated only from the
trajectory of recent DGPS position values. A
simple difference method that calculates a vector
angle of current position from the last past
position of the vehicle was used to estimate
vehicle heading. By increasing the look-back
distance, noise in the heading estimate was Figure 3. Reference marks for error analysis
attenuated. The effect of the look-back distance (blown up from central portion of Figure 1)
on the heading angle estimate will be discussed
in the following experimental result section. The error was measured at three
Live video images were captured while different ‘look-ahead’ distances: 60m (196.8ft),
a driver was driving the test truck on a road. 90m (295.3ft), and 120m (393.7ft) as measured
Position data coming from DPGS was also from the driver’s eye. The topmost horizontal
simultaneously stored. A Canon Optura digital grid mark, i.e. the furthest one in Figure 3, is
video camcorder was used to record the 120m ahead.
projected HUD screens during actual driving
The objective of the conformal HUD is can be seen that the errors at all three distances
to construct and project road characteristics onto match quite well.
the display screen (or combiner) that exactly
matches with real road characteristics. To 2
quantify how well this is done, the “visual sight
1.5
angle” (vsa) was used to describe the mismatch
1
error. The visual sight angle is defined by the
ratio of the actual lateral error associated with the 0.5
Error (degree)
lane projection and distance to the eye point as 0
shown below: -0.5
-1
lateral error at distance x 60m
vsa = . -1.5 90m
distance x 120m
-2
0 20 40 60 80 100 120
The visual sight angle normalizes error Time (sec)
-0.5
EXPERIMENTAL RESULTS -1
Error (degree)
side to side weaving motion indicated that the 0
system could indeed achieve an average error of -0.5
approximately 0.25 degrees (equivalent to 0.5m
-1
at 120m). E60m
E90m
-1.5
E120m
-2
2
0 10 20 30 40 50
1.5 at 60m ahead Time (sec)
0.5
Figure 7. Effect on error or vsa at different
forward-looking distances at 30 mph.
Error (degree)
0
Estimate based on look-back distance 1.0 m
-0.5
Figure 8 depicts the total time measured
-1 0.5m
1.0m
for the computation associated with coordinate
-1.5
1.5m transformation from DGPS signals in the global
-2 coordinate system to the perspective projection
0 10 20 30 40 50
Time (sec)
in the eye coordinate system, including clipping
the results outside the field of view, and includes
the HUD screen drawing time for a typical
Figure 6. Effect on error or vsa of look-back
driving situation. Time to screen refresh is
distances 0.5m, 1.0m, 1.5m at 30mph
measured from the moment that the DGPS data
The mismatch errors or vsa measured at is received. This screen refresh time (which
the different look-ahead distances (i.e., 60m, includes the computation time above) was less
90m, and 120m) can be found in Figure 7. The than 16ms in most cases. The segment from 100
important errors to analyze are those during the to 300 sampling points is for the case when the
no transient condition (i.e. during steady state vehicle is moving in a straight line. Centerline
driving from 18 to 26 seconds and 38 seconds and both side line segments were drawn in the
on). The system was calibrated at the 60m field of view. Drawing was limited to 1000 ft
distance; thus we would expect that the largest ahead in the heading direction of the vehicle.
error would occur at the furthest point, i.e. at The reduction in times occurred when
120. Although we expected that there would be there was little or nothing to draw, i.e. everything
an error offset between the three look-ahead was out of the visual field of the combiner.
distances, we assumed that they would be During the segment from 50 to 100 sample
constant for all speeds. However, we found that points, the vehicle moved to the left and then to
error offsets were larger (by about 0.1 degree) at the right within the driving lane. The vehicle
higher speeds than at slower speed. This appears changed lanes completely during the segment
to be the result of the vehicle motion from the between 350 to 450 points.
time that the GPS position is acquired until the
16
14
12
10
Time (ms)
8
6
4
2
0
1 51 101 151 201 251 301 351 401 451
Screen Update Sequence number
GEO-SPATIAL DATABASE
Figure 9. HUD's Field of View and Query
The image of the landscape projected Polygon
onto the HUD is based on data retrieved from Since the HUD must continually query
and stored in a geo-spatial database. The the database, there are strict limits on query
database contains all the relevant road and response time and communication latency.
geographic information needed by the HUD and Experiments have been designed to measure
other systems onboard the snowplow. The these values.
geographic data is stored, arranged and searched
by its spatial attributes, hence geo-spatial. The
central database provides consistent geo-spatial GEO-SPATIAL DATABASE
data to all vehicle systems, and allows for easy EXPERIMENTS
updates and expansions. System Architecture. The Geo-spatial
The HUD displays to the driver geo- Database Management System (GDMS)
spatial information including the current lane processes queries and maintains the database.
boundaries. The lane boundaries that are drawn The GDMS runs on a PC104 single board
can correspond to the lane striping on the road, computer, based on a 333MHz AMDK6
the shoulder, or they can represent the lanes processor, running the QNX/Neutrino real time
where they would be if they existed (for operating system. As stated before, the HUD is
example, on a gravel road). Within the database, driven by a Windows based PC. For the HUD to
the lane boundaries are stored as lines and update its image, the result of a query is sent
curves. To query the database a polygon is over a local ethernet from the GDMS on the
defined, and the query processor searches the PC104 to the Windows computer driving the
database to locate objects that intersect or lie HUD. To control communications between the
within that polygon. After the entire database is GDMS and the HUD, a client process running
searched, the results are sent back to the task under the QNX/Neutrino operating system
which sent the query request. The HUD must updates the polygon and queries the database for
accommodate a moving vehicle; therefore, the the HUD. This client processes, then sends the
HUD must continually query the database for query results across the Ethernet, using UDP
new data that has entered its field of view. The protocols (described later), to the HUD
HUD requires all road information in the area processor. Two experiments were designed to
covered by its field of view and so defines the measure the GDMS performance and Ethernet
polygon used to query the database. Figure 9 communications. Experiment 1 measures the
shows the HUD's field of view and the query response time of the GDMS and
corresponding query polygon overlaying a Experiment 2 measures the latency of the
section of a database. ethernet communication.
EXPERIMENT 1 - QUERY third computer consisted of another PC104 with
RESPONSE the same specifications and performance as the
PC104 computer running the GDMS and used
To determine the response time of the the QNX/Neutrino real time operating system.
query processor and GDMS, a sample database To simplify communications, the User Datagram
was compiled and a series of queries were Protocol (UDP) was used to send packets of data
preformed and timed. The sample database was over the local Ethernet between the GDMS and
compiled from photogrametry data received from the HUD. The overhead involved with
the Minnesota Department of Transportation. It Transmission Control Protocol (TCP) was not
covers all four lanes of Highway 101 for an 11.6 needed in such a small local network. Maximum
km segment between Rogers and Elk River, packet size for UDP is 2048 bytes, so large
Minnesota. The sample database contains 3720 queries must be cut into packets and sent in
separate objects ranging from single point multiple pieces.
objects including signs and stop lights to
multiple point objects including guard rails and
lane boundaries. Figure 10 shows 400 meters
taken from the sample database, representing a
typical section.
To stress the GDMS beyond normal limits,
the sample database was intentionally left ‘raw’;
no smoothing or compression was used to
remove extraneous information. The experiment
was to execute a series of queries on the sample
database and measure and record the time from
when the query was submitted to when the
results were returned. The size of the query result
was also recorded. A query was preformed every
10 meters along the length of the sample
database. Figure 11 shows the query response
time verses the number of objects found during
the query.
As was expected, it can be seen that as
the density of objects increase, the query takes
longer and returns more objects. Figure 12
shows the query response time verses the size, in
bytes, of the query results. This correlates well
with Figure 11, since the more objects found
would lead to more bytes of data within the
results. Knowing the size of the query results, in
bytes, relates to the second experiment since the
query results have to be sent through the local
Ethernet between the GDMS and HUD
computers.
Table 1 summarizes the results of this
experiment. The results of this experiment show
that on average it takes .0327 sec. per query or
0.73 milliseconds per object to query the
database.
140
Transfer Time
for One Packet
Objects Found by Query)
Query Size (Number of
120
100 (seconds)
80
Average 0.00834
60
40
Standard 0.00656
20 Deviation
0 Minimum 0.00181
0 0.02 0.04 0.06 0.08 0.1
Figure 11. Query Time verses Query Size Using existing databases, rather than the
measured in Number of Objects Retrieved sample database used in experiment 1, the HUD
query results typically span 1 to 3 UDP packets.
The maximum time for query processing and
Query Time verse Query Size in Bytes communications to the HUD processor was
40000 shown to be 0.17 (or 0.0835 + 3*0.0313) seconds
35000
for large query results.
Query Size (Bytes)
30000
25000
20000 CONCLUSIONS AND FURTHER
15000
10000
RESEARCH
5000 A conformal HUD to assist drivers by
0 presenting augmented visual information was
0 0.02 0.04 0.06 0.08 0.1