You are on page 1of 29

US010393872B2

(12) United States Patent ( 10 ) Patent No.: US 10 ,393,872 B2


Brisimitzakis et al. (45) Date of Patent: Aug. 27 , 2019
(54 ) CAMERA AUGMENTED BICYCLE RADAR (58 ) Field of Classification Search
SENSOR SYSTEM CPC .. GO1S 13 / 931 ; G01S 2013 / 9332 ; GOIS 7 /04 ;
GOIS 7 /41; B62J 2099/0013 ; B62J
(71) Applicant: Garmin Switzerland GmbH , 2099/002 ; B62J 2099/0026 ; B62J 27/00
Schaffhausen (CH ) (Continued )
(72 ) Inventors: Evangelos V . A . Brisimitzakis , Lenexa, (56 ) References Cited
KS (US); Ross G . Stirling, Cochrane
(CA ); Kenneth A . Carlson , Gardner, U . S. PATENT DOCUMENTS
KS (US ); Franz A . Struwig , Western 5 ,005,661 A 4 / 1991 Taylor et al. ................. 180 /219
Cape (ZA ); Nolan van Heerden , 5 ,781, 145 A 7/ 1998 Williams et al. ........... 342 /20
Stellenbosch (ZA ) (Continued )
(73) Assignee : Garmin Switzerland GmbH ( CH ) OTHER PUBLICATIONS
( * ) Notice : Subject to any disclaimer, the term of this Printout from https://www .dcrainmaker.com /2014 /07 /hands -on
patent is extended or adjusted under 35 backtracker-radar.html; published prior to Jun . 3 , 2016 .
U .S .C . 154 (b ) by 341 days . (Continued )
( 21) Appl. No.: 15 /372 ,092 Primary Examiner — Timothy A Brainard
(22) Filed : Dec. 7, 2016 (74 ) Attorney , Agent, or Firm — Samuel M . Korte ; Max
M . Ali
(65) Prior Publication Data (57) ABSTRACT
US 2017 /0160392 A1 Jun . 8, 2017 A bicycle radar system including a camera is disclosed . The
Related U . S . Application Data system may include a radar unit and a bicycle computing
device that are in communication with one another. The
(60 ) Provisional application No. 62/ 264,539, filed on Dec . radar unit may transmit radar signals , receive return signals
8 , 2015 . (reflections), and process the returned radar signals to deter
mine a location and velocity of one or more targets located
(51) GOIS
Int. Ci13. /93 ( 2006 . 01)
in a sensor field behind a user 's bicycle. The radar unit may
also include an integrated camera to selectively provide
GOIS 13 /86 (2006 .01) images or video of an area behind the bicycle in the camera ' s
field of view . The radar unit may analyze the returned radar
(Continued ) signals and images and/or video to track the location of
(52 ) U .S . CI. targets located behind the bicycle . The bicycle computing
CPC ........... GOIS 13 /931 (2013.01); G01S 77003 device or the radar unit may also selectively activate the
( 2013 .01); G01S 77062 (2013 .01 ); GOIS 7 /24 camera based upon the satisfaction of particular conditions.
(2013.01 );
(Continued ) 18 Claims, 7 Drawing Sheets
440

Left on Main St. ot- 405


Informational 410
Overlay

- 444

LIEWS
AO + -- 408
402B
402A 403B
403A
414 Dist. to Turn Time to Turn
300 ft 0 :20
412 Tap to go back
US 10 ,Page
393,2872 B2

(51) Int. Ci. 2003/0201929 A1 * 10 / 2003 Lutter .............. .. GO1S 7 /032


GOIS 7 / 06 (2006 .01) 342 /52
GOIS 7 / 24 ( 2006 .01) 2008/0186382 AL 8 /2008 Tauchi ........ .. .. ....... 348/ 148
GOIS 7700 (2006 .01) 2009/0033475 AL 2 /2009 Zuziak et al. .......... 340 / 432
2013 /0127638 Al 5 / 2013 Harrison ..... 340 / 903
(52) U .S. CI. 2015 / 0025789 A1* 1/2015 Einecke .................. GOIS 13 /72
CPC ....... GOIS 13 /867 (2013 .01); GOIS 2013 /936 701/408
( 2013.01 ); GOIS 2013 /9353 (2013 .01 ); GOIS 2015 /0228066 Al * 8 /2015 Farb ................... G06K 9 /00805
2013/ 9357 (2013.01 ); GOIS 2013 /9378 348 / 148
(2013.01 ) 2016/0363665 Al 12/2016 Carlson et al. 342 / 146

(58 ) Field of Classification Search OTHER PUBLICATIONS


USPC . . . . . . . . . . ..... 342/55
See application file for complete search history . Printout from https://www .dcrainmaker.com /2015 / 10 /cycliq - tlyó
References Cited review .html; published prior to Jun . 3 , 2016 .
(56 ) Printout from https://www .dcrainmaker.com /2015 /01 /cycliq -mounted
U . S . PATENT DOCUMENTS camera .html; published prior to Jun . 3 , 2016 .
Printout from http ://www .slowtwitch . com /Products /2011 _ Interbike _
6 ,731, 202 B15 /2004 Klaus ........ 340 /425 . 5 Cerevellum _ Hindsight_ 35 _ 2349 .html, published prior to Jun . 3 ,
7 , 061,372 B2 6 / 2006 Gunderson ................... 340 /435 2016 .
7 ,079 ,024 B2 7 / 2006 Alarcon ............... .... 340 /539 . 11
9 ,449,518 B2 9 / 2016 Mochizuki * cited by examiner
atent Aug. 27 , 2019 Sheet 1 of 7 US 10 ,393,872 B2

100 bucuresti W YTY

GPS SATELLITES CELLULAR PROVIDER INTERNET PROVIDER


128
PHONE SERVICE
132
XwV CONTENT
wwwwwwwwwwwwwwwwwwwwww
134

AGLKXINERU
K W . . ... .. .. . KKKKKKKKKKKKKKW
PALICONER:
:
:

SIGNAL DATA NETWORK (S )


108

MOBILE BLECTRONIC DEVICE 102 ???

MEMORY 106 PROCESSOR


MAP DATA 116
MWwwwww
WW 104 - -
Ainului

POSITION

W
USER INTERFACE 136

w aw w
.

WAKUWAKARMA
DETERMINING .

AWL

KW
ww
BROWSER 140
wwwwwwwwwwwwwwwwwwwwwwwww
COMPONENT W

APPLICATIONS 138 112


GPS RECEIVER
** * ** + + YE Y E Y+ Y + + + + + + . + * ** * * .

NAVIGATION INTERFACE 142


.

NAVIGATION INFORMATION
wwwwww
wwwwwwwwwwwwwww
wwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwww
DISPLAY DEVICE
ROUTE SELECTION INTERFACE 120

MA
TOUCH SCREEN +.L L
METRIC (S ) 154

XARCK
WWW
TOPOGRAPHY
INFORMATION 156 I/O DEVICES
??????????????????

WWW 124
DIFFICULTY RATING 158 ti n t i+ +

COMMUNICATION
+ + + + + + + + + + + + + + + + +

.A L DISTANCE /TIME INFORMATION W


FERRRRRRRR
ETTETYYTYTYYTY
COMPONENT
XCREX 126
A
wwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwww

WWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWW HAPTIC FEEDBACK


ELEMENT WMAN
SPEAKER 178 180
XXWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWW

Wat ww ww w

148 DISTANCE TIME


123 15 min
or wemostre www www wum

W.W W
.
ww

w
wema 152
+ 165 FT
author
r anny
tu

your
w 12wwww
164
FIG . 1 ww GAUW
U . S . Patent Aug. 27 , 2019 Sheet 2 of 7 US 10 ,393,872 B2

200

ombre
gran

partnerim
204

2B
.
FIG
2A
.
FIG

208

things

2020
206 2027
atent Aug. 27 , 2019 Sheet 3 of 7 US 10 , 393 ,872 B2

300

UComnuicaton Camera
.
354 358 MTroadckugilent DLetraminteo Module 375
367 371
mm

URandiatr 308 UPonwietr360 ATasilembghyt 362 UMenmiorty364 MPSCroaendmctusrloieang

.
Proces r ASernsaoyr
352 356 365
PVMroicdeuslioneg369 MATsohedrumlanet373

FIG
.
3
301

EMDleocvbtrilnec 306
atent Aug. 27 , 2019 Sheet 4 of 7 US 10 , 393 ,872 B2

400 404
Left on Main St. Vot405
NA
Informational 2 426
410

w
Overlay 50ftA

WÁ.COK0 wspaer
w

wo d

150ft

403A

402A
Ramdwuk
ap en 250ft 402B
*X X * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *
-----424
Dist. to Turn Time to Turn
414 300 ft 0 : 20 416
po od
572 Tap to go back

FIG . 4A
2
atent Aug. 27 , 2019 Sheet 5 of 7 US 10 ,393,872 B2

440 404

418 Left on Main St. o | 405

W
W orAcmdn
Informational
Overlay
426

uvidonraklz 410

Xdownlamri
. . . . . . . . . . .+

W+YT WUN

stwa w s

ouvidawrnmt
CHARD

W WU w W WO EWOR Wo ,ODW
+

o
444
oo

W +WAWX
P*ERSKOANKRE 2014 MUWUNWE
+
408
40247
WWUWW * *

*X 402B
-403B
WA
403A ?? WE+

Dist. to Turn Time to Turn


416
300 ft 0 :20
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +

4124 Tap to go back

FIG . 4B
atent Aug. 27 , 2019 Sheet 6 of 7 US 10 , 393 ,872 B2

480
418 Left on Main St.
00000

Informational
OOOOOOOoooooooooooooo

010OC

RAMcoread
Overlay
:

*KER KER AK R
426

484 486

WX - 482 406
- 408
403A 402B
402A w W
403B

Dist, to Turn Time to Turn


414 -4
414 300 ft 0 : 20 416
412 -
hononananananananan a n
Tap to go back

FIG . 4C
atent Aug. 27 , 2019 Sheet 7 of 7 US 10 ,393,872 B2

500
GENERATE RADAR SENSOR SIGNALS AND
RECEIVE REFLECTIONS OF THE RADAR SENSOR
SIGNALS - 502
ANALYZE THE RADAR SENSOR SIGNALS TO - 504
DETERMINE A VELOCITY AND LOCATION OF ONE
| OR MORE TARGETS LOCATED IN A SENSOR FIELD

TRANSMIT TARGET DATA INCLUDING THE L 506


VELOCITY AND LOCATION FOR THE ONE OR MORE
TARGETS FROM THE ANALYSIS OF THE RADAR
SENSOR DATA

- 508

NO HAS A TRIGGER
CONDITION BEEN
SATISFIED ?

YES
CAUSE A CAMERA TO BEGIN CAPTURING VIDEO
DATA
L 510

ANALYZE THE VIDEO DATA TO DETERMINE A


VELOCITY AND LOCATION OF ONE OR MORE 512
TARGETS LOCATED IN THE CAMERA 'S FIELD OF
VIEW

TRANSMIT TARGET DATA INCLUDING THE E514


VELOCITY AND LOCATION FOR THE ONE OR MORE
TARGETS FROM THE ANALYSIS OF THE VIDEO DATA

FIG . 5
US 10 , 393 ,872 B2
CAMERA AUGMENTED BICYCLE RADAR phone, smart watch ,head -mounted in -sight display , portable
SENSOR SYSTEM navigation device , or the like ). The mobile electronic device
may be mounted to the bicycle or worn by the user ( e. g .,
CROSS -REFERENCE TO RELATED head -mounted , wrist-worn , etc .) in a position such that its
APPLICATIONS display is viewable by the cyclist . For example , the mobile
electronic device may be mountable to or sized to fit within
The priority benefit of U . S . Provisional Patent Applica a holder mounted to the user's bicycle , such as the handle
tion No. 62/264,539, entitled “ Bicycle Rear Radar Sensor bars, or to be mounted to sunglasses worn by the user.
and Camera ," filed on Dec . 8 , 2015 , is claimed and the The technology encompassed by the present disclosure
disclosure of which is hereby incorporated by reference in 10 may further comprise informing or alerting a cyclist of
its entirety . identified targets that may be of interest to the user by
providing one ormore situational awareness indicators on a
TECHNICAL FIELD display, using a haptic feedback element to provide one or
more vibrations , or using a speaker of the mobile electronic
The present disclosure relates generally to bicycle radar 15 device to provide audible feedback . The mobile electronic
systems and ,more particularly, to a bicycle radar system that device may include one or more processors and the radar
is mounted to a bicycle and augmented with a camera to unit may include one or more processors . The processors
provide enhanced user awareness of vehicles and other may be used , independently or together, to analyze the radar
cyclists located behind the bicycle radar system . sensor signals and video and/ or image data in order to
20 generate target data relating to one or more targets located in
BACKGROUND the radar sensor' s sensor field and the camera ' s field of view .
In implementations, the processor in the radar unit may
Cyclists often have limited visibility of their surround - identify at least one target located in a sensor field in
ings, particularly ofmoving targets ( e.g ., vehicles , bicycles, proximity to the radar sensor using the reflected radar sensor
objects , obstacles , etc .) located behind them . Radar signals 25 signals and generate target data that may be wirelessly
may be output and reflections of the outputted radar signals transmitted (or communicated ) to the mobile electronic
may be used to detect nearby targets in a sensor field , such device . The target datamay include data relating to a relative
as an area of interest behind the cyclist, and present infor distance and velocity of one or more targets based upon
mation related to the detected target (s ) to the cyclist. How - radar signals and /or image analysis . As a result, the target
ever, radar systems typically include a transmitting antenna 30 data may include data ( e .g ., the relative distance and velocity
and a radar sensor ( receiving antenna ) that detects one or of one or more targets and threat levels associated with
more targets traveling near the bicycle to which the bicycle targets ) derived from one ormore radar sensor signals and /or
radar system is mounted . A rear -mounted radar system may derived from one or more images captured by the camera .
detect a vehicle approaching the bicycle from behind . Radar The processor of the mobile electronic device or the pro
systems mounted to a moving object may be improved by 35 cessor of the radar unit may determine information , such as
incorporating a camera having a field of view at least situational awareness indicators, threat levels, and /or loca
partially overlapping with the sensor field of the radar tion and /or velocity information relating to one or more
sensor. target( s ) to aid a user with riding a cycle in areas having
stationary and /or moving objects along the user 's route from
SUMMARY 40 a starting point to a destination .
In embodiments , situational awareness information ,
The present disclosure is directed to technology that determined based on target data , may be provided using the
encompasses a radio detection and ranging (RADAR or user interface device of the radar sensor system . In embodi
radar ) sensor system having a radar sensor, a camera and a ments, the radar sensor system presents situational aware
user interface device to provide situational awareness indi- 45 ness information using a user interface device ( e . g ., a
cators . In embodiments , the radar sensor system includes a display , a speaker, haptic feedback element, etc .) of the
radar unit including a radar sensor housing that is mountable mobile electronic device that is accessible by ( e. g., view
to the bicycle and that encloses , wholly or partially, the radar a ble, audible , in contact with , etc.) the user while riding the
sensor , the camera , a processor, and a transceiver. The radar cycle . For instance , in embodiments where the mobile
unit may be mounted on a bicycle in a position such that a 50 electronic device is worn on a wrist of a user , the mobile
portion of the radar sensor faces an area behind the bicycle . electronic device may provide situational awareness infor
The radar unit may be configured to transmit radar signals, mation to the user using the haptic feedback element and /or
receive a reflection of the transmitted radar signals, and the speaker either with or without presenting situational
output radar sensor signals corresponding to the received awareness information on a display device of the mobile
reflections. The radar sensor signals may be analog signals 55 electronic device .
indicating unprocessed measurements of radar reflections The use of a camera in addition to a radar sensor enables
(radar beam returns ) received by the radar sensor in a sensor the radar sensor system to enhance the situational awareness
field of the radar unit. The camera 's field of view at least indicators in a manner that would not be feasible using only
partially overlaps with the sensor field of the radar sensor. the reflections of radar sensor signals . For instance , live
For instance , the sensor field of the radar sensor may be 60 video data may be transmitted by the radar unit to the mobile
associated with an area having a size (width , height, and electronic device , which may provide a real -time view of
depth ) that is approximately equal to the area of the sensor targets located behind the user 's bicycle and live video data .
field . The camera generates images and/ or video data for the The situational awareness indicators may include , for
field of view captured by the camera (hereafter called the example , text, symbols, icons , highlighting, flashing colors ,
“ video data ," the “ image data ” or both ). 65 dimmed or brightened portions of a displayed screen or
In embodiments, the radar sensor system includes a navigational information (turn arrow ) presented on the dis
mobile electronic device (e .g., a bicycle computer , smart play, and so forth , which are provided via a user interface of
US 10 ,393 , 872 B2
the mobile electronic device , which may include the display. FIGS. 4A -4C are schematic illustration examples of user
In this way, the display may provide a situational awareness interface screens, according to an embodiment; and
indicator to inform the user of the presence of a one or more FIG . 5 illustrates a method flow 500 , according to an
targets within a detectable range of the radar sensor (and embodiment.
thus the cyclist), a threat level associated with the detected 5
targets , and /or live video of the detected targets. In this DETAILED DESCRIPTION
manner, the cyclist has improved situational awareness to
ride the bicycle using information relating to the detected The
The following text sets forth a detailed description of
target, which may be in front of the cyclist, behind the numerous different embodiments . However, it should be
cyclist, or to the left or right side of the cyclist. In some 10 understood that the detailed description is to be construed as
implementations , the situational awareness indicators may exemplary only and does not describe every possible
be paired with other information (e.g ., guidance, position embodiment since describing every possible embodiment
ing, or location information ). would be impractical. In light of the teachings and disclo
The situational awareness information presented on the 16 sures herein , numerous alternative embodiments may be
display may include a determined location of a detected implemented .
target (e.g., an approaching vehicle, pedestrian , cyclist, A radar sensor system can inform or alert a cyclist about
object, animal, other cyclist, etc .) relative to the bicycle targets ,obstacles, and other objects in proximity to his or her
based on a received radar sensor signal, a determined range bicycle . For clarity , while the term " bicycle ” is used
of the target to the cyclist , a direction of approach of the 20 throughout the description for consistency and simplicity,
target, a determined awareness level of the target, a threat the present invention should not be construed to be limited
level, a current lane occupied by the target, and so forth . For to use with a bicycle. Embodiments could include a bicycle ,
example , a situational awareness indicatormay be a tracking unicycle , tricycle , or any other human force - powered
bar with an icon illustrative of a target location based on vehicle . A cyclist who is assisted by a bike computer, having
received target data , a dynamic representation of a distance 25 a geographic positioning system (GPS ) receiver and a pro
between the target and the bicycle using two icons, a cessor configured to provide information . In these scenarios ,
brightness or color of an edge of the display or navigational situational awareness of nearby moving vehicles and
information (turn arrow ) presented on the display , or a bicycles may be helpful for the cyclist to identify an appro
numeric time gap between the target and the bicycle based priate moment to perform a turn or lane change . In embodi
on the target data . In embodiments , the mobile electronic 30 ments, situational awareness indicators may be presented on
device can also present location information or positioning a display viewable by the cyclist or the situation awareness
data ( e . g ., geographic coordinates, altitude, map data , navi- information may be provided using a haptic feedback ele
gation information , and so forth ) based on a current geo ment or a speaker of the mobile electronic device . For
graphic position received by the processor from a position example, a mobile electronic device that is mounted on the
determining component located within themobile electronic 35 handle bars of the bicycle may include a display viewable by
device or the radar unit . Furthermore , the mobile electronic the cyclist that can present situational awareness information
device may present threat level indicators or video of the ( e . g ., an indication of determined location ( s ) of a target (s ),
road behind the bicycle .
This summary is provided to introduce a selection of the range of the target to the cyclist, the direction of
concepts in a simplified form that are further described 40 approach of the target, the awareness levelof the target, and
below in the detailed description . This summary is not so forth ) based on target data corresponding to identified
intended to identify key features or essential features of the targets located proximate to the bicycle . In embodiments
claimed subject matter, nor is it intended to be used to limit where the radar sensor system is implemented as two or
the scope of the claimed subject matter. Other aspects and more separate components, the target data is received by the
advantages of the present technology will be apparent from 45 mobile electronic device from a transceiver of the radar unit
the following detailed description of the embodiments and mounted to the bicycle. In embodiments, the mobile elec
the accompanying drawing figures . tronic device may be worn on a user 's head or mounted to
sunglasses worn by the user. Various measurements deter
BRIEF DESCRIPTION OF THE DRAWINGS mined from an analysis of the target data may be provided
50 to a user. The display of the mobile electronic device may
The figures described below depict various aspects of the also present location information (e.g ., geographic coordi
system and methods disclosed herein . It should be under - nates, altitude , and so forth ) of the bicycle based on the
stood that each figure depicts an embodiment of a particular current geographic position of the bicycle communicated to
aspect of the disclosed system and methods, and that each of the processor from a position -determining component.
the figures is intended to accord with a possible embodiment 55 Embodiments also include utilizing image analysis to
thereof. Further, whenever possible, the following descrip - provide additional functionality that would not be feasible
tion refers to the reference numerals included in the follow - for a processor by relying solely upon reflections of the radar
ing figures , in which features depicted in multiple figures are sensor signals. For instance , a processor included in the
designated with consistent reference numerals. radar unit may analyze video and /or image data to correlate
FIG . 1 is a block diagram illustrating an example mobile 60 one or more targets located behind the user ' s bicycle to a
electronic device environment 100 including a mobile elec - particular road lane and present this information on a
tronic device that can implement a radar sensor system in display . Furthermore, although the velocity and position of
accordance with embodiments of the technology ; various targets may be ascertained using reflections of radar
FIGS. 2A - 2B illustrate an example radar sensor system sensor signals, different vehicles may have similar radar
environment 200 from two different perspectives ; 65 profiles regardless of their size . However , by analyzing the
FIG . 3 is a block diagram example of a radar sensor video and/ or image data , the size a target may be readily
system 300 , according to an embodiment; ascertained and the appropriate threat level (which may be
US 10 ,393 , 872 B2
based upon the detected size and /or position of the target ) mounted on the user 's bicycle such that the radar sensor and
may be conveyed to the cyclist for improved situational camera face any area proximate to , such as an area to the
awareness . front of, behind, left side , right side, or any combination
In embodiments, the mobile electronic device and/ or thereof, the bicycle .
radar unit may include a position - determining component, 5 Themobile electronic device may also provide situational
such as a global positioning system (GPS ) receiver, config - awareness information via audible alerts provided by a
ured to determine a current geographic position of the speaker. For example , the speaker may output a unique tone
bicycle , a transceiver configured to receive target data from when at least one target is detected by the processor or
a transceiver coupled with a radar sensor of the bicycle , a output a tone for every new target detected . In embodiments ,
display , and a processor coupled with the position -determin - 10 the processor may control the speaker to adjust a volume or
ing component, the transceiver, and the display . The pro pattern of output tones based on a determined awareness
cessor of the mobile electronic device may be configured to level of one or more targets. The processor may control the
cause the display to determine one or more situational speaker to adjust a pattern of tones output by the speaker
awareness indicators based on the received target data and to based on a determined direction of approach of the target. In
cause the display to present location information based on 15 embodiments , the speaker may include two speakers oper
the geographic position determined by the position -deter- ating in stereo . The processor may control the two stereo
mining component and the one or more situational aware - speakers to adjust the tone' s volume, pattern , or duration to
ness indicators (e . g ., an indication of a detected target, a provide feedback relating to a determined direction of
range of the target to the cyclist, a direction of approach of approach of one or more targets identified by the processor.
the target, an awareness level, and so forth ). Additionally, 20 The processormay control the speaker to output one ormore
the mobile electronic device may present a threat level pre -recorded messages , such as “ On your right” or “ On your
associated with various targets , a current lane occupied by left," to provide a cyclist situational awareness of targets
one or more targets , and/ or live video captured behind the determined to be located in proximity of the user and his
bicycle . bicycle to which the radar sensor system is mounted .
In some embodiments , the radar sensor system may be 25 The mobile electronic device may also provide situational
implemented as two or more separate components, while in awareness information using haptic feedback . The mobile
other embodiments the radar sensor system may be inte - electronic device may include a motor and a vibrating
grated as a single component. For instance , the radar sensor element that may be controlled by a processor to produce
system may include a radar unit (or radar housing ) contain - vibrations of constant or varying intensity . For instance , a
ing a radar sensor and a mobile electronic device having a 30 processor may control the haptic feedback element to pro
processor configured to present situational awareness indi- duce a vibration when at least one target is determined to
cators informing or alerting a cyclist of one or more targets , exist in a sensor field of a radar sensor or a field of view of
such as moving vehicles , pedestrians, cyclists , and / or other a camera ( e . g ., behind the cyclist ) when a new target is
obstacles , determined to be in proximity to his or her cycle identified by a processor in the radar unit. In embodiments,
( e . g ., bicycle , unicycle , tricycle , or other human force - 35 a processor may control the haptic feedback element to
powered vehicle ). The radar sensor may be configured to adjust vibration intensity ( strength ) or a pattern of the
transmit a radar signal, receive a reflection of the transmitted vibrations based on a determined awareness level of a target
radar signal, and output a radar sensor signal corresponding or a determined direction of approach of the target.
to the received reflection . The radar sensor signal may be The processor of the mobile electronic device or the
generated by the processor of the radar unit or the radar 40 processor of the radar unit may analyze the target data to
sensor. For instance , the radar sensor signal may be an determine information, such as situational awareness indi
analog signal representing unprocessed radar reflections cators relating to one or more target( S ), to aid a user with
(radar beam returns ) received by the radar sensor in a sensor riding a bicycle in areas having stationary and /or moving
field of the radar sensor . objects along the user' s route from a starting point to a
The radar sensor may face an area proximate to ( front, 45 destination . The processor of the mobile electronic device
behind, left, right, or any combination thereof ) the cycle to may receive detected current geographic position and target
which the radar sensor system is mounted where radar data from the position - determining component and a trans
signals may be output and reflections of the outputted radar ceiver of the radar unit, respectively , and may be configured
signals from target( s ) may be received (i.e ., the sensor field to determine one or more situational awareness indicators
of the radar sensor ) . The radar unit can detect one or more 50 based on the target data , which may include information
targets ( e. g ., vehicles , objects, pedestrians , animals, and so corresponding to targets proximate to the bicycle , and cause
forth ) in range of the bicycle based on reflections ( radar the display to present the location information ( e. g ., location
beam returns ) received by the radar sensor from one or more or geographical position , altitude, or navigation data in text,
targets located within a sensor field of the radar sensor. symbols , a graphical ( e . g., map ) representation , or the like )
The radar sensor system may also include a camera facing 55 and a situational awareness indicator.
a field of view proximate to (front, behind , left, right, or any The situational awareness indicator may be a tracking bar
combination thereof) the bicycle . Furthermore , the camera with an icon illustrative of a target location based on the
may be configured to capture video and /or images , which are target data , a dynamic representation of a distance between
analyzed by one or more processors included in the radar the target and the bicycle using two icons, a brightness or
unit . The processor of the radar unit may then generate target 60 color of an edge of the display or navigational information
data based on an analysis of the radar sensor signal and/ or (turn arrow ) presented on the display , or a numeric time gap
the captured video or images based upon the occurrence of between the target and the bicycle based on the target data
certain conditions. In embodiments, the target data includes corresponding to targets proximate to the bicycle . The
information identifying targets proximate to the bicycle situational awareness indicator may include text, symbols ,
regardless of whether the target data was generated based 65 or an iconic or graphical representation located on or adja
upon an analysis of the reflected radar sensor signals and/ or cent to a map , textual, or symbolic representation of location
the captured video or images. The radar unit may be or positioning data, or any combination thereof. For
US 10 ,393 , 872 B2
example , the processor of the mobile electronic device can mobile electronic device can cause at least one edge of the
cause the display to present a map with an icon associated display or presented navigational information (turn arrow ) to
with the detected target on the map or present a tracking bar c hange color (e . g ., change to red , yellow , or green ) to
next to the map with an iconic representation of the detected indicate an awareness level (i.e., a suggested level of aware
target relative to the user ' s bicycle . The processor of the 5 ness of the cyclist' s surroundings that the cyclist may wish
mobile electronic device can also cause the display to show to employ ) associated with a target determined to be present
text, symbols , icons , highlighting, flashing colors , dimmed (detected ) proximate to the bicycle based on the target data
or brightened portions of a displayed screen , and so forth to corresponding to targets proximate to the user ' s bicycle .
indicate an awareness level ( e .g ., " low awareness level,” The awareness level (as well as a threat level, when
" moderate awareness level," or " high awareness level" ) 10 applicable ) associated with a target may be determined
associated with the detected target. Furthermore , the pro - based on one or more factors such as , but not limited to , a
cessor associated with the mobile electronic device may determined distance between the cyclist and detected target,
cause the display to show other indicators or information a determined approaching speed of the target or relative
such as a threat level based upon the size and/or position of speeds of the cyclist and target, a determined rate of accel
a target , live video captured by the camera of the radar unit, 15 eration or deceleration of an approaching target , a deter
or an indication of a road lane occupied by the detected mined change of direction ( e . g ., turn , lane change , etc .) of an
target. approaching target, a number of targets , a determined size of
In implementations , the processor of the mobile electronic the target,map or route information (e.g., predicted visibility
device is configured to cause the display to present a first due to turns, hills , trees , and other geographic features ,
indicator when a detected target is determined to be in 20 weather information , etc .), any combination of the forego
proximity ( front, behind , left side , right side , or any com - ing, and so on , based on the target data corresponding to
bination thereof) to the bicycle . For example , the processor targets proximate to the bicycle . In some implementations ,
of the mobile electronic device may be configured to cause the processor of the mobile electronic device may also be
the display to present a tracking bar when a target is configured to cause a change in brightness or color of the at
determined to be present within a detectable range of the 25 least one portion of the edge of the screen of the display or
radar sensor or is detected to be present within threshold navigational information (turn arrow ) presented on the dis
proximity of the radar unit ( and thus the bicycle ) based on play in response to determining a target in a first direction
the target data . The processor of the mobile electronic device associated with the edge corresponding the determined
may also be configured to cause the display to present an direction of the target relative to location and / or orientation
icon illustrative of the target detected to be proximate to the 30 of the mobile electronic device display.
radar unit on the tracking bar, when the target is determined The processor of the mobile electronic device may also be
to be present within a threshold distance from the bicycle configured to cause a change in brightness or color of at least
based on the target data corresponding to targets proximate a portion of a second edge of the display in response to
to the bicycle. determining that a target is present in a second direction
In some implementations, the processor of the mobile 35 associated with the second edge based on the target data
electronic device may be further configured to cause the corresponding to targets proximate to the bicycle . For
display to present a dynamic representation of a distance example , the processor may be configured to cause the right
determined by the processor between the bicycle and a target e dge of the mobile electronic device display or navigational
determined to be present proximate to the bicycle based on information ( turn arrow ) presented on the device display to
the received target data using an icon illustrative of the target 40 change color or brightness to indicate an approaching
and a second icon illustrative of the bicycle . The separation vehicle or other target determined to be present ( detected ) in
between the icons is representative of the distance between a right sensor field of the radar sensor and the left edge of
the bicycle and a target based on the target data correspond the display to change color or brightness to indicate an
ing to targets proximate to the bicycle. For example, the approaching vehicle or other target determined to be present
processor of the mobile electronic device may be configured 45 (detected ) in a left sensor field of the radar sensor . Similarly ,
to cause the display to show a substantially instantaneous or the processor may be configured to cause the right edge of
periodically updated representation of the tracking bar, the mobile electronic device display or navigational infor
where the cyclist icon and the target icon are presented mation ( turn arrow ) presented on the device display to
closer to one another, or further away from one another, change color or brightness to indicate an approaching
based on changes in the distance between the cyclist and the 50 vehicle or other target determined to be present (detected ) in
target . a right portion of a field of view of the camera and the left
In another example, the situational awareness indicator edge of the display to change color or brightness to indicate
determined by the processor of the mobile electronic device an approaching vehicle or other target determined to be
is presented as a brightness or color of at least one portion present (detected ) in a left portion of a field of view of the
of one or more edges of a display (including a display 55 camera .
screen ) to indicate an awareness level determined in asso - The processor of the mobile electronic device may also be
ciation with a target determined to be present in proximity configured to cause a change in brightness or color of at least
to the bicycle . The processor may be configured to cause a a portion of multiple edges of the display or navigational
change in the brightness or color of an edge of a display information ( turn arrow ) presented on the display in
device or navigational information (turn arrow ) presented on 60 response to determining that a target is present in a third
the display of the mobile electronic device to indicate the direction associated with the associated combination of
presence of one or more targets proximate to the bicycle in edges corresponding the determined direction of the target
an area of interest corresponding to the radar sensor' s sensor relative to location and /or orientation of the mobile elec
field and /or the camera ' s field of view . Information relating tronic device display . For example , the processor may be
to the targets may be provided in target data communicated 65 configured to cause the left and right edges of the display or
by a transceiver of the radar unit to the processor of the navigational information (turn arrow ) presented on the dis
mobile electronic device . For example , the processor of the play to change color and /or brightness to indicate an
US 10 ,393 , 872 B2
10
approaching vehicle or other target , the position ofwhich is field of view based on inputs such as, but not limited to ,
determined based on target data , located in a rear (or any location information ( e. g., location or positioning data mea
other) sensor field of the radar sensor or field of view of the sured by the position -determining component ), communi
camera. The color and /or brightness change may be greater cated information ( e. g., a communication received from the
( increased ) if a target determined to be located in the sensor 5 detected target ), bicycle speed measurements ( e . g ., from a
field of the radar sensor or field of view of the camera is bicycle speedometer ), and so forth .
determined to be traveling faster than ( approaching ) the The radar unit, including at least one radar sensor, is
bicycle on which the radar unit and mobile electronic device mountable to a bicycle being ridden by the user and the
are mounted than targets determined to be located in the mobile electronic device is also mountable to the same
sensor field that are determined to be traveling at the same 10 bicycle in a position in which its display is viewable by the
or slower speed than the bicycle . cyclist, to the user ' s wrist, or to an accessory ( e .g ., sun
Similarly, in embodiments where the audible or haptic glasses ) worn by the user on his head . In embodiments
feedback is provided to communicate situational awareness where the situational awareness information is presented on
information, the change in volume of the audible output a display device of the mobile electronic device , it is to be
and / or the intensity of the haptic feedback ( vibration ) may 15 understood that the mobile electronic device may be
be greater ( increased ) if a target determined to be located in mounted anywhere as long as its display device may be seen
the sensor field of the radar sensor or field of view of the by the user while riding the bicycle . For example , the mobile
camera is determined to be traveling faster than (approach - electronic device may be mountable to or sized to fit within
ing) the bicycle on which the radar unit and mobile elec - a holder mounted to a steering assembly ( e . g., handle bars )
tronic device are mounted than targets determined to be 20 of the bicycle . In embodiments where the situational aware
located in the sensor field that are determined to be traveling ness information is provided using a speaker or a haptic
at the same or slower speed than the bicycle . For example , feedback element, the mobile electronic device may not
the display color or brightness, speaker volume or haptic include a display or a display of the mobile electronic device
feedback may be changed to the highest (e . g ., brightest, does not need to be mounted in a location where it may be
loudest, most intense or strongest) configuration of the 25 seen by the user while riding the bicycle . In embodiments ,
display , speaker, or haptic feedback element, if a target the mobile electronic device may be coupled with or in
determined to be located in the sensor field of the radar communication with (wired or wirelessly ) headphones or a
sensor or field of view of the camera is determined to be mobile device in communication with headphones such that
quickly approaching the radar unit and the bicycle at a rate audible information may be output to the user by the
of at least three times the current speed of the bicycle, which 30 headphones . For instance, the mobile electronic device may
is determined by the processor of the mobile electronic determine situational awareness information for one ormore
device or the processor of the radar unit based on informa- targets determined to be in proximity to the bicycle and then
tion provided by a position -determining component. In such cause the headphones to output audible alert tones or mes
a manner, the user may be informed of relevant targets sages (e .g., " vehicle approaching to your right” ).
( objects ) proximate to the user and take precautionary or 35 In some embodiments, the mobile electronic device is
corrective measures , if necessary . physically connected ( e. g., wired ) to one or more radar units
Situational awareness indicators may also includemetrics mounted on the bicycle such that one or more radar sensors
associated with one or more targets determined to be present may have a sensor field in front of, behind , to the left side ,
(detected ) in the sensor field of the radar sensor or a field of and /or to the right side of the bicycle. In embodiments , the
view of the camera in the radar unit. For example , the 40 mobile electronic device may include or integrate a radar
processor of themobile electronic device may be configured sensor. In other embodiments , a transceiver of the mobile
to determine a time gap associated with a determined electronic device may be configured for wireless communi
distance between the bicycle to which the radar unit is cation with a transceiver of the radar unit .
mounted and a moving or stationary target detected in Once a target has approached the bicycle from behind and
proximity to the bicycle and cause the display to present the 45 the target begins travelling at approximately the same veloc
determined time gap . In embodiments where the audible or ity as the user ' s bicycle , which may result in the threat level
haptic feedback is provided to communicate situational from the target exceeding a threshold level, the processor of
awareness information , a speaker of the mobile electronic the radar unit may activate a camera to capture video data of
device , or a speaker in wireless communication with the objects in a field of view of the camera to assist the cyclist
mobile electronic device , may output a message indicating 50 assess a threat level posed by the target. Therefore , embodi
the presence of a target proximate to the cyclist and a ments include the camera of the radar unit selectively
determined estimate of time until an approaching target will capturing video and / or image data , which may be analyzed
reach the cyclist. The mobile electronic device may identify by the processor of the radar unit to generate the target data .
a target approaching the radar unit (and the cyclist ), and In this way, the target data may include information that is
determine the time required for the target to reach the radar 55 based upon the radar sensor signals or the analyzed video
unit based on the current velocity of the target and the and / or image data . In other words, the target data may
cyclist' s bicycle . For instance , the processor may cause an include data ( e . g ., the relative distance and velocity of one
audible signal such as , “ vehicle identified fifty (50 ) feet or more targets ) derived from radar sensor signals and / or
behind , will reach bicycle in thirty ( 30 ) seconds ." data derived from images captured by the camera . There
In implementations, the processor of themobile electronic 60 fore, when the target data includes information based upon
device or the processor of the radar unit may use the target a video and / or image analysis of captured data , the target
data to determine the time gap based on the distance between data may additionally or alternatively include data identify
the bicycle and the detected target and relative speeds of the ing any suitable type of information upon which the afore
bicycle and the detected target. The processor of the mobile mentioned situational awareness indicators are based ( e. g .,
electronic device or the processor of the radar unit may 65 the relative distance and velocity of one or more targets ).
determine current locations of the bicycle and target( s ) The use of a camera is also advantageous in that the size of
determined to be located in the sensor field or the camera ' s objects may be more accurately ascertained , which may be
US 10 , 393 ,872 B2
12
used to calculate and display a higher threat level for larger processor 104 and other elements of the mobile electronic
targets . Furthermore , the radar unit may transmit live video device 102 to perform the techniques described herein .
data to the mobile electronic device, which is used by the Although a single memory 106 is shown, a wide variety of
processor of the mobile electronic device to display real types and combinations ofmemory may be employed . The
time video of targets behind the bicycle , particularly when 5 memory 106 may be integral with the processor 104 , stand
targets pose an imminent threat to the cyclist and/ or when alone memory, or a combination of both . The memory 106
the targets can no longer be detected via analysis of the radar may include , for example , removable and non - removable
sensor signals . memory elements such as random access memory (RAM ),
FIG . 1 is a block diagram illustrating an example mobile read -only memory (ROM ), Flash (e . g ., secure digital (SD )
electronic device environment 100 including a mobile elec - 10 card , mini- SD card , micro - SD card ), solid - state disk (SSD ) ,
tronic device that can implement a radar sensor system in magnetic , optical, universal serial bus (USB ) memory
accordance with embodiments of the technology . The envi- devices, and so forth .
ronment 100 includes a mobile electronic device 102 (e.g ., The mobile electronic device 102 is further illustrated as
a bicycle computing device such as the GARMINTM including functionality to determine position . For example ,
EDGETM bicycle computer,GARMINTM VARIA VISIONTM 15 the mobile electronic device 102 may receive signal data
head -mounted in - sight display ,GARMINTM VIRBTM action 108 transmitted by one or more position data platforms
camera , smart phone, smart watch , etc .) operable to provide and /or position data transmitters , examples of which are
navigation functionality to the user of the mobile electronic depicted as Global Positioning System (GPS ) satellites 110 .
device 102. The mobile electronic device 102 may be More particularly, the mobile electronic device 102 may
configured in a variety of ways . For example, a mobile 20 include a position - determining component 112 that may
electronic device 102 may be configured for use during manage and process signal data 108 received from GPS
fitness and/or sporting activities , such a recreational and satellites 110 via a GPS receiver 114 . The position - deter
competitive bike riding. However, the mobile electronic mining component 112 is representative of functionality
device 102 can also comprise a sport watch , a golf computer, operable to determine a geographic position through pro
a smart phone providing fitness or sporting applications 25 cessing of the received signal data 108 . The signal data 108
( apps), a hand -held GPS device , and so forth . It is contem - may include various data suitable for use in position deter
plated that the techniques may be implemented in any mination , such as timing signals, ranging signals, ephemeri
mobile electronic device that includes navigation function des , almanacs , and so forth .
ality . Thus, the mobile electronic device 102 may also be P osition -determining component 112 may also be config
configured as a portable navigation device (PND ), a mobile 30 ured to provide a variety of other position - determining
phone, a hand -held portable computer, a tablet, a personal functionality . Position -determining functionality, for pur
digital assistant, a multimedia device , a media player, a poses of discussion herein , may relate to a variety of
gaming device , combinations thereof, and so forth . In the different navigation techniques and other techniques that
following description , a referenced component, such as may be supported by “ knowing” one or more positions. For
mobile electronic device 102 , may refer to one or more 35 instance , position -determining functionality may be
devices, and therefore by convention reference may bemade employed to provide position / location information , timing
to a single device ( e . g ., the mobile electronic device 102 ) or information , speed information , and a variety of other navi
multiple devices ( e . g., the mobile electronic devices 102 , the gation - related data . Accordingly , the position -determining
plurality of mobile electronic devices 102, and so on ) using component 112 may be configured in a variety of ways to
the same reference number. 40 perform a wide variety of functions . For example , the
In FIG . 1 , the mobile electronic device 102 is illustrated position - determining component 112 may be configured for
as including a processor 104 and a memory 106 . The bicycle navigation ( e . g ., implemented within a bicycle com
processor 104 may perform the functions described herein puter ); however , the position -determining component 112
independent of the processors included in the radar unit or may also be configured for other vehicle navigation or
in conjunction with one or more processors included in the 45 tracking .
radar unit using wired or wireless communication to com - The position -determining component 112, for example ,
municate information between the processors of the radar can use signal data 108 received via the GPS receiver 114 in
sensor system . The processor 104 provides processing func - combination with map data 116 that is stored in the memory
tionality for the mobile electronic device 102 and may 106 to generate navigation instructions ( e . g ., turn -by - turn
include any number of processors , micro - controllers , or 50 instructions to an input destination or POI), show a current
other processors , and resident or externalmemory for stor . position on a map , and so on . Position -determining compo
ing data and other information accessed or generated by the nent 112 may include one or more antennas to receive signal
mobile electronic device 102 . The processor 104 and the one data 108 as well as to perform other communications, such
or more processors included in the radar unit may execute as communication via one or more networks 118 described
one or more software programs or computer -readable 55 in more detail below . The position - determining component
instructions that implement the operations described herein . 112 may also provide other position - determining function
The processor 104 and the one or more processors included ality, such as to determine an average speed , calculate an
in the radar unit are not limited by the materials from which arrival time, and so on .
it is formed or the processing mechanisms employed therein Although a GPS system is described and illustrated in
and , as such , may be implemented via semiconductor (s ) 60 relation to FIG . 1 , it should be apparent that a wide variety
and /or transistors ( e . g ., electronic integrated circuits ( ICs ) ), of other positioning systemsmay also be employed , such as
and so forth . other global navigation satellite systems (GNSS ), terrestrial
The memory 106 is an example of device -readable stor- based systems (e .g ., wireless -phone based systems that
age media that provides storage functionality to store vari- broadcast position data from cellular towers ), wireless net
ous data associated with the operation of the mobile elec - 65 works that transmit positioning signals , and so on . For
tronic device 102, such as the software program and code example , positioning -determining functionality may be
segments mentioned above, or other data to instruct the implemented through the use of a server in a server -based
US 10 ,393 , 872 B2
13 14
architecture , from a ground-based infrastructure, through Wired communications are also contemplated such as
one or more sensors (e.g ., gyros, odometers, and magne - through universal serial bus (USB ), Ethernet , serial connec
tometers ), use of “ dead reckoning” techniques , and so on . tions, and so forth .
The mobile electronic device 102 may include a display The mobile electronic device 102, through functionality
device 120 to display information to a user of the mobile 5 represented by the communication component 126 , may be
electronic device 102. In embodiments, the display device configured to communicate via one or more networks 118
120 may comprise an LCD ( Liquid Crystal Diode) display, with a cellular provider 128 and an Internet provider 130 to
a TFT ( Thin Film Transistor) LCD display, an LEP (Light receivemobile phone service 132 and various content 134,
Emitting Polymer ) or PLED (Polymer Light Emitting respectively . Content 134 may represent a variety of differ
Diode ) display , and so forth , configured to display text e nt content, examples of which include , but are not limited
and/ or graphical information such as a graphical user inter - to : information relating to high - risk geographic areas ( e. g .,
face. The display device 120 may be backlit via a backlight intersections, streets , etc .), map data, which may include
such that it may be viewed in the dark or other low -light route information ; web pages; services ; music ; photographs ;
environments . s video ; email service ; instant messaging; device drivers ;
The display device 120 may be provided with a touch real-time and/or historical weather data ; instruction updates;
screen 122 to receive input (e.g., data , commands, etc .) from and so forth .
a user. For example , a user may operate the mobile elec The mobile electronic device 102 is illustrated as includ
tronic device 102 by touching the touch screen 122 and /or by i ng a user interface 136 , which is storable in memory 106
performing gestures on the touch screen 122 . In some 20 and executable by the processor 104 . The user interface 136
embodiments, the touch screen 122 may be a capacitive is representative of functionality to control the display of
touch screen , a resistive touch screen , an infrared touch information and data to the user of the mobile electronic
screen , combinations thereof, and the like. The mobile device 102 via the display device 120 . In some implemen
electronic device 102 may further include one or more tations, the display device 120 may notbe integrated into the
input/output ( I/O ) devices 124 ( e. g ., a keypad , buttons, a 25 mobile electronic device and may instead be connected
wireless input device , a thumbwheel input device , a track - externally using universal serial bus (USB ), Ethernet, serial
stick input device , and so on ). The I/O devices 124 may connections, and so forth . The user interface 136 may
include one or more audio I/O devices, such as a micro provide functionality to allow the user to interact with one
phone, speakers , and so on . ormore applications 138 of themobile electronic device 102
The mobile electronic device 102 may also include a s by providing inputs via the touch screen 122 and/ or the I/ O
communication component 126 representative of commu - devices 124 . For example, the user interface 136 may cause
nication functionality to permitmobile electronic device 102 an application programming interface ( API) to be generated
to send / receive data between different devices ( e . g ., com to expose functionality to an application 138 to configure the
ponents/peripherals ) and /or over the one or more networks 36 application for display by the display device 120 or in
118 . Communication component 126 may be a transceiver combination with another display . In embodiments, the API
coupled with the processor 104 . Communication component may further expose functionality to configure the application
126 may be representative of a variety of communication 138 to allow the user to interact with an application 138 by
components and functionality including , but not limited to : providing inputs via the touch screen 122 and/ or the I/ O
one or more antennas ; a browser; a transmitter and /or 40 devices 124 .
receiver; transceiver, a wireless radio ; data ports; software Applications 138 may comprise software, which is stor
interfaces and drivers; networking interfaces; data process - able in memory 106 and executable by the processor 104 , to
ing components, and so forth . perform a specific operation or group of operations to
The one or more networks 118 are representative of a furnish functionality to the mobile electronic device 102 .
variety of different communication pathways and network 45 Example applications 138 may include bike riding applica
connections which may be employed , individually or in tions , navigation / guidance applications, fitness applications ,
combinations, to communicate among the components of exercise applications, health applications, diet applications,
the environment 100. In embodiments, networks 118 may cellular telephone applications , instant messaging applica
include wireless communication between communication tions, email applications, photograph sharing applications ,
component 126 (transceiver ) and a transceiver within the 50 calendar applications, address book applications, and so
radar unit. Thus , the one or more networks 118 may be forth .
representative of communication pathways achieved using a In implementations, the user interface 136 may include a
single network or multiple networks. Further, the one or browser 140 . The browser 140 may enable the mobile
more networks 118 are representative of a variety of differ - electronic device 102 to display and interact with content
ent types ofnetworks and connections that are contemplated 55 134 such as a webpage within the World Wide Web , a
including, but not limited to : the Internet; an intranet; a webpage provided by a web server in a private network , and
satellite network ; a cellular network ; a mobile data network ; so forth . The browser 140 may be configured in a variety of
wired and / or wireless connections, and so forth . ways . For example , the browser 140 may be configured as
Examples of wireless networks include, but are not lim - an application 138 accessed by the user interface 136 . The
ited to , networks configured for communications according 60 browser 140 may be a web browser suitable for use by a full
to : one or more standard of the Institute of Electrical and resource device with substantial memory and processor
Electronics Engineers (IEEE ), such as 802 .11 or 802. 16 resources (e. g., a smart phone , a personal digital assistant
( Wi-Max ) standards ; Wi-Fi standards promulgated by the (PDA ), etc .). However, in one or more implementations , the
Wi- Fi Alliance ; ZigBee standards promulgated by the Zig - browser 140 may be a mobile browser suitable for use by a
Bee Alliance ; Bluetooth standards promulgated by the Blu - 65 low - resource device with limited memory and / or processing
etooth Special Interest Group ; ANT or ANT + standards resources ( e . g ., a mobile telephone , a portable music device ,
promulgated by Dynastream Innovations, Inc., and so on . a transportable entertainment device, etc.). Such mobile
US 10 ,393 , 872 B2
15 16
browsers typically conserve less memory and processor electronic device 102 includes a speaker 178 and a haptic
resources, but may offer fewer browser functions than web feedback element 180. Speaker 178 may be any sound
browsers . producing element ( e . g ., speaker, headset, mono or stereo
Themobile electronic device 102 is illustrated as includ headphones , etc .). Haptic feedback element 180 may be a
ing a navigation interface 142 , which may be implemented 5 vibration - producing component such as a motor coupled to
by program instructions stored in memory 106 and execut- an eccentric load .
able by the processor 104 . The navigation interface 142 Themobile electronic device 102 may include the speaker
represents functionality to accessmap data 116 that is stored 178 and haptic feedback element 180 in addition to or in lieu
in the memory 106 to provide mapping and navigation of display device 120 . For instance , in embodiments where
functionality to aid the user of the mobile electronic device 10 mobile electronic device 102 may not be mounted or worn
102 with traveling from a starting location to a destination . in a position in which its display device 120 may be seen by
For example , the navigation interface 142 may generate a cyclist while riding a bicycle , speaker 178 may provide
navigation information 144 that includes maps and /or map - audible communication of situational awareness information
related content for display by display device 120 . As used determined by processor 104 to the cyclist. Similarly , haptic
herein , map - related content includes information associated 15 feedback element 180 may provide tactile communication of
with maps generated by the navigation interface 142 and situational awareness information determined by processor
may include route information , POIs, information associated 104 to the cyclist.
with POIs, map legends, controls for manipulation of a map F IGS. 2A -2B illustrate an example radar sensor system
( e. g ., scroll , pan , etc .), street views, aerial/ satellite views, environment 200 from two different perspectives . As shown
and the like, displayed on or as a supplement to one or more 20 in FIG . 2A , radar sensor system environment 200 includes
maps . Map -related content may be retrieved from map data a bicycle 202 , to which a mobile electronic device 206 and
116 , content 134 , other third party sources, or any combi- radar unit 208 are mounted , and a target 204 , which is a
nation thereof. vehicle in this example . In an embodiment, mobile elec
In one ormore implementations, the navigation interface tronic device 206 may be an implementation of mobile
142 may be configured to utilize the map data 116 to 25 electronic device 102 , as shown in FIG . 1 and discussed
generate navigation information 144 that includes maps above . Furthermore , in an embodiment, radar unit 208 may
and/ or map -related content for display by the mobile elec - be an implementation of radar unit 308 , as shown in FIG . 3
tronic device 102 independently of content sources external and discussed further below .
to the mobile electronic device 102 . Thus, for example , the Although a bicycle is shown in FIGS. 2A and 2B as an
navigation interface 142 may be capable of providing map - 30 example , embodiments also include bicycle computing
ping and navigation functionality when access to external device 206 and radar unit 208 being mounted or affixed to
content 134 is not available through network 118 . It is any suitable type of human -powered or motor - driven vehicle
contemplated , however, that the navigation interface 142 instead of bicycle 202 . For example , mobile electronic
may also be capable of accessing a variety of content 134 via device 206 and radar unit 208 may be mounted to a unicycle ,
the network 118 to generate navigation information 144 35 a tricycle , a scooter , a motorcycle , a car, a forklift, etc .
including maps and /or map -related content for display by Furthermore , although target 204 is shown in FIGS. 2A - 2B
the mobile electronic device 102 in one or more implemen - as a vehicle , embodiments include mobile electronic device
tations. 206 and radar unit 208 detecting any suitable number and/ or
The navigation interface 142 may be configured in a type of targets that may pose a potential threat to the cyclist
variety of ways . For example , the navigation interface 142 40 riding bicycle 202 (or alternative vehicle as the case may
may be configured as an application 138 accessed by the be ). For example, target 204 may include one or more
user interface 136 . The navigation interface 142 may utilize pedestrians , other cyclists , trucks, debris , etc .
position data determined by the position -determining com - Furthermore, mobile electronic device 206 and radar unit
ponent 112 to show a current position of the user ( e. g., the 208 are shown in FIG . 2A as being separate components .
mobile electronic device 102 ) on a displayed map , furnish 45 However, in some embodiments, mobile electronic device
navigation instructions ( e . g ., turn - by -turn instructions to an 206 and radar unit 208 may be integrated as a single
input destination or POI), calculate traveling distance/ time component. In such a case , each of mobile electronic device
information 168 (e . g ., distance 148 and time 162 shown in 206 and radar unit 208 may be suitably mounted such that
FIG . 1 ), and so on . target data may be appropriately collected and information
As illustrated in FIG . 1, the navigation interface 142 50 such as situational awareness indicators may be conveyed to
further includes a route selection interface 146, which is also the cyclist.
storable in memory 106 and executable by the processor In embodiments, the mobile electronic device 206 and
104 . The route selection interface 146 causes the display radar unit 208 are operable to implement the features
device 120 of the mobile electronic device 102 to be described in accordance with radar sensor system environ
configured to display route selection information . In the 55 ment 200 . For example, the mobile electronic device 206
implementation shown , the route selection information is may include or be configured to wirelessly communicate
illustrated in the format of a map page 150 that includes a with radar unit 208 (or multiple radar units) . For example ,
route graphic 152 representing a route that may be traversed radar unit 208 may include a radar sensor having a sensor
by a cyclist using the mobile electronic device 102 ( e .g ., by field in an area proximate to the bicycle and a camera facing
a bicycle in or on which the mobile electronic device 102 is 60 a field of view in the area proximate to thebicycle . The radar
mounted or carried ) . The route selection interface 146 can unit 208 may be mounted to a front, rear, or side portion of
also provide various metrics 154 such as topography infor - the bicycle 202 such that the sensor field and /or the camera 's
mation 156 , a difficulty rating 158 associated with traversing field of view may be directed in front of the bicycle , behind
a geographic area , elevation data 164 , and so forth . the bicycle , the right side of the bicycle , the left side of the
The mobile electronic device 102 is further illustrated as 65 bicycle, or any combination thereof. In embodiments, the
including functionality to provide audible and tactile (vibra - radar unit 208 , the radar sensors , the mobile electronic
tion -based ) feedback to a user. In embodiments, the mobile device 206 , or portions of each of these devices may be built
US 10 ,393 , 872 B2
18
into another device . The radar unit 208 may also be a enclosed entirely or partially within the mobile electronic
standalone device having a transceiver enabling wireless device 206 as a separate device or integrated with the radar
communications with the mobile electronic device 206 . unit 208 .
The mobile electronic device 206 and radar unit 208 In an embodiment, radar unit 208 may be mounted or
together form a radar sensor system . This radar sensor 5 otherwise affixed to bicycle 202 and directed behind bicycle
system is operable to detect objects , vehicles, people , ani - 202 . As shown in FIG . 2A , radar unit 208 may transmit radar
mals , and other targets in proximity to the bicycle 202 signals (e . g ., radio - frequency signals of a particular fre
located within sensor fields and /or the camera ' s field of view quency or band of frequencies ) in the sensor field , receive a
to assess and/or present situational awareness indicators or reflection of the transmitted radar signals reflected from
recommendations to the cyclist based on target data corre - 10 various targets located in the sensor field (e . g ., target 204 ),
sponding to the objects, vehicles , people , animals , and other and output a radar sensor signal corresponding to the
targets. For example , as illustrated in FIG . 2A , the radar unit received reflection . Continuing this example , a processor
208 may be configured to identify and detect one or more within radar unit 208 may process the radar sensor signal to
targets that enter a sensor field and/ or field of view behind generate target data indicative of the velocity and range of
the bicycle . For instance , upon approaching bicycle 202 15 target 204 relative to bicycle 202 . The radar unit 208 may
from behind , target 204 may be detected by radar unit 208 transmit the target data to the mobile electronic device 206 ,
based on the returns (reflections) of transmitted radar signals which may present this information to the cyclist on a
in a sensor field behind the bicycle 202 or based on image display , audibly, or using haptic feedback . Additional details
data or video data for a field of view captured by the camera regarding radar unit 108 are further discussed below .
included in the radar unit. The camera ' s field ofview at least 20 FIG . 2B shows an alternative perspective of bicycle 202
partially overlaps with the sensor field of the radar sensor. and target 204 on a road , with target 204 in the same lane as
For instance , the sensor field of the radar sensor may be bicycle 202 . For clarity , bicycle 202 is shown in FIG . 2B
associated with an area having a size (width , height, and without the mobile electronic device 206 or radar unit 208
depth ) that is approximately equal to the area of the sensor that are still mounted to bicycle 202 . As shown in FIG . 2B ,
field . Target data may be generated by the processor of the 25 target 204 is initially following bicycle 202 in the same lane ,
radar unit 208 based on the detected target(s) 204 . target 204 approaches bicycle 202, as indicated by the
The mobile electronic device 206 may be configured to dashed line ,and then target 204 reduces its speed to approxi
wirelessly receive the target data from a transceiver within mately the same speed as bicycle 202 to avoid a collision . In
the radar unit 208 , to determine a location of target 204 , and an embodiment, radar unit 208 continuously or periodically
to notify the cyclist of the target 204 by presenting one or 30 operates its radar sensor to identify the presence of one or
more situational awareness indicators on a display ( e . g ., more targets behind the user. As a result, radar unit 208 may
display device 120, as shown in FIG . 1). The target data may initially generate target data indicating the range and veloc
include, for example, information relating to the velocity, ity of target 204 relative to bicycle 202. Once target 204
range, recommended awareness level, azimuth angle, threat begins travelling at approximately the same velocity as
level , or any other information corresponding to the target 35 bicycle 202 , the processor of radar unit 208 may activate
determined to be present in a sensor field and /or field of view camera to capture video data ( one ormore images ) of target
proximate to the bicycle 202 . The velocity or position of the 204 because a threat level posed by target 204 may exceed
detected target 204 may be used by mobile electronic device a predetermined threshold level.
206 to determine an appropriate situational awareness level Embodiments enable a user to determine whether the
and/ or recommended course of action . Processor 104 of 40 previously identified target 204 passed bicycle 202 or turned
mobile electronic device 206 may then present the situ - onto another road (or is otherwise not present) or whether
ational awareness level and/ or recommended course of the previously identified target 204 is now traveling directly
action to the cyclist. Further details and examples of how behind the user. Therefore , embodiments include a processor
this information may be presented is further discussed below of mobile electronic device 206 and /or radar unit 208
with reference to FIGS. 4A -4C . 45 determining, from the initial target data ( i. e ., the target data
As shown in FIG . 2A , the mobile electronic device 206 calculated using the radar sensor signals ), that a previously
may be implemented as any suitable type of device config - identified target 204 is no longer being detected , and then
ured to communicate with radar unit 208 , to receive target begin analyzing available video and /or image data captured
data and /or live video data from radar unit 208 , and to send via a camera included in radar unit 208 to determine the
data to radar unit 208 to control various functions of radar 50 relative location and velocity of target 204 . If the processor
unit 208 . For example , themobile electronic device 206 may ofmobile electronic device 206 determines that target 204 is
be mounted on the handlebars of bicycle 202, as shown in no longer traveling in the camera ' s field of view behind
FIG . 2A . Thus, in an embodiment,mobile electronic device bicycle 202 , information corresponding to target 204 may be
206 may be implemented , for example, as a bicycle com - removed from the display of mobile electronic device 206 .
puting device or bicycle accessory (e.g., GarminTM EDGE 55 If the processor ofmobile electronic device 206 determines
and VARIA devices ) that displays information to the cyclist that target 204 is still traveling in the camera 's field of view
such as navigational data , directions, routes , traffic , behind bicycle 202 , the display of mobile electronic device
advanced performance metrics , VO2 max , cycling dynam - 206 may present a determined range of target 204 to the
ics , etc ., in addition to the information that is determined cyclist, a direction of approach of the target 204 , a deter
using target data received from radar unit 208 . Alternatively, 60 mined awareness level of target 204 , a threat level associated
in embodiments , the mobile electronic device may be worn with 204 , a current lane occupied by target 204 and other
on a user ' s head ( e . g ., GarminTM VARIA VISIONTM head - information relating to target 204 .
mounted in -sight display ). In some embodiments, the In some embodiments, the manner in which information
mobile electronic device 206 includes a communication relating to target 204 determined to be traveling behind
component 126 that is physically connected ( e .g ., wired ) to 65 bicycle 202 is presented to the cyclist remains the same as
a communication interface of the radar unit 208 ( or multiple when the relative location and velocity of target 204 is
radar units 208 ). In embodiments, the radar sensor may be determined via the radar sensor signals. In other embodi
US 10 ,393 , 872 B2
19 20
ments , different types of information, such as the live video In embodiments in which mobile electronic device 306
and / or other information , may be communicated to mobile and radar unit 308 are implemented as separate devices ,
electronic device 206 and presented upon the target data mobile electronic device 306 and radar unit 308 may be
indicating that target 204 is still traveling in the camera ' s configured to communicate with one another via one or
field of view behind bicycle 202 or the initial target data no 5 more wired and / or wireless links ( e . g ., link 301 ). This
longer indicating the relative location and velocity of a communication may include , for example , live video data
previously identified target 204. and /or target data transmissions from radar unit 308 to
Furthermore , in various embodiments , the processor in mobile electronic device 306 . To provide another example ,
radar unit 208 may perform particular functions associated this communication may include the transmission of one or
with the analysis of the video and/ or image data provided by 10 more commands from mobile electronic device 306 to radar
the camera in radar unit 208 periodically , continuously, or unit 308.
upon receipt of a suitable command received from mobile Again , to facilitate these communications , mobile elec
electronic device 206 . For example, to conserve battery tronic device 306 and radar unit 308 may be configured to
power, radar unit 208 may by default analyze radar sensor support communications in accordance with any suitable
signals to generate target data identifying the radar sensor as 15 number and /or type ofwired and /or wireless communication
the data source used to calculate the conveyed information protocols . Examples of suitable communication protocols
such as relative target position and velocity . Once the target may include personal area network ( PAN ) communication
data indicates that a target has “ disappeared ,” mobile elec protocols (e .g., BLUETOOTH ), ultra - low power communi
tronic device 206 may transmit one or more commands to cation protocols (e .g., ANT and ANT + ), Wi- Fi communica
radar unit 208 to activate the camera to begin capturing live 20 tion protocols , radio frequency identification (RFID ) and / or
video data and/ or image data that may be analyzed by the a near field communication (NFC ) protocols , cellular com
processor in the radar unit 208 to calculate new target data munication protocols , Internet communication protocols
that is transmitted to mobile electronic device 206 . Of (e. g ., Transmission Control Protocol ( TCP ) and Internet
course , radar unit 208 may determine if and when to perform Protocol ( IP )), etc .
these functions independently (without receiving commands 25 For example , link 301 may represent one or more wired
from the mobile electronic device 206 ). Further details communication links (e .g ., a cable connection such as uni
associated with such embodiments are discussed below . versal serial bus (USB ) connection , a wired Ethernet con
Embodiments include radar unit 208 determining infor- nection , etc . ) and / or one or more wireless communication
mation , in addition to the relative location and velocity of links ( e . g ., a BLUETOOTH connection , an ANT or ANT +
one or more targets, from the analysis of captured video 30 connection , a Wi- Fi connection , a cellular connection , etc . )
and / or image data . For example , radar unit 208 may deter - between mobile electronic device 306 and radar unit 308 .
mine a size of target 204 by analyzing captured image and /or Radar unit 308 may be implemented as any suitable type
video data and including this information in the transmitted of computing device suitable for being mounted or other
target data , allowing mobile electronic device 206 to present wise affixed to a bicycle and configured to identify one or
a threat level proportional to this calculated size and/ or 35 more targets proximate to a bicycle , to generate target data
proximity of the target 204 . To provide another example indicative of the position and velocity of such targets , to
with reference to FIG . 2B , the processor of radar unit 208 capture and / or analyze image and /or video data , and to
may analyze one or more frames of the captured video to transmit target data and /or image and /or video data in
correlate target 204 to its appropriate road lane and include accordance with the embodiments described herein . In an
this information as part of the transmitted target data (or as 40 embodiment, radar unit 308 may include a processor 352, a
a separate data transmission ), allowing mobile electronic communication unit 354 , a sensor array 356 , a camera 358 ,
device 206 to display this information . The details of such a power unit 360 , a taillight assembly 362 , and a memory
operations are further discussed below . unit 364. Radar unit 308 may include additional elements
FIG . 3 is a block diagram example of a radar sensor such as, for example, interactive buttons, switches, and /or
system 300 , according to an embodiment. In an embodi- 45 knobs, memory card slots, ports , memory controllers , inter
ment, radar sensor system 300 includes a mobile electronic connects , etc ., which are not shown in FIG . 3 or further
device 306 and a radar unit 308 . In an embodiment, mobile described herein for purposes of brevity.
electronic device 306 may be an implementation of mobile Processor 352 may be implemented as any suitable type
electronic device 102 or mobile electronic device 206 , as and /or number of processors, such as a host processor of
shown in FIGS. 1 and 2 , respectively , and discussed above . 50 radar unit 308, for example . To provide additional examples,
Furthermore, in an embodiment, radar unit 308 may be an processor 352 may be implemented as an application spe
implementation of radar unit 308 , as shown in FIG . 2 and cific integrated circuit (ASIC ), an embedded processor, a
discussed above. Again , although mobile electronic device central processing unit associated with radar unit 308, etc .
306 and radar unit 308 are illustrated as two separate Processor 352 may be coupled with and /or otherwise con
components in FIG . 3 , embodiments include mobile elec - 55 figured to communicate , control, operate in conjunction
tronic device 306 and radar unit 308 being integrated as a with , and /or affect operation of one or more of communi
single component that may be mounted in any suitable cation unit 354, sensor array 356 , camera 358, power unit
location to facilitate the functionality of both mobile elec - 360 , taillight assembly 362 , and /or memory unit 364 via one
tronic device 306 and radar unit 308 . Regardless ofwhether or more wired and /or wireless interconnections, such as any
mobile electronic device 306 and radar unit 308 are imple - 60 suitable number of data and / or address buses, for example .
mented as separate devices or integrated into a single device, These interconnections are not shown in FIG . 3 for purposes
the various components shown in FIG . 3 may be intercon - of brevity.
nected (e.g ., within a single device or within each respective For example, processor 352may be configured to retrieve,
device ) and /or coupled with one another to facilitate the process , and/ or analyze data stored in memory unit 364 , to
various functionality described herein . Such couplings and 65 store data to memory unit 364 , to replace data stored in
interconnections are not shown in FIG . 3 , however, for memory unit 364 , to analyze reflected radar transmissions
purposes of brevity . and output radar sensor signal corresponding to the received
US 10 ,393 , 872 B2
21 22
reflection , to generate target data , to capture video and /or sensor array 356 may utilize such accelerometers to measure
image data , to receive commands transmitted from mobile the acceleration of radar unit 308 in one or more directions
electronic device 306 , to control various functions of radar and , as a result, measure the acceleration of the bicycle to
unit 308 , etc . Additional details associated with such func - which radar unit 308 is mounted . This data may be utilized
tions are further discussed below . 5 locally by radar unit 308 , for example , to operate taillight
Communication unit 354 may be configured to support assembly 362 , as further discussed below .
any suitable number and /or type of communication proto In other embodiments , the target data may include the
cols to facilitate communications between mobile electronic radar sensor signals as unprocessed data , and the processor
device 306 and radar unit 308 . Communication unit 354 may of mobile electronic device 306 may analyze the radar
be configured to facilitate the exchange of any suitable type 10 sensor signals to calculate the actual relative speed and
of information between radar unit 308 and mobile electronic location of one ormore targets located in the sensor field . In
device 306 ( e .g ., via link 301), and may be implemented other words, the target data may be processed by either
with any suitable combination of hardware and/ or software mobile electronic device 306 or radar unit 308 based upon
to facilitate such functionality . For example , communication considerations such as design preferences and battery and
unit 354 may be implemented with any number of wired 15 processor limitations of each device. In any event, the target
and /or wireless transceivers, ports , connectors , antennas, data may indicate the velocity and location of various targets
etc . In an embodiment, communication unit 354 may func - with respect to the velocity and location of radar unit 308 .
tion to enable radar unit 308 to wirelessly connect to mobile In this way, when radar unit 308 is mounted to a bicycle and
electronic device 306 and to provide bi- directional commu- directed to a region behind the bicycle , the target data
nications between mobile electronic device 306 and radar 20 indicates the location and velocity of various targets behind
unit 308 . The data transmitted from radar unit 308 may be the bicycle with respect to the velocity and location of the
referred to herein as “ radar unit data ,” and contain the bicycle .
aforementioned target data as well as other types of data Sensor array 356 may be configured to sample sensor
described throughout this disclosure ( in separate data trans - measurements and /or to generate target data from radar
missions or as part of the same data transmission ) . 25 signal reflections continuously or in accordance with any
Sensor array 356 may be implemented as any suitable suitable recurring schedule , such as , for example , on the
number and /or type of sensors configured to measure , moni- order of several milliseconds (e . g ., 10 ms, 100 ms, etc .),
tor, and / or quantify one or more environmental characteris- once per every second , once every 5 seconds, once per every
tics . These sensor measurements may result in the acquisi- 10 seconds , once per every 30 seconds, once per minute , etc .
tion and/ or generation of different types of sensor data , for 30 Sensor array 356 may also be controlled via one or more
example , which may be processed by processor 352 and/ or commands received from mobile electronic device 306 , as
transmitted to mobile electronic device 306 via communi- further discussed below .
cation unit 354 as part of the target data or as a separate data Camera 358 may be configured to capture image data
transmission . Such sensor data transmissions may include, and/or video data over one or more consecutive frames,
for example , processed sensor data ( e . g ., data indicating the 35 including capturing live video data , of objects in the field of
actual measured values ) and/ or the raw sensor data output view of camera 358. In an embodiment, camera 358 may
from each particular sensor , which may be processed by selectively capture image and /or video data in response to
mobile electronic device 306 to determine the actual mea various commands received from mobile electronic device
sured values. 306 and /or upon various trigger conditions being satisfied ,
For example, sensor array 356 may include one or more 40 as further discussed herein . In an embodiment, camera 358
radar sensors and/ or transducers (which may utilize , e . g ., may be housed within or otherwise integrated as part of
radar, Light detection and ranging (Lidar ), and /or ultrasonic radar unit 308 , and strategically mounted within radar unit
sensors ). Sensor array 356 may include one or more radar 308 such that, when radar unit 308 is mounted in a bicycle ,
sensors that are configured to transmit radar signals (e .g ., RF camera 358 may capture image and/or video data of the road
signals ) in various directions across a particular range of 45 and / or other objects in the field of view behind the bicycle
angles , to receive reflected radar signals from one or more to which radar unit 308 is mounted
individual radar sensors , and to output radar sensor signals Camera 358 may include any suitable combination of
using the reflected radar signals . These radar sensor signals hardware and/ or software such as image sensors , optical
may include, for example , analog signals that represent stabilizers , image buffers, frame buffers, charge -coupled
unprocessed measurements associated with each individual 50 devices (CCDs), complementary metal oxide semiconductor
radar sensor 's radar transmission and a time of return for its (CMOS ) devices , etc ., to facilitate this functionality . Camera
respective reflected radar signal. In some embodiments , the 358 may store the image and / or video data to any suitable
radar sensor signals may then be processed by processor 352 portion of memory unit 364, which may be stored in a
to determine the actual relative speed and location of one or " rolling buffer” format such that stored data is overwritten
more targets and included as part of a target data transmis - 55 periodically , such as every 15 minutes, every hour, etc .,
sion . unless a user intervenes ( e . g ., by powering down radar unit
Sensor array 356 may also include accelerometers , gyro 308 or indicating that video recording should be stopped
scopes, perspiration detectors , compasses, speedometers , using any suitable interactive techniques such as a button ,
magnetometers , barometers, thermometers, proximity sen which is not shown in FIG . 3 for purposes of brevity ). In this
sors, light sensors ( e . g ., light intensity detectors ), photode - 60 way, the image and / or video data may be stored in memory
tectors , photoresistors , photodiodes , Hall Effect sensors , unit 364 such that in the event that an accident or other
electromagnetic radiation sensors ( e .g ., infrared and /or ultra - noteworthy event occurs, the stored data may be saved or
violet radiation sensors ), ultrasonic and/ or infrared range copied to another device as needed .
detectors , humistors , hygrometers , altimeters , biometrics The camera 's field of view at least partially overlaps with
sensors ( e . g ., heart rate monitors , blood pressure monitors , 65 the sensor field of the radar sensor. For instance , the sensor
skin temperature monitors ), microphones , etc . When sensor field of the radar sensor may be associated with an area
array 356 is implemented with one ormore accelerometers, having a size (width , height, and depth ) that is approxi
US 10 ,393 ,872 B2
23 24
mately equal to the area of the sensor field . Additionally or other suitable data used in conjunction with radar unit 308 ,
alternatively , camera 358 may be utilized to determine such as target data , sensor data , live video data , etc .
whether other components of radar unit 308 are configured Camera control module 365 is a region ofmemory unit
properly . For example , sensor array 356 may include one or 364 configured to store instructions , that when executed by
more radar sensors, which need to be mounted in such a 5 processor 352 , cause processor 352 to perform various acts
manner that they are not obstructed to operate correctly. in accordance with applicable embodiments as described
Because camera 358 may be mounted in close proximity to herein . In an embodiment, camera control module 365
sensor array 356 , an obstruction to the field of view detected includes instructions that, when executed by processor 352,
by camera 358 would likely result in a similar obstruction to cause processor 352 to control the state of camera 358 and/ or
sensor array 356 . In an embodiment, processor 352 may be 10 when image and /or video data is captured , stored , and /or
configured to detect whether camera 358 has a clear field of
view , for example , as part of an initial startup , initialization , transmitted .
In various embodiments, processor 352 may execute
or calibration procedure, and communicate this information instructions stored in camera control module 365 to interpret
to mobile electronic device 306 so this may be conveyed to
a user. This detection may include, for example , momen - 15 commands received from mobile electronic device 306 via
tarily transmitting live video data to the mobile electronic link 301 and/ or commands received locally , for example, in
device 306 and allowing a user to view the live video data , the form of user input (e .g ., via appropriate interaction with
check for obstructions, or otherwise verify that radar unit radar unit 308 , the details of which are not shown for
308 has been properly aligned and mounted to the rear of the purposes of brevity ) . For example , upon receiving one or
bicycle. This may also include, for example , processor 352 20 more commands from the mobile electronic device 306 ,
analyzing the live video and determining whether one or processor 352 may execute instructions stored in camera
obstructions exist in the camera ' s field of view using any control module 365 to determine the appropriate function
suitable image processing techniques (e . g ., determining and to cause camera 358 to perform that function . For
whether no images are within a threshold distance of the example , if the mobile electronic device 306 transmits a
camera , determining that no shadows or other dark objects 25 command to change the powered state of camera 358 , then
otherwise conceal a portion of the field of view , etc .). In the processor 352 may execute instructions stored in camera
event that an obstruction is detected , mobile electronic control module 365 to cause camera 358 to turn on or turn
device 306 ( or radar unit 308 ) may sound an alarm or off in accordance with the particular command . To provide
provide other suitable feedback to the user to verify that the another example , processor 352 may execute instructions
alignment and mounting configuration of radar unit 308 is 30 stored in camera control module 365 to interpret commands
correct. such as when to begin capturing image and /or video data ,
Power unit 360 may be configured to act as a power when to store image and/or video data in memory unit 364 ,
source for radar unit 308 . Power unit 360 may be imple when to stop the rolling buffer of image and /or video data
mented as any suitable type of power source that facilitates stored in memory unit 364 and not overwrite the stored data ,
power delivery to one or more portions of radar unit 308 to 35 etc .
provide functionality for various components of radar unit Sensor processing module 367 is a region ofmemory unit
308. Examples of implementations of power unit 360 may 364 configured to store instructions , that when executed by
include any suitable type of rechargeable battery, an array of processor 352, cause processor 352 to perform various acts
rechargeable batteries, fuel cells, etc . in accordance with applicable embodiments as described
Taillight assembly 362 may be configured with any suit - 40 herein . In an embodiment, sensor processing module 367
able number and /or type of illuminating components , such includes instructions that, when executed by processor 352,
as lightbulbs, light-emitting diodes (LEDs), etc.,which may cause processor 352 to analyze radar sensor signals output
be arranged in a particular manner and/ or have varying from one or more radar sensors included as part of sensor
intensities . In an embodiment, processor 352 may control array 356 , to determine relevant information from this
the manner in which taillight assembly 362 illuminates the 45 analysis , and to generate target data including this determine
various illuminating components based upon changes in information . For example , processor 352 may execute
acceleration of the bicycle as detected from sensor data instructions stored in sensor processing module 367 to
generated by one or more accelerometers that are imple - analyze the radar sensor signals to identify the location
mented as part of sensor array 356 . For example , taillight and /or speed of various targets located in the sensor field .
assembly 362 may include several illuminating components 50 This may include, for example , converting radar sensor
positioned in a horizontal line . As deceleration is detected signals collected over a time period from analog to digital
exceeding a threshold value, processor 352 may cause signals , analyzing the time of return associated with the
taillight assembly 362 to illuminate more illuminating com - radar sensor signals , and correlating each radar sensor signal
ponents , to cause the illuminating components to increase in to a particular radar sensor in sensor array 356 to determine
brightness , to flash , etc . In this way, taillight assembly 362 55 a size , location , and velocity of one or more targets located
may function similar to a vehicle ' s taillights , which illumi- in the sensor field . Data processing module may then format
nate as the bicycle is slowing down and turn off otherwise . this information as part of a target data transmission , which
In accordance with various embodiments , memory unit is then transmitted to mobile electronic device 306 via
364 may be a computer -readable non -transitory storage communication unit 354 .
device thatmay include any suitable combination of volatile 60 Video processing module 369 is a region ofmemory unit
( e . g ., a random access memory (RAM ) , or non -volatile 364 configured to store instructions, that when executed by
memory (e . g ., battery -backed RAM , FLASH , etc .). Memory processor 352, cause processor 352 to perform various acts
unit 364 may be configured to store instructions executable in accordance with applicable embodiments as described
on processor 352 . These instructions may include machine herein . In an embodiment, video processing module 369
readable instructions that , when executed by processor 352, 65 includes instructions that, when executed by processor 352 ,
cause processor 352 to perform various acts as described cause processor 352 to analyze image and / or video data to
herein . Memory unit 364 may also be configured to store any determine whether one or more targets (or portions of
US 10 ,393 , 872 B2
25 26
targets ) are contained in image and /or video data captured by stored in sensor processing module 367 by processor 352) or
camera 358 ( field of view of camera 358). from an analysis of captured video data (e .g ., via execution
To perform video analysis , video processing module 369 of instructions stored in video processing module 369 by
may include any suitable number and /or type of video processor 352 ). Embodiments include radar unit 308 track
processing algorithms. For example , memory unit 364 may 5 ing one or more targets , i.e ., providing the position and
be configured to store various training data models . These velocity of one or more targets in the target data to facilitate
training data models may include, for example, ranges of mobile electronic device 306 continuing to convey this
video data metrics that indicate when a particular target to be information by switching between the two aforementioned
detected (or a portion of a target) is contained within video analyses .
data . These video data metrics may include any metrics 10 Therefore , embodiments include target tracking module
suitable for the classification of live video data images by 371 including instructions that , when executed by processor
comparing the video data metrics to the training data mod - 352, cause processor 352 to control when each analysis is
els . For example , the video data metrics may indicate performed . Thus , target tracking module 371 is a region of
brightness , groupings of pixels forming specific sizes , pat- memory unit 364 configured to store instructions, that when
terns, or shapes , pixel coloration , edges detected within the 15 executed by processor 352 , cause processor 352 to perform
live video data , contrasting portions within the live video various acts in accordance with applicable embodiments as
data , etc . described herein . In an embodiment, target tracking module
Based on the output from the executed classification 371 includes instructions that , when executed by processor
algorithm on the live video data , a determination may be 352 , cause processor 352 to control which source of data
made based upon the characteristics utilized by that particu - 20 (i.e ., radar sensor signal or video ) is used to calculate the
lar classification algorithm . Video processing module 369 position and velocity of one ormore targets included as part
may store any suitable type and /or number of classification of the target data .
algorithms to make this determination . For example , video To do so , embodiments include processor 352 executing
processing module 369 may store instructions that, when instructions stored in target tracking module 371 to deter
executed by processor 352 , cause processor 352 to execute 25 mine if and when one or more trigger conditions has
a linear classifier algorithm , a support vector machine algo - occurred . When a trigger condition occurs , radar unit 308
rithm , a quadratic classifier algorithm , a kernel estimation may activate camera 358 , power up or power down camera
algorithm , a boosting meta -algorithm , a decision tree algo - 358 , start or stop capturing , analyzing, and /or transmitting
rithm , a neural network algorithm , a learning vector quan - image and /or video data , etc . As further discussed below ,
tization algorithm , etc. 30 processor 352 may interpret and execute various commands
Furthermore , embodiments include video processing upon the occurrence of a trigger condition based upon the
module 369 including instructions that, when executed by particularmode of operation of camera 358 and /or radar unit
processor 352 , cause processor 352 to not only determine 308 .
whether particular objects are located in field of view ( the Capturing, storing , and /or transmitting video may be a
captured image and /or video ), but the velocity and location 35 particularly power -intensive operation , causing operation of
of those objects with respect to radar unit 308 . To do so , camera 358 continuously to drain power unit 360 . Therefore ,
embodiments include processor 352 analyzing one or more embodiments include radar unit 308 , via processor 352
frames of captured video to identify one or more reference executing instructions stored in target tracking module 371,
objects associated with a particular fixed or known length to cause radar unit 308 to capture video and /or images only
located within the field of view of camera 358. 40 when certain conditions are satisfied or in specific situations.
For example , using an edge detection algorithm or other The following conditions are explained with the assumption
suitable algorithm , processor 352 may identify line seg - that the radar sensor signal are collected continuously or
ments associated with dashed road lane lines. Federal guide otherwise available at any time, and the video is selectively
lines establish that each dashed road lane line be 10 feet captured , stored , transmitted , and/ or analyzed . However,
long, with the empty spaces in -between measuring 30 feet . 45 embodiments also encompass the opposite of this scenario .
In an embodiment, video processing module 369 may That is, embodiments may also include the video data being
include instructions that enable processor 352 to identify continuously captured and the radar sensors being selec
such dimensions within a video frame and to calculate a tively powered on , and the radar sensor signals being
proportion between pixels and the actual measurement asso generated and /or analyzed based upon similar or identical
ciated with such known fixed length objects . This propor - 50 conditions as described below . In this alternate scenario , the
tion , once known , may then be used to determine the velocity and location of targets may be determined initially
dimensions associated with other objects (such as the tar- (i. e ., included in the initialtarget data ) from a video or image
gets ) in the live video by applying the pixel-to -length ratio analysis instead of an analysis of the radar sensor signals .
to an identified number of pixels occupied by other objects. Video Analysis Trigger Conditions
The distance between radar unit 308 and other various 55 In various embodiments, processor 352 may execute
targets may be calculated , for example , by identifying an
instructions stored in target tracking module 371 to cause
object adjacent to the target having a fixed or known communication unit 352 to issue commands to camera 358
dimension , and applying the pixel-to - length ratio for the when certain trigger conditions are met, resulting in radar
object to the nearby target in the field of view . Furthermore , unit 308 activating or powering on camera 358 , capturing
once the dimensions of target objects are known, the veloc - 60 video data , analyzing the video data , and / or transmitting the
ity at which these targets are moving may be calculated , for video data . In the event that video is continuously being
example , using the frame capture rate associated with the captured , processor 352 may instead analyze the captured
captured video and the change in each target' s position video upon such a condition being satisfied , as such com
between each frame. mands are not necessary in such a scenario . Examples of
In other words, the location and velocity of targets relative 65 various trigger conditions are further discussed below .
to the bicycle may be determined either from an analysis of For example , if an analysis of the radar sensor signals
the radar sensor signal (e.g ., via execution of instructions does not indicate the presence of any targets in the sensor
US 10 ,393 , 872 B2
27 28
field for a predetermined threshold period of time (e. g., 30 352 may determine when a target “ should be ” behind the
seconds, 1 minute , etc .), then this eventmay serve as a video bicycle , but its presence (i. e., its relative location ) in the
analysis trigger condition . In this way , embodiments include sensor field can no longer be detected from analysis of the
radar unit 308 periodically verifying, via an analysis of the radar sensor signals. For instance, using location tracking,
video data , that no targets are located behind the bicycle . 5 processor 352 may track the location of a particular target
To provide another example , embodiments include from a point in time when the target is initially detected until
mobile electronic device 306 transmitting commands to the target passes the bicycle. In other words, once detected
radar unit 308 to turn on camera 358 and to analyze received in the sensor field , the target is expected to pass the bicycle
live video in accordance with any suitable schedule . Alter - at some later point in time based on that target 's velocity at
natively , radar unit 308 may locally issue such commands 10 the time it was detected . Thus , an initially detected target
independently of mobile electronic device 306 . In this that is no longer detected at some later point in time using
instance , the trigger condition may be, for example , the the radar sensor signals (e .g ., after a time period that
passage of a particular interval of time such as 15 seconds, corresponds to when the target should have passed the
30 seconds , etc ., such that video data is analyzed in accor - bicycle based upon its initial velocity ) may act as a trigger
dance with a recurring schedule. In other words, radar unit 15 condition . When this trigger condition is met , processor 352
308 may periodically analyze captured video in addition to may switch how velocity and location tracking is performed
or as an alternative to the other trigger conditions described by changing from an analysis based upon the radar sensor
herein . In this way, periodic analysis of the captured video signals to an analysis based upon the video data (i.e ., by
may provide additional information and feedback to a user activating camera 358 and analyzing video or image data to
in addition to the information obtained via an analysis of the 20 determine whether a target is traveling behind the user' s
radar sensor signals . bicycle ).
In an embodiment, processor 352 may analyze the radar To provide yet another example , embodiments include
sensor signals over a period of time as the data is received processor 352 executing instructions stored in target track
from sensor array 356 . Therefore , the velocity of one or ing module 371 to identify if a particular target, once
more targets as indicated by the radar sensor signals may be 25 detected in the sensor field , is lost within some predeter
tracked over time as a result of processor 352 executing mined window of time after the target 's initial detection
instructions stored in sensor processing module 367, as (e . g ., a fixed window of time that is not based upon the
discussed above . This tracked velocity information may also target 's initial velocity ). For instance , if the radar sensor
be used as the basis of one or more trigger conditions . For signals are analyzed and a target is detected in the sensor
example , processor 352 may execute instructions stored in 30 field , the location and velocity of the target may be deter
target tracking module 371 to determine whether a target's mined and a timer or other point of reference in time (e .g .,
deceleration profile matches ( e .g ., within a threshold toler - a timestamp may be generated . If the radar sensor signals
ance ) that of one or more predetermined deceleration pro - later indicate ( e. g ., within the next 15 seconds, 30 seconds ,
files. In other words, upon detecting ( from the radar sensor etc .) that the target is no longer present in the sensor field
signals ) that a particular target is slowing at a rate that 35 ( e . g ., target passed bicycle , target turned onto another road ,
exceeds a threshold deceleration , this may trigger processor etc .), then this particular trigger condition is satisfied for
352 to switch how velocity and location tracking is per - processor 352 to evaluate objects located in the field of view
formed for that target (or for all targets ) by changing from of camera 358 . Such embodiments may be particularly
an analysis based upon the radar sensor signals to an analysis important, for example , in areas where traffic often changes
based upon the video data , and including the results of one 40 unexpectedly , such that video analysis may not need to be
of these analyses as part of the target data . performed when traffic behind the bicycle is turning off as
To provide an additional example , instead of using the opposed to being behind the bicycle but no longer detected
deceleration of one or more targets as a trigger condition , via the radar sensor signal analysis .
processor 352 may determine when one ormore targets have Regardless of how the analysis of video data is triggered ,
a relative velocity that is approximately equal to that of the 45 in accordance with various embodiments, radar unit 308
bicycle . That is , the condition would be said to be satisfied may continue to analyze the radar sensor signals ( or do
when it is determined that a target has a relative velocity periodically such as every 5 seconds, every 10 seconds , etc.)
approximately equal to that of the bicycle. To do so , pro corresponding to the sensor field while the video data
cessor 352 may determine when the relative instantaneous corresponding to the field of view of camera 358 is analyzed .
velocity of a particular target in the sensor field is less than 50 In the event that relative velocity of the target resumes above
a predetermined relative threshold velocity ( e .g ., 2 mph , 4 a threshold relative velocity (or another trigger condition is
mph , etc .). If so , then this particular condition is considered no longer satisfied ), then radar unit 308 may switch back to
satisfied , and processor 352 may switch how velocity and analyzing the radar sensor signals to determine the relative
location tracking is performed for that target (or for all velocity and location of one or more targets and /or cause
targets ) by changing from an analysis based upon the radar 55 camera 358 to power down or otherwise stop capturing,
sensor signals to an analysis based upon the video data storing, and / or transmitting video .
As an additional example , the history or “ trend” of a In embodiments in which relative target velocity is used
target 's tracked velocity and /or location may also be used as as the basis of a trigger condition , the relative velocity
the basis for one or more trigger conditions . That is, pro - threshold that triggers radar unit 308 to switch from an
cessor 352 may analyze radar sensor signals over a period of 60 analysis based upon the radar sensor signals to an analysis
time to track the location and /or velocity of one or more based upon the video or image data of the field of view of
targets in the sensor field . As discussed above, embodiments camera 358 may be the same value or a different value than
enable a user to determine whether the previously identified the relative velocity threshold that triggers radar unit 308 to
target 204 passed bicycle 202 or turned onto another road ( or switch back to an analysis based upon reflections of radar
is otherwise not present)orwhether the previously identified 65 sensor signals from the sensor field . For example , different
target 204 is now traveling directly behind the user. Using relative velocity threshold values may be used such that,
the history of tracked locations and /or velocities, processor once a video or image data analysis is triggered , a higher
US 10 ,393 , 872 B2
29 30
relative velocity threshold is required to switch back to a Threat Assessment
radar sensor signal analysis than the initial relative velocity Threat assessmentmodule 373 is a region ofmemory unit
threshold that triggered the video or image data analysis . In 364 configured to store instructions, that when executed by
this way, data analysis switching may be performed in a processor 352, cause processor 352 to perform various acts
hysteretic manner to better ensure smooth and consistent 5 in accordance with applicable embodiments as described
transitions between both types of data analyses . Again , this herein . In an embodiment, threat assessment module 373
may be facilitated , for example, by either switching data includes instructions that, when executed by processor 352 ,
analyses (when video data is continuously captured ) or by cause processor 352 to categorize the threat level of one or
powering down camera 358 or otherwise stopping video more targets located in the sensor field of sensor array 358
from being captured (when the video is not continuously 10 or field of view of camera 358. For example, as discussed
captured ), as the case may be . above , processor 352 may execute instructions stored in
The above examples discuss situations in which the video video processing module 369 to track the location and
or image data is either captured or analyzed when the radar velocity of targets using video data . The categorized threat
sensor signals no longer indicate the presence of a target. level of each target may be based upon , for example, the
This may occur when the speed of a target passed bicycle 15 relative location to the bicycler and /or the size of each target
202, turned onto another road (or is otherwise not present ) calculated from one or more of such video processing
or when the previously identified target 204 is now traveling algorithms.
directly behind the user. However, in some situations, itmay That is, embodiments include processor 352 calculating
be preferable to present or record live video of the field of one or more dimensions of various targets in the live video
view of camera 358 upon initially detecting a target , and 20 (located in the field of view of camera 358 ). These dimen
then stop presenting or capturing the live video once the sions may be any suitable portion of each target, such as
target has passed . Such embodiments may be particularly those measured with respect to the front side of a vehicle
useful, for example , when the bicycle is traveling in an area
( e .g ., height and width ). Once these dimensions are calcu
that does not have many targets to track . lated , processor 352 may execute instructions stored in
In an embodiment, the trigger condition may be based 25 threat assessment module 373 to compare the dimensions to
upon one or more targets having assessed a threat level in a range of predetermined dimensional models associated
excess of a predetermined threshold . For example, as dis with various threat classifications. To provide an illustrative
cussed further below , threat levels of targets may be based example , memory unit 364 may store a set of dimensional
upon the determined size and /or proximity of a target to the models corresponding to a large vehicle , such as a semi
bicycle , as well as other factors . In an embodiment, the 30 truck , that represents a high threat level. Continuing this
processor of mobile electronic device 306 may determine example , memory unit 364 may also store other sets of
the threat level based on an analysis of the target data . In dimensional models corresponding to a sport utility vehicle
some embodiments , the mobile electronic device 306 may (SUV ) , a mid - sized vehicle , and a compact vehicle , each
determine when the trigger condition is satisfied based upon representing a decreasing threat level in accordance with
a target exceeding a predetermined threat level , and sending 35 decreasing dimensions. Once the dimensions of a particular
a command to the radar unit 308 that causes the radar unit target are identified , processor 352 may correlate the target
308 to activate camera 358 and begin capturing, analyzing, to one of these dimensional models and assess the target' s
and / or transmitting video data . In other embodiments, this threat level as the threat level of the dimensionalmodel to
decision to turn on camera 358 based on the determined which it has been correlated .
threat level associated with a target in the sensor field may 40 This correlation may be performed in any suitable man
be made independently by processor 352 of radar unit 308. ner. For example, processor 352 may attempt to match a
In any event, processor 352 may selectively switch from calculated target dimension to a range of dimensions asso
determining the location and/ or threat level of a target ciated with each threat level stored in memory unit 364 ( e . g .,
located in the sensor field relative to the bicycle using radar overall width or height). Processor 352 may then determine
sensor signals to determining the location and /or threat level 45 which of the stored dimensional models have a range of
of a target relative to the bicycle using video data captured dimensions best matching the corresponding calculated tar
by the camera 358 of objects in its field of view . Again , this get' s dimension . Processor 352 may determine the threat
location and / or threat level may be included in the target level corresponding to thematched dimensionalmodels and
data that is transmitted to the mobile electronic device , assess the target' s threat level as the corresponding threat
regardless of which source of data is used to determine this 50 level.
information . To provide another example, threat assessment module
In other words , a first trigger condition may include a 373 may assess threats based upon other factors in addition
target being initially detected in the sensor field via analysis to , or instead of, target size . Such threat assessments may be
of the radar sensor signals . This first trigger condition , when based upon any suitable combination of information
satisfied , may cause video to be captured , transmitted to 55 obtained by analyzing the radar sensor signals and / or by
mobile electronic device 306 , stored, and /or analyzed . Fur - analyzing captured video data. For example , as discussed
thermore, a second trigger condition may include the target below with regards to lane determination module 375 ,
passing the bicycle . This second trigger condition , when processor 352 may correlate one or more targets to a
satisfied , causes the video to stop being captured , transmit respective road lane , and may track each target as it moves
ted , stored , and / or analyzed . In this way, video footage may 60 between road lanes. In an embodiment, threat assessment
be stored over brief intervals of time when targets pose module 373 may include instructions that, when executed by
potential threats to a bicycle , and otherwise not stored processor 352 , cause processor 352 to utilize various metrics
permanently . This video data may be stored in memory unit related to road lane usage to determine potential threats . For
364 , for example, and/ or transmitted to mobile electronic example , the rate at which each a target changes lanes over
device 306 , which in turn presents the video data , in various 65 a period of timemay be compared to a threshold rate (e . g .,
embodiments . Additional details of how video data may be 2 lane changes every 15 or 30 seconds, 4 lane changes within
displayed in this manner are further discussed below . 60 seconds, etc.). Upon exceeding this threshold rate , a
US 10 ,393 , 872 B2
31 32
target may be associated with an increased threat level. To more of these video data processing algorithms to correlate
provide another example , a target that is severely skewed a particular target to its current road lane line .
within its own lane or that straddles more than one lane may For example, as discussed above , line segments associ
similarly be marked as being an increased threat level to the a ted with the road lane lines may be identified via edge
cyclist . To provide yet another example , the threat levelmay 5 detection (or other suitable techniques ). Solid and dashed
be modified in accordance with a range of predetermined road lane lines may have pixel dimensions of a threshold
distances from the bicycle, such that the target 's threat level size that are greater than other identified line segments
is increased the closer the target is to the bicycle , which may within the live video data . Once the road lane lines are
be in addition to the aforementioned threat assessment identified, processor 352 may execute instructions stored in
techniques or as an alternative to such techniques. 10 lane determination module 375 to identify the shape of the
Furthermore , because the location and velocity of each road and the number of road lanes in the live video . This
target with respect to the bicycle may be tracked over time, determination may be made , for example , using the carto
a trajectory may be calculated for each target. For example , graphic data utilized for navigational functions performed
by using the previous and current velocity and heading of a by mobile electronic device 306 to verify the calculated
particular target, this information may be extrapolated to 15 number of road lanes .
determine a future path for that target. This extrapolation Once the overall number of road lanes is determined ,
may be applied to any suitable sample size of previously embodiments include processor 352 mapping or correlating
tracked information ( e . g ., the previous 5 seconds of data , the the position of each target in the video to its respective road
previous 10 seconds, etc .). In various embodiments , this lane . Again , the cartographic data may be used to supple
trajectory information may then be utilized as the basis for 20 ment or assist in this correlation , which may be received
one or more threat assessments. For instance, if a target's from mobile electronic device 306 . For example, if a target
trajectory, when considered in conjunction with that of the is identified as traveling in the second lane from the right
bicycle, would result in a collision (or a proximity within side of the road, then processor 352 may correlate this lane
some threshold distance ) between the bicycle and the target, position to the actual map of the same road based upon the
then processor 352may assess this situation as a threat to the 25 current location of mobile electronic device 306 . In an
cyclist and cause communication unit 354 to transmit this embodiment, processor 352 may repeat this process over
information as part of the target data (or a separate data time to track each target as it moves between road lanes.
transmission ), which is then conveyed to the user via mobile That is , as each target changes between different road lanes,
electronic device 306 . processor 352 may keep track of this information and
To provide yet another example , processor 352 may 30 transmit this data as target data (or a separate data trans
utilize other forms of information to assess potential threats. mission ) to mobile electronic device 306 . The details of how
For example, sensor array 356 may include a microphone the correlated road lane line information may be displayed
that records audio data , which may be captured with video are further discussed below with reference to FIGS. 4A -4C .
data or as a separate sensor measurement. Processor 352 FIGS. 4A -4C are schematic illustration examples of user
may continuously analyze (or upon the same trigger condi- 35 interface screens used in conjunction with a radar sensor
tions being satisfied as described herein with respect to the system , according to an embodiment. Each of FIGS. 4A - 4C
analysis of the video data ) such audio data to determine shows various types of awareness indicators using the target
whether a particular target should be audible to the cyclist. data and /or other data received from a radar unit that is used
That is, when audio data indicates noise above a particular as part of a radar sensor system . In an embodiment, FIGS .
threshold level, then processor 352 may and cause commu- 40 4A - 4C correspond to example displays shown by a mobile
nication unit 354 to transmit this information as part of the electronic device ( e .g ., mobile electronic device 306 , as
target data (or a separate data transmission ), which is then shown in FIG . 3 ) based on target data received from a radar
conveyed to the cyclist via mobile electronic device 306 . unit (e . g ., radar unit 308 , as shown in FIG . 3 ) .
Such embodiments may be particularly useful, for example , In some implementations, processor 104 is configured to
to provide a third source of threat assessment should the 45 cause the display device 120 to present a route 422 and an
analysis of the radar sensor signals and the video data both icon 424 ( e .g ., triangle ) indicative of the cyclist' s position on
fail to indicate the presence of a target. the route 422 , as shown in FIG . 4A . The display screen 400
Regardless of the type of threat identified , embodiments may also show a street name 418 , route name, or other
include mobile electronic device 306 displaying various geographic data . The processor 104 may also cause the
threats in any suitable manner to adequately convey to a user 50 display device 120 to show guidance information on the
the severity and /or type of threat, which is further discussed display screen 400 . For example , the display screen 400 in
below with respect to FIGS. 4A -4C . FIG . 4A shows directions for the cyclist to make a left turn
Lane Determination ahead . In some implementations, the display screen 400 may
Lane determination module 375 is a region of memory show an arrow 420 on the route 422 and a “ left turn ” icon
unit 364 configured to store instructions, that when executed 55 in the upper left corner of the display screen 400 with a
by processor 352, cause processor 352 to perform various distance ( e . g ., 300 ft.) to the left turn ( not shown ). The
acts in accordance with applicable embodiments as display screen 400 illuminates the sides ( e. g ., edges 402A
described herein . In an embodiment, lane determination and 402B ) of the display screen 400 in a low awareness
module 375 includes instructions that, when executed by color (e. g ., green ) or navigational information ( turn arrow )
processor 352, cause processor 352 to correlate the target to 60 presented on the display screen 400 to indicate that the
a road lane within the road on which it is traveling . For determined awareness level is low at the moment or for the
example , as discussed above , processor 352 may execute upcoming turn . A distance indicator 414 may also be shown
instructions stored in video processing module 369 to track to indicate that the left turn is approximately 300 feet ahead
the location and velocity of targets within captured video of the cyclist . A time indicator 416 may also be shown to
data corresponding to a field of view of camera 358. In an 65 indicate that the left turn is approximately 20 seconds ahead
embodiment, processor 352 may further execute instructions of the cyclist based on his or her current speed and location .
stored in lane determination module 375 to utilize one or Textual instructions with a street name 418 ( e . g ., “ Left on
US 10 ,393 , 872 B2
33 34
Main St.” ) may be shown to guide the cyclist on the route located at different distances from the bicycle (dot 405
422 . One or more navigational selections 512 ( e .g ., " Tap to representing the user 's location ). In some embodiments, dots
go back ” ) may be shown to allow the cyclist to make 406 , 408 and 410 may relate to the progression of a single
changes to the route or stop routing. target as it approaches the bicycle from behind . In any event,
The display screen 400 can also show sensor connection 5 the distance from the cyclist to the target determined to be
status icon 404 , indicating that the mobile electronic device present in a sensor field or field of view of the camera ( i. e .,
102 is wirelessly connected to the radar unit. Sensor con ascertained from the target data ) is represented by the
nection status icon 404 may be presented or shaded in a position of each dot 406 , 408 , and 410 on the tracking bar
color (e . g ., green ) to indicate connectivity . In some imple - 403B , relative to dot 405 representing the cyclist. In some
mentations, the first processor 104 is configured to cause the 10 embodiments , text may accompany each dot to indicate the
display device 120 to indicate that a wireless connection distance of each target relative to the cyclist. Although two
with a transceiver (within radar unit 208 or 308 ) coupled tracking bars 403A and 403B are shown in FIG . 4A ( and
with one or more radar sensors is active ( connected ) or FIGS . 4B - 4C ), embodiments include one of the tracking
disconnected from the mobile electronic device 102 by bars 403A or 403B being displayed at one time on display
changing the color or shading of sensor curves in the sensor 15 screen 400 or both tracking bars 403A and 403B being
connection status icon 404 shown on the display screen 400 . displayed at the same time. For example , tracking bar 403A
In some implementations, sensor connection status icon 404 may be used instead of tracking bar 403B , as shown in FIGS.
may be accompanied by a notification displayed at any 4A - 4C , to display target locations according to user prefer
suitable location on display screen 400 , such as " sensor has ence .
been disconnected ,” or any other sort of visual or auditory 20 The dots and the accompanying text (when presented )
indication . After a period of time, the processor 104 may be may update periodically as new target data is received to
configured to cause the display device 120 to remove the indicate a current position of each target over time. Again ,
notification . because the target data may include target distance and
In some implementations, a situational awareness indica - velocity information based upon an analysis of radar sensor
tor determined by processor 104 may include a brightness or 25 signals and / or video or image data , embodiments include the
color of at least a portion of an edge ( e.g ., edge 402A or position of dots 406 , 408 , and 410 updating regardless of the
402B ) of the display screen 400 or navigational information velocity at which the target is traveling. In this way , the
(turn arrow ) presented on the display screen 400 . Processor changes in the position of each target over time may be
104 is configured to cause a change in brightness or color of readily and seamlessly conveyed to the cyclist in a continu
an edge 402A or 402B or navigational information to 30 ous manner.
provide a situational awareness level to the cyclist. For Furthermore , display screen 400 may include both aware
example , the display screen 400 can indicate a low level of ness level indicators and / or threat level indicators based
recommended awareness with a slight change in brightness upon various factors . For example , as shown in FIG . 4A ,
or dimming of edge 402A and /or 402B , and greater changes edges 402A and 402B of the display screen 400 or naviga
in brightness or dimming of edge 402 A and /or 402B corre - 35 tional information ( turn arrow ) presented on the display
sponding to higher levels of recommended awareness , such screen 400 may be illuminated in a high awareness color
as when a vehicle is rapidly approaching or near the cyclist. (e. g., red ) to indicate that the awareness level is high . This
The display screen 400 may also indicate a low level of may be the case , for example , when the detected target
recommended awareness by changing a color at edge 402A represented by dot 410 is nearer to the cyclist or approaching
and / or 402B or navigational information (turn arrow ) to a 40 at a faster speed than the targets represented by dots 406 and
low awareness color such as green , indicate higher levels of 408 .
recommended awareness by changing a color at edge 402A Threat level indicators may also be conveyed to the
and / or 402B or navigational information turn arrow ) to a cyclist on display screen 400 in various ways . Again , the
moderate awareness color such as yellow or orange , and threat level of a particular target may be determined from the
indicate to a highest levels of recommended awareness by 45 size of the target , the target' s proximity to the cyclist, the
changing a color at edge 402A and /or 402B or navigational velocity of the target, the target' s trajectory , etc . In FIG . 4A ,
information (turn arrow ) to a highest awareness color such targets 406 , 408 , and 410 may represent three different
as red . targets behind the cyclist . As each target is nearer to the
For example , processor 104 may receive target data from cyclist, that target' s respective dot may increase in size
the radar unit ( e . g ., radar unit 208 or 308 ) indicating the 50 proportional to the threat level. The size of the target' s dot
position and velocity of targets relative to the bicycle (based may also increase , for example , based upon determined size
upon an analysis of radar sensor signals or an analysis of of the target based on video or image data . For example ,
video and/or image data , as discussed above ). When this although dot 406 is smaller than dot410 in FIG . 4A , these
target data indicates that a target may be traveling nearby , as dots could be the same size if the target represented by dot
shown in FIG . 4A by dots 406 , 408 , and 410 , the processor 55 406 was determined to be of a much larger size than the
104 may be configured to cause the display device 120 to target represented by dot 410 .
illuminate the sides ( e . g ., edges 402A and 402B ) of the In various embodiments , processor 104 may change the
display screen 400 or navigational information (turn arrow appearance of dots presented in tracking bar 303A and / or
presented on the display screen 400 in an awareness color 303B in any suitable manner to adequately convey the
( e . g ., orange ) corresponding to the determined threat level to 60 classified threat level of each target such as by changing
indicate the awareness level. colors , flashing, etc . Additionally or alternatively , the threat
The processor 104 may be configured to cause the display level information associated with each target (or the closest
device 120 to present the tracking bar 403A and /or 403B on target) and/or other relevant information may be presented
the display screen 400 to indicate a detected target ( e. g., a in an information overlay 426 , as shown in FIG . 4B . For
rear-approaching vehicle ) as one or more of the dots 406 , 65 example, the information overlay 426 may display informa
408 , and / or 410 . In various embodiments , dots 406 , 408 , and tion in the form of text such as the velocity , position , and
410 as shown in FIG . 4A may represent three distinct targets threat level classification of one or more targets (e.g., the
US 10 ,393 , 872 B2
35 36
closest target represented by dot 410 ), information related to an illustrative example , target 486 may be displayed in red ,
the threat classification ( e.g., “ large vehicle approaching " ), target 482 may be displayed in orange , while target 484 may
etc . be displayed in yellow . Processor 104 may determine a
In some implementations, processor 104 is configured to higher threat level for targets 482 and 486 (than 484 )
cause the display device 120 to present live video data 5 because these two targets are traveling in the same lane as
captured by the radar unit (e . g ., radar unit 208 or 308 ) behind the user. The proximity to the user and determined size of
the cyclist, as shown in FIG . 4B . For example , display target 486 make it a higher threat to the user than target 482.
screen 440 includes the tracking bars 403A and 403B , dots As additional target data is received indicating new lane
406 , 408, and 410 , and other similar icons and user interface locations and threat levels for each target, the threat level
functionality as display screen 400 . However, instead of the 10 and position of each targetmay be determined by processor
map and route information previously displayed in display 104 and updated accordingly.
screen 400 , as shown in FIG . 4A , the central portion 444 of In the example shown in FIG . 4C , the size of each dotmay
display screen 400 as shown in FIG . 4B includes a screen correspond to the threat level of each target. For instance, the
shot of live video behind the bicycle . In this example , the size of the dots may be constant, changed based upon their
live video shown in central portion 444 of display screen 440 15 distance to the bicycle , or updated in size or color to match
includes that of the target corresponding to dot 410 . that of the threat level indicated by each target' s color. In this
In an embodiment, the live video may be captured by the way, different types of threats may be conveyed in different
radar unit and transmitted to the mobile electronic device ways to the cyclist via display screen 480 . For example , the
upon the radar unit receiving a command from the mobile size of the dots may represent a threat based upon each
electronic device requesting the live video . For example , as 20 target ' s proximity to the cyclist, while each target ' s color
discussed above , once processor 104 determines a threat ation may represent a threat level based upon the target' s
level associated with the target associated with dot 410 that size or lane -changing patterns .
exceeds a threshold threat level classification, the mobile FIG . 5 illustrates a method flow 500 , according to an
electronic device may transmit a command to the radar unit. embodiment. In the embodiment, one or more regions of
The radar unit may receive this command and, in response , 25 method 500 ( or the entire method 500 ) may be implemented
begin capturing and transmitting live video , allowing the by any suitable device . For example , one ormore regions of
mobile electronic device to present the received live video , method 500 may be performed by mobile electronic device
as shown in FIG . 4B . In an embodiment, once the deter - 306 and/ or radar unit 308 , as shown in FIG . 3 .
mined threat level falls below the threshold threat level Method 500 may begin when radar sensor signals are
classification , display screen 440 may revert back to display 30 generated (block 502 ). These radar sensor signals may
screen 400. In this way, the mobile electronic device may include, for example , radar sensor signals output from one or
display different types of information to a user (cyclist) more radar sensors and reflected received from one or more
based upon the threat level associated with a particular targets located in a sensor field (block 502 ).
target. Method 500 may include one or more processors analyz
In some implementations , the processor 104 is configured 35 ing the reflected radar sensor signals to determine the
to cause the display device 120 to present an indication of velocity and /or location of one or more targets located in a
each target within its road lane, as shown in FIG . 4C . For sensor field (block 504 ). The location and /or velocity of
example , display screen 480 includes the tracking bars 403A these targets may be relative to that of the device which
and 403B , dots 406 , 408 , and 410, and other similar icons obtained the radar sensor signals ( e . g ., radar unit 308 , as
and user interface functionality as display screens 400 and 40 shown in FIG . 3 ) (block 504 ) .
440. However, display screen 480 includes a top - down view Method 500 may include one or more processors trans
of the road , lane dividing line , and targets traveling on the mitting target data including the velocity and/or location for
road behind the bicycle such that an indication of the tracked the one or more targets from the analysis of the radar sensor
location of several targets relative to the bicycle is presented signals (block 506 ). This target data may be received , for
with the respective road lane for each target. In some 45 example , by a mobile electronic device ( e . g ., mobile elec
embodiments, this top - down view may be presented as a tronic device 306 , as shown in FIG . 3 ), which interprets this
particular mode of operation instead of the routing informa- information and presents it to a user on a display in any
tion shown in FIG . 4A . But in other embodiments, the suitable manner. In various embodiments , upon receiving
top -down view shown in display screen 480 may be a the target data , this information may be presented in accor
transition from the navigational information shown in dis - 50 dance with the screenshots shown and described with ref
play screen 400 . For example , similar to display screen 440 , erence to FIGS. 4A - 4C .
display screen 480 may be displayed upon the mobile Method 500 may include one or more processors deter
electronic device detecting a threat level of a target exceed mining whether a trigger condition has been satisfied (block
ing a particular threat level classification . In such a case , 508 ). This may include , for example , the various trigger
display screen 480 may transition back to display screen 400 55 conditions discussed herein , such as the determination of the
when the classified threat level falls below the classified target being classified as a particular threat level (e. g., the
threshold threat level. threat level exceeding a predetermined threshold level), the
Display screen 480 may present three targets 482 , 484 , passage of a predetermined threshold time period , the deter
and 486 . Each of these targets may correspond , for example , mination that a target matches a predetermined deceleration
to dots 406 , 408 , and 410 , respectively . As shown in FIG . 60 profile , the lack ( absence of any targets being detected for
4C , each target may be shaded or colored in any suitable a particular time period based on radar sensor signals , user
manner to adequately convey that particular target's threat preference for the radar unit to utilize both the radar return
level. In the example shown in FIG . 4C , target 486 corre - signals and the video data collected by the camera , failure of
sponds to dot 410 and is associated with the highest threat the radar sensor, etc . (block 508 ). In embodiments , the
level. Target 482 corresponds to dot 408 and is associated 65 processor may activate the camera based on a signal from a
with the next highest threat level and target 484 corresponds sensor array ( e. g., accelerometers, gyroscopes, perspiration
to dot 406 and is associated with the lowest threat level. As detectors , compasses , speedometers, magnetometers ,
US 10 ,393 , 872 B2
37 38
barometers, thermometers, proximity sensors, light sensors foregoing text, numerous alternative embodiments may be
( e . g ., light intensity detectors ), photodetectors , photoresis implemented , using either current technology or technology
tors , photodiodes, Hall Effect sensors, electromagnetic developed after the filing date of this patent application .
radiation sensors (e. g ., infrared and / or ultraviolet radiation
sensors ), ultrasonic and /or infrared range detectors , humis - 5 Having thus described various embodiments of the tech
tors , hygrometers, altimeters , biometrics sensors ( e. g ., heart nology , what is claimed as new and desired to be protected
rate monitors , blood pressure monitors, skin temperature by Letters Patent includes the following:
monitors ), microphones, etc .) exceeding a predetermined 1. A radio detection and ranging (radar ) unit for a bicycle ,
level. If the trigger condition is not satisfied , then method comprising:
500 may revert back to continuing to analyze radar sensor 10 a radar sensor configured to transmit a radar signal and
signals (block 504 ), and transmitting the target data based receive a reflection of the transmitted radar signal, the
upon the analysis of the radar sensor signals (block 506 ). radar sensor outputting a radar sensor signal corre
However, if the trigger condition is satisfied , then method sponding to the received reflection ;
500 may continue to utilize video and /or image data to a camera ;
identify target(s ) located in a field of view of a camera (block 15 a wireless transceiver configured to transmit target data ;
510). and
Method 500 may include one or more processors causing a processor coupled with the radar sensor, the camera , and
a camera to begin capturing video (block 510). This may the wireless transceiver, the processor configured to :
include, for example, issuing an appropriate command to determine a location of a target relative to the bicycle
power on or otherwise control a camera , resulting in the 20 based on the received radar sensor signal,
camera capturing , storing , and/or transmitting captured determine a threat level associated with the target based
video depicting objects located in the field of view of the on the determined location of the target,
camera (block 510 ). generate the target data indicative of the location of the
Method 500 may include one ormore processors analyz target relative to the bicycle and the threat level asso
ing the video data to determine a velocity and / or location of 25 ciated with the target, and
one or more targets located in the camera ' s field of view turn on the camera based on the determined threat level
(block 512 ). This may include, for example, the analysis of associated with the target .
the captured video (block 512 ) in accordance with any 2 . The radar unit of claim 1, wherein the processor is
suitable number and /or type of video processing algorithms, further configured to :
as discussed herein (block 512 ). 30 determine a size of the target using video data generated
Method 500 may include one or more processors trans by the camera , and
mitting (wirelessly or wired ) target data including the veloc determine the threat level associated with the target fur
ity and /or location for the one or more targets determined by ther based on the determined size of the target.
analyzing the video data (block 514 ). Again , this target data 3. The radar unit of claim 1, wherein the wireless trans
may be received , for example , by a mobile electronic device 35 ceiver is further configured to transmit video data captured
( e. g ., mobile electronic device 306 , as shown in FIG . 3 ), by the camera to a bicycle computing device upon the threat
which interprets this information and presents it to a user on level associated with the target exceeding a threshold threat
a display in any suitable manner. In various embodiments, level.
this information may be displayed in accordance with the 4 . The radar unit of claim 1 , wherein the processor is
screenshots shown and described with reference to FIGS. 40 further configured to selectively switch from determining
4A -4C . the location of the target relative to the bicycle and the threat
Some of the Figures described herein illustrate example level associated with the target using radar sensor signal
block diagrams having one ormore functional components. generated by the radar sensor unit to determining the loca
It will be understood that such block diagrams are for tion of the target relative to the bicycle and the threat level
illustrative purposes and the devices described and shown 45 associated with the target using video data captured by the
may have additional, fewer, or alternate components than camera .
those illustrated . Additionally, in various embodiments , the 5 . The radar unit of claim 4 , wherein the processor is
components (as well as the functionality provided by the further configured to determine the location of the target
respective components )may be associated with or otherwise relative to the bicycle and the threat level associated with the
integrated as part of any suitable components . For example , 50 target using the video data .
any of the functionality described herein with reference to 6 . The radar unit of claim 1 , wherein the processor is
the radar unit may be performed by the mobile electronic further configured to cause the camera to power on and to
device . capture video data when the processor does not determine a
It should be understood that, unless a term is expressly location of a target relative to the bicycle for a duration of
defined in this patent application using the sentence “ As 55 time exceeding a threshold time period .
used herein , the term “ ’ is hereby defined to 7 . The radar unit of claim 1, wherein the wireless trans
mean . . . " or a similar sentence , there is no intent to limit ceiver is further configured to receive a command transmit
the meaning of that term , either expressly or by implication , ted from a bicycle computing device , and
beyond its plain or ordinary meaning, and such term should wherein the processor is further configured to cause the
not be interpreted to be limited in scope based on any 60 camera to power on in response to the received com
statement made in any section of this patent application . mand .
Although the foregoing text sets forth a detailed descrip - 8 . A radio detection and ranging (radar ) unit mountable to
tion of numerous different embodiments, it should be under - a bicycle, comprising:
stood that the detailed description is to be construed as a radar sensor configured to transmit a radar signal and
exemplary only and does not describe every possible 65 receive a reflection of the transmitted radar signal, the
embodiment because describing every possible embodiment radar sensor outputting a radar sensor signal corre
would be impractical, if not impossible. In light of the sponding to the received reflection;
US 10 ,393 , 872 B2
39 40
a camera configured to selectively capture video data radar sensor outputting a radar sensor signal corre
behind the bicycle ; sponding to the received reflection ;
a processor coupled with the radar sensor and the camera , a camera configured to capture video data of a road behind
the processor configured to the bicycle ; and
determine a location of a target based on the received 5 a processor coupled with the radar sensor and the camera .
radar sensor signal, the processor configured to :
determine a size of the target based on the captured determine a location of a target based on the received
video data , radar sensor signal,
determine a threat level associated with the target based analyze the video data to identify road lane markers
the determined location and size of the target, 10 within the road and a current road lane of a target
periodically generate target data indicative of the loca behind the bicycle,
tion and size of the target behind the bicycle ; and
a wireless transceiver coupled with the processor, the determine a size of the target based upon the analysis
wireless transceiver being configured to transmit the of the video data ,
target data to a bicycle computing device , 15 classify the target with a threat level proportional to the
wherein the target data transmitted via the wireless trans determined size of the target, and
ceiver causes the bicycle computing device to present generate target data indicative of the determined loca
an indication of the tracked location of the target tion of the target and the identified current road lane
relative to the bicycle. occupied by the target behind the bicycle, the target
9 . The radar unit of claim 8 , wherein the processor is 20 data including the classified threat level; and
further configured to cause the camera to power off and to a wireless transceiver coupled with the processor, the
stop capturing video data once the determined threat level wireless transceiver being configured to transmit the
drops below a threshold threat level. target data to a bicycle computing device to cause the
10 . The radar unit of claim 8 , wherein the processor is bicycle computing device to present an indication of
further configured to cause the camera to power on and to 25 the identified location of the target in the identified road
capture the video data when the processor does not deter lane .
mine a location of a target relative to the bicycle for a 15 . The radar unit of claim 14 , wherein in response to
duration of time exceeding a threshold time period . receiving the transmitted target data , the bicycle computing
11 . The radar unit of claim 8 , wherein the wireless
transceiver is further configured to receive a command 30 device
target
presents an indication of the tracked location of the
relative to the bicycle in the target's correlated road
transmitted from the bicycle computing device , and lane within the road .
wherein the processor is further configured to cause the 16 . The radar unit of claim 14 , wherein the processor is
camera to power on in response to the received com configured to cause the camera to power off and to stop
mand .
5 capturing
12 . The radar unit of claim 8 , wherein the processor 35 € video data once the determined threat level drops
controls the wireless transceiver to transmit video data below a threshold threat level.
captured by the camera to a bicycle computing device upon 17 . The radar unit of claim 14 , wherein the processor is
the threat level associated with the target exceeding a configured to identify the location of the target using the
threshold threat level. video data .
13 . The radar unit of claim 12 , wherein the target data 40 18 . The radar unit of claim 14 , wherein :
transmitted via the wireless transceiver further includes the the target is from among a plurality of targets ;
determined threat level, and the processor is further configured to analyze the video
wherein the bicycle computing device is configured to data to correlate each of the plurality of targets to a
cause a change in brightness or color of an edge of a respective road lane; and
display device in response to receiving the transmitted 45 in response to receiving the transmitted target data , the
target data . bicycle computing device presents an indication of the
14 . A radio detection and ranging ( radar ) unitmountable tracked location of each of the plurality of targets
to a bicycle , comprising: relative to the bicycle in each target 's respective road
a radar sensor configured to transmit a radar signal and lane.
receive a reflection of the transmitted radar signal, the

You might also like