You are on page 1of 23
oy United States Patent Maluf ‘USOT! 8731S1B1 (10) Patent No.: 45) Date of Patent: 'S 7,873,181 BL Jan. 18, 2011 20040256501 AL 1 soos 137860 AL* 00 Bevhler tal 2008 Nam T0205 (75) nestor: A. David Malu, Mountain View, CA FOREIGN PATENT DOCUMENTS us) ep 2002-32702 * 11/2002 (73) Asiance: ‘The United States of America as represented by the Administrator of OTHER PUBLICATIONS the National Aeronautics and Space, py aan pint Syste fr uo nage Reese Administration, Washington, DC (US) fujon- tPF uantaconon biome enincen, el 39-No. (*) Notice: Subject to any disclaimer, the term ofthis Grantham, Spatial Hering and Related Phenomen, Hearing, Hard patent ib extended or adjusted under 38. faker aneponand optioned on, 998.9713 Ace UISEC th by 90) dys inc Pos Sa Mansur al Sound Gras A Nuns at Anas Meth for (21) Appl. Nos 11825,600 {henna of Meio Spanos DW. 1667.92 Penn (22) Filed: Sep. 21,2006 —- Related U.S. Application Data Primary Examiner—Va Le (63) Continuation-in-part of application No. 11/239,450, sistant Examiner—Amara Abdi fied on Sop 28,2008 (7a) Artorns, tee oF Frm John F Schipper, Rober M. Pails () Ima. ; G09K 9/00 (2006.01) (37) ABSTRACT Sock 936 (200801) (2) Usa. 482/100; 38272T6 Method and system fr enhancing or extending visual repre (58) Fleld of Clawitcaion Search 382100, _stlation ofa Selected region of visa image, where visual Sasio16 GRIN epescaaionsintrlered wither dstred by supplement pee gee nein ati inga visa signal witht least one aio signal having one or : ‘more ado ngal perce hat repeat one oF enone (66) References Cited ‘sual image perametrs,soch as verteal andor horizontal Us EAE DOCUMENTS) length range of the region: change in a parameter value that susson a+ 3817 chumeterees the vst image, with respect fo a erence Simeaey AS 0025 parameter alvesand ime ateof change ina parameter value Some a * 12000 Ont sal 32276 _[hitehanaterzc the visual image. Region dimension ean be 6.681.018 BI® 1/2004 Asakur et al 38118 changed to emphasize change with time of a visual image. SiSsnes9 Bo" aan? Biber seeteg —chnasd emp ns ‘7.382.888 B2* 6/2008 Aylward etal. asio7 aoosinriay Ai® 12008 Reams Sau na 31-1 32 / 33 # Tefon] x6 Tots 35 Sing ie 3 | [79 ‘usite {ASCE ne tear) | Jorean] | Aus [480 ee ie Cole Tint >| Se cin Sensing Delay ‘rmation |. aps! = US 7,873,181 BL Page 2 (OTHER PUBLICATIONS. Maier, Am Experimental system for Antony Image Represents: tions, IEEE Transactions on Biomed! Enginerng, Feb 1992, 39:2 IBEE. Paes, Neural Signal Processing, Hearing, Hantook of Perception ‘ant Cognition 2a ston, 195, 75121, Aca Pres, USA ‘Sten, tal, Mos of Binaural atcrction Hearing, Hanbook of Porepton and Cogation 2nd Fition, 1998, 47-585, Academie Pres, USA Yates, Cochlear Sista and Function, Hearing. Handbook of Per Phase Difference at Overall -—> ‘Analyzer Time — > Delay - 44 >| > > > Baseline Function ‘Analyzer Relative Signal Analyzer FIG. 4 -—45 | US. Patent Jan. 18, 2011 Sheet 4 of 9 US 7,873,181 BI Represent at least one selected region of a visual image by at least first, second, third and fourth selected visual image components, including at least one of: vertical location (relative to a signal recipient) of the = |~——5! visual image region, horizontal location of the region brightness of the region, and predominant hue or wavelength of the region Map visual image component representatives onto at least first, second, third and fourth audible signal attributes, drawn from the following attributes: carrier signal frequency, envelope signal frequency, carrier signal-envelope signal phase difference at a selected time, baseline amplitude, envelope signal amplitude relative to baseline amplitude, and time duration of signal S54 —33 Present audible signal attributes | [Incorporate audible signal sequentially in an audibly attributes into one or more audible perceptible manner signals that are presented in an a —— audibly perceptible manner FIG. 5 US. Patent Jan. 18,2011 Sheet 5 of 9 US 7,873,181 BI 63-1 FIG. 6 US. Patent Jan. 18, 2011 Sheet 6 of 9 US 7,873,181 BI d(E) d(E; death) FIG.7 US. Patent Jan. 18,2011 Sheet 7 0f 9 US 7,873,181 BI fa(t n; deer) flend):d(n;sep) > d(E) f(eaut) H — - A(end):d(n;sep)< d(E) FIG. 8A fa(t; n; incr) f(end):d(n;sep)< d(E) A{eaut) | - —- p< ---- 2 —_ f(end):d(n;sep) > d(E) | FIG. 8B US. Patent Jan. 18, 2011 Sheet 8 of 9 US 7,873,181 BI fa(tsn:2) tus d(n;sep)< d(E) aK atnisep)> (8) FIG. 9A Ta(tsn;2) NI T\S d(n;sep)s d(E) NN nisep) > d(E) FIG. 9B US. Patent Jan. 18, 2011 Sheet 9 of 9 US 7,873,181 BI 72 ® —_ LS-2 67-1 aa 62-1 . ~ oo R Sy a XN Ist / \ { @ ® Ls IS: [ obsin=n FIG. 1] O2(x2, Yo. Z2) ~ O1(x4, ¥4, 24) FIG. 10 US 7,873,181 BL 1 VISUAL IMAGE SENSOR ORGAN REPLACEMENT: IMPLEMENTATION RELATED APPLICATION ‘This application i cantinuation-in-part ofa patent appli cation entitled "Visual Image Sensor Organ Replacement” US. Ser. No. 1/289,450, ied 28 Sep. 2005 ORIGIN OF THE INVENTION This invention was mad, in part, by one or more employ~ «0s of the U.S. government. The US. government has the right © make, use andor sell the invention deseribe herein ‘without payment of compensation, inetutng but not imited. to paymeat of royalties, PIELD OF THE INVENTION This invention relates t implementation of use of audio sianal parameters as representatives of ime varying oF cot stant visual signal parameter values. BACKGROUND OF THE INVENTION Present development of fast, cheap and miniaturized elec~ tronics and sensory devices opens new pathways for the ‘development of sophisticated equipment to overcome limits tions of the huznan senses. Humans rly heavily on vision to sense the envionment in order to achieve a wide variety of | ‘goals. However, visual sensing is peneally available only for 2 limited visible range of wavelengths, roughly 400 nm (na ‘homicters) to 720.am, which is small ction ofthe range of ‘wavelengths (180 nm through about 10,000 nm) at which interesting physical effects andior chemical effects occur. Audible sensing, over an estimated avdible range of 200 Hz (Hert)-20.000 Fz, is similarly Timited, but this range is @ Janger fraction ofthe stably interesting range 102-10" H2). Further use of binaural hearing o provide audible cles a 0 slopth and relative location is generally better developed than are the corresponding meclanisis associated with formation ‘of visible images. Since the time of Aristotle (384-322 BC), humans have been interestedin perceiving what is beyond normal “vision”. Roentgen's discovery of X-Rays enabled him t soe inside living tissue, and “vision” was thereby extended beyond the naked eye. In the following years, imaging and sensing tech rhigues have developed so rapidly that astronomy, medicine ‘and geology are just few of the areas where sensing beyond ‘the normal visual spectrum hus been found usefil. Altering ‘and extending human “vision” changes our perception ofthe world Acvording to some rovent research in evolution ofthe si system foranimals, reported in" What Birds See" by Timot H. Goldsmith, Scientitic American, uly 2006, pp. 68-75, certain int species havea tera-chromatie color sensing sy tem, with color bands spanning the near-ultavilet, violet, ‘green and red wavelengths, in contrast the tri-chromatie (for primates, humus and some birds) or bichromatic (for ‘ther animals) color sensing systems that cover only #0 oF three visible wavelength bands, The tetr- 2 lating. audibly perceptible signal s,(t) having a single infor mation-bearing (envelope) frequency anda single carrie fre- «quency. This signal can be characterized by: an envelope frequency f, anc corresponding time rate of change of the ‘envelope frequency dit (analogous to “chirping” oF to a 2 Doppler shift; eater Irequency fan envelope frequency ‘phase , at selected time, ty: camer feyuency phase fat the selected time, baseline function amplitude (0, defining a baseline due BB, and corresponding time rate of change of base line amplitude dh; a non-undulatory snal amplitude a0), measured relative tothe baseline curve BBs; and a time interval (duration) At forthe signal. The ‘human ear may be able to distinguish the phase difference, ‘e-#.~H. Dut cannot distinguish the absolute phsses. andor: An audible signal equation incorporating all these features is Sa-tontnsin (cern se) o “The maximum number of parameters forthe signal shown, in FIG. 1 that may be distinguished by the human ear is 1M=658 ifthe (absolute) selected time, ty, andthe absolute ‘phases are not included. These M signal parameters may be used to audibly represent a corresponding visual region of an image, such as vertical and horizontal coordinate ranges (ver- sus time) ofthe visual region (relative to fixed two-dimen sional or three-limensional system), estimated distance s() “andlor rate of change of distane ds toa selected center of the region, egion brighiness, overall region brightness, and region predominant hue (color) or wavelength. Optionally, ‘these audible signal parameters can be presented simulta nosy or sequentially, for any corresponding visual image ‘region thats so represented, Ina sequential presentation, one for more additional audible signal parameters may be included if te information corresponding to the ational parameter value(s) is necessary for adequate representation of the image region, ‘The visual image may be decomposed into sequence of K selected visual image regions Ry. K, with KEI, ‘contiguous of non-coatighous; overlapping or nonoverlap- ping) that make up part or all oF the total visual image, for ‘example as ilustrated in FIG. 2. The sequence of regions Ry, “andthe corresponding soquonce of audilesignal parameter, need not exhaust the set of al regions that together make up the visual image. Preferably; the image regions are chosen acconling to which regions are of most interest For example, swlien an image has a single image region (less than the entire image) where one o more image parameters is changing 4 substantially with time, this region may be a primary focus: and if this region slowly changes its location or its physical ‘extent within the total image the location and breadth ofthis image region should correspondingly change with time, That is, the horizontal and vertical hounds andr the center of an image region may move with time within the total image. 1 the visual image changes between on time t a subse- ‘quent ime, te audible parameters representing each selected region Ry may also change with ime, ina sequential manner, FIG. 1 graphivaly illustrates signal parameters correspond- ing toamapping thatcan be implementedto representa group ‘of visual signal parameters, representing a scleeted region Ry ‘ofthe total image, by an audibly perceptible signal or signals. Inone approach, a visual image region Ris selected and ‘optionally isolated, and the corresponding audibly percep- tible signal parameters are presented 1) sequentially within time interval of selected length (eg.,5-30see) of 2) as partof ‘single aad signal that incorporates two or more selected uae signal parameter values, ‘f'n audible signal parameter changes with time, continu ‘ously or dseretely this change can be presented neconding to ‘several options: change the audible parameter value con- tinuously ata rate that corresponds othe time rate ofchange ‘of the corresponding visual parameter valie; (i) change the audible parameter value discretely at arate coresponding to adiscrete time rte of change of the visual parameter value; and (i) change the audible parameter value discretely, by a ‘selected amount, only when the magnitude ofthe difference Derwent first valve anda second value of the parameter i at least equal toa threshold magnitude, which wil wary with the nature ofthe visoal parameter. Humans and primates rely heavily on tri-chromatie vision to sense and react to the enviroamient in order to achieve ‘various goals. By coneas, other animals ely heavily, but not ‘exclusively, on smell (eg, rodents), on sound (eg., some binds), or on tetra-chromatie vision (other binds). The inven- tion augments or replaces a human seasory visual system, ‘whichis deficient in many respects, with one or more auditory signal, in onder to achieve the Following, (1) Provide a capacity to sense beyond the human visible light range ofthe electromagnetic spectrum). (2) Increase capacity of humaa sensing resolution, beyond ‘the number of rls and cones in the human eye (appeoxi- ‘mately 120 million rods ax 6 million color sensing cones) ‘that limit the resolution ofthe images, panicularly because ‘humans ely onthe subse ofeones located in the fove, whieh provide humans the highest visual acuity of approximately I ‘min of are resolution Within field of view less than 12 degrees horizontal by 4 degrees vertical in humans (3) Provide wider angle equivalent of visual sensory per- ception, where the shape and location of human eyes limit the effective human field of view to about 200 degrees horizon- tally by about 150 degrees vertically (4) Improve the ability of a human 10 sense distances, ‘which is presently relatively poor and ean be confounded by ‘wide varity of visual eves (5) Allow compensation for mavemeat by the humaa or ‘changes in the scene; for example, motion smear or blur ean make it difieult to resolve images at resolutions achievable ‘when the perspective of an image i nat moving or changing, (6) Allow spliting of wser attention (sault-tsking using ‘to oF more senses), where visual image limits the range of other activites that a person can do simultancously, such as ‘monitoring gauges and reading text concurrent. (7) Provide audibly perceptible changes in an aulible parameter value that correspond to chia US 7,873,181 BL 5 discrete, in a visual parameter value that ae too small oF subtle fora human eye to sease or respond to. (8) Provide an auible parameter value that changes in an audibly perceptible manner only when the corresponding ‘vista parameter changes by at east a threshold amount, and the threshold i selectable according to the environment, ‘Using the invention, a wide variety oftasksthat ar difficult ‘orcumbersome t accomplish, using primarily visual indicia, ‘ean be met, including the following: Enabling the user to substantially simultaneously focus attention on multiple aspeets of a visual field, or its ‘audible Held equivalent Emnbedded human sensing ofairerat performance; and Audibly indicating miero-fratures and/or thermal distor: sion in materials Inonler to increase the visual image resolution obtainable ‘ia an anditory representation, @ mapping is perfoemed 10 distribute image in ime. Throe-dimeasional spatial bright ness and multi-spectral maps ofa sensed image are processed using el-time image processing techniques (eg. histogram 2 normalization) and are wansformed! into one oF more Wo" dimensional maps of an audio signal as fupetion of Te ‘quency and of time. The invention uses a Visual Instrument Sensory Organ Replacement (VISOR) system o augment the human visual 2 system by exploiting the improved capabilities of the human auditory system. The human brain is far superior to most ‘existing computer systems in rapidly extracting relevant information from blotred nosy, and redundant images. This suggests that the available auditory bandwidh is not yet ‘exploited inan optimal way. Although image processing tech- nigues can manipulate, condense and focus the information (eg. wsing Fourier Transforms), keeping the mapping. direct and simple as possible may also reduce the risk of accidentally filtering out important clues. Even a perfect, ‘non-redundant sound representation is subject to loss of rel ‘evant information in @ non-perfeet human hearing system. Also, a complicated, non-redundant visual image-to-qudible image mapping may well he more difficult ta learn and eom- prohend than a straightforward visual mapping, while the ‘mapping system would inerease in complexity andl cost. FIG. 8 schematically illsiates a mapping device used to transform selected visual image parameters associated witha ‘region to an audible signal with audibly perceptible param- ‘lers. One or miore visual image region CVIR"), represents tions is received and analyzed by a fist stage signal reeeiver- processor CRIP") Mol, The firs sage R/P M-L analyzes a received VIR and provides one of more (preferably as many as possible) of the following visual signal characterization, Perametersin second stage R/P 31-2: vertical andhorizontal ‘coordinate ranges of the regio andlor its center, optional ‘adjustment in size of region viewed: region predominant hue (color) or wavelength; region average brighioess and region ‘peak brightness, using. region locator and sizing mechanism 32, a region predominant (or average) color sensing mecha nism 33 and a region brightness sensing mechanism 34. Out ‘put signals from the locator mechanism 32, from the color mechanism 33 and from the brightness mechanism 34 are received by athind tage IVP 31-3, which provides a collec~ tion of audible signal parameters, including time rae of change TRC ofa least one parameter value ‘As an example: the predominant or average color output signal from the region eolor sensing mechanism 33 can be tused to determine the envelope frequency {the region borightness output signal from the rygion brightness mechs nism} can he used to determine the envelope relative ampli- tude, a, (Constant) oral); the vertical and horizontal location, 6 ‘ouipu signals from the locator mechanism 32 ean be wsed to dletermine time duration At (if the visual image region loca- tions are indexed by a one-dimensional index), oF to deter mine time duration At, envelope frequency f, (the visual image region locations are indexed using a two-dimensional index) or change rate, dh or dt The four visual signal parameters ean be assigned to fourof six audibly perceptible Signal parameters (FIG. 1) in °,)-(65-43)(4-3-21)=15 ise tinguishable ways. More generally, N visual signal param ‘ters can be assigned to M (ZN) audibly perceptible signal parameters in (%,) distinguishable ways. ‘Where the time rte of change option i) i used fora visual signal parameter value fone ean form aa approximating second degree polynomial espe (AED Halla dha Tplcpaatnariare thee jar Dilip a Bip soal Yoo bobor Mth o (55. ytpastpas® and compote 1( and dri using the approximating polynomial ¢; app) and dit: app) cl, respectively. Approximating polynomialsof degree higher ‘than 180 can aso be used her, Where the time rate ofchange option i) swe fora visual signal parameter vale Fa sequence of ratios Geant Mpa @ is compute forthe sequence of times (5). Whore the time rte of change option (i) ie wid for 3 ‘visual signal parameter vale, the ime at of change ratios ofinteresthevome tnd 0) =MagN AMI Forgan P=) E rt Mine Berg) = Sr for @= 12, P= pL a ip) =i Beer, “The analysis performed by each of the mechanisms 32,39 and 34 isnot instantaneous, and the associated time delays ‘may not be the same foreach analyzer. For this reason, an ‘overall dime delay on stoi sn igh eh io is preferably imposed, using a time delay mechanism 3 helore an able signal (or atadble signal sequence) incor. porating the I through M-M1+M2 audible signal parameters 's audibly displayed, where M2 isthe number of parameter values that can change with time and M1 isthe number of remaining parameters. If he audible signal parameters are dlisplayed sequentially, mot simultaneously or collectively, this dime delay might be reduced or eliminated. The overall time defay is implemented by a time delay mechanism 38, ‘which incorporates an appropriate time delay valve foreach ‘of the aulible signal parameters received from the first ste RIP 31-1. An audible signal formation mechanism 36 (op- tions) Forms and ives either: (1) an audibly perceptible, ‘ordered sequence ofthe set of M audible signal componeats ASC(m),m-I,...,M,(orasubsetthereo!),or(2)acollective audibly perceptible signal APS incorporating the set (or a subset) of the audible signal components. The output signal from the audible signal formation mechanism 36s perceived bby 8 human or other animal recipient US 7,873,181 BL 1 The RP 40, illustrated in FIG. 4, includes one or more of the following: a casier/envelope frequency (.) analyzer 41; ‘an envelope amplitude analyzer 42; an euvelope-carrierIe- {quency phase difference (A@) analyzer 43 and baseline fone tion (B(0) analyzer 44 that estimate the phase difference ata sclected time and detemaines the baseline function and base- Tine time rate of ehange; anda relative signal amplitude (a oF a() analyzer 48, relative to the baseline function ata carr sponding time. “The analysis performed by each ofthese analyzers is not instantaneous, ad the associated time delays may not be the same foreach analyze. Fortis reason, an aver time delay on a A.) 0 0) o is preferably imposed before an audible signal incorporating the M converted visual signal parameters is audibly dis ‘played, Ithe convertod vist signal parameters age anibly displayed sequentially rather than simultaneous) delay might be reduced o eliminated. The overall ime delay ‘implemented by atime delay mechanism 46, which incor- porates an appropriate time delay for each of the audible parameters received om the R/P 31. A signal formation, module 47 forms a composite audible signal representing an audible imwge componen, and issues this component as an ‘uit signal ‘Where the amplitude af) is constant, the signal shown in FIG. 1 may be represented in an altemative form Pre =o apie) 4 S04) * he taf fk 4) Sal —aleoe + The carrierenvelope frequency analyzer 42 forms a sequence f correlation signals, computed overs timeinterval of length T 10,110 tsa eit on 10H ton ow at each of spaced opart sequence of “translated” carrer frequencies, f,, in a selected carrier frequeney range, swliere £, i not yet known, and provides an two spaced apart frequencies {y-f-+f, and ff, associated with the VIR, where the correlation ination, C14C2*, has the highest magnitudes. The ‘envelope and carrie frequencies are then estimated from Suthi aon Saha ‘The emvelope-carier phase difference Ap and relative amplitude a, are determined by computing the correlations WH alin tp aa ‘a0 (Urbain Ysera am) from whick the quantities a, (0) and Ap are easily deter mined. The baseline funeton b(t) is then determined from ote aleo filo Taeipese) ‘The frequency difference (f,-f,) and frequeney sum (f+) ‘values ae distinguished from each other ina normally func 8 tioning human auditory system if the sum-frequeney differ cence is at least equal to a threshold valve, such as 250 Hz FIG. $i a flow chan illustrating « method for practicing the invention. In step SI, atleast one selected region of @ visual image is represented by N selected visual image paramicters, inching atleast one of: vertical and horizontal Tocation coordinates and time rate of change of location coot- dlinate(), relative toa signal recipient, of the region: region predominant hue or wavelength, region brightness (average and/or peak}, and time ateof change ofa visual signal param- ter. In step $2, the visual image region representatives are ‘mapped onto Maudible signal attributes (M=N), draw rom the following set of anibutes: carer signal frequency: enve- lope signal frequency and time rate of change of envelope frequency; cartier signal-envelope signal phase difference at selected time; baseline amplitude and time rate of change of baseline amplitude; envelope signal amplitude relative to baseline amplitude; and signal time duration. In step 53, the audible signal tributes presented sequentially inant ‘ly percepible manser tothe recipient In an alternative step 453 (tep 54), the audible signal attributes are received and incorporated in one oF more audible signal that isfare pre= sented nan audibly perceptible manner toa recipient. The invention can be applied to provide audibly percep- tible and distinguishable signals, representing one or more scleeted repions ofa visually perceptible image, fora sight- impaired person. Where more than one VIR is represented, ‘the audible signal representatives of the VIRS are preferably presented sequent ‘8 small separation time interval {as Title asa few fens of msee) between consecutive repre ‘Where the visual image isa Tine drawing or other binary ‘representation, the audible signal components can be cantig- ‘wed to represent curvilinear and linear shapes, sizes and intersections. Where the visual image primarily represents interaction of dominaat color masses in different regions of | ‘the image, the dominant ues and shapes of these interacting regions can be represented audibly. For other reasons, a non Sightimpaired person may prefer to focus attention of attributes of aregionof an imagethat canbe represented more accurately or intuitively by non-visual signal, for example, toextend the (visual) wavelength range of signals that ean be perceived, ‘The invention can be applied to “enrih” image deta or ‘manifest more clearly some image deals thot are not evident ‘where the region is viewed solely with reference to visible Tight wavelengths. For example, some details of a region may ‘oe hidden of muddled when viewed in visible wavelength Tight but may’ become clear when the region is illuminated ‘with, of viewed by, an instrument that is sensitive to, neat- infrared light (wavelength hf.7-2 yan) or mi-intared lig {G.n1-20 jum) or ulravioet light (220.4 jun). These hidden dials can he converted wo audible signal parameter values that are more easily audibly persived as part ofa received ‘signal, Operated in this manner, the invention can separately ‘compensate fora relatively narrow (or relatively broad) vis- ible wavelength sensitivity of the viewer and a relatively narrow (or elatively broad) auditory frequency sensitivity of the same viewer or ofa different viewer-reipient. Operated inthis manner, the visible wavelength sensitivity of a fist (visual image) viewer ofthe image repion can be adjusted and compensated for electronically by adjusting the audible ‘wavelength ange of one or more ofthe axlible signal param ‘ters, before the transformed audible signals received by the ‘same viewer or by a diferent viewer. “The invention can also be applied to provide audible signal ‘components representing shape signature, sizes and esti- US 7,873,181 BL 9 ‘mated separation distances for objets that eannot be seen, oF thatare seen very imperfecily. because of signal interference, sianal distortion and/or signal attenuation by the ambient environment. This may occur in a hazandows environment ‘where ids present provide an opague, darkened or transl cent view of objects in the environment including moving oF motionless persons and objects that present hazard ‘This interference may also oscur in an airborne environ- ‘mentin which rain, snow, il, sleet, fog, condensation andlor ‘other environmental atributes prevent reasonably accurate ‘sual perception of middle distance and far distance objets. ‘visual image region thats ikely to experience iterference ‘ean be converted and presented asa sequence of audi signal tates that ean be more easily or more accurately pet- «ceived or interpreted by an operator of anairera(airbomeor ‘onthe ground). The audio signal attributes may be extended to inchide estimated closing velocity between the operator! aircraft and the not-yet-seen abject. The invention can also be applied, in an environment of visual “confusion: "to focus on and provide information only ‘on important dewils among a clutter of unimportant detail ‘The important details may be characterized by eran para: ters, and the system may foeusonnitially-visal details that possess one or more of these parameters, converting ther ‘evant information to audibly perceptible signals that contain this (converted) information. An example is 9 specifi aie- craft approaching a destination aiport surounded by other sirbore aireraf: the spcifedsirerat may wish to focus on and receive relevant (converted) auibly perceptible infoma- tion forthe immediately preceding areraft and the immed ately following aircraft in a queue formed by an ar traffic controle to provide an orderly sequence of touchdowns at the destination aipor The invention can also be applied where visual signals ‘epreseting the image are more likely 1o experience signal interference, signal distortion, signal atenuation andor simi lar signal impairments than are selected corresponding audible signals that represent certain parameters in these Vista signals. The visual signals (now converted to able signals) may be transmitted trough the ambient environment With reduced signal interference, edveed signal distortion andor reduced signal attenuation. and may be interpreted ‘more aecuraely by 2 signal recipient The invention can also be applied to provide an svdible sianal representing P dimensions (P~2),lormed or converted from a two-dimensional visual image region. The audible signal may, for example, provide depth cues, clues about @ dlominant he or color or brightness if any, and ches about the maximum fineness of detail associated with the image region, in addition to normal two-dimensional information Consider visual image region, such smite region of the image, and lt p(t) represent an image region parameter that changes with time, The parameter may change continu ‘ously, or even dfferentably, but in other more peneral ita tions p(t) may also change by adiserete amount at each of a sequence of spaced apart times {ty} 38 {DCG}. where 4Gep(t.1) 8 discussed inthe preceding. Assuming that piGep(te0 for n-1, 2... ane ean form a normalized parameter A) 0) an) iQ PUd-O-aaNi Oe 38) Which represents a diference oF a ratio of a subsequent ‘parameter value lative 1 an initial or preceding parameter Value, This difference or ratio ean be represented andibly by 10 a baseline signal amplitude difference (b(,)-b(4), ampli- tude ratio (bt, )/(), envelope frequency dillerence (ff) cor envelope frequency ratio (f/f), among other eombina- Asan example of application af the VISOR system, con sider a battlefield situation in which one or more combatants, ‘oF one oF more equipment items, are expesed 10 artillery shells or other projectiles, as illustrated in FIG. 6 and dis- cussed in more detail in Appendices A, B and C Ins first version, cach combatant wearsorcaries location determination (“LD”) system, such as GPS, and is aware of ‘the combatant's present location evordiaates within an ace racy of a few meters. A simple differential equation for a projectile ballistic trajectory is posited, and two oF more ‘observations, spaced apart in time, of the projectile location from each of two observers allows estimation ofthe reeva shell trajectory parameters, including projectile launch point, projectile impact point and time, and projectile explosive Joad, from whieh projectile injury and/or projectile lethality regions can estimated, Where thecombataatsareor maybe ‘within the disebilty or injury or lethality region for the pro- jectle, the combatants ean be notified colletively of this ‘development by use of an audible (or visual) warning signal, ‘such as a signal with mosotonieally decreasing (or increas. ing frequency, with inal frequency value fend) that isnear to or below a frequency coresponding to disablement or injury or lethality: In this instance, each combatant receives a separate audible (or visual) warning signal with monotoni- cally varying fraquoney, having a final frequency flend) th is specific fr that combatants present location onthe battle- field, Thatis, one combatant maybe within an injurylethality region, and inother combatant may be outside this egion, swith a scparate audible (or visual) warning signal for each. ‘Where M (3) observations of projectile location are pro- vided, projectile trajectory accuracy is enhanced by use of sastically weighted average of trajectory location points Distinetion between trjectories of to oF more projectiles ‘that are present at substantially the same time i also avail able Ina second version, the audible (or visual) warning signal ‘hasan undlatory signal fequeney andior an undultony sig- ‘al intensity, which is differnt for a combatant location inside, as opposed t outside, a probable disability orinjury or lethality repion relative to the estimated imps site In third and fourth version, atleast one (reference) com- ‘atant, but less than al thecombatants, wears or eaties an LD system, and the injury or lethality region is estimated forthe reference combatant. When the reference combatant is within the injury oe lethality region, an appliance worn hy the refer fence combatant issues an andible (or visul) warning signal that is recognized by all nearby combatants ‘Consider a batlfield situation in which one or more com- Datants, 61+ G1, 2...) are exposed to arillery shells oF ‘other projectiles, as illustrated in FIG. 6. Fach combatant, ‘61-iwearsorcanies an appliance 63-(1,2,...), including ‘8 receiver processor for GPS signals andlor ather location determination (“LD”) signals, ceive from LD transmitters (65,) Gly... 1,12) that are spaced apart from the com bataats. The appliance 65+ associated with each com ‘61-is aware ofthe appliance locaton coordinates (x to within an acceptably sina inaceuraey. If differential GPS DGPS") signals are used, the appliance location ean be dermined to within an inaccuracy of no more than one ‘A projectile 62 is launched from a launch site location LS, spaced apart from the combatants 61, roughly targeting the ‘combatants and following a trajectory 67 that canbe visually US 7,873,181 BL W ‘or (preferably) electromagnetically observed and estimated For example, @ trajectory estimation system disclosed in Appendices A, B and C, or any other system with acceptable prompiness of response, can be used for trajectory observa- lion and estimation. tjectory observation aad estimation system 66 observes and provides a prompt, accurate estima- tionof the projectile trajectory 67, including but not limited anestimate ofthe impact ste location IS for the projectile 62 ‘The location eoordinates (Ker. Yer: Zar) for the projectile impact location 67 ae promptly transmitted to each appliance 6341, whieh prompdy computes the separation distance a sepa ane os) ‘and generates an audible, sime varying signal S(t 3), lls: ‘rated graphically in different versions ia FIGS. 8A, 83, 9 ‘and 91, that is communicated to the associated combatant 61-1. Optionally, each combatant 61-1 receives a separately determined audible signal Si thatisechosen orewstomized for that combatan's hearing system (including. taking ‘account ofthat combatants hearing acuity or audible signal sensitivity vers frequency). In one version, the stable sig nal S(t: i) begins a a relatively high, but audible pereptible frequency f(0), and quickly and monotonically decreases fn end frequency flend) that is monotonically decreasing swith decreas af separation distance di sep) forte partic lar combatant 61-/. Optionally, the audible signal S,(¢ i) citer terminates at the end frequency fend) or continues at that end frequency Ina fest version, the combatant 61-/oF the appliance 63-1 ‘compares the end frequency fend) with a frequency fea tion) (for which the combatant as boea tained) io determine ifthe estimated impact location 67 ofthe projectile is elose ‘enough (withina distaneed(F), depending upon the estimated ‘explosive load F illustrated in FIG. 7)tothecombatant's own, location to possibly cause det or serious injury to an exposed combatant, If f(end)=l(caution), the combatant ‘quickly takes defensive maneuvers, such 2s reducing expo- Sure (0 the projetile’s explosive force. If the estimated impact site loeation IS is further sway and flend)>feauton), the combatant may elect to uke no defensive maneuvers. "Nommlly, a cautionary distance dcaution) varies inversely ‘with an estimate of the explosive load E-carid by the p= Jjectile 62 Ina second version, the appliance 63-/ numerically come pares the estimated separation distanced sep) with dcat~ tion), computed for the estimated explosive lol ofthe pro= jectle 62, and communicates the esult ofthis comparison audibly 10 the associated combatant 61-7, using an audible signal 8,(¢ i) with monotonically varying frequency £ that «dccreases offend) (0 be compared mentally with ficaution)) acconling to the separation distance i sep). Altematively, the audible sipnal frequency f(t) may increase monotonically as d(iy impact) decreases. so that flend)=ffeaution) causes implementation of defensive maneuvers by the combatant 61 In the first version, each combatant 61-i receives a sep rately determined audible signal S,(; i) Inthe second ver- sion, applicable where the relative locations of a group of combatants is substantially unchanging (the combatants Femain in place or move as & group), a singe audible signal S,(0 can be provided, keyed to a separation distance dre sep) ofthe estimated impact location from a reference eom Fatant (fea o¢ virtual Tor the group. In this version, the frequency range fend) =f=f(0) nd the cautionary frequency ffeaution)are preferably chosen to fake account of thehearing 2 acuities ofeach member in the group of combatants. Different ‘versions ofthis example are discussed in detail in Appendices A.BandC, As another example, consider a visual imege region, @ portion ofa larger image, in which an object of interest moves foward the viewer or away from the viewer at substantial speed. Because this movement, the apparent size (diameter ‘viewed transverse to the direction of sight) of the object changes substantially with ime. Ifthe visual imagers reduced in size and (redefined so that this object isthe dominant feature ofthe (resulting visual image, change ofthe diameter ‘wilh time ean be represented as an envelope frequency oF baseline amplitude, for example, with envelope frequency or ‘baseline amplitude changing in proportion to the increase (or decrease) with time ofthe diameter of the object. The system includes a “focus” mechanism (optional) that permits 2 visual image region (part ofa larger image) t0 be dlscretely or continuously reduced or inereased or otherwise adjusted in size (horizontally and vertically, independent of ‘ech other) to redefine, and focus on, aseleted smaller visual image region, in order to more clearly display the image Temporal changes that are of most importance. This adjust ‘meat in visual image region size can be implemented dis- cretely by drawing an initial quadrilateral or other polygon (rectangle, trapezoid, ete) as a bonler around the region to ‘which the visual image i estrited. Optionally, the resulting ‘visual image region, thus redefined, maintains its new shape, ‘with an image diameter that inereases or decreases according to the change in diameter that occurs asa result of definition ‘of the selected smaller visual image region, Appendix A Example of Projectile Trajectory Estimation Where combatants are present on a battlefield and are ‘exposed 10 tiller’ or armored vehicle fire, the injury sd fatality count can be reduced ifa probable impact point and a probable lethality adivs forthe shell orother projectile can be < E) a that injury arcleath from explosion of the incoming projectile is less likely or ‘unlikely, the audible (or visual) signal frequency, FG m; deer, 1) will stop decreasing at an end frequeney, fend) {caut, and the audible (or visual) signal frequency f(t n; int) will stop increasing st an end frequency, end) fea), as illstratod in FIGS, 8A and SP, respectively. Inthis fst version, the locaton ofeach combatant 61- is ‘knowin to within an inaccuracy of afew meters or less using an LD device that sparta the receiver-processor 6, anda ‘separation distance dn ep) iscalculated for each combatant (61-n, preferably using information received at and/or com- puted by thecorresponding receiver processor 63xn. The eon dition in Eq, (A-2) is tested separately for each combatant ‘6L-n to determine if this condition i satsied. For each com: baaat 6L-n for which he condition (A-2) is satisfied, the receiver processor 63m generates a fist audible (or visual) ‘warning Signal s(n: deer) in which the monotone decreas- ing frequency f(¢ 1; deer) decreases to substantially below the cautionary frequency flan} on alternatively the mono- tone increasing frequency fn; ine) inereases to substan- tally above the cautionary frequency f(eaut) Where the con dition (A-2) is not satisfied fora particular combatant 61 The monolonedeereasing frequency [((m: deer terminates or plateaus at an end frequency flend)>fteatt); and the mono- Tone inereasing frequency f(t n: nee) temninates or plateaus atanend frequency rend) mi 0y 24k 2 1b(L2) fran Norham ar eval )ontar-etares 12 as) where rt-fy) and et) are two observations of location ‘coordinates for the projectile at distinct Himes tt, and 2 (1,2... analy 8a selected but arbitrary time valne(e, to-(hits)2). Where the observation time values t, and t. are known, the solution r@) can be extended backward and forward in ime to estimate a launch time, tt, conespond- ing aunch ste LS, an impact time, tt, and a corresponding impact ste IS for whiel toes sy yes sy where, is known laich surface and Sis « known impact surface (eg, planar or spheroidal). The lunchtime, t,he impact time, tt, the launch site coordinates (x). p-%)> and ‘the impact site coordinates (x, yp 22) are estimate from Eas. (B22), (B3) and (B). ‘Two ormore projectile observations (times ty and witht.) ftom each of to or more spaced apart observers are used f0 provide trajectory vector values n,) and Where projectile observations are available at !>2 distinct times, FI, 423), With oe... typone can obtain 2 potentially more accurate estimation of the tajectory’ by replacing the trajectory parameters bi i2) and e(1.2) with filtered or statistically weighted parameter values, respectively, defiaed by SS an wucirs. BEA p-6 US 7,873,181 BL 17 continued $F meme where ij.) are normalized, non-negative ter weights at isiving Hin=0u. os KIT id ‘The projectile amneh velocity (rf = Fatma 82h, a determined immediately after lasneh, may be computed and used oestimat the projectile explosive load E, relying in part ‘upona database of launch velocity vor each ofthe different projectiles in theadversary’sarsenal. The projectile explosive load F plus reference curve or database of separation dis tance dE) fr serious injury or lethality (PIGS. 8A and 8B) is, tise to estimate whether a combatant is within probable injury or lethality repion forthe estimated impact site, ‘Where the value of F, is known (e, from local wind ‘observations), the projectile mass m can be estimated. By ‘consulting an appropriate database of the adversary"s projec tiles and comparing the projectile launch velocity, determined immediately after lunch, plus the projectile mass m. an esti- mate of projectile explosive Toad & ean be made. Using the information contained in FIG. 7 forthe partiulue projectile sed, an impact elfet distance d(F) corresponding to serious injury or lethality or equipment disablement ean be estimated ndused inthe preceding development in connection with Eq, (4.2) to determine which audible (or visual) warning signal Seat) should be provided fora particular combatant, or forall combatants. in given region, Two or more different warding, signals may be provided, corresponding to different dangers from projectile impact Appendix Trajectory Observations 11.10 thence roel hat observed at bt tially he sane tne cto wo or moe ksowa cbr ton ico Ol and O2 hivng th eee aon coords Yd) id (nyt) Eason cbaer- Smabnavers nce Pstioaon oatonsoondsss Gedy fet saben th sme ine tough meus mt of sation stan df, 2) and spose Coordinates (r=) meas rei eine of Sahoo Ft etn toh The obser ects OL OF nd prlet projec oan Fave corn pence wo arbay bu teed codmte oto elong plane 11 Grind by in senate O-01 and 0.08, A dieciy menue prion danced canbe dtr sled by eating nd tp een oar ot Cher elstaomngneteleconio acuta edt Sandro eed ura ign tne Ay Me, testy) 21.2 oy 18 where ¢ is a velocity of signal propagation in the ambient ‘medium ‘The line of sight segment I. from Om to P is described parametrically 3 follows for eich observation pont: sds 005g ony P4606 6, 0 ey ‘where (0. are spherical coordinates, referenced tothe same ‘coordinate system, and dis the distance along the line of sight [From the Eqs. (C-2), one recovers consistency relations, (naP4or yeaa oor coPh OPO Teeth) S| es Equations (C-2a) and (C-2b) can be re-expressed as (wna corr esi Neato es ‘The location (xy. yp 2) isthe unigue intersection ofthe line of sight sewments [and L. From Eqs. (C-2a) and (C-2b) fone infers that costings cotsosfaes—ni | (C=3) costings

You might also like