You are on page 1of 18
‘US 201702! cu») United States 2) Patent Application Publication co) Pub. No.: US 2017/0205061 Al oy om m en @ (86) G0) Jul 17, 204 BROERS et al. STADIUM LIGHTING AIMING SYSTEM AND METHOD. Applicant: PHILIPS LIGHTING HOLDING BKY., EINDHOVEN (NL) HARRY BROERS, EINDHOVEN (NL); RUBEN RAJAGOPALAN, EINDHOVEN (NL), WEI PIEN L EINDHOVEN (NL); CHRIS DAMKAT, EINDHOVEN (NL); BART ANDRE SALTERS, EINDHOVEN wy Inventors Appl. Nos 26,418 PCT Filed: Jul 6, 2015 PCT No, $371 XI), (2) Date: PCTEP2015/065281 Jan, 13, 2017 Foreign Application Priority Data ey 14174356 IS06LAL (43) Pub. Date: Jul. 20, 2017 Publication Classification (1) Inte FIV 2115 (2006.01) G06r 7/70 (2006.01), GOOK 946, (200501), HUAN 778 (200501), (2) US. C1, oe F21V 21/15 (2013.01); HOSN 7/188 2013.01», Gos 7/70 2017.01); Gn6k 914604 (2013.01); P20" 21317105 201301) on ABSTRACT A lighting aiming system for aiming a stadium fighting system, the lighting aiming system comprising: a luminaire ), the lominaire having a mounting position and an ore ‘ation and configured to generate a light beam along an ‘optical axis: camera (1) conigured to camera (1) coupled to the luminaire and having a defined relationship between a fied of view of the eamera and the optical axis; a memory (81) configured to store lighting information comprising desired aiming location of the Tominaire, and the memory further configured to store fea- ture information comprising an expected location of at least ‘one feature; @ processor (33) configured to determine and ‘ouput sting information based on the festure information, Jighting information and image to enable a determination oo ‘whether the luminaire is comsctly aimed, Ac o Patent Application Publication Jul. 20,2017 Sheet 1 of 7 US 2017/0205061 Al Fig1 Patent Application Publication Jul. 20,2017 Sheet 2 of 7 US 2017/0205061 A1 Fig 2 Patent Application Publication Jul. 20,2017 Sheet 3 of 7 US 2017/0205061 A1 Fig 3 Patent Application Publication Jul. 20, 2017 Sheet 4 of 7. US 2017/0205061 A1 301 Fig 4 Patent Application Publication Jul. 20,2017 Sheet Sof 7 US 2017/0205061 A1 _ van 110 Fig 5 Patent Application Publication Jul. 20,2017 Sheet 6 of 7. US 2017/0205061 A1 Fig 6 Patent Application Publication Jul, 20, 2017 Sheet 7 of 7 US 2017/0205061 A1 Fig7 US 2017/0205061 Al STADIUM LIGHTING AIMING SYSTEM AND. METHOD FIELD OF THE INVENTION [0001] This invention is generally related to a lighting ‘timing system and method, and in particular a lighting ing system and method which is compatible with lange trea lighting systems, BACKGROUND OF THE INVENTION 10002} In many high-end fage-area Hit applications such as those utilising Philips ArenaVision products for ‘example, a large number of luminaires are distributed ‘round aa are to be illuminated in an atempt wo erate & ‘uniform light intensity across sai given area. Examples of| such applications include arena lighting systems for ilu nating sports arenas, e.g. fel, pitch or stadium lighting, ‘agade lighting, shop Noor lighting, parking lot lighting and 10003] A football stadium, for example, may have a light- fing plan or design where the lighting system contains more naires each located on the stadia and with & location oF point on the pite to attempt to provide a suitable lighting effect. Typically the lighting system installer eceives a light plan which contains, for ‘each Inminaite, information such as the type of luminaire, the mounting location and orientation ofthe luminaire, and the aiming locaton or point (typically relative tothe centre ‘of the fed). Based on this information the installer mounts the installer also bias to direct aiming location in the fed using the lighting plan orienta tion values 1000S} From dhe luminaire Tocation, the installer has a ‘lear overview ofthe ld but itis very dificult o accurately ‘determine the aiming location in the field. To improve the ‘accuracy ofthe alignment procedure, the installer ean use a arid ereated by manually puting visual markers on te fekd ‘at the required coordinates and a laser pointer aligned sith the luminaire optical axis. In such a way’ the alignment is 2 matter of aiming the laser spot at the requested visually inerpolsted locations on the grid. In this procedure the placement of the visual markers on the field isan elaborate task and the alignment ise based on the laser spot is prone 10005} | One way to overcome such dificulis isto presi the luminaires mounted on prefabricated jigs on small seule ‘models of the lighting plan such as discussed within the US. Pat, No, 8,717,882. However such systems are unable adapt to any changes tothe uilding ofthe stadium. In other words itis dificult or impossible to “fine tune the luminaires Jn an effective way. It has furthermore be proposed, such as ‘iselosed in US published application US20130268246 to tach a camera 1 the luminaire so that the installer i able to ‘see’ where the Hight from the luminaire will be directed by pretucing a “topped” image reflecting the modelled ight pattem. This eropped image ean then be compared with & further ‘wide-angle’ camera image to determine the locaton ‘ofthe ight beam and pattem relative toa wide-angle image fnd so determine whether the luminaire i dicted at the required direction, Jul. 20, 2017 SUMMARY OF THE INVENTION 10006} ‘The above concem is addressed by the invention as define by the claims, 0007} According to an embodiment of the invention, there is provided a lighting aiming system for aiming a large area lighting system, the large-area lighting system for lighting an area to he illuminated, the Fighting aiming system comprising:a luminaire, the luminaire having a mounting position and an orientation and configured to generate a light beam slong an optical axis; camera configured to capture ‘an image, the camera coupled to the luminaire and having & ‘defined relationship between a field of view ofthe camera and the optical axis a memory configured to store lighting information comprising a desired siming location of the Tuminaire within the azea tobe illuminated, and the memory further configured to store feature information comprising an expected location of at least one feature within the area to be illuminated, which is illerent to the desired aiming location; a processor configured (0 determine and output timing evultation information based on Use Feature infor ‘mation, lighting information and image to enable a deter ‘ination on whether the luminaire is comecty aimed. In such embodimtens the luminaire can be coreecly aimee by the use of an image to determine the luminaires current ama and comparing the current aim with a stored lighting plan ‘containing information as tothe luminaires desired ai, [0008] The defined relationship between field of view of | the camera and the optical axis may be at least one of: a known orientation offset between the field of view of the camera and the optical axis; the fed of view of the camera including the optical axis: the field of view of the camera centred on the intersection ofthe optical axis and a surfeee. In such embodiments the camera ean be aligned with the ‘optical axis and therefore “see” the aiming point of the luminaire. Or in some situations be offset at a known ‘orientation relative o the luminaize optical axis in oder that the camera is able to apply a narow field of view to capture ‘an image containing « knowa feature which i located fT the Juminaire’s optial axis but which ean be used to aim the Jminaire. [0009] The processor configured to determine and output aiming evaluation information may be configured to gener- fate aa image position based on a mapping ofthe distance, ‘within a field of view of the camera, between the desired siming location of the luminaie, from a point of view ofthe ‘camera, and the expected location of at least ono feature, from a point of view of the camera, and configured to genemte 9 graphical indicator at the image position to be ‘applied tothe image o indicate the expected location ofthe at least one feature within the image. In such embodiments the user or installer can see the dilferenee between the luminaires current im and the expected sim by comparing the position ofthe feature within the image with the graphi- cal indicator, [0010] The processor configured to determine and outpot ‘iming evaluation information may be configured to analyse the image to determine the atleast one feature within the image, to determine the feature position within the image, Ando analyse the feature positon within the image using the lighting information and feature information to deter ‘mine and owtput the siming evaluation information com- prising an indicator as to whether the lumingie i correctly imed, In such a manner the procestor can identity features ‘within the image and tben use these determined features as US 2017/0205061 Al bodiments the processor is i ‘geometry behind the siming of the luminaire and therefore to determine the location of the luminaire and/or the orieataion of the luminaire by com- paring the image feature locations and the expected location ‘Of the luminaire andior orientation of the luminaire [0011] The aiming evaluation information may further ‘comprise a luminaire orientation adjustment signal based o the analysis ofthe feature position within the image using the lighting information and feature information. 10012] The lighting aiming system may further comprise fn electronically contmllable motor configured to receive the luminaire orientation adjustment signal and to aetwate the Iuminaire based om the himinaite orientation adjustment signal such that the diflerence between the desired aiming Position and an aiming position based on the mounting Jocation and orientation ofthe luminaire is redueed. In such sitmatons the laminaires can be aimed and changed in thei ‘sim multiple times without the need to employ expensive high level working equipment 10013] The lighting aiming system may further comprise a tit sensor coupled tothe [uminaie, wherein the processor may be configured to detemnine whether the luminaire is ‘correctly aimed ina fist, around a horizantal axis, orienta tion based on analysing the dit sensor output using the lighting information: and to determine whether the laminaire js correctly aimed in an second, around & vertical axis, ‘orientation based on the aiming evaluation information. La such embodiments the Gling. siming detemnisation and correction can be performed using the tilt sensors and significantly reduce the complexity of the pan aiming deter- mination and eorection using the camera and “computer vision” apparatus ane methods as discussed herein, {0014} The fighting aiming system may further comprise 8 distance or range sensor coupled to the luminaire, wherein the processor may be configired to determine whether the Juminsie is corcetly aimed in a first, around a horizontal xis, orientation based on analysing the distance or range feator ontput using the lighting information and to deter tine whether the luminaire is coeretly nimed in an second, ‘round @ Vertical axis, orientation based on the aiming ‘evaluation information In such embodinents the tit aiming ‘determination and correction can be performed using the range or distance sensors and signifiantly reduce the com- plexity of the pan aiming determination and correction using the camera and “computer vision” apparatus and methods as “discussed herein, [0015] The fighting aiming system may further comprise & display, wherein the display may be configured to receive tnd display visual representation of information relating 0 the aiming evaluation information In such a manner the user ‘ean be provided an indicator as to how to adjust the Tuminaire and so correctly aim it [0016] The display may be an augmented reality display showing a representation of the luminaire aiming location from the viewpoint of the display the Tuminaire aiming location being based on the aiming evaluation information. [0017] The at least one Feature may be atleast one of: at least one beacon located at determined location(s), and ‘wherein the feature information comprises the expected Jocation(3) ofthe atleast one beacon: and an intrinsic 2D or 3D feature located at a defined location, and wherein the {eature information comprises the location of the 2D or 3D Feature Jul. 20, 2017 [0018] In such embodiments the memory is configured to sore information indicating an expected image with beacon position and use this to compare with the aetwal image Captured by the camera within which the actual beacon is positioned. The difference between the expected image beacon position and the actual image beacon postion may then in Some embodiments be used to determine an aiming fem and corrction. Far example the intrinsic feature may bea stadium stand arrangement, a goal post centre ercle fo centre spot mark on the field. In such a manner no saddtional markers or beacons are required to be placed at locations on the fed 0019} According to a second aspect there is provided a ‘method of aiming a large area lighting system, the lange area Jighting system for lighting an area to be illuminated, the rmicthod comprising: locating @ luminaire ia @ mounting Joestion with an orientation such that the luminaire is configured to generate a ight beam along an optical axis coupling a camera to the luminaire with 2 defined relation: ship betwoen @ field of view of the camera and the optical axis; capturing an image with the eamera: storing a ighting information comprising a desired aiming position of the Juminaire within the area to be illuminated; storing featore information comprising an expected location of at least one feature within the area to be illuminate, which is different {o the desired siming location; determining aiming evalna- tion information based on the feature information, the light ing information and the image; outputting the aiming evalu- ‘tion information to enable a determination on whether the luminaire is correctly sive 0020] Te defined relationship berweena field of view of the camera and the optical axis may’ be at least one of: a known orientation offset between the field of view of the ‘camera and the optical axis; the field of view of the eamera including the optical axis: the Feld of view of the eamera centred on the intersection ofthe optical axis and a surface Tn sich embodiments the camera ean be aligned with the ‘optical axis and therefore “see” the siming. point of the luminaire. Or in some situations the camera field of view ‘may bese at known orientation relative othe luminaire ‘optical nisin onler thatthe camera is able to apply aro field of view to eapture.n image containing a known feature hick is located off the luminaires optical axis but which cane used 0 sim the Tuminsive [0021] Determining aiming evaluation information may ‘comprise: everating an image position based on & mapping ofthe distance, withina Geld of view ofthe camera, between the desired aiming location ofthe kiminate, from a point of view ofthe eamera, and the expected location of atleast one feature, fom a point of view of the camera: generating @ raphical indicator at the image position; applying to the mage the graphical indicator at the image position 1 indicate the expected position of the atleast one feature within the image. [0022] Determining the aiming evaluation information ‘may comprise: analysing the image to determine the at least ‘one feature within the image; determining the feature pos ‘ion within the image: analysing the feature postion within the image using the lighting information and feature infor ‘mation; and generating an aiming indicators o whether the Tuminaire is correctly aimed based on the analysis of the feature postion within the image using the lighting infor vation and feature information, US 2017/0205061 Al within dhe image using the lighing information and feature information, and outputting aiming evaluation information ‘comprises outputting the luminaire orientation adjustment signal 10024) |The method may further comprise: receiving ata ‘electronically contrllable motor the luminaire orientatio adjustment signal; and actuating the luminaire by the elee~ tronically controllable motor based on the Iuminaireorien- tation adjustment signal such that the diffrence between the ‘desired aiming position and an aiming position based on the ‘mounting position and orientation of the luminaire is resuced. 10025] The method may comprise: determining whether the luminaire is eorredy aimed in fist, around a horizon- tal axis, orientation based on analysing a tlt sensor ouspt using the lighting information; and determining whether the fuminaire is correctly aimed in a soond, aotind a vertical axis, orientation based on the aiming evaldation infomation. 10026) The a nprise: determining whether the luminaire ina first, around a horizon- tal axis, orientation based on analysing a range or distance sensor output using the lighting information; and determin- ing whether the laminaire is correctly aimed in a second, arounda vertical axis, orientation based on the aiming ‘evaluation information, 10027] The method may comprise receiving and display- ing 2 visual representation of information relating 1 the ‘siming evaluation information on a display unit 10028] Displaying a visual representation of information ‘may comprise displaying via an augmented reality display’ representation of the luminaire aiming position from the viewpoint ofthe display, the luminaire siming postion being based on the siming evaluation information. 10029] The at east one feature may be at least one of at Teast one beacon located at determined location(s), and wherein the festure information comprises the expected Jocation(s) ofthe atleast one beacon: and an ineinsie 2D or 3D feature located at @ defined location, and wherein the ‘eature information comprises the location of the 2D or 3D. Feature BRIEF DESCRIPTION OF THE DRAWINGS 10030] | Examples ofthe invention will now be described in detail with reference 10 the accompanying. drawings, in, which: [0031] FIG. 1 shows an example lighting aiming system ‘seconting to some embodiments 10032] "FIG. 2 shows an example lighting aiming system i ‘operation according first set of embodiments; [0033] FIG. 3 shows an example lighting ming system in ‘operation aecording to a second set of embextiments 10034] FIG. 4 shows an example lighting aiming system in ‘operation determining at last one 2D and one 3D example feature within a stadium aecording to some embodiments: 10035) FIG. $ shows an example lighting aiming system using an additional tilt sensor secording to some embodi- 10036] FIG. 6 shows an example lighting aiming system using an additional distance or range sensor according to some embodiments; and Jul. 20, 2017 0037] FIG. 7 show a flow disgram ofthe operation of the light aiming system according to some embodiments DETAILED DESCRIPTION OF THE EMBODIMENTS {0038} ‘The concepts as deserbod with respect to the embodiments erst are t a Fighting aiming system. In Particular toa lighting siming system t enale the direction {Flood liaht laminates to improve workflow efficiency The lighting aiming system uss a camera to determine the comet cimang of & luminaire based on extracted. visual landmarks or features positioned in the ares to be im ‘ated (ithe nteedced oF intense located within the aca). An alignment error ean furthermore be derived by analysis ofthe determined features (measured orientation) and with the required orientation scoring toe light pla The Jnstler can in some embodiments as described herein then be auided by suiable audio or visual means to the smecsuredmislignment wo dee the luminaire. sinilr fppmoach can also be applied to lumiasiees with motorized ‘act nts. Aihough the following examples have been Geseribod in parle with respect to stadia oF areas seh 4 fosbal stadia, it would be understood that the apparatus tnd methods described! herein cold be applied to various Jang seal fighting applications, such as fagade Hating, shop foo Hihing parking lot lighting or even lane reas in which itis not posible © plage markers onthe target Tocations, such as Hiling for svitaming pool arenas, for example. The lighting system designer typically uses a ‘etal plane to desig the Hight pla. In fotbll stains the Via plane is uswally taken to be the football pitch However in swimming pools the veal plan i iil to tne water Jvel of the poo, Inan empty pool itis iat 0 place markers atthe get location onthe vet plane and Jn a filled poo! the marker shoul! be (moving) floating. Sinilany th apparatus an methods desriba herein canbe applied to lighting situations where the lighting plane is ‘even, For example in velodromes where the cyeling Door js banked and makes a step angle. [0039] With respect to FIG. Tan example lighting aiming stem is show. The Fabting aiming system shows in FIG T comprises luminaire § which can be any suitable luminaire type and configuration. The huninaire $ can 38 described herein be mon within the stadiom via a ‘lable mounting point orig. The mousting pot or jig ean be configured to be adjstable in order to change the cricntation of the luminaire. This sdjosimeat in some tmbodiments is a ‘0 dimensional adjustment. Suitable adjustments can be any two or the ofthe following: an Cremation around horizontal axis (@ tlt adjustment) an ‘rentation around a veetial ans (a pan adjustnent) and an Crontation around the optical exis of the Taminae (aro aujusinend. {0040} "The lighting aiming system can comprise a camera 1. The camera I ean in some embodiments be couple 0 the Inia §. The camera in some embodiments is detich: ably coupled to the luminaire $ on @ fixing ji. In some embodiments the camera 1 can be pemmanenty coupled or ‘mounted on the fuminsire § or fom par of an integral Juminaire assembly comprising laminate and aera, The camera 1 is coupled othe faminaire in such a manner tt there is a defined relationship between afield of view of the camera andthe optical anis, For example the defined rele tionship between te field of view of the camera and the US 2017/0205061 Al ‘optical axis may be that the Bld of view of the camera Jnchudes the optical xis ofthe luminaire, ln other words that there is a small but known orientation offset between the field of view of the camera and the optical axis. Preferably the fold of view of the camera issue thatthe centre af an ‘mage captured by the camera is the intersection of the ‘optical axis of the Tuminaire with respect to the pitch oF stadium surfoce. [nother words that preferably the centre of the image captured by the camera iste aiming spot or point ‘of the luminaire. However it would be understood thatthe defined relationship may be a known orientation offset between the feld af view ofthe camera und the optical axis ‘This offset may enable the camera to have a nao field of view to capture an image containing a kaown Feature which js located off the luminaire’s optical axis but which can be used to aim the luminaire ia a manner as deseribed herein. 10041] |The camera can be any suitable camera or imaging means configured to capture an image and pass the image 10 ‘an aiming device 3. For example in some embodiments the ‘camera 1 comprises lenses or optics to enable an adjustment ‘of the field of view of the camera such as @ zooming ‘operation such thatthe camera is configured to capture a frst mage or set of images at a fist 200m level witha Wider feld ‘of view to enable coarse aiming, and a second image or se ‘oF images at a second zoom evel with a narmower field of view to enable fine aiming. Inthe application herein the tem ‘camera should be intespreted as any image capturing appa Talus including both passive and active imaging examples such as Hidae devices, infra-red cameras, and should not be Jimited to visual wavelength cameras. 10042] The lighting siming system in some embodiments Ture comprises an aiming apparatus 3. The aiming appa- ratus 3 in some embodiments is configured to receive the Jmage or image data from the camera 1 and based on the ‘mages determine whether the himinare is correctly aimed. In some embodiments the aiming apparatus is implemented by a suitable computer or processing apparatus configured to receive (and transmit data) wirelessly or using wired of ‘cable connections. The computer or processing device in some embodiments is a portable or mobile apparatus suit- fable for being caried by the installer or the luminaire system. [0043] The siming apparatus 3 in some embodiments therefore comprises at least one processor 31 configured 10 process image data received from the camera 1. The aiming ‘apparatus 3 furthermore comprises atleast one memory 33. The memory in some embodiments comprises a portion of memory allocated wo storing program code or data to be ra ‘or executed on the processor 31, such a the feature deter- mination, feature position determination, and aiming deter mination. operations described herein, Fuithermore the memory 33 in some embodiments comprises a portion of memory allocated to storing data to be processed. For ‘example in some embodiments the memory 33 is configured to store a lighting plan or information based on the lighting plan to enable the processor 31 10 determine whether the Juminaive is correctly aimed. In some embodiments the memory 33 comprises infonnation based on desired or siming” orientations ofthe kuminare, the desired or “aim= ‘ng’ location on the pitch, or any suitable information based ‘on an aiming’ requirement. This information can for ‘example be determined or found from the lighting. plan ‘oF table of luminaires installed or 10 be the stadium, the type of luminaire, the Jul. 20, 2017 vouating or placement location of the luminaires (relative to ‘known datum such as the centre point or spot of the stadium), the desired orientation of the luminaire, and the esired aiming point of the luminaire (relative othe knowa datum), Furthermore as discussed herein in some embodi- seals the information based on a desired aiming location of the Tuminaire can be in the fom of simulated of pre- termined generated images from the viewpoint of a cam- era (luminaire) mounted at the desired location and orien- tation and aimed atthe desired aiming location or spot. The ‘memory 33 in some embodiments may comprise informa tion based on any features to be used in the aiming oper- tions. As will be described hereafter the features can be intrinsie to the architecture or stricture within which the Tighting effect is being generated, for example features within the stadium, In some embodiments the feaures are aiming or target features 19 be located at known of deter ined locations. Thus for example any parameters asso ated withthe feature (aiming, target or intrnsi) are stored in the memory 33, These caa for example be the location, the shape, sie, clour or ight pattern of the feature 0044} ‘The memory 38 can be any suitable memory such as semiconductor memory and in some embodiments the ‘memory 33 comprises atleast one pat volatile memory and ‘one part non-volatile memory. In some embodiments atleast part of the memory is located separate fom the aiming apparatus 3. For example in some embodiment apart ofthe ‘memory, such asthe memory comprising the lighting plan or information based on the lighting plan, is located on a server remote from the siming apparatus 3 and can be retrieved by the processor 33 via a suitable data connection. [045] The processor 31 can therefore be coafigured 10 receiv image data from the camera I, In some embodiments the processor is further configured fo receive the lighting information and the feature informatioa, The processor 31 can then analyse the image data the lighting information and the feature infomation and determine and outpot aiming evaluation information based on the analysis to enable a {delermination on wiieher the laminaire is correctly aime. For example in some embadiments the processor ean be ‘configured to generate an image position (oran image offset ‘rom defined postion in an image) based on a mapping of the distance between the desired aiming location of the Juminaire and the expected location of atleast one festore with respect tothe image viewpoint. Having detemained an ‘mage postion the processor can then generate a graphical indicator atthe image postion to be applied to the image to indicate the expected positon of the atleast one feature Within the image. Although it would be understood that ‘within the following description the terms location and position are interchangeable. For clarity reasons the tenn location is used with respect to a physical Toeation or positon, such as the aiming location or the locaton of the Teature within the stadium, and the tem postion is used with espoct fo @ camera image of information within the image fr based on the image. When the graphical indicator is aligned with the feature then the luminaire is correctly timed, Furthermore the installer or user can, when the ‘raphical indicator does not align with the feaure o mage, attempt to adjust the laminae to align the graphical indicator with the feature on the image. [0046] In some embodiments the processor 31 can fur- thermore he configured to perform image processing and ‘analyse te image fo determine the at east one feature With US 2017/0205061 Al the image. As described herealler the feature ean_ for ‘example be a visual marker or beacon placed upon the field ‘or can be determined based on the intrinsic structure oF ‘configuration ofthe stadium or pitch, 10047] The processor 31 furthermore is configured determine the position of the feature(s) within the image. Furthermore the position is defined in some embodiments relative to a known point or locus. For example in some iments the positon of the feature(s) is determined ive othe eente point of the image however it would be ‘understood that in some embocliments the position of the feature(s) is determined relative to one of the comers oF ‘edges ofthe image, [0048] The processor 31, having determined the position ‘of the feature(s) ean tea be configured to determine oF ‘analyse the position of the feature or features within the mage using the lighting information and the feature infor- mation. The processor 31 can use this analysis to generate ‘and output aiming evaluation infomation to enable a deter- rmination of whether or not the Jaminaie is eorectly aimed. For example the processor 31 can determine the image Position (or an image offset from a defined position in an mage) between the feature in the image and an expected ature postion and generate siming evaldaton information ‘or an aiming or orientation adjustment signal further based fon the analysis. 10049] In some embodimeats the aiming apparatus further ‘comprises a display unit 38. The display unit can be any suitable display technology’ suitable for providing a visual or raphical indicator to the installer or usee of the aiming ‘apparatus 3. Tn some embodiments the display unit it 3 ‘detachable or separate part from the siming apparatus and ‘configured to receive the aiming evalvation information (uch a6 the orientation adjustment signal) and display 3 Visual representation of information relating to the aiming ‘evaluation information. In some embodiments the display unit 38 can receive the image captured by the camera and ‘overlay the aiming evaluation information suchas the visual ‘or gnphical indicator. In some embostimments the image ea furthermore be processed. For example the determined fea- tures can be visually enance. 10050] In some embodiments the display unit can be Jmplemented as an pplication oF program operating on @ tablet computer, mobile device or mobile phone. In some ‘embodiments the display technology is an aigmented reality ‘display such as for example augmented reality plases 001) In some embodiments the aiming evalation infor- mation can be processed by the apparatus i and output via an audible output apparatus, such asa speaker or headphone The audible output of the aiming evaluation informatio ‘may in some embodiments compliment the visu oF graphi- ‘al ouput and fueter indicate tothe installer or user whether the luminaire is correctly aimed. For example a headset could attempt to produce a spatial audio source signal indicaing Wo the installer oF user a rotation around the vertical axis, Thus a left spatial signal could indicate an adjustment to the left and a right spatial signal indicate an adjustment © the right 0082] In some embodiments the lighting aiming system Ther comprises controllable motor 37 configured 10 receive the orientation adjustment signal. The motor, for ‘example & stepper motor, can be coniigured to move oF fsctuate the luminaite Sto roduce any siming err. In such ubodiments the Inminaire 5 can be mounted on 9 tit and Jul. 20, 2017 ‘pan gimbal which is actuated or ‘controlled motor 37. The controllable motor 37 can ‘more in some embodiments move or actuate the luminsice ia ‘roll orientation. Although most lumingires have a rotation ‘variant Hight projection with respect tothe optical contre tnd therefore it is not always necessary 10 adapt the ‘mechianeal rl angle, roll movement ean produce an effect {or asymmetric light projection moles. In some embodi- ‘ments the mechanical rll ean be measured witha tilt device like an accelerometer in the plane perpendicular to the ‘optical axis and an alignment grid overlay as part of the fiming evaluation information can he generated and pro- jected on the input image so thatthe wer enn detect sty ‘misalignment and correct the role angle aocordingly [0053] In some embodiments the lighting aiming {urther comprises additional sensors 39. For exan some embodiments the lighling siming system comprises ‘il sensor coupled 10 the Inminsire 8 and confighred 10 determine a horizontal axis (il) orientation value which ean be passed 10 the processor 31, In such embodiments the processor 31 can be configured to receive the tilt sensor ‘output and deteemine whether the Tuminaire is correctly fimed around a horizontal axis orieatation by analysing the ‘it sensor output using the information stored in the memory 33, such as comparing the measured tilt angle and the esired tit angle for the luminaire. Furthermore in such ‘embodiments the procestor 31 can be configured to deter ‘mine whether the luminaire is corretly aimed around vertical axis orientation (a pan orientation) based on the analysis of the image and the infomation stored in the ‘memory 33 as discussed herein, In sch a manner the image analysis as diseussed herein is simplified as the tlt orienta tion is determined separately and does not need to be {determined from the analysis of the image using the infor ‘mation stored inthe memory. In some embodiments the use ‘of the tlt sensor ean be configured to check or ealibrate the Setermination of whether the aiming of the luminaire is forrect based on the analysis of the image as discussed herein. Furhemmore in some embodiments the use ofthe tilt sensor can be combined with the analysis of the image as discussed herein to more aceurately determine the aiming of the Tuminaire [0054] Similarly in some embodiments of the fighting system further comprises a range or distance sensor mounted ‘on the luminaire Sand configured to determine a distance or ‘ange from the luminaire to the aiming position or point la such embodiments the processor 31 eat receive the output of the distance or range sensor and analyse this based on the information in the memory 33 in order to determine the horizontal or tilt orientation, In such embodiments the processor 31 can be configured to receive the range or stance sensor output and determine whether the laminaire js correctly aimed around a horizontal axis orientation by analysing the range or distance sensor output using the information stored inthe memory 33, suchas comparing the neasured distance or range against an expected dist between the Juminsire and the siming position or spot Furluermore in such embodiments the processor Mca be configured to determine whether the himinaine is eorretly aimed aronnd a vertical axis orientation (a pan orientation) based on the analysis of the image and the information stored in the memory 33 as discussed herein, In such a ‘manner the image analysis as discuss herein is simplified ‘asthe tilt orientation is determined separately and does not US 2017/0205061 Al need to be determined from the analysis ofthe image using the information stored in the memory In some embodiments the use of the range or distance sensor can be configured (© ‘chock or calibrate the determination of whether the aiming ‘of the luminaire is coroct based on the analysis of the image ‘discussed herein, Furthermore ia some embodiments the use ofthe range or distance sensor can be combined with the analysis ofthe image as discussed herein to more accurately ‘determine the aiming ofthe luminaire. 0085) In the following examples the processor generates the aiming evaluation information based on the image processing embodiments described above. However it would be understood thatthe following apparatus ean be ‘configured to generate the aiming evaluation information in the form of the imaye postion applied to the image. In other words generating and ouiputing aiming evalsation infor mation by generating an image position based on mapping ‘of the distance between the desired aiming locaton of the Juminaire and the expected location of atleast one feature ‘with respect othe image viewpoint in manner as described herein. Once an image position is determined then graphi- cal indicator is applic tothe image atthe image postion to Indicate the expected position of the at Teast one feature vvithin the imoge. [0086] In some embodiments the processor 31 can fur thermore be configired to perform image processing and analyse the image to determine thea least one feature within the image. 10087] » With espect to FIG. 2, an example operation of the Fighting aiming system is shown accorting to first sot of ‘embodiments. The stadium is represented in FIG. 2 by a football field 100 and an example luminaire § (coupled to 3 ‘camera 1) having an optical axis or aiming direction 103 Which intersects with the football field 100 at an aiming location or point 107 on the surface of the Meld. The Fuminaie Sin this example is located at an @ priori known luminaire location (x.32zX1.Y1,Z1) but bas an wakaown oF ‘yet undetermined aecuratly horizontal and vertical orien- 10058] Furthermore located on the feld is at least one Visual marker 101 at known locaton, In this example @ single visual marker 101 is located atthe centre spot of the field (X32:000). The visual marker can be any suitable ‘object sch as 2 balla reflector, or an active light source oF hetcon, It would be understood that a beacon could be ‘defined! both as an active and passive beacon. Furthermore the term beacon and visual indicator should not be limited to the human visual range of wavelengths but with regards 10 the camera sensor sensitivity range. In some embodiments different objects may have different characteristics such as ‘different colours or shape of markers, or coded (pulsed) light sources in order to enable the visual markers To be distin- uished fom each other [0089] The camera 1 coupled or mounted to the luminaire 5 can be configured to capture an image as described herein ‘and shown in FIG, 2 as image 110, The image 110, in tis, ‘example, has a centre at image which shows the aiming spot ‘oF point 107. The camera can in some embodiments pass this Jmage to the processor 31 10060] The processor 31 can then be configured to analyse the image 110 to determine a feature (which inthis example js the visual marker 101). The determination of the visual indicator can be based om any kaown parameter sich as ‘colour, shape, size or beacon pattern, For example the image Jul. 20, 2017 an be colour filtered to identify the visual indicator having ‘specific colour, o the image can be processed to determine a specific shape. Having determined the feature (visual ‘marker 101) within the image 110 the processor can then be configured to determine the visual marker 101 position Within the image 110. In the example shown herein the position within the image 110 is determined based on the ‘numberof pixel rvs (vertical offset) Ar andthe number of pixel columns (horizontal offset) Ae from the centre of the mage (representing the siming spot or point) 107 [0061] In some embodiments the processor M can be configured 10 use the offset (the number of rows of pixels and columns of pixels) from the centre ofthe image to define 4 veetor line 105 originating atthe position on the image plane through the optical axis ending atthe marker location Jn the field. This vector 105 can, in some embodiments be ‘used to derive an estimation of the rotation angles around the ‘ow and column dinecton with respect tothe optical axis, or in other words to determine an estimate of the crentation(s) fof the liminsie, knowing the relative locations of the Tuminaire, the feature (oe visual marker), and the “distance™ between the aiming point and the feture. [0062] In such situations the processor 31 can furthermore retrieve from the lighting plan, or information based on th lighting plan the expected or desired orientations) of the Jominaire. The processor can compare the determined oF eased orieatation(s) with the expected orentation(s) and therefore determine whether the luminaire is corretly aimed ‘or not, Furthermore from the difference berween the deter- ‘mined oF measured orientation(s) with the expected orien- ‘ations) the processor ca Furthermore generate a cotection of orientation adjustment signal to be passed tothe display ‘unit or 1 the motor as deseribed herein. 0063} In some embodiments atleast some of the geom- ‘ety ealeulations ean be performed prior to the deemination ‘of whether the luminaire is corretlysimed. For example in sonic embodiments the processor 31 can for example use the information based the lighting plan expscted luminaire Jocation,orientation(s) and potentially the aiming spot with respoct tothe field, the loeaion of the visual indicator and ‘with knowledge oa the camer field of view pre-determine an expected image ollst or position. Furthermore in some embodiments the lighting plan can itself comprise an fexpected image offset valve oF positon, In such embodi- ‘ments the processor 31 can then compare the expected ‘mage offset values oF positions against the determined image offset values or positions in order to determine ‘whether the luminaire is onmecty aimed or nt, Futhennore {rom the dilference between the expected and determined image ollsets or postions can furthermore be used to gene erate a conection oF orientation adjustment signal 10 be passed to the display unit oro the motors deseribed herein [0064] In some further embediments the expected image ollset or position ean be genersted as a simulated image ‘comprising 2 simulated visual marker a the expected image ‘ollset or image position which can be overlaid with respect to the eaptured image and presented tothe user ofthe display unit. The displayed simulated visual marker can be used 10 inform the installer or user ofthe aiming apparates by means of suitable graphical user interface the correct luminaire orientation, [0065] With respect to FIG. 3 an example of the lighting fiming system i shown according to a second set of embodiments. Whereas in the example shown in FIG. 2 US 2017/0205061 Al there was single visual indicator, in this example multiple visual markers (or beacons) at know locations are placed in the field, This is shown with respect to FIG, 3 by the field having a sevond visual marker 201 located at'a second known position (x3i2X2,Y2,22), 10066] The processor 31 can receive the image 210. The ‘mage 210 captured by the camera 1 shows oth of these visual markers 101,201, The processor ean then furthermore determine the “features' or visual markers 201, 101. The processor 3H can then determine the postion ofthe visual markers with respect to the captured image. This is once sain shown in the image 210 where te first visual marker LOL has Ar pixel rows (vertical) ofliet and the Ae pixel ‘columns (horizontal) offset fom the eentre of the image (epreseating the siming spot ce point) 107 and the second visual marker 201 as 2 pixel rows (vertical) offset and the ‘Ac2 pixel columns (horizontal) offset ffom the centre of the mage (representing the aiming spot or point) 197. In such ‘manner the processor cas then determine the displacement vectors 108, 205. The processor 31 ean furthermore analyse these Vectors 108, 208 to derive an estimation ofthe rotation angles around the row and column direction with respect 0 the optical axis, In other words the processor ean determine fn estimate ofthe orienation(s) of the luminaire, knowing the relative locations ofthe laminae, the feature (or Visual rmarker)-and the “distance between the aiming point and the feature Known location, The use of multiple features is advantageous as it penis an average sing position to be ‘determined with the ability to reduce any error introduced by a single measurement. Furthermore with suficent numbers ‘of featuzes or visual indicators determined the processor 31 ‘ean furthermore estimate further paraiters associated ith the luminaire, such luminaire rotation orientation (which js useful for non-symmetrical light patter ‘estimating the camera Ioeation within the sta ‘words the luminaire location within the stadium). Thus for ‘example with 10 features oF visual markers @ rotation around the optical axis can be estimated, while in ease of one Jeature (marker) this rol rotation is assumed to be zero and thus neglected. For luminaire location, 4 features oF (bea- ‘cons) in the 2D image are required. Where range or distance ‘data is available for the features only three beacons are required 10 derive a luminaire location. Where the 2D ‘orientation ofthe plane ofthe field can also be determined then only 2 features in the image are needed in order t0 ‘estimate or extract the liminaire cation, This information ‘ean in some embodiments be directly estimated or extracted by, for example, analysis of a 3D image from a 2D laser {0067} The processor 31 can in some embodiments use the ‘estimated lumtinsie location to determine the identity ofthe Juminaie fom the image alone. Thus in some embodiments the processor need not know the loeation ofthe luminaire but determine the location based on the image and from this location use the lighting plan or information based on the lighting plan location to identify the luminaire and then ‘determine fom the lighting plan or the information based on the lighting plan the expected associated aiming location and/or expected orientation estimation 10068] The processor 31 can then compare the expected iming location and/or expected orientation(s) against the ‘determined siming Ioeation andor orientations) to deter- tine whether the identified luminaire is correctly aimed. As described previously this analysis of the feature positio Jul. 20, 2017 ‘within the image using the information based on the Hihting plan to detennine whether the luminaire is correctly aimed can for example be a comparison of the determined orien- ‘ation(s), or determined image displacement), or simulated ‘mage(s) of expected visual markers overlaid on the cap- tured image. [0069] _Akhough the example shown with respect to FIGS, 2 and 3 use visual markers placed on kaown locations ofthe field it would be understood that in some embodiments the jmage can be analysed to determine inherent or intrins 2imensional or 3-dimensional features. For example FIG. 4 shows an example edge detection filtered image from feamera. Within this image is shown two possible intrinsic eatures which can be used a relerence features with knowa locations within the stadium. FIG. 4 shows an example 2-imensional feature inthe form ofthe field marking ofthe penalty box 305. The markings on the field are made at Known locations and with Known sizes andthe image comprising these can be used to determine point on the ‘eature suitable for generating a suitable image displacement vector Furthermore an example 3limensional festore shown in FIG. 4 isa stadium seating lower ter 301 edge. It ‘would be understood that a stadium would have many such ‘eatures which could be identified with known locations and then determined within image. For example the intrinsic ‘eatures could also be the Mloodlight luminaires themselves since in the light plan the location of each kuminaire is Knows, Therefore, observed Tunintes could be used as passive or active landmark. By activation (staie or ended Tight) ofa specific luminaire a beacon ean be created forthe ‘observing camera. Based on the location ofthe luminaire to be directed, luminaire can be setvated as beacon on the Joestion resulting in better aiming precision, The aiming {ool or apparatus in some embodiments could therefore indicate the heacon luminaire resulting in the bighest aiming precision. In such embodiments using the stadium lights ‘may result in the optical axis ofthe Inminaire and camera being non-aligned or offset. Capturing images ofthe stain lights mounted high above the field ax well ax the target Joations onthe feld could require a very large field of view. In such embodiments therelore there may be a defined ‘mechanical offset between the optical axis of the laminaire ‘nd field of view of the camera system in such Way that the Juminairs can be observed by the camera regardless of the Jminaire target location on the field. [0070] Subsequenty the luminaire can be aetivated manu- ally by an operator oF automatically by the system, I woud 'be understood that in some embodiments as many of these shapes have defined straight or substantially staight lines any optical perspective effects can be determined and allowed fr. {0071} In such embodiments the processor 31 can be configured to extract the Teatre oe object with a unique 2alimensional or 3imensional shape aecording 10 aty ‘now technique oF method. {0072} With respect to FIG. § a further example of the Jihuing aiming system shown according oa furter se oF embodiments In thie example the aiming system comprises {NGI sensor configured to determine a tangle 41, The it Sensor can ae deseribedherwn he ws asa complimentary ‘way to determine the oriatation angle ofthe laminae. For ‘siple th il seasor can in some embodiments be used to neat the rotation around the horizontal asad therefore

You might also like