You are on page 1of 15
NM 'S 202300623151 cx) United States _ a cz) Patent Application Publication co) pub. No: US 2023/0062315 Al POYNTER (43) Pub. Date: Mar. 2, 2023 (54)__LIVE VENUE PERFORMANCE SENSOR nosy yout (2006.01) CAPTURE AND VISUALIZATION OVER AOE 1335 (2006 01) GAME NETWORK ‘AOBP 137285 (2006 01) (71). Applicant: MSG Entertainment Group, LLC, (2) Us.cL New York, NY (CS) cre G06T 13/80 (2013.01); AOSF 1335 ‘otk, NY (01409), A63F 137285 (2014.08); Go6E (72) Inventor: Benjamin POYNTER, New York, NY 3011 2013.01), GO6F 3016 2013.01), GOT. ws) 13/40 201301), HOAW 4021 201301) (1B) Assignee: MSG Entertainment Group, LLC, New York, NY (CS) 6 ABSTRACT 1) Appl No: 17461.466 A system and methods for providing iterative content to fan atidience is disclosed, In aspects, system implements methods to receive sensor data eaptured by one of more sensors alixed to a performer. A graphical visualization is rendered base on the sensor data. The graphical visuliza- (2) Filed: Aug, 30, 2021 Publication Classification GD Ima, tion is tansmited 10 one oF more exteraal devices aso~ Ger 13/80 200601) ciated with one or more members ofthe audience for incor GOOk 301 Gos 01) poration into-a application executing on the one or more usr 1340 (200601) ‘external devices oman Mar. 2, 2023 Sheet 2 of 5 US 2023/0062315 AI Patent Application Publication ve ‘old OT envoy (022 s2}@uered joquog- 812 yoegpeay ondeH- | > iz oudeuD- IR anpow uoyoesey (woz soz Beg sosueg zie sindu- Y z uonezyensia jeoiydeg- FOZ ainpow Jenieooy (qzoz Tre uonencay aoe ‘502 eu6ue BuvapLoY (elzoz ejeq 10sv9g BOT o0n0p jewsoyeg, a nn oe Patent Application Publication Mar. 2, 2023 Sheet 3 of 5 US 2023/0062315 AI FIG. 2B Patent Application Publication Mar. 2, 2023 Sheet 4 of 5 US 2023/0062315 AI Receive sensor data captured by one or more sensors afixed toa performer _ gp Render a graphical visualization based on the sensor data 304 | Transmit the graphical visualization to one or more extemal devices associated with one or more members ofthe audience for incorporation into an application executing _ 395 ‘on the one or more external devices FIG. 3 Patent Application Publication Mar, 2, 2023 Sheet 5 of 5 Communication Infrastructure 422 Control Unit 402. Control Interface 404 ‘Storage Unit 406 Storage Interface 408 Software 410 User interface 412 Display Unit 414 ‘Communication Unit 416 ‘Communication Interface 418 US 2023/0062315 AL Remote device(s), network(s), entiy(es) 420 FIG. 4 Network 110 US 2023/0062315 AL LIVE VENUE PERFORMANCE SENSOR CAPTURE AND VISUALIZATION OVER GAME NETWORK, BACKGROUND [0001] Live performances suc as concerts or theatre are ‘typically one-way interactions in which one or more perfor- mers present for an audience, n a traditional andienge-pet- Former relationship, the interaction between performer and the audience lows only fom the performer the audience. [Even if interactions oceur from the audience to the perfor- mer, these are typically minimal interactions onthe pat of the audience. They can inelude an avdience chanting in response to a request by the performer, an audience singing lyrics with a sony performed, an audience bolding lighters orglow sticks to illuminate a venue, an audience clapping in response to @ performance, filling ont a questionnaite ol- Towing the show, ete. For the audience that chooses to wit- nessa live performance, dhey consume more uk never So often participate, which leaves the value of attending alive performance with more to be desired. [BRIEF DESCRIPTION OF THE DRAWINGS [0902] ‘The patent or application file contains at Teast one Arawing exectted in color. Copies of this patent or patent ‘pplication publication with color drawing(s) wil be pro- vided by the Ofice upon request and payment of the neces fee, [0003], The accompanying drawings, which are incorpo ‘ated herein and form a part of the specification, illustrate aspects of the present disclosure and, together with the deserptio, further serve to explain the principles ofthe dis closure and to enable a person skilled in the pertinent art to make and use the disclose. [0004] "FIG. 1 isa venue for proving interactive content {o one of more members of an audience, in exemplary aspects of the present disclosure. [0008] "FIG. 2A isa control flow far providing the interac- {ive content in exemplary aspeets ofthe present disclosure. [006] "FIG: 21 shows an example of how motion cape ata can be generated ftom sensors affixed to a performer tnd how graphical visualizations can be generated for an pplication based onthe motion capture data in exemplary aspects of the present disclosure. [0007] “FIG. 8s an example method for providing iter- active conten for incorporation into an application exectt- Ing on one or more extemal devices, n exemplary aspoxts of the present disclosure [0008] "FIG 4 is an example architecture of the compos ents that may be used to implement a computing system ‘which provides dhe interactive content in exemplary aspects ofthe present disclosure, DETAILED DESCRIPTION [0109] Aspects disclosed herein provide a system and a ‘method for providing interactive content to an audience ‘The system and the method ean merge the notions ofa sen- sor assisted Hive performance with interactive gaming to provide one or more members of an audience andthe pet= former with an immersive and highly interactive experience with one another. In aspects, this is done vig an interactive application, such a a video game, which incorporates sen- Mar. 2, 2023 ‘sor data received from sensors affixed to the performer Ia aspects, the received sensor data may be incorporated into the application via an engine capable of generating andor cisplaying graphic visualizations. In aspects, the one or -moreanudieace members can interact withthe graphic visus- lizations of the live performer asa result of affixed sensors tothe live performer. The interaction ean be via a deviee on hich the applications installed. In aspocs, the ineractions «an result in changes toa venve, such as changes oa large ‘splay within a venue, in which the performer is perform- ing. In aspects, the interactions can also reslt i eedhack ‘being provided to the performer. In aspects the interactions «an also allow one of more of the audience members to con- ‘uol aspects related to the performance via the application. Ts, ack and forth interactions may be generated between the performer, a Venne in which the performance is taking place. an! dhe one oF more members of the audience. [010] "in aspects the system can perform the aforemen Tioned functionality by implementing methods to rece ‘sensor data captured by one or more sensors afixed to a perfommer. In aspects, the system can render a graphical ‘visualization based on the sensor data via game engine logic. In aspects, the system can transmit the graphical visualization to one or more external devices associated ‘with one or more members ofthe audience For incomporation into an application executing on the one or more extemal eviees. In aspects the sensor data captured can include & ‘motion capture data, an acousti data or a biometric data. In aspects, the graphical visualization may be a rigged scene, ‘and the rigged scene may be rendered to lange based on The sensor daa. Inthe ease of rigged scene, sisi relerence to a rigged character model displayed on the graphical visualization. In aspects, the graphical visualization may ‘bea rigged character, andthe rigged character may be ren- ‘dered to mimic movements of the performer hased on the sensor data. In aspects, the system ean transmit the graphical Visualization tothe one oe more external devices in real-time fiom wen the sensor data is captured. In aspects, the system ‘an receive one or more inputs from the one or more exter- ‘al devices via the application and cause an interaction with the performer or a Venue in which the performer i acting ‘asad on the one or more inputs, In aspacs, de interactions can include generating a graphic for display on a display interface in the venue. In aspects, the interactions can include generating a haptic feedback and transmitting the haptic feedback to the performer In aspects, the interactions ‘an inchade enabling the one or more members of the audi- fence to contol one of more mechanical elements in the ‘enue via the application. [OOL1] The following aspects are described in sufficient ‘etal to enable those skilled in dhe at to make and use the disclosure. Ils to be understood that other aspects are evie ‘dent based on the present disclosure, and that system, pro- ‘ess, or mecha changes may be made without departing fiom the scope of an embodiment ofthe present disclose, 012} In the following deseription, numerous specific details ae given to provide a thorough understnding of the disclosure. However it wil be apparent that the diselo- sre may be practiced without these specific detail, In order toavoid obscuring an embodiment of the present dielosure, some well-known system configurations, archi ‘ues, and process steps are not disclosed in deta [0013] The drawings showing aspects of the stem are ‘Semi-diggrammatic, and not to scale, Some of the dimen- US 2023/0062315 AL sions ae forthe clarity of presentation and are shown exa- erated in the drawing fgures. Similarly although the views ‘the drasvings are for ease of description and generally show similar orientations, this depiction in the figures is arbitrary for the most part. Generally, the disclosure may be operate in any orientation, [O14] The term “meaule™ or “uni refered to herein ‘may include software, hardware, or a combination thereof in an embodiment of the present disclosure in accordance ‘with the context in which the term is used, For example, the software maybe machine code, firmware, embedded code, or application software. Also for example, the bard- ware may be citer, a proeessor, a special purpose com- per, an intepated circ, integrated circuit cones, or & ‘combination thereof. Further, if module or unit is writen inthe system of apparatus claims scetion below, the module ‘or unit is deemed to inclode hardware eircitey or the put- poses and the scope ofthe system or apparatus claims. [O15] The modules or units ia the Tollowing description ‘ofthe aspects may he coupled to otc another as described or as shown, The coupling may be direct or indirect, without or ‘with intervening items between coupled modules unis. ‘The coupling may be by physical contactor by communica tion betwcen modules or unis. System Overview and Function [0016] FIG. 1 shows a veave 100 for providing interactive content (© one or more members of an audience 106, in exemplary aspects of the presen disclosure, In aspects, the Interactive content can allow the ane oF more members of the audience 106 to participate ina performance, such as & live performance, via an application 210 (or example a video game as will be further deseribed with respect 0 FIG, 2A). In aspocts, and as illustrated in FIG. 1. the venue 100 can represent & location Tor hosting an event For example, the veave 100 can represent a music Venue, for example, a musi theater, a asic lub, andor a concert hal. In aspects, the weave 100 may bea sporting venoe, for ‘example, an arena, convention center, andor a stain, oF any other suitable leation in which a performer 114 can conduct a performance. In aspects, the performer 114 may be, for example, a singe, an ator, a musician, a mackie, computer oe any other mechanical device or individual that fan conduct the performance in front of the ane or more ‘members of the audience 106, In aspecs, the eveat can present a rmsical event, a theatrical performance, a sprt- ing even, cle [0017] ln aspects, a computing system of the venne 100 can chable presentation of the interactive content 10 the fone of more members of the audience 106 within the venue 100. In aspects, the interactive content may be any audio, visual, or sensory content, such as sounds, music, sraphics, lights, movies, sill or moving images, smells, oF combination thereof, as examples. In aspects, the imerac- tive content can also fake the form ofthe such asa interactive game, andor graph visualizations incorporated into the application 210, In aspects, the application 210 andlor graphical elements or visualizations may be provided tothe one or more members ofthe audience 106 via one or more external devices 108 on \which the application 210 is being executed on [0018] "In aspects, the one or more extemal dovices 108 may be, for exampie, a mobile device, a laptop computer Mar. 2, 2023 ‘a desktop computer, @ tablet computer, a portable video ‘game console, virtual reality (VR) headset, an augmented reality (AR) device (such asa licensed AR wearable pair of| ‘slasses’),or any simular device on which the application 210 ‘andi graphical elements of visualizations to be incorpo ‘aed into the application 210 may be played or displayed. [0019] Inaspeets the computing system ean include a host ‘server machine 102 to Facilitate the wansmission ofthe inter- active content tothe one or more members of the auienee 106. Inaspecs, the hos server machine 102 may be pat of backend computing infrastructure of the venue 100. In aspects the host server machine 102 may be implemented ‘san ofa variety of centalized or decentalzed computing devices. For example, the host server machine 102 may be @ laptop computer ora desktop computer or may be part of 8 decentaized computing device such as grid-computing resources, virtualized computing resource, cloud comput- ing resources, routers, switches, peer-to-peer. distributed ‘computing device, er a combination thereat. The host ser- \er machine 102 may be centralized in a single room within the venue 100, disributed across diferent rooms, distribu ted across liffeent geographic locations, or may be ‘embedded within network 110 [0020] The network 110 may be any telecommunicatons ‘network such a8 @ witeless or wired network, The neowork 110 may span and represent a variety of telecommunicaton networks and telecommuniation network topologies. For ‘example the network 110 can inchide wireless communica- tion, wired communication, optical communication, ultraso- ‘ic communication, oF a combination thereof, For example ‘itllite communication, cellular communication, Blue= ‘woth, laffared Data Association standard (DA), wireless fidelity (WFD, and worldwide interoperability For micto- ‘wave access (WiMAX) are examples of wireless commini- cation that may be incloded in the network 110. Cable, Fthernet, digital subseriber line (DSL), fiber optic lines, fier tthe home (FTTH), and plain old telephone service (POTS) are examples of wired communication that may be ieluded inthe network 110. Further, the network 110 may traverse a number of network topologies and distances. For ‘example. the network 110 may include direct conection, personal area network (PAN), local area network (LAN), Imetiopolitan area network (MAN), wide area network WAN), oF a combination thereof, [oo2i} in aspects, the network 110 can connect the com- puting systems or devees of the venue 100 to one or more ‘devices, fr example the host server machine 102 t the one ‘or more external devices 108 via, lor example, a LAN. ‘Additionally, the network 10 can connect addtional ‘devices of the venne 100 to the hest server machine 102 ‘andr the one oF more external devices 108, For example, in aspects, the venue 100 can have a display interface 112, such a8 @ video sereen, a television, ight emitting diode (LED) displays, etc that may be part ofan audiovisual sys- tem of the verse 100, nd that may be connected to the host server machine 102 andor the one or more external devices 108, via the network 10. In aspets, the neswork 110 can enable all ree to interact with one another. In aspects, and ‘an example, the network 110 can enable the host server ‘machine 102 to transmit the application 210 (eg. an inter- active video game) andor graphical elements or visualiza- tions to be ineoeporated into the application 210, tobe trans- ‘ited to the one of more estenal devices 108, US 2023/0062315 AL [0022] In aspects the peronmer 114 can have one or more Sensors 116 alixed to the performer 114. In aspects, dhe one for more sensors 116 can oblain one oF more pieces of data that ean facilitate andr affect the generation ofthe inerac- tive content. For example, the generated interactive content can he graphical visualizations 208 of FIG, 2A, that ean be displayed on the one or more extemal devices 108, 0 the display interface 112. For example, the one or more Sensors 116 may be motion capture sensors, microphones, pieao- electric sensors, biometie sensors, VR tacking sensors, oF ‘other suitable sensors as recognized by'a person of ordinary sul in the art In aspees, the one or mote Sensors 116 ean ‘generate the one or more picees of data to include one of @ ‘motion capture data, an acoustic data generated by an acous- ‘ie instrument, oF a biometeic data originating from the pot- Tormer 114 andlor devices or instruments connected to the performer 114. Tn aspects, the one or more sensors 116 may be aflived to the perfonmer 114. For example, the one or more sensors 116 may be affixed toa suit worn by the pet- oemer 114, the body of the performer 114, instruments being played by the performer 114. a microphone in which the performer 114i singing, ec. In aspects, the one or more senvors 116 can obtain tae one ce more pies of dala based ‘onthe type of sensor and transmit that data othe host server ‘machine 102 for futher processing and incorporation into the application 210. For example in aspects, the one or more pieces of data may be used to gencrale portions of the application 210 andr graphical elements ot visuiiza- tions to be incomporated into the application 210. How the alorementioned fnnctions are execited will he discnssed Turther below [0023] "FIG. 2A shows «contrat flow 200 fr providing the fnferactive content tothe one of mote members of the aie ence 106, in exemplary aspects ofthe present disclosure In aspects, the control low 200 ean ince one ot more mod- les (© faciliate providing the interactive content In aspects, these can itl a receiver module 204, a rendet- ing engine 206, and an interaction module 214. While the aforementioned modules are shown as being implemented ‘on the host server machine 102 in FIG, 2A, this i exemp- lary. In xher aspects, some oral of the modules may be ‘implemented on other devices suc as the extemal device 108 or other devices of the venue 100 using a eliet-server architecture. For the purposes of diseussion, and. with pect to FIG, 2A however. twill beassumed tha the moe: lles are implemented on the host server machine 102, How these mesdules interact with other eomponens ofthe venne 100, the performer 114, and the one or more external devices 108 to provide the interactive content to the one of more members of the audience 106 will be discussed below [0024] In aspects, the receiver module 204 can’ enable receipt of the one or more pieees of data received Irom the fone or mare sensors 16 (ol FIG. 1). Fr purposes of diseus- sion with respect t0 FIG. 2. the one or more pices of data ‘willbe referred to as sensor data 202, The sensor data 202 is shown in FIG. 2A as elements (202(a), 2020), ..202(n)} Thhe sensor data 202 can represent different steams of the ‘one or more pieces of data received fom the one ar more seasors 16, For example, in aspects sensor data 202(«) can represent motion caplure data received from mation sensors allixed to the performer 114 of FIG. 1. In aspects, sensor data 202(6) can represent acoustic data received fom a microphone in which the perlormer 114 is singing. In aspects, sensor data 202(n) can represent biomettic data, Mar. 2, 2023 such a8 @ heartbeat, moistire data measuring whether the perfommer 114 is sweating and how much, et. received From biometric sensors allied to te performer TM [0025] inaspeets, the receiver module 204 ean receive the ‘sensor dala 202 va an interface or softvare that can liver the sensor data 202 tothe receiver modile 204, In aspects, the inlerface may be an application programming interlace (API which can isansmit the sensor daa 202 othe recviver ‘module 204. Inaspects, once the sensor data 202 is received, the receiver module 204 ean further transmit the sensor data 202 toa rendering engine 206 for further processing {0026} Inaspecs, the rendering engine 206 may be a some ‘ware and/or hardware component that enables the transfor~ ‘mation of the sensor date 202 to & graphical visualization 208 fo be incorporated ino an application 210, For example. ‘the rendering engine 206 can trasform the sensor data 202 to provide a varity of interactive experiences, with an ‘example being a massinteractive aplication which accepts the sensor data 202 and integrates aforementioned dat into ‘video game played by the one oe more members of the audience 106, In aspects, the video game can include one ‘or more genres of games, sch as an action game, an adven- ture game, a fighting game, a platform game, a puzzle game, ‘racing game, a role-playing game, ahythm gue, a shoo- ‘er gume a simalation game, «sports game, a siralegy game. andor any other suitable genres of games that solicits direct andor active participation ofthe one oF more members of| The auuience 106. Ia aspects, de graphical visualization 208 ‘may be, for example, a rigged scene within the application 210, a rigged character of the application 210, ete. The rigged scene or the rigged character may be a viral scene ‘or character from the rendering engine 206, which contains ‘fully vitual (2-dimensional or 3-limensional character or ‘sone model, which accepts Visual changes based on the ‘sensor data 202. In aspects, the graphical visualization 208 _may be or part of the interactive content provided to the one ‘or more members of the audience 196 of FIG. 1 {0027} Inspects, the rendering engine 206 may be part of game engine of graphics rendering and/or generation ‘engine, which can transform the sensor data 202 to the gra- iaton 208, In aspects, the game engine or eri jing engine may be, for example, Unity™, Unreal? Note, Tough Designer", or any oe suitable ‘game or graphies rendering and/or generation engine that ‘san transform the sensor data 202 to the graphical visualiza- tion 208 [0028] In aspects, the rendering engine 206 can receive the ‘sensor data 202 via an API. In aspects, once received, the ‘Tendering engine 206 can transform the sensor data 202 10 ‘the graphical visualization 208 via various pre-built libeaties ‘ofthe rendering engine 206, For example, in the example of the sensor data 202 being motion eapture data, a person of ‘ordinary skill the art will recognize that the aforemen- tioned examples of the readeting engine 206 (eg, Unity?™, Unreal, ete) can have pre-built libraries that ‘an recognize the motion capture dla revived and rans- form the motion capture dala to viral pose, In aspects the virtual pases ean further he used to render a graphical ‘visualization 208 based on the vstal poses to mimic move- ‘ments captured by the one oF mone sensors. A person of ‘ordinary skill inthe art wll also ecognize that such libaries ‘may be built programmatically with the rendering engine 206. For the purposes of discussion with respect to FIG. 2A, itis assumed that pre-built ibrares will be ised. In aspects, US 2023/0062315 AL the pre-built libraries ean be farther enstomized 10 imple- ‘ment some ofthe funcionlity described in this disclosure. ‘A person of ordinary skill in the rt will recognize the fine= tionality for which customization is required. [0029] In aspects. the graphical visualization 208 may bea rigged character. In aspects, the rendering engine 206 can, Tor example, transform the motion capture data to vital poses to allow the rigged character (© mimic movements bused on the motion capture data. For example, if the per- ormer 114 moves his or ber leg, the motion captre data can reflect this movement. and the rendering engine 206 can ‘wansform that motion capture dala to have the rigged chat- facter also move is ley inthe me or similar manner. In the aforementioned aspects, moving any joint would also be ecognized and transmitiedthronghont the vente space [0030] In aspects, ifthe sensor data 202 is acoustic data enerted by vibration readings during 2 live musical pet- Tormance, th rendering engine 206 can analyze the frequen cies of the sounds received, aad based on the sounds received, render the graphical visualization 208 to reflect the frequencies. For example, ifthe performer 114 sings a high pitch note above a pre-determined frequency. a tiewed ‘aroeter can change colors onthe display interface 12 and! foron the one or more extemal devices 108. [0031] "in aspects, ithe sensor data 202 is biometric data, the rendering engine 206 can determine if, for example, the performer IL is sweating and et what rte, oF whether the performer 114 hasan elevated heartbeat above a pre-detet- ‘mined threshold, and based on the same, change the cole of the rigged character on the display interface 112 andlor on the ome or more external devices 108, The aforementioned fare examples. Other renderings may be made according 0 the need of the application 210, [0032] In aspects, if he graphical visualization 208 is a rigged scene. similar renderings as previously discussed ith respect to the rigged character may be made, but with pect tothe rigged scene, For example, sensor data 202 fellecting certain movements of the performer 114 can result in the rendering engine 206 rendeting the rigged Scene to change. For example, movements of the performer to the right orth left ofa stage can result ina rigged scene changing, for example, rem day tonight or from for exam ple, forest scene to an urban scene given the oot positional data ofa panicular character within the scene, In aspects, other renderings ean include changes of background colors tof the igged seene, changes in rigged characters appearing ‘within th rigged seene, te. The aforementioned are exemp> lary. The renderings can change and depend on the need of the application 20. 10033}. Inaspests, once the rendering engine 206 generates, sandr renders the graphical visualization 208, the rendering fngine 206 can transmit the graphical visualization 208 to the one o¢ more extemal devices 108 of FIG. 1, for incor- ppration ito the application 210. In aspects, the application 2210 may be executing on the one or more extemal devices 108, Inaspocs, the one or more extemal devices 108 may be associted with the one of more members ofthe audience 106 and display andor execute the application 210 forthe ‘one or more members of the audience 106. In aspects the ‘wansmission may be via an API. In aspects, the APY can Interface between the rendering engine 206 andthe aplica- ‘ion 210. In aspects, and as previously indicated, the grape cal visualization 208 may be transmitted via the API he incorporated into the application 210, Mar. 2, 2023 [0034] in aspects, the rendering engine 206 ean also ‘iretly transmit the graphical visualization 208 wo the dis- play interface 112 in conjunction with transmitting to the ‘ne of more external devices 108. In this way, the graphical visualization 208 can be shared and displayed across mull- ple devices andor projected on different displays [0035] "In aspects, once the graphical visualization 208 is Incorporated into the application 210, the one or mere mem- bers ofthe audience 106, can interact withthe application 210 on cach oftheir respective one or more external devices 108, In aspects, and based on the genre of the application 2210: the graphical visualization 208 may be mirrored to each ‘of the one oF more external devices 108 for display. For ‘example. if the graphical visualization 208 shows a rigged ‘charter running based on the movements ofthe performer 114, the running may be mirrored to each ofthe one or more extemal devices 108 for display, In aspects, based on the splay of the graphical visualization 208, each ofthe one ‘or more members of the audience 1N6 can provide one ot ‘more inpts 212 from their respective one of more extemal devices 108 into the appliation 210. In aspecs, the one ot ‘mee inputs 212 can allow the one of more members ofthe audience 106 to interact withthe application 240, In aspects, the one oF more inputs 212 ean include swipes, inputs via huttons, screen manipulations, taps, voice imuts, textual inputs, te, used to interact with the application 210. In aspects, the one or more inputs 212 may be used to belp the one or more members of the audience 106 individually ‘or collectively achieve a goal in the application 210. For ‘example the goal may be to assist a igge character survive ‘a phase of the application 210, achieve a score, perform @ ‘ask, oF appropriate logic commonly associated with an interactive work. [0036] In aspects. the application 210 ean collect, store, andor accumulate the one of more inputs 212 as part of the logie of the application 210. In aspects, once received, the application 210 can further process the one or more inpots 212 to generate a parameter or variable that may be ‘sed fo cause an interaction with the performer 14 aadior the venue 100 of FIG. 1 By way of example in an exemp- lary embodiment, if the one or more inputs 212 ae an accu ‘mulation of taps that result in certain characters in the applic «ation 210 being removed fom potential o attack co help & ‘motion capture controlled rigged character mimicking the ‘movements of the performer 114 get pasta level of the application 210, the Togic of the application 210 ean der- ‘mine based on the accumulation of taps that the goa has ‘en achieved and generate a variable or parameter indical- ing thatthe goal has been achieved. In aspects, the variable ‘or parameter may be, for example, a character or numerical value, such as a leer “Y." or a string such as “COM PLETE” indicating the goal as been achieved, ora binary ‘valve, or integer, for example “I” indicating the goal has ‘heen achieved. In aspects, the application 210 can transmit the variable or parameter back to the host server machine 102 which can process the variable er parameter using the interation mosile 214 to cause the interaction with Te pet= former 114 andlor the venue 100. [0037] in aspects, the interaction module 214 can enable ‘causing the interaction with the performer 114 andlor the ‘venue 100, In aspects, the interaction may be customized and take on a varoty of forms based on the context of the application 210, the nature ofthe performance. the desired cellet of the interaction, etc, A designer of the computing US 2023/0062315 AL system can determine the customization for the interaction. For example, in aspets, the interaction can include generat- ing a graphic 216 Tor display onthe display interface 112 of FIG. 1 of the venie 100, In aspects, the interaction can include generating a haptic feedback 218 to be transmitted to the performer [14 In aspects the interaction can include eenerating a contol parameter 220 that ean enable the one formore members ofthe adience 106 contol one or more ‘mechanical elements in the venve 100 via the application 210. [0938] For example, in aspects where the iteration ean Include generating the graphic 216, based on the ane or ‘mote members ofthe aldience 106 achieving a goal while fnteracing wi the application 210, the application 210 can transmit the variable or parameter indicating the goal as been achieved. Based on the same, the interaction module 214 can process the variable or parameter to generate the trophic 216. In aspects, the graphic 216 may be displayed fon the display interface 112 In aspects, the Indicate that the goal was achieved, can ind the one or more members ofthe audience 106 that achieved highest score to achieve dhe goa, can show a rigged chat- acter show up, can tigger a special visualization to fppear on respective displays, and more contextualized fogie befiting the interactive application 210, etc. The aforementioned ate exemplary and any graphic 216 or visual read by the rendering engine 206 may be generated based on the need ofthe application 210 and/or the experi- ence desired for the performance [0039] In axpocts, where th inleraction includes generat- ing a haptic feedback 218 tobe transmitted to the performer 114, the application 210 can tansmit the variable or para ‘meter indicating the goal has been achieved and based on the same, the imteraction module 214 ean process the vai- able or parameter to generate the haptic feedback 218. In aspects, the haptic ecuhack 218 ean include a vibration or ‘series of vibrations that may be transmitted tothe perfor- ‘mer 114, either toa suit worn by the performer, dieetly to the body of the performer 114, orto an instrument being played by the performer 114 In aspects, the haptic Feedback 2B can indicate tothe performer 114 thatthe goal daring ‘mass-interaction with an application 210 was achieved and allow the performer Ifo take futher actions such a sing a ceriin song based on receiving the haptic feedback 218, executing input tothe rhythm of color change or advan cing (© the next phase of a performance based interactive experience [0040] "In aspocts, where the intoraction includes generat- {nga contol parameter 220 that can enable the one or more ‘members of the audience 106 to contol one of more ‘mechanical clemeats ia the venue 100 via the application 2210, the application 210 can transmit the variable or para ‘meter indieating a goal shared by the one o more members ‘ofthe audience 106 within & veane 100 has been achieved, snd the interaction module 214 can process the variable ot parameter to generate the control parameter 220, In aspects, ‘the control parameter 220 can take the form of signal that can activate one or more mechanical elements in the ven 100, Inaspects, the mechanical elements can include robots, lighis, smoke machines, or oer mechanical apparatuses that may be used as part of a musical or theatrical perfor mance, In aspects, the contol parameter 220 can enable activation of robots, lights. smoke machines, ee. Inaspets, the contol parameter 220 ean provide a link to the one oF Mar. 2, 2023 ‘more members of the andience 196 and allow the one ot ‘more members ofthe addence 106 ofa subse thereat 0 directly contol the one of more mechanical elements in the vente 100 via the application 210, In an exemplary ‘embodiment, the control parameter 220 can do so by, for ‘example receiving a variable or parameter from the appica- tion 210 indicating which of the one or more member of the ‘audience 106 costebuted most to achieving the shared oil, and based on the same provide a fink to the one or more ‘members of the audience 106 identified to allow oaly those members of the audience 106 to contol the mechaa- ical elements [041] In aspects, in accordance tothe nature ofthe per- formance, iti desirable to perform the aforementioned fnetions with respect to 1G, 2A in real-time, For example, in aypeets, whem the live performance is in real-time, the transmission of the graphical visualization 208 to the one ‘or more external devices 108 may be done within millise- ‘candor seconds fm when the sensor data 202 is captured. ‘Thus, the tendering of the graphical visualization 208 shoud he achieved within milliseconds or seconds from ‘when the sensor data 202 is received so that it may be ‘quickly redirected fo the one or more external devices 108 im order for the one or more members ofthe audience 106 0 ‘he able to provide the one or more inputs 212. This can allow for a1 interactive and immersive experience i ‘whieh the one or more members ofthe audience 106 can interat withthe performer 114 within milliseconds or sec- ‘ons during the live performance [042] "thas been discovered thatthe system and methods described above significantly improves the state of the art fiom conventional systems. This is because the syste and methods provide @ novel way to integrate « performance ‘wilh interactive gaming to provide one or more members ‘of the audience 106, the performer 114, and a venue 100, ‘a immersive and interactive experinee with one another, This is achieve by integrating data obtained from the per formance (eg. dhe sensor data 202) and using the obtained data fo generate one or more aspects ofa application 210 by for example generating « graphical visualization 208 to ‘biain feedback from players ofthe game (e.g. the one oF more members ofthe audience 196) (0 cause a interaction between the one or more members ofthe audience 106, the ‘venue 100, and the performer I [043] 1 has beea fuser discovered that implementing the aforementioned system t function in real-time, by for ‘example, transmitting the graphical visualization 208 to the ‘one oF more extemal devices 108 within milliseconds or ‘sovonds from when the sensor data 202 is captured improves the slate of the art because it provides fora system in which ‘one of more members ofthe audience 106 can be presented ‘with real-time imeration via application 210, such as a ‘suming based application on which they can at. This ability Increases enyagement by the one or more members of the ‘tudience 106 with the performer 114 because i allows for ‘novel way for individuals fo interact with a perfommer 114 ‘ater han passively receiving a performance [044] “The modules described with respect to FIG. 2A. ‘may be implemented as instructions stored on a noa-tansi- tory computer readable medium to be executed by one ot more computing units such asa processor, a special purpose ‘computer, an integrated circuit, tegrated circuit cores, or & ‘combination theoof. The non-transitory computer readable ‘medium may be implemented with any number of memory US 2023/0062315 AL nits, such asa volatile memory, a nonvolatile memory, an fneraal memory, an exleral memory, or a combination thereof The nonctransitory computer readable medium ‘may be integrated a apart of the computing system of the venue 100 (, the host server machine 102) installed as a removable portion of the computing system ofthe vene 100, [0045] FIG. 2B shows an example of ow motion eapaure data can be generated from the one oF more sensors afixed ‘oa performer 114, and how graphical visualizations ea be generated for an application 210 based on the motion eap- {ure data, in exemplary aspects of the present disclosure While FIG. 28 shows how graphical visualization can be generated based on one type of sensor data 202 (ie. the ‘motion capture dat) similar principles apply to other types ‘of data captured by sensors such as the biometric data and acoustic data.A petson of ordinary skill inthe att wll revog- nize how the sensor data 202 can be transformed nto gr Dhical visualizations for these other types of data based on the disclosures bere, [0046] Continuing withthe example, FIG. 2B shows the performer 114 with the one or more sensors 222 afixed to his or her bad. The one or more sensors 222 of FIG. 2B ean be the same a those shown with respect 10 FIG. 1 (ie, ene ‘or more sensors 116). In FIG. 2B each of the one or more sensors 222i labeled as (2220, 222b,.., 22n}. In aspects, the one or more sensors 222 ean be motion capture sensors, such as inertial sensors, which can capture the skeletal movements of the performer 14. For example, and as shown in FIG. 2B, the one or more sensors 222 ean care Skeletal movements such as arm movements, joint move- ‘ments, head movements, lex movements, and other similar bodily movements, As shown in FIG. 2B the performer 114 {is shown moving his or her arm. ‘This arm movernent an be captured by the one or more sensors 222. In aspects, once the movements are captured, motion capture data can be generated representing the movements. For example, coot- inate data representing the performer's 114 arm postion in 12D or 3D plane ean be generated representing the move- ments. In aspects, the data wenerated (eg, coordinate data) ‘an he transmitted by the one or more sensors 222 andor electronies coupled to the one or more sensors 222, to the hos server machine 102. The elecitonics can be, for examn- ple, communication cireits with antennas tht ean tansmit the data generated [00ST] Once received, the host server machine 102 can process the motion capture data uilizing the rendering engine 206. Based on the principles described with respect o FIG. 2A, the rendering engine 102 can uilize re-built andor customized libraries to transform the motion capture data into the erphical visualizations for display in an appi- cation 210 exccuting on one or mare extertal devices 108, andor to the display interface H12 ia the venue 110, As showin FIG, 2, the graphical visualizations ae displayed fn an application 210, The application 210 of FIG. 2B is shown to be a video game. In aspects, and as shown in FIG. 2B, the graphical visualizations ean be a rigged chat- acter 224. Based on the motion capture data the rigged ehat- acter 224 can mimic movements ofthe performer 1M. For example, ithe performer 114 moves his o her arm up or ‘down the rigged character 224 ean also mimic the move- ments ina mircored fashion and move its arn up and down in the same manne Mar. 2, 2023 048] As shown in FIG. 2B, the application 210 is ‘executing one or more external devices HOR AS a result, the ne oF more members ofthe andience 106 can also inet= act with the application 210 via the one or more extemal devices 108. For example, and as exemplilied in FIG. 2B, ifthe application 210 isa video game, one or more members ‘ofthe atdience 106 can interact withthe application 210 via ‘one oF more inputs 212. For example, and as shown in FIG. 2B, the one or more inputs 212 can be provided using @ ‘uion con 226 indicating an action to be taken i the one ‘or more members of the andience 106 presses the button! icon 226. For example, in FIG. 2B the buttonficon 226 shown ean trigar the rigged character 224 can ire a weapon tan enemy 228 asa part of the video game. Inthis way, the perlommer 114 ea help guide, arg, oF position the rigged ‘haraeter 224 and the one or more members othe auleace 106 can provide the triggering mechanism to utilize a weapon inthe video game. In this way. the performer 14 aad the oae or more members ofthe auenoe 106 caninter- fact with one another via the application 210 to achieve a ‘common goal “Methods of Operation 0049] FIG. 3 shows an example method 300 for provid- ing interactive content or incorporation into an application 210 exceuting on one or more external devices 108, In ‘aspects, method 30 may be performed by utilizing one ot ‘more computing devices, modules, or units. of the venve 100 or of the one or more external devices 108, In aspects, and as shovwn in step 302, method 300 can have one or more ‘computing devices receive sensor data 202 eapured by one ‘or more sensors affixed to a performer IM. In aspects, and as shown in step 304, the one or more computing devices ‘can rendera graphical visualization 208 based on the sensor data 202, In aspects, and as shown in step 306, the one ot ‘more computing devices ean transmit the graphical visuali- ‘ation 208 to one oF more external devices 108 associated ‘vith one of more members of the andience 106 for iacot- poration into an application 210 exceuting on the one or more extemal devices 108, [0050] in aspects, the sensor data 202 can include a ‘motion expture data, an acoustic data, ora biometric data, ‘or data collected from an array of hardware, which may ‘monitor information a it relates o a performer 14 [O0S1] in aspects, the graphical visualization 208 can be a ‘rigged scene generated by the rendering engine 206, which can be rendered to change based on the sensor data 202 [0052] "In aspocts, the graphical visualization 208 can be a rigged character which can be rendered to mittor move- ‘ment ofthe performer 114 based on the sensor data 202 [0053] in aspects, the one or more computing devices can transmit the graphical visualization 208 to the one or more ‘exer devices 108 ean be done in eal-time from when the ‘sensor data 202 is captured, [0054] In aspocts, the one oF more computing devices can ‘receive one or more inputs 212 from the one or more exter- ‘al devices 108 via the application 210. In aspects, the one ‘or more computing devices can cause an interaction with the petiomner 114 ora venue 100 in which the performer I is performing based on the one or more inputs 212 In aspecis, ‘he interaction cam be in real-time. US 2023/0062315 AL 10085] in aspects, the one oF more compnting devices ean ‘cause the interaction by generating a graphic 216 for display ‘ona display'interface 112 inthe vente 100, [0WS6} "J aspecs, tke one oF more computing devices can ‘cause the interaction by generating a haptic feedback 218 19 be transmitted tothe performer 114 10087] In aspects, he one oF more computing devices can ‘ase the tefaction by enabling the one or more members fof the audience 106 to control one or more mechanical cle- ‘ments in the vente 100 via the application 210, This an be done by, for example, generating a contol parameter 220 to tenable the control [0058] The operations of method 300 can be performed, Toe example, by the host server machine 102, in accordance with aspects described above. Components of the Computing System that Provides the Ineractive Content 10059] FIG. 4 shows an example architect ‘components that may be used to implement & computing system Which provides the iterative content, in exernplary faspects ofthe present diselosure In aspects, the components may bea partof any of he servers. host server machine 102) or computes ofthe venue 100. In aspecs. dhe compo nents cam inehue a control unit 402, a storage unit 406, 2 ‘communication unit 416, anda user interface 412. The cone ‘ool unit 402 may inelude a contol interface 404. The con- ‘wo unit 402 may execute a software 410 to provide some or all of the intelligence of host server machine 102. The con {ool unit 02 may be implemented in a number of dierent ‘ways. For example, the contol unit 402 may bea processor, ‘an application specific imegrated eireuit (ASIC), an embeded processor, a microprocessor, a hardware ental Tie, a hardware finite state machine (FSM), a digital signal processor (DSP) a feld programmable gate array (FPGA), ‘ra combination thereof [0060] "The control interface 404 may be used for commu ication between the control unit 402 and other functional ‘unis or devices ofthe computing system. The control intet- {ace 404 may also be used for communication that i exter- nal tothe functional units or devices of the computing sys tem. The control interface 404 may receive information From the Fanctional units or devices of the computing 9s ‘em, or from remote devices 420,or may transmit informa ‘ion to the fictional units or deviees of the computing sys- tem, orto emote devices 420, The remote devices 420 refer ‘ounits or devices external to computing system, for exam- ple the one or more extemal devices 108 [ost] "Tae eoncol interface 408 may be implemented in ferent ways and may include diferent implementations ‘depending on which functional units or devices ofthe com pling system oF remote devices 420 are being interlaced ‘with the contol unit 402, For example, the contol interface 4404 may'be implemeated with opal circuitey, waveguides, Wireless circuitry, wireline circuitry to attach to a bus, an application programming interface (APD, or a combination thereof The contol interface 404 may be comected to 8 communication infrastructure 422, suchas a bus, to inter- Tice with the functional uaits or devices of the computing system or remote devices 420 [ln62] "The siorage unit 406 may store Ue software 410, For illustrative purposes, the storage unit 406 is shown as 8 single element, although itis understood thatthe storage 400 ofthe Mar. 2, 2023 ‘nit 406 may be a distibution of for ilustraive purposes, the storage unit 406 is shown asa Single hierarchy slorage system, although it is understood thatthe storage unit 406 may be ina different configuration, For example, the storage unt 406 may be formed with dil- ferent storave technologies forming 8 memory hierarchical syste incliding different levels of caching, main memory, ‘lating media, or off-line storage The storage unit 406 may be a volatile memory, a nonvolatile memory, an intemal ‘memory, an extemal memory. or a combination thereof, For example, the storage unit 406 may be @ nonvolatile sto- rage sul as nonvoaile random access memory (NVRAM), Flash memory, disk storage, ora volatile storage such as stati random access memory (SRAM) or dynamic random aceess memory (DRAM), [0063] "The storge unit 406 may include @ storage iner- Face 408, The storage interface 408 may be sed for com- ‘munication between the Sorage unit 406 and other func Tional units or devices of the computing system, The storage intrface 408 may also be sed for communication that is external to computing system. The storage interlace 408 may receive information from the ether functional units ‘or devies ofthe computing system or fom remote devices 20, or may transmit information to the other functional ‘units or devices of the computing system or to remote devices 420. The sorage interface 408 may include different implementations depending on which functional units or devioes of the computing system or remote devices 420 ate being interfaced withthe storage unit 406, The storage interface 408 may be implemented with techaologies and techniques similar 10 the implementation of the control interface 404 [0064] The communication wnt 416 may allow commani- ‘ation to deviees, components, modules, of units of the com- puting system or to remote devices 420. For example, the ‘communication unit 416 may permit the computing system {to communicate between its Components such as the host server machine 102 and the display interface 112. The com- ‘munication unit 416 may further permit the devices of the ‘computing system to communicate with remote devies 420 ‘uel as an altachment, a peripheral devie, the one or more ‘extemal deviees 108, ora combination dheeot through the network 110, [0065] "The network 110 may span and represent a variety ‘of networks and network topologies. For example, the nel- ‘work H10 may bea part of a network and inlude wireless communication, wired eommunieation, optical cammunica- tion, uliasoaic communication, or a combination thereof. For example, satellite communication, cellular commmunica- tion, Bluetooth, Infrared Data Association standard (IrDA), svirless fidelity (WiFi, and worldwide interoperability for microwave acess (WiMAX) are examples of wireless com- ‘munication that may be ineladed in the network 110. Cable, Edhernet, digital subsriber line (DSL), ber optic lines, fiber to the home (FTTH), and plain oid telephone service (POTS) are examples of wited communication that may be inluded in the network 110. Further, the network 110 may ‘raverse @ number of network topologies and distances. Por ‘example, the network 110 may inclade direct connection, personal area network (PAN), local area network (LAN), ‘metropolitan area network (MAN), wide aren network (WAN). ora combination thereof {0066} The communication unit 416 may also funetion as 8 communication bub allowing computing system to func- US 2023/0062315 AL ‘ion as part ofthe network 110 and not be limited tbe an ‘ed poi or terminal nit to the network 110, The coment nication unit 416 may inchude active and passive compo- ‘enfs, seh as micsoelecitonics or an antenna, for interaction swith the network 11, [0067] The communication unit 416 may include a com: ‘munication interface 418, The communication interface 418 ‘may be used for communication bebween the communica ‘ion unit 416 and either functional units or deviees of the ‘computing system orto remote devices 420, The commini- cation interface 418 may rcsive information from the oer Functional units or devices ofthe compiting system, of from emote devices 420, of may transmit information « the ‘other functional units or devices of the computing system fr to remove devices 420, The communication interface 4418 may include differen implementations depending on ‘which fmetional units or devices are being iterced with the communication unit 416, The communication interface 4418 may be implemented with technologies and techniques Similar othe implemeniation of the contol interface 404, [0068] The user interface 412 may present information fzenerated by the computing system. In aspects, the user Interface 412 allows a user ofthe computing sytem to inter= face withthe devices of the computing system or remote devices 420, The user interface 412 may iaclude an input ‘device and an output device. Examples of the input device fof the user interface 412 may inelude a keypad, Butons, switches, touchpads, soft-keys, a keyboard, a mouse, of any combination thereof o provide dataand communication inputs. Examples of the output device may include a display ‘ut 414. The contol unit 402 may operate the wser interface 4412 to present information generated by the computing s9s- term The contol unit 42 may also execute the software 410 to present information generated by the computing system, ‘orto contol other functional units ofthe computing system. ‘The display unit 414 may be any graphical wser interface such asa display, a projector, a video sereen, or any eombi- nation threo. The display unit 414 may be implemented with similar technologies asthe display interface 12 10069] "The above detailed description and aspects ofthe disclosed computing system are not intended to be exbas- Live orto limit the disclosed computing system tothe precise orm disclosed above. While specific examples forthe com- puting system are described above for illustrative purposes, various equivalent modifications are possible within the scope of the disclosed computing sytem, a those skilled ‘nthe relevant art will recognize. For example, while pro- cesses and methods are presented in given order, aller tive implementations may perform routines having steps, oF employ systems having processes or methods, ina different onder, and some processes or methods muy be deleted, ‘moved, added, subdivided, combined, o modified (© pro- vide altemative o sub-combinations. Each of these pro- cesses or methods may be implemented in variety of dif> erent ways. Also, while processes or methods are at times shown as being performed in series, these processes or blocks may instead be performed ot implemented in paral- Jel, or may be performed at diferent times. [070] The resulting method 300 and computing system ate costelectve highly versatile, and may be implemented by adapting components for ready efficient and economical ‘manufacturing, application, and vilizaton. Another impor- tant aspect of aspects of the present disclosure is that it valu- Mar. 2, 2023 ably supports and services the historical trend of reducing costs, simplifying systems, andor inreasing performance. [O7I} These and eter valuable aspects oF the present dis-

You might also like