You are on page 1of 23
ay United States Patent USOI1823344B2 0) Patent No: US 11,823,344 B2 Poynter et al. (4s) Date of Patent: Noy. 21, 2023 (64) MOBILE DEVICE TRACKING MODULE 2osones22 AI* 122015 Lipamet oop 27017 WITHIN AVR SIMULATION 38683 2oigorsns? aie Kogan cot ett A A , doioolseead Al Nal eet soit 7) Applicants MSG Entertalament Growp, LUC doroomass Ale 102019 Smet ot 1906, New York, NY (US) 201990361797 AL* 11/2019: Yeri ‘GOOF 11/3672 Snatontiges ale taanat Dam wy 10808 2) ‘ovo: Helena Pome, New Yori, NY ogous als ‘S222 Rwger eth 8863 S) David E: Rodrigue, Harso, co) Ns) (Continues) Ee ee OTHER PUBLICATIONS . : ‘hag. Lie.“ phon! opering api emaphone on eee cmeated lied wader 338 ta ality” SIGGRAPH Asa 2020 XR 220.12 iar: USC. 154(b) by 0 days. 2420)" (Continues) 21) Appl Now 1741452 (22) Filed: May 7, 2021 Primary Examiner — Ryan M Gray 4) Autores, Agen, or Firm —Sterne, Kester, 5) Prior Publication Data ee eae 'US 202200358736 A1 Nov. 10,2022 (1) Ime, on ABSTRACT Guar is20 cou eae eae Disclosed herein aresystem, metho and computer progam Cena eauan pocket embodiments for mobile device tacking sysiem (2) Us.ch ‘thin VR simulation, The metho includes determining a ia Gost 1920 01801, Gasr 7246 Peston and orientation ofa mobile device teeking mode GQor7oly, GOST 773 20LTO1), Goer (es ease) atached 1 a mobile device, caleulting 2 pos Sornools Quisoin GOSP 22192004 Honan orenation ofthe of the male device baed at eat “201301) —_patally ona positon an oration of racking module, (58) Field of Castcation Search Fimulaing a real work etvronment, generating a Vital None nulzation ofthe mobile devise and rendering 2 VR See application fe for complete sear history Simulation bused atleast parally om the position and entation ofthe display scren of the mobile devi. The (66) References Cted positon and orientation of the display sexeen provides a US. PATENT DOCUMENTS 9449433 B2* 92016 Fong GosF 30317 22" 22019 Shoton ‘Gor 3017, DBL 2021 Sauk a Virtual positon and orientation ofthe display sercen relative to a virtual origin within the VR simulation, 19 Claims, 12 Drawing Sheets US 11,823,344 B2 Page 2 66) References Cited US. PATENT DOCUMENTS 20220095208 AL* 92022 Palos How 4026 2o20358027 AL 112022 Poymer oa (OTHER PUBLICATIONS Cooled, “How to Use Your Phone oF Tablet in VRE 10S and Android." aeeessed at hps:!www.youtube.com/¥ateh y= QRIZIHVUXZE, YouTube video published on Dee. 20,2017; 4 sinutes (submit on accompanying CD-ROM ‘Tysel Wood VR Tech, “How to: Brag. Your Phone in VR" accessed at pe: www youtube com watch -hvzgeQaz@8M, YouTube video published om Sep. 2020; 11 minutes 26 seconds (submited on accompanying CD-ROM), Perel, Frazska et a. "Appltions of MesuarIntertion Framework for Viral Realty Testing in a Smart Envitoament Procedia CIRP 9 (2013) 35.99. (Year 201%) ‘Amano, Tatsuya, etal. “Smaiphone aplication teste using ‘ial veal” Procodings of the Ith FA Iermational Confer fence om Mable and Ubiquitous Systems Computing, Neworking nd Services 2018. (Yea 2018), * cited by examiner U.S. Patent Nov. 21,2023 Sheet 1 of 12 US 11,823,344 B2 FIG. 1B MOBILE DEVICE WITH TRACKING CASE FIG. 1A US 11,823,344 B2 Sheet 2 of 12 Nov. 21, 2023 U.S. Patent Ob ’'Sld 30r er ber ad (suonna ado2s01A9 hace wt 1 Jaraworajaooy ar or = or oo Auneg s9Bue49 — 408589014 a to sd9 LO wars aumeys/oneg | Ke | are ss9j2.1N suoneatunuwoy Or FINGOW ONIVEL SALLY ETT (asn) tog evea U.S. Patent Nov. 21,2023 Sheet 3 of 12 US 11,823,344 B2 FIG. 2B PASSIVE TRACKING MODULE 204 FIG. 2A ACTIVE TRACKING MODULE 201 US 11,823,344 B2 Sheet 4 of 12 Nov. 21, 2023 U.S. Patent €'old NOLVANAINO ONY NOILISOd 208 TNE —_ 13sqvah YA 4O Y3SN AB G13H 3Sv9 ONDIDVEL HLIM, SNIDVELINO-306NI DIAG TTIGOW TVOISAHE US 11,823,344 B2 Sheet 5 of 12 Nov. 21, 2023 U.S. Patent / asey Supjrent y ania aqigow isAUd v'Old woneqarig pue uonisog ONDDVEL NI-3QISLNO U.S. Patent Nov. 21,2023 Sheet 6 of 12 US 11,823,344 B2 2g FIG. 5 US 11,823,344 B2 Sheet 7 of 12 Nov. 21, 2023 U.S. Patent 9°Sld (a3¥30N3¥) NOLWINWIS GITWVISNI dav SUGOW ONY 3S¥9 ONDIOVYL HLIM ALIW3Y TWNLMIA 3DIARG 3TIGOW US 11,823,344 B2 Sheet 8 of 12 Nov. 21, 2023 U.S. Patent (HonowHisswe -030//34N30N38) NOLLVIAWIS ALIV3 TVALUIA NouWInis 3SV9 ONDIDVUL OJHOWLLY ANY dav 31180 ddV LOWY HLIM 391A30 3TIGOW oor US 11,823,344 B2 Sheet 9 of 12 Nov. 21, 2023 U.S. Patent a 8°Sld uonejnuis Ayjeau jenn 1 dd a4GOW pue ase> Supe. qu aoinag aygow US 11,823,344 B2 Sheet 10 of 12 Nov. 21, 2023 U.S. Patent 6 ‘Old waists wone;nuus Ayjeay jerasin quowidojanag ddy uo pajejnuis dy apqoW 889 Bupoesy yam 201n80 aqGO1N U.S. Patent Nov. 21,2023 Sheet 11 of 12 US 11,823,344 B2 U.S. Patent Nov. 21,2023 Sheet 12 of 12 US 11,823,344 B2 1190 Processor 1104 Main Memory 1108 K———J Use inpututpat User Input/Output Interface(s) 1102 *|_Devices) 1103 ‘Second Memory 1110 , Hard Disk Drive ‘Communication 12 Infrastructure 1106 : = = Removable Storage Removable Storage Drive 1114 Unit 1118 Removable Storage Interface 1120 ep. [| Wiesner Remote device(s), Cor ae Interface network(s), | entity(ies) 1128 \ Communications Path 1126 FIG. 11 US 11,823,344 B2 1 MOBILE DEVICE TRACKING MODUL WITHIN A VR SIMULATION ‘CROSS-REFERENCE TO RELATED APPLICATIONS AND INCORPORATION BY REFERENCE This application incorporates, by reference, US. applica: tion Ser. No. 17/314,193, fled May 7, 2021, eatitlod "Too! For Mobile.App Development And Testing Using A Physical Mobile Device.” in its entirety. U.S. application Se. No. 171314,193, deseribes an environment for mobile app devel- ‘opment and testing through mobile device visualization within a Vitual Reality (VR) simulation envionment, BACKGROUND OF TH DIS LOSURE ‘When developing, designing, or conceiving real-world enviroament-based mobile applications (apps). cumbersome and inlliceat for a developer to vista real- ‘world environment each time they would Tike to test their ‘mobile app (application) accurately within that environment. Also, it maybe dicult to visualize and play prototypes of the app without physically being a the real-world environ: For example, ifa developer is creating a mobile game that fnereets with a large amusement park ride, it may be dificult to test the game efcintly sith enough iterations (tess). The developer would have to travel tothe real-world 5 ‘amusement park/venueto test every new build oF hee wame, This travel may be a challenge, especially when developers ‘re-working from remote locations The accompanying drawings, which are included t pro vide further understanding of the disclosure and_ are incorporated in and constitute a part of this specification, ‘usirate exemplary embodiments FIG. 1A illustrates an exploded block diagram ofa mobile device with tracking module, according to some embodi- FIG. 1B illustrates @ fully assembled tracking module, ‘ccording to some embodiments FIG, 1C ustrates a partial system disgram of an ative tracking module, aeconting to some embodiments FIG, 2A illustrates an active mobile device tracking ‘module attached thereto, according to some embodiment FIG, 28 illustrates passive mobile device trcking module, according to some embodiments FIG. 3 illustrates an Tnside-Out tracking system, accont- ing 10 some embodiments IG. 4 llstates an Outsde-Ia tacking system, accond- ing 0 Some embodiments FIG. § illsirtes visualizing of a mobile device user's hands within VR. according to some embodiments. FIG. 6 illustates a system for a viral mobile device visualization within a vital reality (VR) simulation, fccording to some embodiments FIG, 7 luseates a system for a virwal mobile device visualization within vital reality (VR). simulation, avcording to some embodiments IG. 8 illustrates a system for tracking 2 mobile device for 1 viral mobile device visualization within a vial reality (VR) simulation, acconting t some embodiments may be 9 2 FIG. 9iuseates system for racking s mobile device for 1 viral mobile device visualization within a viral reality (WR) simulation, acconling t some embodiments. FIG. 10 illustrates « mobile app simulation rendering a cube as an alpha’ AR overlay, according to some embod FIG. 11 illustrates an example computer system, accont- ing 0 Some embodiments “The prosent disclosure will now be described with refer. ence tothe accompanying drawings. In the drawings, like reference numbers may indicate identical or functionally similar elements DETAILED DESCRIPTION OF THE DISCLOSURE, Provided herwin are system, apparatus, deviee, method andor computer program praxkict embodiments, and/or Combinations and sub-combinations thereof, for # mobile device tracking system within a VR simulation. The detiled deseription to follow is to describe various embodiments providing an caviroament to test a mobile application (app) that may be, for example, curently in development. .\ conventional tes environatent may’ include personal computer (PC) with application coding software, amo engine, testing librarcs, compilers, renderers, te However, a unigue challenge exists forthe conventional testing of mobile apps that Tunetion by interacting with real-world environment. Conventionally, these mobile apps require a tester to travel t the real-world enviroment to properly test any elements that directly interact with that location The detsiled description to fllaw i directed toa tracking system t0 tack a position (loeation) and orientation of a physical mobile device operational within a virtual real ‘World test envionment. The location and orientation of the physical mobile device may be input to a VR simulation testing system to simulate a position (location) and orien- tation ofa virtual rendering of the mobile device within a VR simulation. AS a testor moves the physical mobile device round during testing, its comesponding virwal rendering will move accordingly in the VR simulation, In some embodiments, during testing, the tester holds a ‘mobile device (¢.. smartphone) in thee hands while woat- Jing a VR headset (VR device). The simolation of the real-world environment i fed a imagery to the VR headset, Aware of the VR heaset would be able to look around the real-world environment as if they were thre in person. The ‘mulated VR environment may exist ndifeent forms such as, LIDAR scanned environments, selpted 3D environ ‘ments in game engine, or separate software which enables VR connection In sonte embodiments, a mobile device tracking module may be operative with the physical mobile deviee a configured with active tricking mechanisms (ep. GPS, yroscope, accelerometer, magnetometer. nclinometer ete) low for precise tacking relative toa VR device. In another embodiment, lights on an exterior of the tacking mode (eg, LEDs) allow for precise tacking relative to a VR device with camerss Tn some embodiments, a mobile device tracking module may be operative with the physical mobile deviee and is configured with passive tacking mechanisms (eg, mark- ings, colors, pattems, ct.) and is tacked with cameras 19 allow for prevse tracking relative to-a VR device. US 11,823,344 B2 3 In some embodiments, the cameras are onboaed the VR ‘device (Inside-Out racking} or are extemal othe VR device (Outside- tacking). In one example, a venue represents a real-world environ sent forhosting an event The venue may representa mic venue, such ay a music theater, a music club, andor a ‘concert hall «sporting venue, such as an arena, «convention center, andor a stadium, andlor any other suitable venue. ‘The event may represent 2 musical event, a thestical even, sporting event, a motion picture, and/or any other suitable {event that may presen interactive content Io members of an ‘audience within the venue. (Conventionally, while developing an sudience interactive spp for an audience members smartphone. a developer ‘Would compile atest version of thei ap and download it ‘0 their smartphone. The developer, ora tester, would thon conventionally travel to the vente to test te latest version of thei app, for example, recent upgrades or new features. However, travel is expensive, time consuming and not always useful, as the Venue may not be fully operational (eg. under constction), In the various embodiments, a simulation of the real world environment may be used inthe test environment a substi for te tester being on-location, namely, physi- cally proseat_at’ the realsworld eavizoament, In these embodiments, imagery from the real-world environment may be collected by video, pictures, LIDAR scans, camera rays, pre-recorded imagery te. This imagery is then used to generate a simulation of the real-world eaviroament that smu thon be rendered on a stand Virtual realty (VR) headset Tone were actually atthe venue playing an iterative app on theic mobile deviee (smartphone, they would also ee the smartphone itself and their bandsltingers as they interacted with the smartphone. ‘Therefore, in various cmbodiments, visualization of the smartphone is silted and overlaid on the simulation of the real-world environ: ‘ment (Venue). Hands and finger movements may also be ‘Simulated and overlaid on the simlaton of the real-world environment (venue), ‘Also, if one were tually atthe Venue playing an inter active app on their mobile device (smariphone), they would also see the smartphone move relative tothe ven 3 they ‘moved the smartphone in different directions or pointed iat ‘specific item of interest inthe venue. Therefore, in various tetbodimens, a position and orientation ofthe smartphone felatve tothe Vit headset is determined and then similated {8 a viral point of origin within the simulation of the real-world edvoament (venue), In ation, if one were actually at dhe venue playing an ineractive app on their mobile device (emaphone), they ‘would also physically interact with the smarphone (e3, ‘ouch, vibration, audio, video, etc). Therefore, in various embodiments, physical interactions the tester has with the smartphone they ae holding are recorded and simulated on the visualization of the smartphone in the VR headset “Mobile app signals directed to the phone (esd, video, and vibrations) may also be simulated and communicated 19 the smanphone, such that the tester actually feels, for example, haptic feedback. Sending those signals t0 the Smartphone gives the tester a “real-world” fool a6 they ‘would if they were physically using the app atthe venue Tn various embodiinens described hewn, the technology described herein may allow a developer to cut down on iteration and travel tise when developing, debugging, and! or testing their mobile applications (apps) at real-world environments, ike location-based entertainment (LBB) ven Jone compiter oF on aX 4 es including indoor oF outdoor installations, within Vie- Reality (VR) environment. By combining, physical andlor simulated dota from a physical and virtual mobile device and rendering the piysical and/or the simulated data within a VR simulation, the various embodiments described herein may provide the developer with a hands-on accurate representation and feet for how thei mobile app will work at realeworld environment. For example, the developer may test tei locaton based mobile apps, for example, sl Sereen mobile apps, mobile games, and! mobile augmented! mixed reality apps to provide some examples, using a physical mobile device within a VR simulation The technology described herein in various embodiments ‘may min in alte, or nearteal time, and may inlige a ‘mobile device racking module, such asa tracking case, that attaches to a physical mobile device in the real world 10 allow for prise inside-out or ontside-in tacking of its physical location coordinates and screen boundary coord rates relative to a viual origin within a VR simulation, In some embodiments, the system combines simulated and physical input, camera, microphone, sensor, display. and studi data between a VR simulation, « emote app run fon physical mobile device, and a mobile app simi running and being developed on a mobile app development PC. In some embodiments, the system transmits and com bines simulated aad physical input, camera, micropione, sensor, display. and audio data between a VR simmlation and ‘mobile app running on a physical mobile device without the use of a mobile app development PC In some embodiments, a viral mobile device visualiza tion within the VR simulation combines physical and simu- Jated inputs, soch as, but not limited 0, camer, mcropione, sensor, display, and audio data from the VR simulation, The remote app or mobile app rans ona piysical mobile device ‘oroptionilly « mobile app simulation running and curently being developed! on a mobile app development PC In some embodiments, the system uses tacked mobile device case sereen boundary coordinates fo provide @ video pass-through visualization of physical mobile device fereen by eropping a video fee Irom the VR headset based fn caleulated screen boundary coordinates. For example, ‘markers or identifiers, whieh can he on the comers of The tracked mobile device, allow the cropping of the vie feed bused on identifying their position and orientation. These various embodiments, which are deseribed in fr ther deal below, represent one or more electonie software twols, that when executed by one oF more computing devices, processors, controler, o oer devices that il be fppareat to those skilled ia the relevant acts) without departing from the spirit and scope ofthe present disclosure, fay analyze, process, andor translate audio, video. and ‘movement andor their comesponding digital commands, Embodiments ofthe dielesure may be implemented in hardware, finmware, software, of any’ combination thereof Embodiments of the disclosure may also be implemented as ‘nstretions stored on 4 machine-readable medium, which may be reid and execttad by one or more procestos. A ‘machine-readable medium may include any mechanism for ‘toring or trnsmiting information in a form readable by a machine (eg computing device). For example, a machine-readable mediuin may include read only memory (ROM); random access memory (RAM), magnetic disk storage media opal storage media; fash memory devices; lectriel, optical, acoustical or other forms of propagated Signals (¢., carrer waves, infrared signals, digital signals, cc), and others. Funber firmware. software, routines, instructions may be described herein as performing certain US 11,823,344 B2 5 actions, However, it should be appreciated that such deserip- dons are merely for convenience and tht such actions infact, result from computing devices, processes, controllers, or ther devices executing the fimiware, sofware, routines, FIG. 1A illustrates an exploded block diagram of a mobile device with a mobile deve tracking module 108 (tracking ‘mole, according to some embodiments. System 100 my include mobile device 102 (eg, smariphone, tablet, wear able computer, et.) representing a real-world inpaourput system that may interact in a vial reality (VR) or aug: mented reality (AR) environment, In various embodiments, mobile device tacking module 103, is configured in an sctive tacking embodiment ss an inerface 104 (eg. caseimobile device holder), with an electronic components module 106 and power components ‘module 108, While shown a tee sections tracking module 103 can be a single modile or any number of modes and vary in size without departing in scope from the lechnology 2 described herein Tracking module 103 is attached to mobile device 102 to track a postion (location) and orientation (e.location in six degrees of movement) of the mobile device 102 reative to & proximate (eg, aem’s length) Virtual Reality (VR) headset 308 (Sce FIGS. 3-8) a5 willbe discussed in greater detail hereafter. The position and orientation othe pascal mobile device may be input 19 a VR simulation testing system to simulate a postion and orientation of «virtual rendering of the mobile device within a VR simlation, As * 1 tester moves the physical mobile device around during ‘esting, its corresponding virtual rendering. will move accordingly in the VR simulation. ‘A tracking module's position and orientation, relative to the VR headset, provides physical location coordinates and screen boundary coordinates relative to 3 vietal oFigin within a VR simulation. For example, if mobile device's position and orientation are known, then the system cat talento the mobile device boundary coordinates as Well as screen perimeter coordinates based on known mobile device dimensions (eg. model and model spocs). The model and dimensional information can be stor locally in memory of| any of: the mobile deviee, the tacking module, the VR ‘device, the virtua simulation system, the mobile application ‘development sysem or remotely in server storage, sch a8, coud based storage systems Tn one embodiment, mobile device tracking medle 103 includes an interfacing member 104 fe, case) 0 secure to the mobile device 102. For example, the interfacing member ‘may comprise a back half of a custom phone ease conven ‘ised for @ specific phone movel or fom factor. However, any inerfcing member that securely attaches 10 the mobile device ean be substiuted without deparing. from the scope of the technology described herein, In addition, ene or more sections 106 and 108 are inte- rated with or separately attached with interfacing member 104, As willbe discussed in greater det in association with FIG. IC, eleewonic components module 106 may inclade ‘one oF more ative sensors, a computer processor, comm nication circuitry andlor interfaces, postion or orientation sensors Power components module 108 may inelude knovsn power sources and charging mechanisis (6. hatteries ad ‘hanger or a power eonnetion tothe mobile deve or ether extemal power sour). FIG, LA illustrates a mobile device 102 with mobile device tracking module 103 attached thereto, according {0 6 some embodiments. As shown, a mobile device tacking movdle 103 attaches securely 10 a mobile device (e3., smariphone). While shown as a similar size and shape as the mobile device, the mobile device tacking module may be of any supe, size, design, or material (eg. 3D printed from liquid Plstics). In some embodiments, the tracking module is {integrated within the mobile device ite (.e, using com- nents of the mobile device for active tacking), FIG. AC ilstates a system diagram of an active tracking ‘modile, according to some embodiments. In some embesliments, elemnic components module 106 may’ inchude a computer processor 110 (2. a micro processor) with associated computer memory 114 and con- figured to track 3 postion (location) and orientation of the attached mobile device 102. In_adktion, the computer Processor may process data inpuslourputs through inter- Taces 112 and process position and orientation data captured by Known location and orientation devices 118-124, Loc: tion and orientation devices include, but are nt limited to, lobal positioning system (GPS) 118, active sensors 120 {eg., LIDAR (distance), optical sensors (object dowetion, Inclinometer it, ete), aoselerometer 122 (force caused by vibgation or a change in motion (eg acceleraion)) and tyroseope 124 (orientation and angular velocity). Inputs fom butons 128 provide control (on/off, syachroniation, wireless comectvity, ete). LED actusior 126 provides electrical signals to one or more LEDs that may be arranged fon an outside of the tacking module, In addition to various Known status lights (eg anol), the actuators may ight up LEDs strategically arranged (eg, four comers of a front side) on the tracking module to be actively wacked by cameras as wil be discussed ia greater detail ia FIG. 2B. Power components module 108 may include kaown power soures and charging mechanisms (eg. batery 132 find charger 130 or a power connection to. the mobile device). While shawn as to modules, the number, size and shape ofthese modules can vary without departing from the scope of the technology described herein Th the active mobile device tacking module, communi- cation circuitry 116 may send positon and orieataion data Wo the VR simulator (ei, the VR headset), standalone VR simulation system, a mobile app development system or the ‘mobile device 102. For example the tracking module would actively be recognized by the VR device (eg, helmet) by ‘communications (¢4., Bluetooth®) including position and brientation data relative to the VR device, The VR device ‘ay communicate informatio to the tracking module, such a8 presence, aetivation, synchronization, ele ‘Communications may be through wired (eg though a data port conncction 113 (eq, USB)) oF witeess 117 ‘communication mediums. For example, the communications ‘may’be implemented on. calor network (et, 2 fong-teem evolution (LTE) network, a code division multiple access (CDMA) network, a 3G network, @ 4G nework, a SG network, Bluetooth®, Bhietooth Low Energy (BLE) or smother ype of knovn or next weneration network, ele). a Public land mobile network (PLMIN),a local area network LAN), a wide area neevork (WAN), a metropolitan area network (MAN), a telephone network (eg., the Public Switched Telephone Network (PSTN)), a private network, fan ad oe network, an inane, the Intel, fiber optic= based network, a cloud computing network, andor the like, andlor 4 combination of these oF other types of neworks Insome embodiments, the tracking module is operatively coupled with the mabile device 102 (eg. wirelessly, wired connection trough the dita port, et.) The mobile device is US 11,823,344 B2 7 configured to implement one ot more ofthe tacking module elements such as, a tracking application (app) deteraining Position, orientation and/or communication with the VR simulator (eg. the VR headset) a standalone VR simulation system or a mobile app development system. In some embodiments, a mobile device tacking module 103 secures to the mobile device and includes elements (cctive or passive) fo tack a position and orientation of the mobile device 102. In one embodiment, the mobile device tracking module 103 attaches with a fiction fit Cightly coupled), but may include fasteners such as sips, saps, deve strips, wraps, mechanical fasteners such as clips, rotational mechanisms it mechanisms, hinged connectors, separate grips, ajustable tightenes, ct. In sme embodi- ments the tracking module may wiz a fietion it concept, bt be atached through thin bars around the mobile foam Tietor as opposed toa ease which Tully weaps around the mobile frm factor In addition, the mobile device tacking module maybe constructed of molded plastic, metal, inchude flexing. or flexible material, gel, moldabie elements, handles, grips, FIG, 2A illustrates an active mobile device tracking ‘module 201 configured with Tights, acconling © some ‘embodiments Tisomte embodiments, the mobile deviee trcking module (ex, case) includes lights 202 strategically placed on an outside surface of the tacking module. For example, they fare arranged around a perimeter of the module, on four comers of in a pattern. Lights 202 may be light emitting > diodes (LEDS), liquid erystal displays (LCDs) or equivar Feats In some embodiments, the lights may be tracked by builtin sensors or cameras located onboard a VR heasct 308 (FIG. 3, Inside-Out tracking), LED actuator 126 (FIG. 1€) tums onlolf LEDs for continous tracking or foe track ing durin selected time period. The lights may be powered by the tacking module power components 108 or by power received from the mobile device. In some embodiments the Tights are Mashed for detection purposes In some embodiment, the lights are tacked by one oF more cameras or sensors separate from the VR headset (FIG. 4 Outside-tn tracking). FIG, 28 illustrates passive mobile device tracking module 204 configured with markers, aocording to some ‘embodiments, Passive tracking modi, includes a case with passive markers, soch a high contrast fiducial markers 206 (ex. reflective or electoluminescent) and/or 208 (dark) are placed along front edges of the tricking module. These high ‘contrast fiducial markers may include, but are nt limited to, Printed designs and pattems, bar codes 210, bright colors 212, or typos of markings around the edge of the mobile ‘device ease so that itis clearly visible i a camera that may ‘he used with machine vision to track those fiducial markers Known muliple machine vision Hibraries exist with the cspabiltyto track objects based on colors or patter. The markers may be of a single type or include multiple types (eg. shown). In some embodiments, the passive tracking ‘odile does aot require electrical or power components a shown in FIG. 1C The fiducial markers may be located a various positions ‘round the case, and vary in shape, repetition and desig. These fiducial markers provide a target Toe the VR heakset 308 to tack a postion and orientation ofthe mobile device (ca. smartphone) in space by tangulating between 3 oF fore known marker positions on the case and by knowing the dimensions ofthe phone ease. With this information the 8 system may’ approximate the 3D locaton and orientation of the mobile device casein space reative to the VR headset camera position. This type of ease also may include eon- ventional machine vision configuration and setup steps 50 that it may be properly calibrated for tacking. Th another embodiment, the fiducial markers ae tacked. by one or more cameras or sensors separate fromthe VR headset Tnall embodiments, the mobile device and attached ease ‘may operate in port or landscape mode without departing from the scope of the technology described herein FIG, 3 ilsteates a Insde-Ont tacking system, 28 por some embodiment InsideOut tracking. system 300 is implemented with onboard cameras or sensors (eg. optical or infrared as part fof a VR headset. These onboard cameras 302 are used 10 track objects using machine vision teehnignes without the tse af extemal ase stations or separate exter cameras VR headsets 308 provide basic support for Inside-Out tracking for the heodsct positon. These headsets may track sctive lights or passive markers on the mobile device tracking module 103 and use machine vision techniques 10 track the motion ofthe lighs/markers, then since they know the exaet position of those markers, they may’ then ele the orientation and position of the mobile device tracking module Since a location of the mobile device tracking module relative tothe physical edges ofthe mobile device case 103 |s known, @ predetermined measured look-up table of relative distances fom the edges of each physical mobile device ease (known dimensions of eases to smartphone dimensions including sereen size) with respect to the loca- tion and orientation ofthe tracker is eeated. This provides 1 calculated location ofthe mobile device case's bounding rectangle (perimeter) that may be used within the VR Simulation, For example, once position and orientation ofthe mobile device tracking modile i known, stored information (in he mobile device tracking mosdle if ative o in the VR device if passive) reflecting relative position information of the mobile device tracking module tothe mobile device may be used to caeulate position and orientation of the mobile device. In one example embodiment, the mobile device tracking module isa tracking eae that its othe form factor ofthe mobile device, such asa smartphone case. Adltional sored information (et, in the mobile deviee tacking mode or mobile device, if active tacking or in the VR device, VR simulation system, mobile device app develop: ‘men system, or cloud, ifpasive) may include a Tor factor (dimensions) ofthe mobile device and relative position and dimensions ofa perimeter of a display seroen on the mobile device. ‘This similar technique may be used for Inside-Out track ing ofa custom-built mobile device ease. For example, the custom-built Would include Bdueiary makers, LED lights, br other forms of identification embedded within the ease ‘itself t enable the tacking of the mobile device's positon nd orientation data, In some embodiments, Inside-Out tricking is applied t0 active tacking. modules 201, where the active ticki todole 201 inehades lights 202 (visible or inftared) that surround the mobile device and age powered by the mobile device or batteries, The lights may have various postions round the case, These lights allow the VR headset to tack the portion of the mobile device 102 in space by tang lating betwoon 3 or more know light positions of the case and by knowing the dimensions of the mobile device case. US 11,823,344 B2 9 With this information the system may approximate the 3D) location of the mobile devi in space relative to the VR headset camera position, FIG, 4 ilostrtes an Outside-In tracking system, es por some embodiments In an Outside-In tracking embodiment, the system 400 smkes use of one or more extemal stand-sone cameras 402 located neae (et, same room) the VR headset 308 to track the mobile device tracking module 103. Camera(s) 402 track(s) the mobile deviee tacking module's position ad ‘orientation relative to the VR hesct 308 (hot devices are located within the camer’s eld of view), If two cameras ‘resent, they will each process imoge data reflecting felative positioning as seen in thet respective ld of views. This infomation is fed (e.g wirelessly 404) co the VR headset 308 soit may recognize position and orientation of | the mobile device tacking module within the viral space ff the VR headset 308, As the VR headset and the mobile ‘device tracking module are being tracked ia the same calibrated space at once, they can be eorelted. Know 2 mage processing techniques for recognizing same objects ‘multiple images (videos) use, for example, tie point, van ‘ishing points, and recognition of common vertices and phines to calculate the relative position and orientation. Other known image processing techniques for processing multiple images of a same objects) can be substituted without departing from the scope of the technology dis- closed herein. ‘As provieusly described, since a location of the mobile device tracking moxie relative to the physical edges af the mobile device module 103 is Known, a predetermined ‘mesure look-up able of eltive distances from the eles of each pliysical mobile device case (eg, Koowa dimen: ‘ons of cass to smartphone dimensions lacing sreen Size) with respect to the locaton and orientation of the tracker is rete. This provides a collated Iacaton of the mobile device case's bounding rectangle (perimeter) that tay be wed within the VR. Simlation Torexample, one positional erintation ofthe mobile device wacking modules known, sored information (ithe tmobile devi rcking module if active or inthe VI device IT passive) reflecting relative positon information af the snobile device trcking mode tothe mobile device my be ted f0 calculate postion and orientation of the mobile dkvios. In one example embiment, the mobile device teacking module is tacking ease tht is othe orm faeor ofthe mobile device, sich 2 smartphone case. Adina Stored information (nthe mobile device tacking module if ative or inthe VR deviee if passive) may ined a form Toctor(datcosions) of the mobile devie and relative post ionand dimensions af perimstr ofa cpl sreen onthe tmobile device ‘Atoher method of providing te rectangle's bounding box to track the eles ofthe filial markers within the VR headset captured video ford and combine that with Known goometry and dimensions ofthe mobile device case, thro the we of look up ule foreach mobile device ‘sipped, ad calla a best it asing the kanwn mobile device tacking modi 103 geometry a dimrsions. The ‘ill el the sytem which piel inthe VR henset captured ‘doo ate within the Boundaries ofthe mobile device case, and the system may we tis 3 a mask to poke a window thronah the VR simulation in the VRC headset cared ‘ideo oritmay ep the captured vdeo and vera ton the VR silation IG. 5 illustrates visualizing mobile device wets hands within VR aecording to some embodiments. none embod 10 ment $00, hardware finger tricking technologies onboar! the VRE headset may include machine vision or LIDAR! [Laser cameras that track finger positions that may then be input as an overlay 504 to the virtual mobile device vis. ation, so the user is provided a more immersive mobile device experience. These finger tricking methods may be tased with both Outside-In and Inside-Out tricking to sim- Jat, for example, real-world tuehing 808 or handling (€ 2, rotating or shaking) of the mobile device 102. In a socond embodiment, visualizing the users hands within VR is performed by inverse kinematic rig (K Rig) on a 3D mode ‘ffands (oot shown), For example, hands are represented as ‘bones ina skeleton, and ate wacked based on where on the sreen (within the sercen perimeter 502) they are touching snd moving the IK Rig to mateh those touch poses. In some embodiments, cropped images include see-through imagery ofthe live hands as generated from the one more cameras (headset or extemal) FIG, 6 ilksrates system for a viral mobile device visualization within’ vetual reality (VR). simulation, sccording to some embodiments, System 600 includes physical mobile device 102 (ex, smartphone table, wear able computer, ete) representing 4 realssorld inpaourput system that may interact in virtual reality oe augmented realty environment. The physical mobile device 102, in various embodiments, includes a mobile device tricking module 103 10 establish « postion (eg. location in six degrees of movement) of the mobile physical device 102 relative to Viral Reality (VR) hoadset 308. ‘VR is.a simulated experience that ean be similar to the real world, Applications of virtual reality include entersin- ment (eg. video games), education (eg, medical or military training) and business (¢g virtual meetings). Other distinct 'ypes of VR-siyle technology inelade augmented reality and ised reality, sometimes refered to a5 extended realty or XR. ‘Currently, standard virtual reality systems se ether vie ‘wal reality headsets or mul-projected environments. 10 enerate realistic images, sounds and other sensations that Simulate a user's physical presence ina virwal environment AA person using Viral reality equipment is able «0 look found the artfcial world, move around init, and interact ‘with viral features or items, The effect is commonly rete By VR headsets consisting of head-mounted dis play’ witha smal seen in front ofthe eyes, but ean also be fread through specially designed rooms with multiple large sereens. Virtual reality typically incomporats auditory and video feedback, but may also allow other types of sensory and force feedback throughs haptic technology. ‘When developing a new VR application, especially for ge scale environments, travelling 1o a subject leation to test the application as itis boing developed may not be convenient or possible. Therefor, in various embodiments described herein, a viewal ‘device visualization ‘within a VR simulation 606 combines physical and sim- Tote inputs, seh as, ha not Kite o, camer, micropione, sensor, display, and audio data from the VR simulation 606. AA mobile app under development is installed and execu (computer processor) on the physical mobile device 102. In ‘nition, in one embodiment, the system 600 uses tracked mobile device case sereen boundary coordinates to provide 4 Video pass-through visualization of the physical mobile device sereen by cropping a video fed frm the VR headset $308 based on calculated sercen boundary coordinates (.e, coordinates of perimeter of display sercea) of physical mobile device 102 US 11,823,344 B2 W For example, a test mobile app is installed on the mobile device 102, the mobile device 102 is then heldimaved by a wearer of the VR headset 308 during testing A viral enviroment, corresponding o a real-world location where the app will be used on a mobile device afer testing is simulated, The real inputsoutputs of the mobile device, as the tester interacts with the mobile device while wearing the ‘VR headset, are combined with simulated motile device Jnputsoutputs, a virualization ofthe mobile device, and the Virtual environment and fed a imagery 1 the VR ineadset ‘By combining physical and simolated data from » physi cal and vietual mobile device and rendering that withias VR simulation, the systom may provide the developer a hands: fm accurate representation and fel for how tir mobile app ‘ill work at a real-world locaton. For example, the devel ‘oper may test their location based mobile apps, dual screen ‘mobile apps, mobile games, and mobile sigmentedmixed realty apps using a physical mobile device within a VR simulation, IG, Tillastates a system 700 fora viral mobile device visualization within a. viral reality (VR). simulation, fccording to some embodiments, System 700 for @ viral Iobite device visualization within a virtual realty (VR) simulation, seeording to some embodiments, System 700 includes physical mobile device 102 (ex, smatpbone, tablet, wearable computer, ete.) representing a real-world inpavoutput system that may interact in a virtual reality (V8) or augmented reality (AR) environment. This paysical ‘mobile device (mobile device) 102, in various embodiments, includes a mobile devi tracking module 103 to establish positon (eq, location in six degrees of movement) of the ‘mobile device 102 relative to 8 proximate (neat) Viral Realty (VR) headset 308, The mobile deviee tacking module can be a ease with passive elements, such as ‘markings, an active case with electeonic location elements oF alternatively be builtin to the smarsphone itsel ‘VR is a simulated experience that ean be similar to the real word. Applications of vital reality inelude enters ment (eg, video games), education (e.g. medical or military ‘eaning) and business (viral meetings). Other distnet ‘ypes of VR-style technology include augmented reality and mixed reality, sometimes refered to as extended realty or XR Currently, standard vinual reality systems use either vir tual realty headsets of mul-projected environments 0 enerate realise images, sounds and other sensations that Simulate users physical presence ina virtal environment A person using Viral reality equipment is able «© look ‘around the viata) environment, move around in it, and interact with virtual features or items, The ellect is com ‘only creat by VR headsets consisting ofa head-mounted display with a small sere infront of the eyes, but ean also be ereated through specially designed rooms with multiple large seoens. Viral reality typically incorporates auditory and video feedback, but may also allow other types of Sensory and force feedback through hap technology ‘When developing a new VR application, oF updating an existing VE application especially for large seale envion- ‘ments, travelling 1 a subject location to test the application 25 it is boing doveloped may not be convenient of possible. Therefore, in various embodiments described herein, a vt: ‘ual mobile devie visualization within a VR simulation 706 combines physical and simulated inputs rem the physical ‘mobile device, sch as, but aot limited to, eamers, mcr phone, sensor, display, and audio data (¢., mobile device Sotions performed during testing) A remote app mins on a physical mobile device 102 collecting the physical inputs 12 ‘nd communicates this data to 4 mobile app simolotion 702 running ona mobile app development PC: This mobile app simulation may allow a mobile app under development to run on 2 standalone computer without repeated compiling and loading ofthe mobile app tothe physital mobile device ‘each time a change is made tothe mobile app. In this way, the developer ean contiavously update or upgrade the code of the mobile app and test using the mobile device, but without requiring execution on the mobile device, ‘The ‘mobile similation can also communicate data aed coatig- ‘ation information to both the physical mobile device as well fs the VR simon FIG. # illostates a system #00 for tacking a mobile device for a viral mobile device visualization within a Virtual reality (VR) simulation 806, according 10 some temmbodiments. Physical mobile device 102 may include a ddvwnloaded mobile app. AVR PC or Standalone VR device (ex, VR headset 308) provides a VR simulation 806 of a real-world location/environment to est the mobile applica ‘As previously deseribed in FIGS. 1-4, a mobile device tracking module 103 attaches to a physical mobile device 102 in the real world to allow for precise inside-out or utside-in trcking ofits physical location coordinates nd fereen botsdary coordinates relative to a vietal origin within a VR simulation, transmitting and combining simu jated and physical input, camera, microphone, season, dis- play, and audio data between a VR simulation and 2 mobile pp runing on a physical mobile device. ‘AVR simulation 806 virally recreates the context and physical real-world envitonment/LBE venvesioutdoor or indoor installations, where the mobile app can be used, Jnchides a Virtual mobile device visnalization within the VR mulation which combines the physical and simulated ‘input, camera, microphone, sensor, display, and audio data fom the VRE simulation, the mobile app running on a physical mobite device, Tn various embodiments, Outsidesn tracking 802 (F1G.4) and Inside-Out tracking 804 (FIG, 3) methods are eonfig- lured to track a physical mobile device 102 sereen bounding rectangle for eapiured video within VR headset 308 FIG. 9 illuseates a system work low 900 for a vical mobile device visualization within a viral realty (VR) simulation, seording to. some embodiments. Physical bile device 102 may include a downloaded app to capture Jnpavoutputs ofthe mobile device during testing. AVR PC frstandalone VR deviee (eg, VR headset 308) generates 2 VR simulation 806 of «real-world locaton/environment 10 test the mobile application A mobile app development PC or equivalent may imple ‘ment mobile app smilation 995. This mobile app sian ion can be executed asa standalone software (eg. com piled build) oF @ simolation generated from within game fngine being developed with. VR PC of standalone VR device generates a VR simulation 996 to test the mobile applcato Tn various embodiments, trackers using, for example, Outside-n tracking 802 (FIG. 4) and Inside-Ont tricking 804 (FIG. 3) methods are configured to track a paysical mobile device 102 seen bounding retangle (perimeter of Aisplay) for captured video within VR headset 108. To Vistalize the physical mobile device as an overlay on the Virwal environment, the mobile device's Tocation and ofi- entation relative the VR headset is tricked. In some embodiments, the display screen perimeter (bounding ret fngle) can be tracked to render a cropped VR headset captured video. US 11,823,344 B2 1B FIG, 10 illustrates an example embodiment 1900 imple- menting a mobile app simulation rendering a cube 10068 as an alphavAR overlay 1008 that gets combined with a virtual mobile device camera capture 1006. The vietual mobile device camera is capturing a virual scene from a position of the viral mobife deviee, and the VR or mobile app combine the alpha/AR overlay with the capture of the virtual scene. In operation, the VR. simulation sends the position and oriemtation of the virtual mobile device tothe mobile app simulation 1002 so thatthe mobile app kaows where iis felative tothe viral environment 1012. With thie inform jon, the system may propery render the viewpoint of the cube 1004 in the example fr the alpha AR overlay. This fsa illstrates a VR simblation with optional second screen Video sources 1014 streaming 1016 into the VR environment 1012, Network 1010 may include one or more wired anor wireless networks, For example, the network 1010. my include a celular nework (ea fong-term evolution (ETE) rework a code division multiple access (CDMA) network, 8.30 network, a 4G network, a SG networks, another type of ext generation network, ete} a public land mobile network (PLMN), a local area network (LAN), a wide area network (WAN), a metropolitan aes network (MAN), 2 telephone network (ex (PSTN). a private network, an ad hoe network, an itraet, the Intemet a fiber optic-based network, a cloud computing ctwork, andr the lke, andr a combination ofthese oF ther types of networks. The representative fnetions deseribed herein may be Jmplemented in hanware, software, or some combination thereof For instance, the representative functions may be ‘implemented using computer processors, computer logic, application specifi circuits (ASIC), digital signal proces: Sor, etc, as Will be understood by those skilled inthe ats based on the discussion given herein. Accordingly. any processor that performs the functions described herein is ‘within the scope and spirit of the embodiments presented herein, The following deseribes a general-purpose computer sys- ‘em that may be used to implement embodiments of the disclosure presented herein. The present disclosure may be Jimplemented in hardware, or as a combination of software ‘and hardware. Consequently, the disclosure may be imple: mented in the environment of a computer system or other processing system, An example of such a computer system 1100 is shosen in FIG. 11. Computer system 1100 ineludes one or more processors (also called central processing units, or CPUs), such as processor 1104. Processor 1104 may bea special purpose or #8 general purpose digital signal processor, Processor 10M is fconected «0 a communication infestrcture 1106 (for example, a us oF network). Varios software inplementa- tions are deseribed in terms ofthis exemplary computer system, After eading this deseripion, it will become appar~ ft to person skilled in the relevant art ov to implement the disclosure using other computer systems andlor com pter architectures Computer system 1100 also inchides user inpuoutput evico(s) 1103, such as monitors, Keyboards, pointing ‘devices, et, that communiate with comminication infra ‘ructore 1106 through user inpavourpat interfaces) 1102 CComputee system. 1100 also iveludes «min memory 1105, preferably rindom acess memory (RAM), sul also include © secondary memory 1110. The secondary ‘memory 1110 may include, for example a hard disk deve 112, andiora RAID array 1116, andlor removable storage the Public Switched Telephone Network 2 14 drive 114, representing a floppy disk drive,» magneti wpe ‘rive, an optial disk drive, ete. The removable storage deve ILA reads from andor writes to 9 removable sonage Unit IIB ina well-known manacr Removable storage unit HIS represents a floppy disk, magnetic tape, optical disk, ec. As will be appreisted, the removable. storage unit 1118 Tnchudes a computer usable storage medium having stored therein computer software andior data Tn altemative implementations, secondary memory 1110 ‘may’ include other similar means Tor allowing computer Programs or other instrctions to be loaded into computer system 1100, Such means may include, for exumpe, a removable storage ‘wit 1122 and an interface 1120. amples of sich means may ince © program cartridge and caridge interface (Such as that found in Video game devices), a removable memory chip (such as an EPROM, oF PROM) and associated socket, and other removable storage units 1122 and interfaces 1120 which allow sofware (ce. instructions) and data to be anslered from the removable storage unit 1122 to compster system 1100, ‘Computer system 1100 may also include a communi tions interface 1124. Communication interface 1124 enables computer system 1100 to communicate and interact with any combination of remote doviees, remote networks, emote entities, ete. (individually and collectively referenced by reference number 1128), Examples of communications Interface 1124 may include a modem, a network interlace (uch as an Ethemet card). a communications por, a PCM CCA slot and card, te tha are coupled to. communications path 1126. The communications path 1126 may be imple: Imentod using wire or cable, fiber opts, @ phone line, cellular pone link, an RF link and other communications aks or channels ‘The terms “computer program medium’ and “computer usable media” are sed herein to generally refer to media such as removable stoige drive H114, a hard disk installed hard disk drive 1112, oF other hardware type memory. These computer program products are means for providing or storing software (eg. instructions) f© computer system 100 ‘Computer programs (also called computer contol logic) sre stor in min memory 1108 andor secondary memory LILO, Competer programs may also be revcived via cont ‘munications interface 124. Such computer programs, when texectted, enable the computer system 1100 to implement the present disclosure as discussed herein, In partici, the conputer programs, when executed, enable the processor 1104 to implement the processes andor functions of the present disclosure. For example, when executed, the com- puter programs enable processor 1104 to implement part of br all of the steps described above with reference tothe Aowchars herein, Where the disclosure is implemented using software, the software may be stored in a computer program product and loaded into computer system 1100 Using rid aay’ 1116, removable siorage deve 1114, had ‘drive 1112 oF communications interface 1124, Tn oiher embodiments, features of the disclosure are ‘implemented primarily in hardware using, for example, hndware components sich as Application Specific Inte arated Circuits (ASICs) and programmable or static gate fmays or other state machine logic. Implementation of hardware state machine so as to perform the functions described herein will also be apparent persons skilled in the relevant an), “The aforementioned description of the specific embodi- ‘meats willso fully reveal the general nature ofthe disclosure that others may, by applying knowledge within the skill oF US 11,823,344 B2 15 the act, readily modify andor adapt for vatious applications such specific embodiments, without unde experimentation, without departing fom the general concept ofthe present disclosure. Therefore, such adaptations and modifieations fre intended to be within the meaning and range of equiva Tents of the disclosed embodiments, hased on the teaching and guidance presented herein. It iso be understood thatthe phraseology or terminology herein is for the purpose of {description and not of limitation, such thatthe terminology for phrascology of the present specification is to be inte proied by the skilled artisan ia fight of the teachings and auidance, Rearences inthe specification o “one embodiment,” “aa. embodiment" “an exemplary embodiment,” etc, indicate that the embodiment deseribed may include 1 particular Teature, structure, ot characteristic, but every embodiment smuy not necessarily include the pantcula feature, strc orcharactristic. Moreover sich phrases are aot necessarily referring tothe same embodiment Further, when a particular 3 ‘eature, struct, of charateristicis described in connection ‘with an embodiment, itis submitted that itis within the knowledge of one skilled in the art to affect such feature, Structure, of characteristic in connection with other embod ‘ments whether or not explicitly described “The exemplary embodiments described erin ate pro= vided for ilustrative purposes, and are nt limiting, Otber exemplary embodiments are possible, and modifications muy be made to the exemplary embodiments within the spirit and scope ofthe disclosure. Therefore, the specific domi not meant to limit the disclosure. Rather, the scope of the disclosure is defined only in accordance with the Tol lowing claims and their equivalents. Embodiments may bo implemented in hardware (e, circuits), Himware, sofware, o any combination thereof. Embodiments may also be implemented 2s insteetons stored on a mochine-teadable medium, which may be read and exccuted by one or more processors. A machine-read- able medium may include any hardware mechanism for ‘toring information in a form readable by a machine (a computing device). For example, a machine-readable medium may include read only memory (ROM); random access memory (RAM); magnetic disk storage media, opi fal storage media; fash memory devices and other hard ‘ware implementations. Further, firmware, software, rou- fines, instetions may be described hotein a performing cerainsetions, However, it should be appreciate that sch descriptions are merely for convenience and that such sctons in fact resus from computing devices, processors, controllers, or other devices excouting the firmware, soft ‘Ware, routines, instretions, ete. Further, any of the imple ‘mentation variations maybe carried out by a generale purpose computer In embodiments having one or more components that include one or more processors, one oF more oF the proces- sors may include (andlor be configured to cess) one oF ‘ore intemal andlor external memories that store instroc- fons andlor cade that, when exeeuted by the processons), ease the processor's) to perform. one or more funetions fndlor operation related othe operation ofthe eomespond: jing. components) a described herein andor as would appreciated by those skilled in the relevant at) Lis to be appreciated that the Detailed Description section, and not the Abstract section, i intended tobe used to interpret the claims. The Abstract section may sct foal ‘one of more but not all exemplary embodiments of the 16 present invention as contemplated by the inventors), and ths, are not intended to limit the present invention and the appended claims in any way. ‘The preeat invention has been deseribed shove with the sid of functional building blocks illustrating the implemen tation of specified functions and relationships thersf. The boundaries ofthese function building blacks have been arbitrarily definod herein forthe convenience of the deserip- tion. Altemate boundaries may be defined so Tong a the spocifiod functions and relationships thereof are appropri ately performed The foregoing description of the specific embodiments will so fully reveal the general nature of the invention that others may, by applying knowledge within th skill of the an, readily modify andlor adapt for various applications such specific embodiments, without undve experimentation, without departing from the general concept ofthe present wwenton. Therefore, such adaptations and modifications ane Jnended to be within the meaning and range of equivalents ofthe disclosed embodiments, based on the teaching aad uidance presented herein. It isto be understood thatthe Phricology or terminology herein is for the purpose of {scription and not of limatation, such tht the terminology for plascology of the present specification isto be intr- preted by the skilled antsan in light ofthe teachings and auidance, The breadth and seope of the present invention should not be limited hy any ofthe above desrined exemplary embot rents, but should be defined only in accordance with the following claims and their euivalents What is claimed is 1. A method for mobile application (app) development, the method comprising: ‘determining, by a mobile app development system, a position and orientation of a mobile device tacking module attached 1 a mobile device calculating, by the mobile app development system, 2 position and orientation of the mobile device reative 10 ‘proximate viral reality (VR) device base on the position and orientation ofthe mobile deviee tacking modal: simulating, by the mobile sp development system, areal ‘world environment to generate simulated environ rent; enerating, by the mobile app development syste Virtual visualization af the mobile deviee, the Veta Visualization combining physical inputs and simulated inputs and rendering, by the mobile app development system, the Virtual visualization of the mobile device within the ‘imlated environment based on the position and ori= ‘entation of the mobile device: herein the positon and orientation ofthe mobile device Provides vital positon and orientation of the mobile Avice relative toa virtual origin within the rendering, and wherein the calculating tuner comprises deter- ‘mining a postion and orientation of adisplay sereen of the mobife device based on predetermined dimensional information ofthe mobile dviee 2, The method of elsim 1, wherein the determining comprises receiving, fom one of more cameras, the positon tnd orientation of the mobile device tricking mode. 3. The method of claim 2, furher comprising the one oF more cemems detecting identifiable markings arcnd perimeter of the mobile device tracking module. US 11,823,344 B2 17 4. The method of claim 1, whervin the determining coniprses rveiving, from one oF more cameras of the VR eviee, the positon and orientation of the mobile device tracking module. 5. The method of claim 1, wherein the determining the positon and orientation ofthe mobile device tmcking mod tle includes receiving position and orientation information fom active tricking clements within the mobile device tracking module, the active tacking elements comprising any of a Global Positioning System (GPS), a gyroscope, an tocelerometer a magnetometer, inclinometer, sensors, or Tight system, 6, The method of claim 1, whersin the determining comprises receiving fom one or more cameras, the position and orieattion ofthe mobile device tacking module and the positon and orientation ofthe VR device. 7. The method of clin 1, futher comprising determining location coordinates of a perimeter of the mobile device. '8, The method of claim, futher comprising determining Jocation coordinates ofa boundary of the display screen of 2 the mobile device. '9, The method of claim 1, further comprising communi cating, by the mobile app developmen system, the rendering to the VR dovice, 10. The method of clan 4, futher comprising displaying the rendering within the VR device HLA system for tacking a mobile device, the system comprising: 1 vistual reality (VR) devive coufigured to ‘eeeive position and orientation infomation of a. mobile device tracking module attached 0 the noble deview ‘determine, based on the position and orientation infor- mation of the mobile device tracking. module, a positon and orientation ofthe mobile device relative to the VR device: ate areal world environment t generate a simu: Tate environmen; erate a vietual visualization ofthe mobile device, the vietal visualization combining physical inputs and simulated inpots aad render the vital visualization of the mobile device ‘within the virtual simolated environment based of the position and orientation of the mobile device: ‘wherein the positon and orientation ofthe mobile device provides viral position ad orientation ofthe mobile Alevice relative to a virtual origin within the rendering, and wherein the determining Turher comprises dete. ‘ining a postion and orientation of display screen of the mobile device based on predetermined dimensional information ofthe mobile device. 12. "The system of claim 11, wherein the VR device is uchee configured With one or more eameras and the posi- 18 ‘ion and orientation information ofthe mobile device track- ing module is received from the one or more camer. 13, The system of claim 12, wherein the one of more cameras are configured to track the postion and orientation ofthe tracking module based on machine vision techniques 14. The systom of claim HL, wherein the position and crieatation infomation of the mobile device tricking mod- ule is received from a stand-alone camera configured 10 track the position and crientation of the mobile device tracking module and send the position and orientation ofthe mobile device tracking module to the VR device. 15, The system of elaim I, wherein the mobile device tracking module comprises setve tracking elements inelud- ‘ng ny ofa Global Positioning System (GPS), a gyroscope, ‘am accelerometer, a magnetometer, inelinometer, Sensor, OF alight system, 16, The ssiem of clsim IL wherein the mobile device tracking module comprises passive tracking clement: of ‘denifiable markings und a perimeter of the mobile device tracking module. 7. The system of claim 11, wherein the mobile deviee tracking module comprises any of an active sensor, a ‘computer processor, communication cirevity, an interface, Position sensor an orientation sensor, of a power source 18. The system of claim IL, wherein the mobile device comprises: a smartphone, «tablet, ora wearable compute, 19. A on-ransitory computerreadable device having insrdtions sored thereon tha, when executed by a least ‘one computing deviee, cause the at least one computing device to perform operations comprising: ‘determining a postion and orientation ofa mobile device tracking module attache to a mobile device: «calculating a position and orientation of the mobiledevice relative to proximate vital reality (VR) device based fon the postion and orientation of the mobile device tracking, module simolating a real world environment to generate 2 simu: Tate environment; everatinga virtual visualization ofthe mobile device, the Virtual visualization combining physical inputs and simulated inputs; and rendering the virtal visualization of the mobile device ‘within the simblated environment based on the position tnd orientation of the mobile device: herein the position and orientation ofthe mobile device provides a viral position and orietation of the mobile Avice relative toa virtual origin within the rendering, tnd wherein the calculating Tunlher comprises deter- ‘mining a postion and orentation ofa display screen of the mobife device based on predetermined dimensional information ofthe mobile device

You might also like