You are on page 1of 16

International Journal of

Geo-Information

Article
Holo3DGIS: Leveraging Microsoft HoloLens in 3D
Geographic Information
Wei Wang 1,2 , Xingxing Wu 1,2 , Guanchen Chen 3 and Zeqiang Chen 1,2, *
1 State Key Laboratory of Information Engineering in Surveying, Mapping and Remote Sensing,
Wuhan University, 129 Luoyu Road, Wuhan 430079, China; wangwei8091@163.com (W.W.);
xingxingwu@whu.edu.cn (X.W.)
2 Collaborative Innovation Center of Geospatial Technology, 129 Luoyu Road, Wuhan 430079, China
3 TianShui Sanhe Digital Surveying and Mapping Institute Co., Ltd., 1 Mingzhudong Road, Tianshui 741000,
China; sinoma-cgc@163.com
* Correspondence: ZeqiangChen@whu.edu.cn; Tel.: +86-138-7102-5965

Received: 4 January 2018; Accepted: 8 February 2018; Published: 9 February 2018

Abstract: Three-dimensional geographic information systems (3D GIS) attempt to understand and
express the real world from the perspective of 3D space. Currently, 3D GIS perspective carriers
are mainly 2D and not 3D, which influences how 3D information is expressed and further affects
the user cognition and understanding of 3D information. Using mixed reality as a carrier of 3D
GIS is promising and may overcome problems when using 2D perspective carriers in 3D GIS.
The objective of this paper is to propose an architecture and method to leverage the Microsoft
HoloLens in 3D geographic information (Holo3DGIS). The architecture is designed according to
three processes for developing holographic 3D GIS; the three processes are the creation of a 3D
asset, the development of a Holo3DGIS application, and the compiler deployment of the Holo3DGIS
application. Basic geographic data of Philadelphia were used to test the proposed methods and
Holo3DGIS. The experimental results showed that the Holo3DGIS can leverage 3D geographic
information with the Microsoft HoloLens. By changing the traditional 3D geographic information
carrier from a 2D computer screen perspective to mixed reality glasses using the HoloLens 3D
holographic perspective, it changed the traditional vision, body sense, and interaction modes, which
enables GIS users to experience real 3D GIS.

Keywords: mixed reality; Microsoft HoloLens; 3D GIS; Holo3DGIS

1. Introduction
Three-dimensional geographic information systems (3D GIS) attempt to understand and express
the real world from the perspective of 3D space [1–3]. Currently, 3DGIS perspective carriers are mainly
2D (e.g., a computer screen) but not 3D (e.g., a head mounted display), which affects the expression of
3D information and further affects user cognition and understanding of 3D information. Virtual reality
(VR), augmented reality (AR), and mixed reality (MR) headsets are common 3D perspective carriers. VR,
AR, and MR enhance a user’s sense of experience and reality [4], but they are different from each other.
VR places a user in a digital scene with an adequate immersion experience, but it cuts off the physical
world [5–7]. AR presents a real physical world to a user and overlays digital content over the physical
world in real time [8–11]. However, AR simply superimposes virtual objects into the real environment.
It cannot properly handle the relationship between virtual objects and real objects. MR requires that
the system can correctly handle the relationship between virtual objects and real objects. MR adopts
the advantages of both VR and AR. It achieves a symbiotic blend between reality and virtuality. MR
transforms the physical world into the virtual world in real-time completely. As a result, it can see
real-world information and virtual reality information [12,13]. MR allows a user to experience depth,

ISPRS Int. J. Geo-Inf. 2018, 7, 60; doi:10.3390/ijgi7020060 www.mdpi.com/journal/ijgi

fast 3D modelling based on a specific set of rules. For the third process. The HoloLens has more advanced functions than those of traditional AR devices. gaze design. while AR cannot [14]. including gaze design. However. Creation of the 3D Asset A 3D asset mainly includes a geographic scene model and a third-party 3D model. a graphics processing unit (GPU) and a holographic processing unit (HPU) [15]. basic geographic data of Philadelphia were used to test the proposed methods and Holo3DGIS. Following this idea. Creation of the 3D asset layer provides 3D geographical scene content for the Holo3DGIS application. To achieve this goal. including the application of the HoloLens in disasters and emergency management [17]. The objective of this paper is to propose an architecture and method to leverage the Microsoft HoloLens in 3D geographic information. The geographic scene model is a 3D geographical model that is transformed from basic 2D geographic data. For the first process. the development of Holo3DGIS applications. the development of Holo3DGIS applications. 2. the three processes are the creation of 3D assets. Therefore. The model layer mainly provides . gesture design. the Mars exploration project [19]. two deployed methods are introduced. such as stereoscopic 3D displays. Section 3 describes the contents and results of the experiment and discusses the experiment. The architecture is designed according to the three processes for developing a holographic 3DGIS. The third-party 3D model is a model built by other modelling software. Section 4 concludes this paper and outlines future work.ISPRS Int. Compiler deployment of the Holo3DGIS application layer ensures that the application is deployed to the HoloLens or a high-performance computer. Development of the Holo3DGIS application layer is responsible for designing the interaction between 3D geographic information and users. creation of the 3D asset consists of a geographic information layer and a third-party model layer. and the compiler deployment of Holo3DGIS applications. and other fields [20]. Methods The main goal of this paper is to realize the integration of 3D geographic information and mixed reality with a head-mounted HoloLens display. there is a problem. using MR as the carrier of 3DGIS is promising and may overcome the problem of perspective 2D carriers in 3DGIS. The HPU is a proprietary chip that handles real-time spatial mapping and processing. and spatial mapping [16]. spatial sound design. and anatomical medical. and the transformation of a general 3D model format. voice design and spatial mapping. For the second process. including the deployment of small scene patterns and the deployment of large scene patterns. 2. and perspective. 60 2 of 16 spatial persistence. the main problem is the lack of a commonly shared data structure and interface between MR and 3D geospatial information. 2018. which provide material and content for the Holo3DGIS application. The remainder of this paper is organized as follows: Section 2 describes the implementation of the proposed method regarding the integration of the HoloLens and 3DGIS. Holo3DGIS is designed. a holographic three-dimensional geographic information system (Holo3DGIS) is realized. The HoloLens is a pair of 3D perspective holographic glasses with a central processing unit (CPU). Finally. As shown in Figure 1. with three processes: the creation of 3D assets.1. three methods are proposed. including the preprocessing of 2D basic geographic data. and the compiler deployment of Holo3DGIS applications. pharmacy. four interactive methods are proposed. a typical MR head-mounted display (Microsoft HoloLens) is utilized as an example in this paper to show how to express and visualize 3D geographic information from a 3D perspective. as shown in Figure 1. 7. the construction of virtual laboratories [18]. Geo-Inf. The geographic information layer is responsible for preprocessing 2D basic geographic information and rapidly executing the 3D model based on specific rules to create a geographic scene model for Holo3DGIS. Then. gesture design. The experimental results showed that Holo3DGIS can leverage the Microsoft HoloLens in 3D geographic information. Based on the above advantages. To address the problem of accessing 3D geospatial information content on the MR platform. J. the HoloLens have been widely used in the research of visualization applications.

3D asset creation method flowchart in the Holo3DGIS application.1. on specific 7. 3 of 16 3D Asset Create Import .g. Since the Igs Format converter 3D Asset Create default formats of different modelling software are inconsistent. Fast 3D viewing. the data requires preprocessing. System architecture diagram. Figure 2.1. Spatial data Geoprocessing 3D digital city modeling greening rate. of general The geographic 3D model format scene model is a 3D geographical model that is transformed from basic 2D geographic data.Asset x FOR PEER REVIEW 4 of 16 A 3D asset mainly includes a geographic scene model and a third-party 3D model. the rapid 3D modelling ofISPRS the Int. J. 3D point. 7.g. and the transformation of the general 3D model format. The model layer mainly provides a third-party 3D model for Holo3DGIS. Since the default formats of different modelling software are inconsistent. ISPRS 2. 3D 7. making full use of the existing 3D model. J. which provide material and content for the Holo3DGIS Transformation application. surface elements elements Figure 2. 2018.. Int. Basic geographic data mainly include digital elevation models (DEMs). Rapid 3D modeling based on specific rules constraint rule ModelBuilder Sunshine. making full use of the existing 3D model.ISPRS Int. buildings) and other data. 2018. and attribute service flow model based on construction data Rules area Preprocessing of 2D basic geographic data Base Height attribute Undulating Leveling FileGeodata DEM BaseMap terrain data DEM base 2D point. a model built by other modelling MaxAs shown 3dsin FigureMa 1. Geo-Inf. The third-party 3D model is Obj software. The three processes of the flowchart are the preprocessing of 2D basic geographic data. Creation of the2018. Figure 2 shows a flowchart of the 3D asset creation method in the Holo3DGIS. The data processing steps are ...1.fbx Fusion360 Maya 3D MAX ArcGIS SuperMap MapGIS CityEngine Geographic information layer Model layer Holo3DGIS App Developerment 3D Image Unity3D Audio Asse clips & • Spatial Mapping t Texture • Project management • Spatial Sound • Script/object linking C# • Gesture • Build-in 3D primitives Animation scriptin • Gaze • 2D user interface g • Scaling & Rotation • Mapping,user gaze Build Holo3DGIS App Compiler Deployment Voice、Gaze、Gesture UWP App Apk Holographic Scenes HoloLens Emulator HoloLens GISer Edit the deployment layer Figure 1. basemaps. Geo-Inf. 2.g. infrastructure and trees). data based J. polyline features (e. polygon features (e. System architecture diagram. Figure 1. The preprocessing of data includes two parts: the transformation of 2D geographic features into 3D geographic features and the levelling of surface data. format converters are needed to convert different model formats into a general 3D model with an FBX format (FBX is a free 3D data interchange format across platforms). 3D asset creation method flowchart in the Holo3DGIS application. streets). and the transformation of the general 3D PEER REVIEW model format. Upon obtaining basic geographic data. the rapid 3D modelling of the data based on specific rules. 60 3 of 16 a third-party 3D model for Holo3DGIS. The three processes of the flowchart are the preprocessing of 2D basic geographic data. line. format converters are needed to convert different model formats into a general 3D model with an FBX format (FBX is a free 3D data FBX Format interchange formatDwg across platforms). Figure 2 shows a flowchart of the 3D asset creation method in the Holo3DGIS. x FOR rules. Geo-Inf. Preprocessing of 2D Basic Geographic Data The preprocessing of 2D basic geographic data is based on the creation of the 3D geographic scene and the model. surface line. creation of the 3D asset consists of a geographic information layer and a third-party model layer. The geographic information layer is responsible for preprocessing Stl 2D basic geographic information and rapidly executing the 3D model based on specific rules to create a geographic scene model for Holo3DGIS. point features (e.

for rapid Geo-Inf. J.g.g.. and polygon features. J. line and polygon features can be transformed by batch processing into 3D point. polygon features (e. x FOR PEERusing REVIEWthe CityEngine software. 2018. the data requires preprocessing. then.g. The specific transformation algorithm flow is shown in Figure 3. Geo-Inf. Step 2: The 3D polygons are burned into the DEM. they need to be burned into the DEM. TIN Edit TIN TIN to Raster . basemaps. 5 of 16 Interpolate shape 2D point features 3D point features DEM Interpolate Simplify line shape 2D lines features Simplify line 3D lines features features Surface to Interpolate point shape 2D surface Point features Original surface features 3D surface features features Z attribute Figure Figure 3. Step 3: The processed vector data. Because most 3D polygons cannot merge into the DEM nicely. Burning includes two parts: editing the triangulated irregular network (TIN) and transforming the TIN into a grid. 60 4 of 16 2. 3D7.1. basemap. The specific algorithm flow chart for burning 3D polygons into DEMs is shown in Figure 4. polyline.modelling 2018.ISPRS Int. streets).. infrastructure and trees). 2D point. Basic geographic data mainly include digital elevation models (DEMs). buildings) and other data. The The algorithm algorithm flowchart flowchart of of 2D features transformed 2D features transformed into into 3D 3D features. 7. The data processing steps are as follows: Step 1: 2D features are transformed into 3D features based on terrain undulation (by the ArcGIS ModelBuilder tool). point features (e. polyline features (e. DEM image.1. The preprocessing of data includes two parts: the transformation of 2D geographic features into 3D geographic features and the levelling of surface data.. features. Upon obtaining basic geographic data. The main idea is to complete the transformation of 2D features into 3D features using the Z information of the features for interpolation. Preprocessing of 2D Basic Geographic Data The preprocessing of 2D basic geographic data is based on the creation of the 3D geographic scene and the model. 3. and other data are stored in the file geodatabase. File geodatabase storage and the management of 3D geographic data provide the data interface ISPRS Int.

Table 1. the main concept is the same. roads. These rules can These rulesthecan improve improve efficiency of the 3D modelling. J. However. content. 2018. and other attributes of the model to facilitate personalized customizations. then classified and batched for modelling for the same categories. Geo-Inf. fine segmentation elements are classified. after stretching. windows. 2. The symbol -->is similar to equate (=). Extrude(). paper. Then. this this method method can batch can quickly quickly andbatch and 3D generate generate models3D of models of buildings. texture. the rules rulesare arerepeatedly repeatedlyoptimized optimizedtotocreate create high-quality high-quality 3D3D content.2. after that. Contour Line. split().1. and other andgeographically. 7. other data geographically. Floor and (Wall | Tile |Window)) are variables in the CGA rules file. comp(). The model generated by these rules can also manually adjust the height. ideals.. rule files for buildings and roads are different).g. TIN Edit TIN TIN to Raster 3D Polygons Tincopy Burn DEM 3D Polygons hardline Figure 4. The result of the CGA file execution is a textured 3D building model. and Texture Map Interpretative statement: the terms in the following passage (Lot. This This paper. the buildings are divided and segmented according to the actual needs. infrastructure. Algorithm flow chart for burning 3D polygons into DEMs. Frontfacade. they are divided and segmented by objects that need to be modelled.data roads. doors. Through rule-driven rule-driven files. and the corresponding rules are defined for batch modelling. 60 5 of 16 Figure 3. The algorithm flowchart of 2D features transformed into 3D features. Lot -->extrude(height) Building // Creating the 3D buildings Building -->comp(f){front: Frontfacade | side:Sidefacade | top:roof} // Divide the building into three parts: the front. the left and the right Frontfacade -->split(y) Repeat{Floor} // Divide the building into target floors Floor -->Split(x) Repeat{Wall | Tile |Wall} //Split the floors into walls. and floors (Wall | Tile |Window) -->texture(texture_function) // Paste the texture for each segmented part with the mapping function . Different geographic objects use different rule files (e. the CityEngine. Table 1 shows the rule file for the 3D modelling of buildings. 2D surface Point features Original surface features 3D surface features features Z attribute ISPRS Int. Rapid 3D Modelling Based on Specific Rules Three-dimensional geographic geographic modelling modelling requires requires combined combined data and attribute data from the 2D geographic information. the 2D surface features are stretched based on the height property of the building. buildings. infrastructure. Data Preparation: 2D Building Vector. Repeat() and texture() are the functions of the CGA rules. First. textures are attached to the segmented sections using the map function. files. Based on Based on thethe CGA (computer (computer generated generated architecture) [21] rules from CityEngine. Building. usesuses the contoured the contoured and attribute and attribute information information of preprocessed of preprocessed GIS vectors GIS vectors and and raster rasterThrough data. The general process for modelling buildings. data. based based on on CityEngine CityEngine ideals. Finally.

the left and the right Frontfacade -->split(y) Repeat{Floor} // Divide the building into target floors Floor -->Split(x) Repeat{Wall | Tile |Wall} //Split the floors into walls. In this paper. normal vector. they must be transformed are incompatible and heterogeneous.3. The spatial mapping design is responsible for the interaction between the virtual world and the physical world and the seamless integration of the virtual world with the physical world. the development of the Holo3DGIS application is mainly comprised of two parts. Transformationofofthe theGeneral General 3D 3D Model Format Model Format Three Threedimensional geographicscenes dimensional geographic scenes require require the of the use use of existing existing 3D with 3D models models with formats different different formats (such as Max.g. three parts: the front. Information the file on the points.. doors.and other texture. The first part is the geographic scene model. the virtual geographic world. and material) Figure 5. Holo3DGIS App Development As shown in Figure 1.3. and DAE) to save money by reusing 3D models. x. 2. this study uses the Holotoolkit Software Development Kit (SDK) integrated with the Unity3D game engine. The second part is the interactive design. Figure 5.model format. FBX. vectors. Different formats formats (such as Max.1.mapping the J. The FBX format is a common and generally-accepted 3D The FBX format is a common and generally-accepted 3D model format. 7. and the physical world. and gesture design are responsible for achieving a natural and concise interaction between users and the 3D geographic information. in datatheare FBX general stored in storage the FBXstructure. storedthese Finally. and DAE) to save money by reusing 3D models. therefore. gaze design. obj. information arethese data are obtained. x. The 3D model provides the basic content for the development of the Holo3DGIS application. Geo-Inf. 2018. specificThe specific process conversion conversionis process shownisinshown Figure 5.1. obj. filenormal points. the Unity3D game engine is used as the core development platform. surface. thenisthe obtained. therefore. and floors (Wall | Tile |Window) -->texture(texture_function) // Paste the texture for each segmented part ISPRS with Int.2. FBX. Figure 6 shows a logic diagram of the interactive design. The Holotoolkit SDK integrates the basic class library that is responsible for the interaction design. In this paper. they must be transformed into a general into a general 3D 3Dmodel modelformat.the format is on Information found. planes. other Finally. Transformation 2. where voice design. format. Open the 3D model file to obtain the format · OBJ Format 3ds Format Ma Format Max Format Stl Format ReadObj() Read3ds() Readma() Readmax() Readstl() General FBX format storage structure(e. the 3D modelthe file 3D model file is obtained. Flowchart of the model format conversion. Different of 3D of models 3D models are incompatible and heterogeneous.material normal texture. In this paper.inFirst. windows. 60 function 6 of 16 2. different different formatsformats of 3D models of 3D models are transformed are transformed intoformat. In addition. into the FBX the FBX Theformat. material information and are obtained. format ofthen theis format the file of the determined andfile an is determined and analgorithm existing reading existing reading algorithm that matches that matches the format is found. planes. and the 3D resource is transformed into a uniformly-compatible 3D model. general storage structure. Through the abovementioned three steps. First. which is created by the 3D asset layer. . point.vectors. the creation of the 3D asset has been completed. which is responsible for the interaction between the user.

of Thethespecific gaze function is based onofthe implementation thelocation and direction gaze function of is based the user’s head using the physical ray of the Unity3D engine. which results in the tracking of objects in the holographic geographical scene. users and Holo3DGIS applications. the pre-set action is used. is the fastest this method is theway to obtain fastest wayinteractions to obtain between interactions between users and Holo3DGIS applications. after colliding with the holographic object. The spatial mapping design is responsible for the interaction between the virtual world and the physical world and the seamless integration of the virtual world with the ISPRS Int. It is also the first form of the user interaction user which is used interaction to track input. On the one hand. Holo3DGIS only recognizes three gestures: air-tap. When the user speaks the keyword. and manipulation. physical Geo-Inf. Manipulation controls the movement of the holographic objects. This method is used to store keywords and corresponding behaviors into the hash table. Figure Logic diagram 6. 7. The air-tap triggers click and double-click events. Logic diagram of of the the interactive interactive design. butof location also theadd voices user’s or visual sight.world. The navigation gesture is used to realize the rotation of holographic objects. • Gesture design: The function of the gesture design is to transform the user’s intention into actual action. The following rules need to be followed in the design of voice control: (1) create very concise commands. where voice design. and gesture design are responsible for achieving a natural and concise interaction between users and the 3D geographic information. and (4) avoid using the HoloLens system commands.library that is responsible for the interaction design. (3) avoid using single tone commands. it is convenient for user memory. 660shows a logic diagram of the interactive design. Voice control provides users with a voice command experience by setting keywords and corresponding behaviors in the application. initialize . On the other hand. • Voice design: The interaction of the user and the holographic object through voice control can anticipate the actions of the user’s hands. tracking this method technology. J. gaze design. The design track holographic of the The objects. (2) use simple vocabulary. Figure 2018. gazedesign functionof needs the gazeto not only function capture needs tothenot location of the user’s only capture the sight. which holographic is used to objects. By tracking and capturing the input gesture based on the location and state of the user’s hand. information but also as feedback add voices to or visual the user. navigation gesture. the collision results include the location of the collision point and the information of the collision object. design. this is convenient for system identification. the system automatically creates the appropriate response to manipulate the objects in the holographic geographic scene. •• Gaze Gaze design: design:Based Basedononeyeeye tracking technology. It is also the first form of theinput. The as information specific implementation feedback to the user. 7 of 16 Emit physical Tracking Gaze Get User Collide with rays at this holographic Design Position holographic objects position objects Holographic Object Movement/Zoom Manipulation Gesture Input Click Gesture Gesture air-tap Gesture double click Design judgment to select objects Navigation Gesture Rotate Hographic object Input statement Execute Query Keyword / Yes keyword Voice Design statement at hash binding table program Virtual scene 3D scene Spatial Determine whether the real Yes tracking space can accommodate the Mapping registration virtual scene Figure 6.

Figure 7. is accomplished independently by the HoloLens. With the holographic remote player technology. 2. When the volume of the Holo3DGIS application is within 900 MB. By taking into account the performance of the ATOM processor (the smallest and least powerful processor in Intel history) and the HoloLens. This is one of the indicators that distinguishes between MR and AR. the small scene deployment pattern is adopted. Therefore. The Deployment Pattern of Small Scenes The deployment pattern of small scenes deploys Holo3DGIS applications to the HoloLens devices directly. the scanned data and built-in triangulation are obtained to achieve modelling and digitization of the physical world. Holo3DGIS App Compiler Deployment The rendering ability of the HoloLens is limited compared to that of a high-performance personal computer (PC). 2. J. As shown in Figure 7. and registering voice commands to accomplish subsequent processing.ISPRS Int. world The virtual world refers to the geographic scene created by the 3D asset layer.3. x FOR PEER REVIEW 9 of 16 the interaction process. Figure 8 shows the Holo3DGIS application deployment diagram for large scenes. Microsoft also restricts the volume of the HoloLens application program. The physical world refers the world target refers space to theacquired target spaceby the HoloLens acquired scanning by the digitization. The spatial mapping design mainly consists of two steps. 2018. this paper designs a deployment pattern for large scenes to solve the problem. the large scene deployment pattern is adopted. the methods of realization between the voice design and the gesture design are different. this article uses two methods to deploy Holo3DGIS based on the application volume. registered into the virtual world through spatial mapping. When the Holo3DGIS application exceeds the limit of 900 MB. The process for computing and rendering the entire Holo3DGIS application. 2018. The Deployment Pattern of Large Scenes Since Holo3DGIS applications are oversized. Holo3DGIS Holo3DGIS application application deployment deployment for for small small scenes. world and is registered the physical into world is the virtual world through spatial mapping. the holographic consists of consists two parts: of the twophysical world parts: the and the physical virtual world andworld. the HoloLens is connected to the . by using the HoloLens deep camera to scan the target’s physical space. The purposes of the voice design and the gesture design are the same: to realize the interactions with objects in the holographic geographic scene. the holographic Figure applicationapplication 7. Through surface mapping. The main technology of the deployment pattern for large scenes is the use of the HoloLens holographic remote player middleware. 60 8 of 16 keyword recognizer instances by registering the keyword. the system provides users with feedback that cannot be combined with the virtual and real worlds. The second step is to calculate whether or not the digital physical space can fully accommodate the virtual geographic scene. First. 7. including ISPRS Int. If it can be accommodated.2. Geo-Inf. The physical to refers to the geographic scene created by the 3D asset layer.3. HoloLens scanningand the physical digitization. J. 2.3. 7. Geo-Inf. the HoloLens avoids placing images in obstructed positions and generates an experience that is contextualized to the user’s location. where the holographic application cannot exceed 900 MB [22]. The virtual the virtual world.1. If not. which leads to HoloLens rendering delays. the combination of the virtual and the real worlds is realized. scenes. which refers to the ability to model the physical world with a 3D camera. • Spatial mapping: Spatial mapping can introduce the virtual world into the real world and place the holographic 3D geographical scene into the most suitable location given the user’s physical space to achieve a combination between the real world and the virtual geographic world. However. Voice Virtual reality fusion Gesture Virtual Eye world tracking Multi-channel 3D interaction Register GIS user Hololens Physical world Holographic Application Figure 7.

Experimental Preparation The experimental preparation mainly includes three parts: data. the real physical world is introduced into world is introduced into the virtual three-dimensional digital city of Philadelphia.such as gestures During and voices. this process. Redmond.transferred The computer frompasses the HoloLens to a high-performance the rendered calculations of thePC. such ascapturing. Experiments and Results 3. CA. USA). in real time.1 (rapid 3D modelling based on specific rules). to the interactive PC computer information. and Windows 10 professional version (the HoloLens developer required either Windows 10 Pro. this paper uses basic 2D Philadelphia geographic data Based on the proposed methodologies. The and transmits HoloLens it to is only the HoloLens responsible forindisplaying. The main contents of the experiment are described below. software and hardware.4.. and USA). CityEngine 2014. and HoloLens (Microsoft. Research Institute. Switzerland).3. 2018. Figure The main8 shows the Holo3DGIS technology of the application deploymentdeployment pattern fordiagram for largeisscenes. 3. Holo3DGIS Holo3DGIS application application deployment deployment for for large large scenes. CityEngine (Environmental Systems Systems Research Institute. or Education systems). scenes. Redlands. An experimental scheme based on HoloLens and 3D GIS integration is designed.g. and voice. 3. Allthe HoloLens rendering is connected are calculations to the Unity3D engine transferred from theonHoloLens a PC.2 (2D GIS data preprocessing). and deployment). Figure 8. such as gaze. to the PCremote the holographic computer playerin real time. 9 of 16 2.g. This changes the interactive modes of traditional 3D GIS. Unity3D (Unity Technologies. The main purposes of the HoloLens (Microsoft. The Deployment Pattern of Large Scenes Since Holo3DGIS applications are oversized. The andHoloLens is only transmitting responsible interactive for displaying. Holo3DGIS application deployment for small scenes. Hololens Physical world Holographic Application Figure ISPRS Int. The specific deployment design is shown in Figure 8. and other data. Redmond. the real physical follows: first.g. Redlands. rapid 3D city modelling 3D GIS integration is designed. Enterprise. intermediate the holographic bridge between remote player the PC computer acts and the as an intermediate HoloLens. The technologies were implemented through ArcGIS technologies. acts as an During this process. CA. frames. HoloLensWith the holographic is connected to the remote Unity3D player engine technology. The main purposes of the experiment are as experiment are as follows: first. infrastructure and trees). delays. USA) platforms. shown in Figure The interactive process Voice Gaze is handled by PC Gesture Holographic WiFi connection Interactive process Holographic scenes Transfer Holographic data Hololens High-performance Holographic Remote computer Player Figure 8. this paper uses basic 2D Philadelphia geographic data and the developer version of Microsoft HoloLens. Zurich.3. The third goal is to manipulate the holographic digital city by natural interactive means. Software is used to develop and deploy Holo3DGIS applications. The second goal is to use the 3D stereoscopic the real world is integrated. CityEngine (Environmental (Environmental Systems Research Institute. through the use of spatial mapping technology. information. The combination the virtual three-dimensional digital city of Philadelphia. The computer holographic image passes the through rendered calculations wireless-fidelity (Wi-Fi)of and the holographic transmits it toimage through wireless-fidelity the HoloLens (Wi-Fi) in the form of frames.. All rendering to a high- calculations performanceare PC. polygon features (e. pattern for large which scenesleads to HoloLens to solve rendering the problem. The experiment mainly uses GIS technology. The specificbridge between deployment the PC design is computer and the8. USA). Geo-Inf. for large scenes. Hardware renders and runs the platform of the Holo3DGIS application. the use theofholographic the HoloLensremoteholographic playerremote player the technology. The experiment mainly uses GIS technology. . Figure this paper 8 shows the designs Holo3DGIS a deployment application pattern for large diagram deployment scenes to solve the problem. buildings). CA. Experiments and Results Based on the proposed methodologies. J. streets). through the use of spatial mapping technology. Unity3D (Unity Technologies.0f3 (core engine for the development of Holo3DGIS applications). polyline features (e. Data are the basic content of the experiment. CA. An experimental scheme based on HoloLens and and the developer version of Microsoft HoloLens. large scenes Theof the use main thetechnology HoloLens ofholographic the deploymentremotepattern for player large scenes isWith middleware. Switzerland). basemap. gestures and transmitting voices. 7. The technologies were implemented through ArcGIS (Environmental Systems Research Institute. • Software: ArcGIS 10. Visual Studio Community 2015 (a platform for Holo3DGIS application development. The second goal is to use the 3D stereoscopic display technology of the display technology of the HoloLens to bring a new and real 3D visual experience to the user. WA. The HoloLens to bring a new and real 3D visual experience to the user.. gesture. and mixed reality technologies. this paper Since Holo3DGIS designs applications a deployment are oversized.HoloLens. WA. San Francisco. USA) platforms. and mixed reality technologies. which leads to HoloLens rendering delays. the form ofcapturing. compilation. Zurich. The combination of the virtual world and of the virtual world and the real world is integrated. San Francisco. rapid 3D city modelling technologies. including a digital elevation model (DEM). Unity3D 5. middleware.2. point features (e. • Data: 2D basic geographic information data of Philadelphia [23].2. USA).1. on a PC. 60 7. The Deployment Pattern of Large Scenes 2.

The principle of . PC (Dell.2. gesture. to achieve interaction between users and 3D geographic information. and voice.1. and the urban scene is a virtual Philadelphia 3D geographic scene. • Spatial persistence: the digital Philadelphia scene is inserted into the real world by an anchor in space. ISPRS Int. 60 10 of 16 • Hardware: Microsoft HoloLens developer Edition [24]. 7. Introduction of the Virtual Geographic World into the Physical World Based on the methods and the design of the experiment. and the combination of the two is realized. ~3. Schematic map showing the introduction of the digital Philadelphia scene into the world. the 3D digital city of Philadelphia is superimposed onto the physical world.2.ISPRS Int.1 GHz). Gesture and voice interactions are used to implement specific GIS operations. The white background is a physical wall. This is the integration result of the holographic digital city and the physical world. gaze is the first input and the primary form of targeting within the holographic geographic scene. 3. It brings users a new way to interact with three-dimensional geographic information systems. Experimental Results and Analyses This experiment mainly realizes the introduction of a 3D geographical scene into the physical world and provides a holographic 3D geographic information system to users. using the mouse and the keyboard. In addition to interacting with people. gestures and voice instructions of the GIS user. • Interactive features: the digital city in the Holo3DGIS application can respond to the gaze. 3. can interact with the surface of the physical world.2. 7. Figure 9 shows a schematic map of the introduction of digital city of Philadelphia onto the physical world. Therefore. 2018. x FOR PEER REVIEW 11 of 16 Figure 9. as a part of the physical world.10 GHz (4 CPUs).2. the digital Philadelphia scene is the part of the physical world and has the characteristics of a real-time virtual object. when a GIS user returns. geographic scenes and objects in the digital Philadelphia scene are created by the light and voice of the users’ peripheral world. 3. J. Gaze reveals where the user is looking in the world and determines their intent [25]. The Holo3DGIS used in this experiment has the following characteristics: • In the Holo3DGIS application. New Interactive Mode of the 3D GIS Traditional 3D GIS only provides a single way to interact. Schematic map showing the introduction of the digital Philadelphia scene into the physical Figure 9. Geo-Inf. Out of the three interactions in Holo3DGIS. • Depth and perspective information: when the digital city in the Holo3DGIS application is placed in the physical world. the distance and angle between users and digital city entities are measurable. the digital Philadelphia scene remains in its original position. Intel (R) Xeon (R) CPU E3—1220 v3 at 3. while Holo3DGIS provides several types of natural interaction. 2018. the digital city. it has a wealth of information and perspective. Geo-Inf. J. physical world. such as gaze.

2018. users can browse the city of Philadelphia from different scales.2. Figure 11 shows a schematic of the gesture interaction. The light ball locations are where the user’s line of sight is located in a geographical scene. The difference between the two is mainly the level of experience and interaction. Gaze is equivalent to the role of a mouse with a PC. The principle of gaze design is the position and orientation of the user’s head. In addition. Figure 10 shows a schematic of the gaze interaction. Gesture and voice interactions are equivalent to the left key of the mouse. 2018. not their eyes.2. 12 of 16 Figure 10. using manipulation gestures. Gaze interaction schematic. which is called a gesture frame. Geo-Inf. New Interactive Mode of the 3D GIS Traditional 3D GIS only provides a single way to interact. which allows users to browse the digital city of Philadelphia from different angles. The voice design is similar to the gesture operation. the hand needs to be tracked first to capture the gesture. Figure 10. These light balls are used to implement feedback to the user’s line of sight. The range of HoloLens gestures is only within the conical range ahead of the device. gaze is the first input and the primary form of targeting within the holographic geographic scene. Gesture and voice interactions are used to implement specific GIS operations. Holo3DGIS is similar to traditional 3D GIS interactions. which allows the users to better understand the virtual geographic world. in addition to the query of geographic entities through air-tapping. and voice. It is most convenient for users to operate three-dimensional geographic information with voice commands. Out of the three interactions in Holo3DGIS. Geo-Inf. information 7. Gaze interaction schematic. As the user looks around the room. also include “navigation gestures” to achieve the rotation of holographic geographic scenes. The Holo3DGIS application automatically ignores all other gestures. The schematic shows the properties and spatial information of the holographic object that are viewed by the gesture interaction. using the mouse and the keyboard. 7. the former operates three-dimensional geographic ISPRS Int. x FOR PEER REVIEW systems in a more natural way. the gaze vector intersects with the holographic 3D geographical scene. Gaze reveals where the user is looking in the world and determines their intent [25]. there is a direction vector reminding the user to place the hand in the identifiable area. to determine their gaze vector. J. but it only anticipates hand gestures. while Holo3DGIS provides several types of natural interaction. The Holo3DGIS application only recognizes two gestures: “Put the index finger back to the face” and “Click (index finger points face down)”. and the intersection point is where sight should be focused. to achieve interaction between users and 3D geographic information.ISPRS Int. gesture. To achieve gesture interactions. 60 11 of 16 3. J. such as gaze. Gesture interactions. When the user’s hand moves away from the gesture box. .

The results show that the large scenario deployment mode can reduce the power to the two modes areconsumption obtained. Gesture interaction schematic. GPU are compared.2. a 576 use MBHoloLens for the Holo3DGIS usingapplication CPU and is deployed GPU areoncompared. ISPRS Int. the HoloLens can also be used as a rendering platform for geographic scene data. J. Figure 11. GPU Engine0 34. utilization rate of GPU The utilization rate Engine1 of GPU increased Engine0 isfrom 0% to from reduced 15. the same a large scenario. Figure 12. rateutilization of GPU Engine0 is reduced rate of the HoloLens fromCPU 34.96% tofrom is reduced 23. The results show that the large scenario deployment mode of the HoloLens. Geo-Inf. in the figure.18%.18%. Gesture interaction schematic. From the above experiments. 60 12 of 16 Figure 10. it can see that when the deployment of the Holo3DGIS the utilization rate of the HoloLensisCPU application is reduced transferred from from 71% to 47%. and refers to the HoloLens the utilization rate GPUof GPU headset. the HoloLens CPU and GPU engines declined. the rateholographic of the of use for theapplication HoloLens CPU was and GPU engines transferred from thedeclined. the12rates andof13. From the data in the figure. the HoloLens to The utilizationthe the computer. HoloLenswhiletothe therate of use for computer. Engine1GPUincreased Engine1 refers from to 0% thetocomputer 15. the increased from 0% to 15. Geo-Inf. Therefore.18%. GPU. thethe ratecomputer of use for GPU engine increased from 0% to 15. Gaze interaction schematic. J. when the deployment HoloLens to the computer. a 576 MB Holo3DGIS application can reduce the power is consumption deployed onofthe theHoloLens HoloLens. network and the performance of the computers meets the experimental requirements. the HoloLens Fromand theondata PC computers.ISPRS Int. Figure 11. 71%and to the 47%. x FOR PEER REVIEW 13 of 16 Figure 12. the same application is deployed by two deployment modes for a small In thisand scenario experiment. while the rate of use for the computer GPU it can be seen that if the speed engine of the network and the performance of the computers meets the experimental requirements.18%.2. GPU Therefore. Holo3DGIS application deployed on HoloLens headsets: CPU and GPU usage percentages. andAsonshown in Figures PC computers. 2018. Engine0 referswhento thethe deployment of the holographic application was transferred from the HoloLens GPU headset. 7. The CPU and GPU data percentages corresponding are obtained. theit rates of use can see thatforwhen the HoloLens using CPU the deployment of the and Holo3DGIS application is transferred from the HoloLens to the computer. .3. As shown in Figures 12 and 13. Holo3DGIS application deployed on HoloLens headsets: CPU and GPU usage percentages. it can be seen that if the speed of the HoloLens can also be used as a rendering platform for geographic scene data. 2018.81%. Performance Comparison In this experiment.3.96% to 23. 7.81%. 3. The application CPU and GPU is deployed by twocorresponding data percentages deployment to modes the two formodes a small scenario and a large scenario. Performance Comparison 3. From the above experiments. GPU Engine1 refers to the computer GPU.

The test contains the Microsoft HoloLens and applications and improve its designs more humanized. leaders of surveying. which enable GIS users to experience real 3D GIS. body sense.2. Additionally. and it brings a completely different form of interaction with the traditional interaction. The second evaluation index system is established after system is established after the first level index system. the HoloLens is a developer developer edition. Holo3DGIS applications can bring a perfect experience for the GIS user. Through the HCI experiment. and it changes the traditional vision. It is used as a questionnaire for testing users. to record the user’s experience on Holo3DGIS application. We also have some problems in Holo3DGIS from the questionnaire survey. to record the flexibility. and Masters in GIS. 2018. and the interaction problem in 3D GIS.2. 3. 13. and interaction interaction modes. leaders of surveying. Human Computer Interact Test The human computer interact (HCI) test is aimed to report the users’ experience with the The human computer interact (HCI) test is aimed to report the users’ experience with the Holo3DGIS application. the first level index system. It is used as a questionnaire for testing users. The HCI experiment developed the first evaluation index system. Limitations and Deficiencies Although these methods can achieve 3D GIS integration with HoloLens. interoperability. users have a high evaluation of the interaction interface of the Holo3DGIS application. the user may suffer from dizziness. comfort. The process of HCI experiment includes parts: the testers’ selection. 3D GIS. system efficiency. and visual clarity.and mapping-related computer science. who are professors in 3D GIS. design evaluation index system and questionnaire. The problem of voice that user is not satisfied with the voice control and comfort in use. we test the availability of Holo3DGIS Holo3DGIS application. comfort. using MR as the carrier of 3D GIS is promising. result report. In addition. Holo3DGIS application deployed on HoloLens headsets: CPU and GPU usage percentages. and flexibility. system efficiency. easy learning. the HoloLens is a modes. Human Computer Interact Test 3. We also have some problems in Holo3DGIS from the questionnaire survey. users are also very satisfied with the Holo3DGIS Holo3DGIS interaction. easy learning. J. The second evaluation index user’s experience on Holo3DGIS application. Figure Figure 13. and it brings a completely different form of interaction with the traditional 3D GIS. 60 13 of 16 Figure 12.2. PhDs in surveying and mapping or We invited 10 test subjects. Through the HCI experiment. Especially. the Hololens weight can cause neck strain and nose discomfort after extended use. and it overcomes the problem of perspective 2D carriers in 3D GIS. because the user had difficulty adapting to the device at the beginning. 7. The HCI experiment developed the first evaluation index system. which enable GIS users to experience real 3D GIS. body sense. users are also very satisfied with the visual impact and novelty of Holo3DGIS.and mapping-related companies. In sum. percentages. design evaluation index system and questionnaire. there are several problems: . However.4. In addition. and the test report. The test contains the Microsoft HoloLens the Holo3DGIS application deployed on the HoloLens.ISPRS Int. users have a high evaluation of the visual impact and novelty of Holo3DGIS. Additionally. bachelor students in finance. 3. Holo3DGIS Holo3DGIS applications applications deployed deployed on on PC PC HoloLens: HoloLens: CPU CPU and and GPU GPU usage usage percentages. bachelor students in finance. companies. and the test result three parts: the testers’ selection. The test result shows that the users have positive subjective feelings on the human computer The test result shows that the users have positive subjective feelings on the human computer interaction interface of the Holo3DGIS application.4. which contained which contained visual clarity. The problem of voice interaction is mainly related to the accuracy of the user’s pronunciation. The problem is problem is that user is not satisfied with the voice control and comfort in use. However. who are professors in 3D GIS. The edition. In addition. The main reason is as follows: Holo3DGIS overcomes the problem from the 2D perspective The main reason is as follows: Holo3DGIS overcomes the problem from the 2D perspective carriers and carriers and the interaction problem in 3D GIS. and it changes the traditional vision. interoperability. The process of HCI experiment includes three and the Holo3DGIS application deployed on the HoloLens. we test the availability of Holo3DGIS applications and improve its designs more humanized. Geo-Inf. Especially. PhDs in surveying and mapping or computer science. and Masters in GIS. Holo3DGIS will not recognize the voice if the user’s pronunciation is not correct.5. We invited 10 test subjects.

the screenshot of the spatial mapping captures both the real physical world and the holographic virtual image. • Holo3DGIS applications are suitable for the first-person perspective. To improve the resolution of the screenshot. DirectX 11 lacks a corresponding base library. 60 14 of 16 • The Holo3DGIS application does not consider how to organize and schedule geographic scene data according to the display requirements. At the same time. there are barriers and challenges when developing MR 3D environments for the visualization and inspection of 3D geospatial information. and compiler deployment of the 3D GIS application. but they can also be used for third-person browsing. The result of this delay is that the user may suffer from nausea. such as the the HoloLens. such as the HoloLens. body sense. third person viewing screens used at HoloLens conferences are shown through spectator view. 4. The main problem is the lack of commonly-shared data structures and interfaces between MR and 3D geospatial information. The proposed method for the development of holographic 3D GIS is based on three processes: generation of the 3D asset. . have a very limited field of view (approximately 30◦ by 17. therefore. The main reason is that the third-person perspective relies on the mixed reality capture system in the HoloLens device portal. and a high-performance computer for support. the user may suffer from fatigue because their eyes need to focus on the HoloLens plane within an inch of their eyes for quite some time. relying on a single lens reflex (SLR) and other high-definition multimedia interface (HDMI) image recording equipment can provide a wider view. 2018. which may narrow the user’s visual space. which causes the result to be not as good as wearing the HoloLens holographic glasses directly. to visualize 3D geospatial data may introduce a delay because geographic scenes exceed the computers rendering capabilities connected to the HoloLens. the effect is not satisfactory. while the gaze and gesture capture only the holographic virtual image.5◦ ). Basic geographic data of Philadelphia was used to test the proposed methods and the Holo3DGIS. • MR devices. To build individual development engines. and it requires designs from scratch. In addition. In addition. All of the experimental screenshots in this article are obtained through a third-person mixed reality capture screen stream. It changes the traditional vision. The experimental results showed that the Holo3DGIS can use the Microsoft HoloLens in 3D geographic information. and interaction mode. This is a method that requires two HoloLens devices. The mixed reality capture system uses a 2-million-pixel red-green-blue (RGB) camera to capture the video stream. Therefore. In addition. This method presents a new approach for data visualization in 3D geographic information systems. this paper proposes architecture and methods to use the Microsoft HoloLens with 3D geographic information. These three drawbacks may cause discomfort to the GIS user. The integration and interaction of virtual geographic information scenes and real-world data through mixed reality technology will greatly improve the GIS user’s application experience. To address this problem and access 3D geospatial information content with MR. 7. therefore. The Holographic DirectX 11 application programming interface (API) is needed. the HoloLens weight can cause neck strain after extended use. Currently. J. However. development of the 3D GIS application. the latter is not as efficient as the former.ISPRS Int. However. Advantages of the integration of MR and 3D GIS are as follows: • The real world is a complex and imperfect three-dimensional map. the pixels for the third-person perspective must be solved. Conclusions This paper focuses on the integration of MR and 3D GIS. using an MR device. but the development mode will be limited to the Unity3D engine. the difficulty coefficient of development is larger. and the resolution of the output picture can reach 1080p or even 4K. an HDMI-capable camcorder. Geo-Inf. Spectator view can be used to capture the mixed reality picture. • Holo3DGIS applications can be developed based on the Unity3D engine.

GIS broadens the scope and application for mixed realities. 385–396.. holographic rendering calculations could be transferred from a single computer to multiple computers for parallel computing. Jansen. Arch. Burlingame.. Further studies will focus on the following aspects: • Commonly-shared data structures and interfaces between MR and GIS need to be established. Currently.ISPRS Int. 2042017kf0211). Author Contributions: Xingxing Wu and Wei Wang conceived of the idea of the paper.. Verbree. 17–19 April 2014. Xingxing Wu. F. pp. Moreover. GIS access to the MR platform requires the transformation of data format. it outputs an interactive feedback loop between the real world. the National Nature Science Foundation of China (NSFC) Program (no. Syst. D. accurate interfaces and contracts for communication can be obtained. In Proceedings of the SPIE Engineering Reality of Virtual Reality. the dynamic display of changes in geographic processes in holographic mixed realities could be obtained. Valencia. In addition. the Nature Science Foundation of Hubei Province (no.. • MR provides a set of new input and output interactions for the GIS. 964–982. Graph. Rebelo. 185–225. 1999. Kraak. 23. Virtual Reality: A Review. Prakash. 54. On the one hand. Blanco. 41771422). Verbree. Spat. 60 15 of 16 • Holo3DGIS has perspective information and depth information. References 1. Using the computer-driven VR environment to promote experiences of natural world immersion. Xingxing Wu and Zeqiang Chen wrote the paper. MR geographic resources could be accessed on demand. F.C. Maren. On the other hand. Factors 2012. 2017YFC0803700). 4–5 February 2013. P. CA. Geogr. M. • By adding a geospatial analysis model service to the Holo3DGIS application.J. 9–11 March 2009. Inf. O. Solving the lack of computing ability with a single computer could be accomplished with larger geographic scenes. 7. E. the transformation of data formats could cause the data quality to become worse. Acknowledgments: This work was supported by grants from the National Key R and D Program of China (no. 1999. India. 2018. Frank. A. [CrossRef] 6. [CrossRef] 2. and the user to enhance the sense of authenticity throughout the user’s experience. it provides a new research direction and means for 3D GIS data visualization.. Yengin. L.. Duarte. M. Macatulad. . In Proceedings of the 2nd International Conference on Advance Trends in Engineering and Technology. and the fundamental research funds for the central universities (No. processing. S. Germs. Tingoy. [CrossRef] 4.G. 2016CFA003). 2014. Sabir. Using virtual reality to assess user experience. [CrossRef] 3. this could increase the cost of MR development. MR could gain access to virtual geographic scene services in a standard manner. G. Int. Noriega. Photogramm.V. • By designing a service-oriented Holo3DGIS framework by publishing 3D geographic data into services and designing samples. E. E. M. Inf.. Germs. Conflicts of Interest: The authors declare no conflict of interest. J.. storage. XL-2. 7.W. USA. E. Geo-Inf. Soares. Jaipur. 2017CFB616). A multi-view VR interface for 3D GIS. The virtual geographic scene service could be used by other MR users in the network. Interaction in virtual world views-linking 3D GIS with VR. Volume 8649. Spain. R. Gulluoglu. and spatial analysis of geographic data for a mixed reality. Hum. Sci. Jansen. which enables GIS users to experience real 3D GIS. 13. the innovative group program of Hubei Province (no. The use of virtual reality as an educational tool. 3DGIS-Based Multi-Agent Geosimulation and Visualization of Building Evacuation Using GAMA Platform. 87–91. • By adopting distributed computing technologies.. the virtual geographic world. R. Zeqiang Chen. Education and Development Conference. • GIS provides functions for the collection. ISPRS Int. Maren. 497–506.V. which would diminish the user’s experience. and Guanchen Chen analyzed the data and performed the experiments.... [CrossRef] [PubMed] 5. Remote Sens. J. F.A. Comput. In Proceedings of the 3rd International Technology. G.. J..

Haegler. 2006. Brigham. Moreau. [CrossRef] 14. H. [CrossRef] 10. 94. Dunston. R. Computer augmented environments: Back to the real world. Conf. [CrossRef] [PubMed] 15. Available online: https://www. 12. Reality Check: Basics of Augmented..microsoft. HoloLens Hardware Details. Ref. 2012. 399–412. Wonka. © 2018 by the authors. Procedural modeling of buildings. 5622. Neural Comput. 7..microsoft. 2002. 16. T. 24. Q. Müller. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons. W. Virtual Laboratory Based on Node. Performance Recommendations for HoloLens Apps. Š. Available online: https://developer. 2009. Available online: https://www. 1–8. 13. Sielhorst. O. Servières.. Bisták. 315–322. 25.gov/news/news.0/). L. 20. Ref. Licensee MDPI. T. [CrossRef] [PubMed] 21. [CrossRef] 17. Syst. Constr. Advanced Medical Displays: A Literature Review of Augmented Reality. Eng. 2016.S. Ahuja.J. Lect. M. [CrossRef] 19. N. Q. N. 31.. [CrossRef] 22. Serv. Notes Comput.R. Available online: https://www. nasa. Microsoft Collaboration will Allow Scientists to ‘Work on Mars’. 14. Wang.. Augmented Reality: Merging the Virtual and the Real. Whether You’re Working as an Individual or a Team. Geographic Information Data of Philadelphia. 2008. 451–467. Med. 212–218.php?feature=4451 (accessed on 18 December 2017). Holodisaster: Leveraging Microsoft HoloLens in Disaster and Emergency Management. Haffner.. 614–623. D. Mackay. P. A. The Commercial Suite and the Development Edition are Fully Equipped for All HoloLens Developers. Basel. Available online: https://developer.com/en-us/ hololens/buy#buyenterprisefeatures (accessed on 20 December 2017). 2008. J.com/en-us/windows/mixed-reality/gaze (accessed on 20 December 2017). Technol. Yang..baidu. NASA.. Uchiyama. H. 4. Berryman.. 60 16 of 16 8.researchgate. 170–172. 2003. User perspectives on mixed reality tabletop visualization for face-to-face collaborative design review. Disp. Feuerstein.. Wellner. Ulmer. and Mixed Reality.. X. D.microsoft.. 8. Stark. Virtual. Kučera. [CrossRef] 11. IEEE Spectr. 171–178. Navab.. P. Haematologica 1993. Switzerland. Comput. Technol.. Available online: https://developer. 1071–1103. 53. Van Gool.ISPRS Int. Graph. 18. Rar. 2017. M. Geo-Inf. Med. 2018. 13. 128–135. Augmented reality: A review. Autom. Learning to recognize three-dimensional objects. J. Serv.. 17.jpl. AR City Representation System Based on Map Recognition Using Topological Information. 23. Saito.. Roth.com/en-us/windows/mixed- reality/hololens_hardware_details (accessed on 12 November 2017). Insley. . Inf. P. Gaze. S. Furlan. Fed. The future of augmented reality: Hololens-Microsoft’s AR headset shines despite rough edges [Resources_Tools and Toys]. ACM Trans. Kozák. P. Sci. Sci. 21.js Technology and Visualized in Mixed Reality Using Microsoft HoloLens. E. 2017.org/licenses/by/4.microsoft. G.. E.com/en- us/windows/mixed-reality/performance_recommendations_for_hololens_apps (accessed on 20 December 2017).net/publication/312320915_Holodisaster_Leveraging_Microsoft_HoloLens_ in_Disaster_and_Emergency_Management (accessed on 18 December 2017). Available online: https://pan. M. P. 36. [CrossRef] [PubMed] 9. 25.com/s/1eSuAOCi (accessed on 20 December 2017). S.