You are on page 1of 2

2018 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)

Reproducing Material Appearance of Real Objects


using Mobile Augmented Reality
Seiji Tsunezaki* Ryota Nomura† Takashi Komuro‡
Saitama University Saitama University Saitama University

Shoji Yamamoto§ Norimichi Tsumura¶


Tokyo Metropolitan College of Industrial Technology Chiba University

Figure 1: Mobile augmented reality system for reproducing material appearance of real objects

A BSTRACT obtain the perception of material. Therefore, the system that repro-
duces the realistic material appearance of real objects is required.
In this paper, we propose a system that can reproduce the material
appearance of real objects using mobile augmented reality (AR). To reproduce the realistic material appearance, there have been
Our proposed system allows a user to manipulate a virtual object, mobile augmented reality (AR) systems which allow a user to ma-
whose model is generated from the shape and reflectance of a real nipulate a virtual object using his or her hand, which is superim-
object, using the user’s own hand. The shape of the real object is posed in the real space and that is presented on a mobile display or
reconstructed by integrating depth images of the object, which are a head-mounted display [3, 4]. However, these systems use man-
captured using an RGB-D camera from different directions. The ually designed models for virtual objects and do not use models
reflectance of the object is obtained by estimating the parameters of generated from real objects.
a reflectance model from the reconstructed shape and color images, There have been studies for measuring reflectance properties of
assuming that a single light source is attached to the camera. We real objects using light sources and cameras that are put in various
measured the shape and reflectance of some real objects and pre- directions around a target object [5, 6]. It is possible to measure
sented the material appearance of the objects using mobile AR. It exact reflectance properties by providing incident light from vari-
was confirmed that users were able to obtain the perception of ma- ous directions and observing reflected light to various directions.
terials from changes in gloss and burnish of the objects by rotating However, this requires large-scaled apparatus and long measure-
the objects using their own hand. ment time.
On the other hand, studies that estimate the shape, light sources
Keywords: reflectance property, reflectance measurement, object and reflectance property only from images captured by a monocu-
manipulation, mobile display, RGB-D camera lar camera have been conducted [1, 9]. These studies do not need
Index Terms: Human-centered computing—Human conputer in- a large-scaled apparatus or long measurement time. However, the
teraction (HCI)—Interaction paradigms—Mixed / augmented real- amount of information obtained from the images of a monocular
ity camera is limited, which may cause unstable estimation and incor-
rect result.
1 I NTRODUCTION Based on these studies, we propose a mobile AR system that
reproduces the material appearance of a real object, whose shape
With the spread of online shopping, it has become possible to see and reflectance are measured using a simple apparatus. A user can
and purchase products through the Internet. However, since users obtain the perception of material by manipulating a virtual object
cannot see a product with taking it in their hand, they often fail to that is displayed on a mobile display using his or her own hand.

* e-mail: tsunezaki@is.ics.saitama-u.ac.jp
† e-mail:
2 M OBILE AR SYSTEM FOR REPRODUCING MATERIAL AP -
nomura@is.ics.saitama-u.ac.jp PEARANCE OF REAL OBJECTS
‡ e-mail: komuro@mail.saitama-u.ac.jp
§ e-mail: yamasho@metro-cit.ac.jp Figure 1 shows the concept of our proposed system. A user can ma-
¶ e-mail: tsumura@faculty.chiba-u.jp nipulate a virtual object that is displayed on a mobile display using
his or her own hand. The model of the virtual object is generated
from the shape and reflectance of a real object, which are measured
using a simple apparatus consisting of an RGB-D camera, a light
source that is attached to the camera, and a turntable.

978-1-5386-7592-2/18/$31.00 ©2018 IEEE 421


DOI 10.1109/ISMAR-Adjunct.2018.00126
Figure 2: A user is manipulating an object whose model is generated from the shape and reflectance of a real object, using his or her own hand.

2.1 Providing perception of materials using Mobile AR to. By picking the corresponding pixel values, intensity change at
We use the mobile AR system developed by Nomura et al. [8] for the vertex over all frames is obtained.
presenting material appearance of objects. To realize natural inter- The reflectance property of the target object is estimated from the
action with virtual objects, the system provides visual information reconstructed shape, acquired change intensity, and the light source
that is consistent with the user’s somatic sensation by displaying information. We used the Blinn-Phong model [2] as the reflectance
user-perspective images on a mobile display. The system allows model. To obtain parameters of this model, the least squares method
the user to manipulate a virtual object using his or her own hand is applied to the reflectance model and observation values.
with six degrees of freedom (three degrees of freedom for trans-
3 D EMO D ESCRIPTION
lation and three degrees of freedom for rotation) by mapping the
hand pose to the object’s translation and rotation. The user can ob- In our demonstration, participants can experience a new mobile AR
tain the perception of materials from changes in gloss and burnish system that reproduces the material appearance of a real object. The
of the objects by rotating the objects using his or her own hand. participants can manipulate a virtual object with six degrees of free-
dom, whose model is generated from the shape and reflectance of
2.2 Measuring shape and reflectance of real objects the real object, using their own hand. The participants feel like they
are taking the object in their hand by grasping and manipulating the
To measure reflectance properties of real objects, the system cap- virtual object using their hand. In addition, the participants can see
tures depth and color images of a rotating object on the turntable us- the object from various directions such as from the side and from
ing the RGB-D camera. The shape of the object is reconstructed by the back by rotating the object.
integrating the depth images of the object captured from different Figure 2 shows examples of presenting the material appearance
viewpoints. The reflectance of the object is obtained by estimating of real objects which were measured using our measurement sys-
the parameters of a reflectance model from the reconstructed shape tem. A user is manipulating the virtual object whose model is gen-
and color images. erated from the shape and reflectance of a real object, using his or
Figure 3 shows the measurement scene. We connected Microsoft her own hand.
Kinect to a PC and used it for capturing the images of a target ob- The unique point of our demonstration is that our system real-
ject. We used a small LED light as a single light source and fixed izes reproduction of the material appearance of real objects using
the light to the camera. mobile augmented reality. Our system can be used in online shop-
ping such as users can see the material appearance of products using
the system. Our demonstration will draw a crowd since the system
.LQHFW is easy to use and presents the realistic material appearance to the
2EMHFW
participants.
/('/LJKW
R EFERENCES
[1] J. T. Barron et al. Shape, illumination, and reflectance from shading.
IEEE Transactions on Pattern Analysis Intelligence, 37(8):1670–1687,
2015.
[2] J. F. Blinn. Models of light reflection for computer synthesized pictures.
In Proc. of SIGGRAPH 1977, pages 192–198, 1977.
[3] M. L. Caballero et al. Behand: Augmented virtuality gestural interac-
7XUQWDEOH tion for mobile phones. In Proc. of MobileHCI 2010, pages 451–454,
2010.
[4] A. Hettiarachchi et al. Annexing reality: Enabling opportunistic use of
Figure 3: Measurement scene everyday objects as tangible proxies in augmented reality. In Proc. of
CHI 2016, pages 1957–1967, 2016.
[5] H. Li et al. Automated three-axis gonioreflectometer for computer
The 3D shape of the target object is reconstructed by integrat-
graphics applications. Optical Engineering, 45(4), 2005.
ing the depth images captured from different viewpoints using the
[6] G. Muller et al. Rapid synchronous acquisition of geometry and ap-
RGB-D camera. We used KinectFusion [7] for 3D shape recon- pearance of cultural heritage artefacts. In Proc. of VAST 2005, pages
struction. Based on ICP (Iterative Closest Point) algorithm, Kinect- 13–20, 2005.
Fusion aligns point clouds with high accuracy and estimates the [7] R. A. Newcombe et al. Kinectfusion: Real-time dense surface mapping
camera motion. and tracking. In Proc. of ISMAR 2011, pages 127–136, 2011.
The system calculates the correspondence between each vertex [8] R. Nomura et al. Mobile augmented reality for providing perception of
of the reconstructed 3D model and color pixels in all the captured materials. In Proc. of MUM 2017, pages 501–506, 2017.
frames. By reprojecting the vertex to all the camera viewpoints, it is [9] R. Xia et al. Recovering shape and spatially-varying surface reflectance
determined which pixel in each color image the vertex corresponds under unknown illumination. TOG, 35(6):1–12, 2016.

422

You might also like