Professional Documents
Culture Documents
2, APRIL-JUNE 2020
Abstract—We present a framework for the acquisition and and macroscopic roughness of a surface. These two tactile
parametrization of object material properties. The introduced dimensions received immense attention during almost a
acquisition device, denoted as Texplorer2, is able to extract
century, beginning with the duplex theory of roughness by
surface material properties while a human operator is performing
exploratory procedures. Using the Texplorer2, we scanned Katz [4], the research studies and survey of Lederman et al. [5],
184 material classes which we labeled according to biological, Klatzky et al. [6], Bensma€ıa et al. [7], [8], and Bergmann Tiest
chemical, and geological naming conventions. Based on these real et al. [9], [10]. In parallel, auditory perception during finger-
material recordings, we introduce a novel set of mathematical surface interaction is linked to the perception of roughness and
features which align with corresponding material properties is cognitively fused into an overall understanding of the touched
defined in perceptual studies from related work and classify the
materials using common machine learning techniques. Validation material according to Schroeder et al. [11] and Yau et al. [12].
results of the proposed multi-modal features lead to an overall Altogether, these EPs lead to a set of haptic material proper-
classification accuracy of 90.2% 1.2% and an F1 score of 0.90 ties which we intend to capture by manually recording object
0.01 using the random forest classifier. For the sake of surface interaction data using a novel human-operated object
comparison, a deep neural network is trained and tested on scanner, denoted as Texplorer2. Data traces resulting from
images of the material surfaces; it outperforms (90.7% 1.0%)
these scans provide the input for supervised material classifica-
the hand-crafted feature-based approach yet leads to more critical
misclassifications in terms of the proposed taxonomy. tion based on multi-modal features, e.g., content-based tactile
features. Additionally, such features also enable the potential
Index Terms—Surface Haptics, material scanning, content-
based features. reproduction, i.e. haptic display, of the material properties.
A. Tactile Dimensions
I. INTRODUCTION
There is still an active discussion about the feature space
Authorized licensed use limited to: SLUB Dresden. Downloaded on June 02,2023 at 07:51:49 UTC from IEEE Xplore. Restrictions apply.
STRESE et al.: HAPTIC MATERIAL ANALYSIS AND CLASSIFICATION INSPIRED BY HUMAN EXPLORATORY PROCEDURES 405
necessary physical interaction with an object, and hence, is extracts relevant haptic properties to perform the task of object
influenced by active scan-time parameters. For example, recognition. The robot applied a series of four of the six EPs
Romano et al. [18] have shown that acceleration signals result- (except lifting and contour following) on each of the 60 objects
ing from tool–surface interaction heavily depend on the scan- used in the experiments. They computed the mean and the maxi-
ning speed and force. Current technical systems that attempt mum value of the low-frequency signals measured with the
to solve this challenging task mainly concentrate on the BioTac and converted the recorded high-frequency signals into
acquisition and display of vibro-tactile signals for roughness a non-normalized energy spectral density (ESD). The total
acquisition and display using acceleration sensors during tool- energy of the ESD curve, the spectral centroid, the variance, the
mediated material interactions as investigated by Kuchenbecker skewness, and the kurtosis transform the ESD into single-valued
et al. [17] and Culbertson et al. [19], [20]. These vibro-tactile features. Kaboli et al. [25] extended this idea using a Shadow
signals are also used to recognize material surfaces using Hand equipped with five BioTac sensors to perform classifica-
robots [21]–[25] and during human freehand movements by tion of 120 objects with an accuracy of 100%. However, it is not
Burka et al. [26] and in our previous work [27], [28]. Beyond clear which of the presented material samples belonged to one
using tool-mediated accelerometer scanning setups, other meth- and the same material class. The ambiguous class names, such
odologies measure the vibro-tactile signal propagation through as carpets (textures 58–78), do not allow for a generalization of
the human skin as shown by Sano et al. [29], Vardar et al. [30] the material identification. It is also not clear whether such basic
and Visell et al. [31], or, are based on tribometry scans as for features are inherently descriptive enough to sufficiently repre-
instance conducted by Colgate et al. [32]. The vibro-tactile sent the material samples in a haptic context.
sensing capabilities can even exceed human performance of We argue that both approaches have their advantages and pro-
material identification, as shown by Fishel et al. [22] using the vide solutions for different applications. For example, the Bio-
BioTac sensor. Tac Toccare might be a good choice to scan planar material
Acquiring and modeling other relevant tactile dimensions samples for industrial or commercial applications, whereas
has also been studied extensively in the past. For example, ste- hand-held operator systems perform well as mobile and versatile
reoscopy-based approaches which determine the intensity of haptic property scanners. Generally, robotic systems can reliably
surface structures have been presented recently by Choi reproduce desired recording procedures and generate more sta-
et al. [33]. Comparably, Dulik et al. [34] use infrared light tionary material interaction signals on planar material samples.
during non-contact scans to infer the surface height profile. By contrast, a human operator can easily adapt the scan proce-
Thermal conductivities can be measured using thermistors or, dure to complex object geometries and freely scan any material
as shown by Aujeszky et al. [35] or Choi et al. [36], using a sample without prior preparation of the scanned object. Humans
thermal camera based on nondestructive infrared recordings. can evaluate much faster where and how a scan should be per-
Low-frequency hardness is usually modeled as spring stiffness formed, especially for very complex object geometries. Sec-
according to Hooke’s law as done by Basdogan et al. [37] and ondly, robotic solutions require great effort to reduce the motor
can be measured using normal force measurements that are vibrations which affect the accelerometer recordings during the
divided by the indentation depth. High-frequency hardness is robot-surface interaction. The cost and effort to set up and run
related to tapping responses measured with accelerometers such a system are significantly higher than a human-operated
and has been extensively examined by Okamura et al. [38] scanner. This point might be leveraged by cost scalability in the
and Kuchenbecker et al. [39]. future. To the best of our knowledge, the BioTac Toccare system
Two different approaches exist to capture all the aforemen- currently costs about $3 105 and the hand-held operator-
tioned tactile dimensions. A material scanner can be wielded wielded device that we propose in this article about $3 103 .
either by a human operator, or by a robotic scanning system
which precisely controls its interaction parameters.
C. Contributions
A human operator-based approach reveals a specific vari-
ability in these scan-time parameters. For example, a sensor- We envision a mobile human-operated material scanner to
ized tool generally is moved within a speed range of about capture multi-modal haptic data that is associated to meaningful
20 mm/s up to 240 mm/s, as shown by Culbertson et al. [19]. labels of real world material samples. We present mathematical
The application of different sensing domains and the design of formulations of tactile features which were conceptualized and
robust features as shown in our previous work [27], [28] can partly summarized by other experts in the field to allow for suc-
mitigate the dependencies of these parameters. cessful material classification in combination with audio-visual
Robotic scanning setups, as presented by Sinapov et al. [40], features and to provide parameters possibly used for tactile dis-
Jamali et al. [21], Fishel et al. [22] or Hoelscher et al. [24] have play. The following list summarizes the main contributions of
the advantage of precise and controllable movement procedures this paper.
on predefined material sample slices. The current state-of-the- We present a taxonomy based on geological, biological,
art scanning platform is the BioTac Toccare [15], which allows and chemical naming conventions and provide a novel
for a comprehensive scan of a predefined material sample to database (LMT Haptic Material Database) consisting
assess its tactile properties and can be considered as the gold of 184 material classes. The collected images and data
standard for automated planar tactile sensing. Beyond two- traces have a total file size of about 80 GB.
dimensional surface scans, Chu et al. [41] have shown that a We present a mobile material scanning setup
PR2 robot equipped with two BioTac sensors successfully (Texplorer2, two units are shown in Fig. 1), which
Authorized licensed use limited to: SLUB Dresden. Downloaded on June 02,2023 at 07:51:49 UTC from IEEE Xplore. Restrictions apply.
406 IEEE TRANSACTIONS ON HAPTICS, VOL. 13, NO. 2, APRIL-JUNE 2020
Fig. 2. Granite is an igneous (S1 ) stone (C6 ) which is either polished (left:
P2 ) or otherwise physically processed (right: pointed P3 ). The difference in
the surface structure heavily influences its haptic feel.
Publicly available material databases for haptics research Due to the large amount of woods available, we currently face
(e.g., from Culbertson et al. [43], Hassan et al. [44], or our prior a class imbalance towards C1 which needs to be considered
database from our website [42]), provide a great variety of dif- whenever the classification system has to distinguish on a coarse
ferent haptic data scans. However, these databases are based on scale between, e.g., woods and other classes. However, fine level
colloquial material names (e.g., stone tile) which can lead to material classification is not affected since we have the same
ambiguities across different industrial or commercial branches. number of observations (instances) for each material class.
In addition, these databases typically contain between 100 and We actively record new material samples to extend the already
200 differently shaped surface textures which often belong to extensive database. Several materials, e.g., concrete or glass,
the same material classes, though. Consequently, we performed exist in dozens of different processing shapes and mixtures. We
an interdisciplinary survey on the chemical, biological, and introduced the identifiers P and X to take these material-specific
geological material names and defined the material class labels properties into account and will significantly increase the number
according to these observations. of differently processed materials in the database in future work.
Considering the number of different sensing domains and
required data to represent these different material appearances,
A. Taxonomy
we still face an ongoing task. That is also why we suggest to fol-
Our naming convention works as follows. C denotes the low taxonomies like the one proposed in this article to unambigu-
major material class, e.g., metals or woods. The identifier S con- ously connect haptic data traces to the exact material. Note that
siders the material sub-class, e.g., hardwood or softwood. M the number of woods goes well beyond the 10,000, for example.
denotes a specific material, e.g., lead, tin, or aluminum on a fine The number of various minerals, stones, or metal alloys and their
classification level. Many materials come along with different processing shapes is similarly excessive.
Authorized licensed use limited to: SLUB Dresden. Downloaded on June 02,2023 at 07:51:49 UTC from IEEE Xplore. Restrictions apply.
STRESE et al.: HAPTIC MATERIAL ANALYSIS AND CLASSIFICATION INSPIRED BY HUMAN EXPLORATORY PROCEDURES 407
TABLE I
TAXONOMY OF THE CLASSES C AND SUB-CLASSES S. ALL MATERIALS M ARE INTEGRATED INTO THIS LABELING STRUCTURE
Fig. 3. Overview of the Texplorer2 units. The operator bag and laptop tray
are not depicted.
A. Texplorer2 Overview
Based on the idea of a haptic camera (haptography) from describe hardness, and thermal cooling data to describe
Kuchenbecker et al. [17], we see the necessity for an intuitive warmth. Beyond, it acts as metal and magnetic detector.
and flexible low-cost operator approach for the measurement This unit is first placed on the material sample to collect
of any material sample. We completely redesigned the static touch information, and thereafter, used to press on
Texplorer device from our previous work [28] and added new and squeeze the material sample.
components. Inspired by the design of the Proton Pack by TU3 captures surface images, magnified surface
Burka et al. [26], we embed the scanning system into a bag to images, and images with additional illumination using a
allow for mobile scanning using a laptop, a laptop supporter/ smartphone (Apple iPhone 8) camera.
tray (Rocket Packs) and a power bank (Anker). A USB data TU4 measures the mass and the volume to estimate the
acquisition card (NI USB-6002), attached to a Windows 7 lap- density of an object sample by placing the sample on
top Lenovo 80VR (i7-7700HQ CPU at 2,800 MHz, 16 GB top of the scale and inside the measuring jug.
RAM) collects the multi-modal sensor data. In addition to the
lower cost and simple scanning procedure, we further identify B. Texplorer2 Units and Data Processing
the advantage of possibly scanning materials of arbitrary
shape, geometry, and size. The operator performs multiple 1) Texplorer2 Unit 1: Micro- and Macroscopic Rough-
scanning steps with four mobile devices, which we denote as ness: The first Texplorer2 unit is rolled by the operator over
Texplorer2 units (TU1–4). the surface of an object as shown in Fig. 4. Three bearings
Fig. 3 shows the four main units of the Texplorer2 and the ensure that the influence of the applied force is mitigated and
following list summarizes the general operating principle and the distance between the surface and the IR sensor is constant.
the target haptic dimension of each unit.
TU1 records acceleration and audio signals to calculate a) Vibro-Tactile Signals: We use a three-axis accelera-
microscopic roughness features as well as reflectance tion sensor ADXL335 (Adafruit) with a range of 3g to
data for macroscopic roughness definition. It is slid acquire vibro-tactile signals v, i.e., to capture microscopic
over the material sample. roughness information with a sampling rate of 3,000 Hz. We
TU2 records normal and tangential forces to describe the use two different kinds of tool tips during separate scans: the
surface friction, pressure and folding FSR information to bare finger which propagates finger-surface interaction signals
Authorized licensed use limited to: SLUB Dresden. Downloaded on June 02,2023 at 07:51:49 UTC from IEEE Xplore. Restrictions apply.
408 IEEE TRANSACTIONS ON HAPTICS, VOL. 13, NO. 2, APRIL-JUNE 2020
Authorized licensed use limited to: SLUB Dresden. Downloaded on June 02,2023 at 07:51:49 UTC from IEEE Xplore. Restrictions apply.
STRESE et al.: HAPTIC MATERIAL ANALYSIS AND CLASSIFICATION INSPIRED BY HUMAN EXPLORATORY PROCEDURES 409
Fig. 7. Recorded magnetic data during static touch. Aluminum (left) does
not change the magnetic field, steel wool (middle) is paramagnetic, and cast
Fig. 6. Close-up view of the static touch part of TU2. The electric conductive iron (right) reveals significant magnetic properties (ferromagnetic). Diamag-
tissues reliably detect metals, the Peltier element and MLX90614 measure netic materials, which are currently not part of the database, would generate
thermal cooling, and an additional magnetometer unit (right) instantly detects positive values. All other materials lead to a mean value of DB ¼ 0.
the change of the external magnetic field caused by the neodymium stack
through a material.
material to identify metals. This procedure leads to an inverted
denote the recorded temperature data array as t. Note that the binary array m ~ 2 f5 V, 0 Vg, which we map to the values
required change in position can be automated in a future ver- m 2 f1; 1g. Only electrically conductive materials, e.g.,
sion of the Texplorer2. metals, pull down the DAQ input to ground (0 V) during scan
time, leading to an m ¼ 1 array. We set a binary value
b) Force Sensitive Resistor (FSR) Signals: Surface metal ¼ 1 if m > 0.
friction information is recorded using the index finger during As a minor contribution, we add a magnetometer to measure
a sliding motion of the object surface. Our second Texplorer2 the change of the magnetic field through a material sample.
unit contains two FSR (FSR400, Interlink) for normal and tan- More specifically, we apply a stationary external magnetic field
gential sliding force measurements, see Fig. 5, middle. These in a constant distance (thickness of casing) and measure the
sensors are used to determine the exerted normal force and the influence of the underlying material sample. We 3D-printed a
tangentially exerted friction force during the sliding move- casing which contains a 3-axis digital magnetometer
ment. In the following, we consider the intensity values of MAG3110 (Sparkfun) PCB and a stack of eight neodymium
these sensor measurements only; the orientation of the forces permanent magnets, shown in Fig. 6 (right). We zero-calibrate
is captured by the sensor placement (normal and tangential). the MAG3110 using these permanent magnets which expose a
We denote the force arrays f ¼ ½f1 ; f2 ; . . .fn from a scan as strong and constant magnetic field to rule out the influence of
f0fr;n and f0fr;t , respectively. The raw FSR sensor values f0 (in the Earth’s magnetic field. The resulting data traces are denoted
volts) grow logarithmically with increasing force. We convert as DB and shown for selected materials in Fig. 7.
each voltage value f 0 in f0 to a force applying the expression Magnetism is not a tactile property, but the change of the
magnetic field characterizes several metals and their alloys.
0 1 We can reliably distinguish between ferro-, non-, and diamag-
f ¼ 9.81 m=s2 0.01 kg ef 1:33 V (1)
netic materials within a second during static touch and thereby
according to the inverse force-voltage relation diagram (Inter- even exceed human material classification capabilities. The
link FSR integration guide), leading to the normal and friction thickness of the material sample ideally should be included in
force arrays ffr;n and ffr;t . Note that the normal FSR, which the future, still. Features based on metal or magnetic material
contacts the surface, is equipped with a 6 mm diameter PLA detection as well as thermal cooling are not influenced by
sphere. other sensing domains (such as vibro-tactile signals) and lead
The same FSR used for the normal force is also used to infer to further independent, and hence, characteristic features.
hardness properties. Additionally, a Flex Sensor (FS-L-0095-
103-ST, Spectra Symbol) collects information about the object d) Data Processing: Since the operator index finger
flexibility. For example, if a piece of paper lies flat on a table may create an offset force while touching the TU2 friction
and a robotic material scanner presses on it, the hardness of the FSR, we need to subtract the mean value of the first force sam-
underlying table is measured. However, humans can easily lift ples from the friction force signal.
the paper to fold it. In this case, the Flex Sensor will change its
angular value indicating flexibility. Note that the FSR for the
3) Texplorer2 Unit 3: Visual Properties: Visual informa-
normal force from the friction estimate is used alongside with
tion is linked to haptic perception as summarized by Lacey
the Flex Sensor during the pressing and folding procedure. This
et al. [2], and hence, contains information about relevant
step helps to distinguish between papers, leather, and fabrics,
material properties.
for example. The operator first presses on the material surface
and releases it, using the FSR for normal force acquisition. Sec-
ondly, the operator grabs the whole material, if possible, and a) Camera: The images of the materials, denoted as
tries to fold it between the thumb and index finger. The Flex Idisp , are captured using an iPhone 8 (Apple) camera with a
Sensor records the folding angle a f while the normal force FSR resolution of 3024 x 4032 pixels. To infer macroscopic details,
records the applied folding force ff in parallel. we further attach a magnifying lens (four times zoom) and
capture macro images Imacro . Note that the distance between
c) Metal and Magnet Detector Signals: As in our previ- each surface and the camera was held constant using a Styro-
ous work [28], we use two electric conductive tissues as elec- foam bumper to ensure the same recording distance and per-
trodes (see Fig. 6, left) during the static contact with a spective. We further use the camera flashlight to capture
Authorized licensed use limited to: SLUB Dresden. Downloaded on June 02,2023 at 07:51:49 UTC from IEEE Xplore. Restrictions apply.
410 IEEE TRANSACTIONS ON HAPTICS, VOL. 13, NO. 2, APRIL-JUNE 2020
D. Discussion
We do not explicitly measure the scanning parameters during
the device – surface interaction. Burka et al. [51] have shown
that variable scan-time parameters do not necessarily decrease
the classification performance whenever options exist to miti-
gate their influence by intelligent acquisition device design or
Fig. 8. Volumetric approach based on Archimedes’ principle. We use the application of robust features. Several aspects of the
different measuring jugs to infer the volume of arbitrarily-shaped objects by
displacing either water or micropearls (diameter 1 mm). The images showing Texplorer2 account for these dependencies, and we also identify
the exploratory procedures originate from Lederman et al. [1]. more options for future optimization. The bearings of TU1 com-
pensate for heavy pressure forces during the acquisition of
illuminated images Iillu which provide further information vibro-tactile signals. The fixed distance of the IR reflectance sen-
about the object’s surface reflectivity. sor to the surface also makes sliding scans robust in terms of sur-
face structure detection. The major remaining dependency is the
b) Data Processing: We remove 100 pixels from each scanning speed. Different operators may scan the surfaces differ-
border of the image to reduce the number of artifacts. ently fast, yet within a specific range of this dependency. Cul-
bertson et al.’s [19] work on tool-mediated setups shows an
4) Texplorer2 Unit 4: Mass and Volume: The material average speed of 130 mm/s which gives a good indication of
samples are placed on an electronic scale (Smart Weight, how fast humans wield such acquisition systems. We need to
range of 0.1 g – 2 kg) to measure their weight m which is accept a remaining variance in the vibro-tactile and audio signal
noted manually. The density r ¼ m V has been determined and scans, but are confident that the inclusion of other modalities,
published online for the majority of all available materials good device design, and robust features can still lead to success-
(e.g., in [49]). However, for a new query to the database, very ful classification.
complex object geometries, and undocumented densities, we Another important point to discuss is the number of modalities
still need an approach to estimate the volume V . This task is used in this article. Fig. 9 shows all currently recorded modalities
complex and an automatic robotic setup may not be able to for one material sample. Haptic material interaction is a multi-
extract the volume of complex object geometries without faceted procedure; we already employ nine different sensors
additional effort, time, and cost, whereas a human operator leading to twenty different kinds of signals. The application of
can intuitively carry out displacement techniques to estimate different sensors reduces potential correlations from subse-
the volume. For example, a set of measuring jugs can be filled quently extracted features, but poses challenges for the required
with water and the object is placed inside. The displacement is amount of data to be collected. We believe that content-based
proportional to the volume of the object. If the object could be haptic data acquisition, potentially used for display as well,
damaged by water, we use polystyrene micro-pearls instead. requires multi-modal data acquisition. Versatile material scan-
To read the measuring scale accurately and avoid uneven ners are required to cope with this challenge and potentially will
micro-pearl distribution, we place a vibration motor (Newgen add further sensors. For example, another sensing domain not
Medicals Vibration Plate) below the jugs and apply a constant yet examined on larger material databases is the electric suscep-
20 Hz vibration before each manual reading of the scale as tibility x, which is very characteristic for electrically non-con-
shown in Fig. 8. The overall displacement procedure has ductive materials.
major advantages compared to infrared or stereoscopic
approaches. First, the volume of concave objects or hole-con-
IV. FEATURES FOR MATERIAL CLASSIFICATION
taining ones can be determined, and second, the volume of
highly reflective objects can be measured. We define and use a set of tactile as well as non-tactile fea-
tures based on the multi-modal data traces introduced in the
C. Recordings following section. Note that several parameters of the features
are the result of feature optimization and were iteratively
The TUM Department of Architecture holds the largest col- determined.
lection of building materials in all German universities [45]. We
scanned most of the available material samples along with mate-
A. TacTUM Tactile Features
rials from our previous work [28] to set up the largest public hap-
tic database (to the best of our knowledge). In our current version Fishel et al. [16] propose a set of representative tactile fea-
of the database, two human operators scanned each material tures which we understand as a more detailed description of
(five times in total) and stored the data traces as. mat files which the five major tactile dimensions from Okamoto et al. [14]. To
Authorized licensed use limited to: SLUB Dresden. Downloaded on June 02,2023 at 07:51:49 UTC from IEEE Xplore. Restrictions apply.
STRESE et al.: HAPTIC MATERIAL ANALYSIS AND CLASSIFICATION INSPIRED BY HUMAN EXPLORATORY PROCEDURES 411
TABLE II
MACROTEXTURE NOTATION OVERVIEW
the best of our knowledge, there is no explicit mathematical As a third component, the IR sensor’s reflectance signal
definition of these features publicly available. Hence, and as a data is related to the height profile of a material surface. Dur-
major contribution, we propose the equations for the calcula- ing a scan, variations in the reflectance signals capture the
tion of these features in the following. change of the current surface-sensor distance. We calculate
the moving average rð100Þ and consider the standard deviation
1) Macroscopic Roughness: The macrotexture (MaTXTUM ), of the difference r rð100Þ as our reflectance variation (RV)
macrotexture coarseness (MaCOTUM ), and macrotexture regu- value
larity (MaRGTUM ) describe the macroscopic roughness features
of a material surface. RV ¼ sðr rð100Þ Þ (6)
Finally, MaTXTUM is calculated as a linear combination
a) Macrotexture MaTXTUM: We define a surface-
1 MaTX TUM ¼ 1 VS þ 2 FV þ 3 RV (7)
dependent macrotexture feature based on our macroscopic
roughness strength feature from [28]. Several sensing using the weighting parameters 1;2;3 which are determined
dimensions determine the physical existence of surface during feature optimization (Section V-A2) as (3, 2, 4).
structural patterns. We believe three different signals to be
appropriate for calculating this feature value. Table II shows b) Macrotexture Coarseness MaCOTUM: The existence
the auxiliary functions which we use for the MaTXTUM of visual patterns on a surface indicates whether the material
calculation. surface is coarse or not. Consequently, we apply the Harris
The first component is related to acceleration spikes in the Corner Detector (HCD) [52] to identify the n most character-
vibro-tactile signals. We define a spike detection algorithm istic image corner points in our surface images Idisp . We use
on the absolute values of the vibro-tactile sliding signals the speeded-up robust feature (SURF) extractor [53] on these
with N being the length of vabs . We first compute a 100- corner pixels to extract and sort the scale property sc in a
ð100Þ
point window simple moving average array vabs of vabs . descending order array sc (using the Matlab function
The moving average window length is an optimization detectSURFFeaturesðÞ). We consider the visual
Authorized licensed use limited to: SLUB Dresden. Downloaded on June 02,2023 at 07:51:49 UTC from IEEE Xplore. Restrictions apply.
412 IEEE TRANSACTIONS ON HAPTICS, VOL. 13, NO. 2, APRIL-JUNE 2020
Authorized licensed use limited to: SLUB Dresden. Downloaded on June 02,2023 at 07:51:49 UTC from IEEE Xplore. Restrictions apply.
STRESE et al.: HAPTIC MATERIAL ANALYSIS AND CLASSIFICATION INSPIRED BY HUMAN EXPLORATORY PROCEDURES 413
Fig. 12. Texplorer Unit 2 while being pressed on a deformable material sam-
ple. The normal force FSR (in red color) is used to infer the tactile compliance.
Fig. 11. Separation of normal and tangential (friction) forces for the calcula- Note that the indentation depth is constant (thanks to the attached hemisphere
tion of tactile stiction FSTTUM and sliding resistance FRSTUM . Note that the and TU2 casing) after the operator presses D x ¼ 4.0 mm into the deformable
increase of the friction force before the normal force increase may result from, material sample.
e.g., operator finger movements inside the TU2.
3) Friction: Tactile stiction (FSTTUM ) and sliding resis- noticeably adhesive. Secondly, the adhesion properties of
tance (FRSTUM ) can be considered equivalent to the static and any sensing device also differ from those of a human fin-
dynamic friction coefficient of a material surface. ger; a material surface scanned with a biomimetic sensor
made of silicone may result in a larger adhesive tack fea-
a) Tactile Stiction FSTTUM and Sliding Resistance ture value than expected. Thirdly, an operator can intui-
TUM
FRS : As long as the normal force ffr;n is about zero, the tively assess the stickiness of a material during several
operator did not noticeably press on the material. An increas- trials. Next, any oily or sticky surface might damage the
ing normal force then indicates the beginning of the static fric- device, and the detection of such surface conditions is still
tion phase. Normal and friction force ffr;t values are then used challenging for robotic systems. Lastly, the inclusion of
to model friction based on Coulomb’s law. the operator judgment represents another variable not
Since we can not directly measure the transition from stick- depending on any other sensing domain.
ing to sliding over the surface with TU2, the ending segment
of the static phase needs to be determined following another 4) Compliance: Operator-based material scanning
approach. We observe that the friction and normal forces approaches reveal an advantage over robot-based
increase differently whenever the operator starts touching the approaches for complex object structures in terms of hard-
material in order to perform a slide. We shift a 10 ms window ness estimation. Humans generally identify the most suited
over the signals to identify the maximum positive slopes locations for grasping and folding/pinching a material sam-
Dffr;max and Dfn;max and define tactile stiction as ple. We use the recorded normal FSR and Flex sensor sig-
nals during the pressing and folding actions of the operator
Dffr;max to infer hardness-related features. We first determine if a
6 FST TUM ¼ (16) material is deformable or not with regard to reasonable
Dfn;max
human interaction forces. The folding angle values a f of
the TU2 act as tie-breaker in this binary decision. We
Fig. 11 visualizes exemplary signal traces and the correspond- evaluate the expression sða af Þ < 1o prior to any compli-
ing slopes. ance feature calculation. If an object is rigid, all five com-
The following signals after the ending of the static segment pliance features are set to zero.
describe the sliding phase until the normal force again
decreases to zero. The FSR signals in the sliding segment are a) Tactile Compliance CCPTUM: Tactile compliance
denoted as ffr;FRS and fn;FRS , respectively. Consequently, slid- refers to the spring stiffness [37] of a material sample and is
ing resistance is calculated as defined as the ratio of the required normal force fn and
ffr;FRS achieved indentation depth Dx. Thanks to the design of our
7 FRS TUM ¼ (17) Texplorer2, we are faced with a fixed Dx and measure the cor-
fn;FRS
responding (normal) force as shown in Fig. 12.
b) Adhesive Tack ATKTUM: The effort required to break The tactile compliance then is
contact with a surface [15] is proportional to the adhesive
forces of a surface. We define it as a quintary operator-decided 0 a f Þ < 1o
if sða
9 CCP TUM ¼ fn;max (19)
value Dx else
8
>
> 0 if not adhesive with
>
> fn;max ¼ max f n;i
< 0:25 if slightly adhesive 1in
(20)
8 ATK TUM ¼ 0:5 if uncertain (18)
>
> being the maximum force and D x = 4.0 mm measured with
>
> 0:75 if adhesive
: a caliper. If the material sample is deformed up to this inde-
1 if strongly adhesivecases
ntation depth, the additional force is compensated by the
There are several reasons not to define this feature based whole TU2 casing. Note that a 3D printed PLA or rubber
on measurements from the sensing device. First, the major- (Shore hardness scale value > 75) hemisphere are applicable
ity of naturally occurring material surfaces are not in this context.
Authorized licensed use limited to: SLUB Dresden. Downloaded on June 02,2023 at 07:51:49 UTC from IEEE Xplore. Restrictions apply.
414 IEEE TRANSACTIONS ON HAPTICS, VOL. 13, NO. 2, APRIL-JUNE 2020
10 CDF TUM ¼ sða
af Þ (21)
Authorized licensed use limited to: SLUB Dresden. Downloaded on June 02,2023 at 07:51:49 UTC from IEEE Xplore. Restrictions apply.
STRESE et al.: HAPTIC MATERIAL ANALYSIS AND CLASSIFICATION INSPIRED BY HUMAN EXPLORATORY PROCEDURES 415
Authorized licensed use limited to: SLUB Dresden. Downloaded on June 02,2023 at 07:51:49 UTC from IEEE Xplore. Restrictions apply.
416 IEEE TRANSACTIONS ON HAPTICS, VOL. 13, NO. 2, APRIL-JUNE 2020
dominant color, and magnetic feature works well for metal feature space. Larger correlations can be expected for sub-
classification. The surface reflectivity is also highly character- dimensions belonging to the same major tactile dimension.
istic and independent of operator exploratory motions since For example, FSTTUM and FRSTUM belong to the dimension
the data is captured during static touch. of friction and are intuitively correlated.
Authorized licensed use limited to: SLUB Dresden. Downloaded on June 02,2023 at 07:51:49 UTC from IEEE Xplore. Restrictions apply.
STRESE et al.: HAPTIC MATERIAL ANALYSIS AND CLASSIFICATION INSPIRED BY HUMAN EXPLORATORY PROCEDURES 417
TABLE IV TABLE V
BEST FEATURES FROM DIFFERENT TESTED DOMAINS CRITICAL CORRELATIONS OF FINAL TACTUM FEATURES. NOTE THAT THE
FEATURES RELATED TO STIFFNESS ARE CORRELATED DUE TO THE FACT THAT
THE DATABASE CONTAINS MUCH MORE RIGID THAN DEFORMABLE MATERIALS,
AND HENCE, THE CORRESPONDING STIFFNESS FEATURE VALUES ARE SET TO 0
Authorized licensed use limited to: SLUB Dresden. Downloaded on June 02,2023 at 07:51:49 UTC from IEEE Xplore. Restrictions apply.
418 IEEE TRANSACTIONS ON HAPTICS, VOL. 13, NO. 2, APRIL-JUNE 2020
Fig. 17. Results of the fine classification with X10 . The dotted line shows the
Fig. 16. Results of the coarse classification with X10 (top: accuracy, bot- best accuracy of 90.2% 1.2% and F1 score of 0.90 0.01.
tom: F1 score) for different classifiers. The x-axis shows the sets of features
used to train the classifiers.
Fig. 19. DL network based on AlexNet [68] adapted to perform a fine classi-
fication of the 184 material classes in this article. Each convolutional layer is
followed by a batch normalization and rectified linear unit (ReLU) layer. If
not explicitly stated, the stride was set to 2 and the padding to “same”.
Fig. 18. Results of the fine classification with X3 (top: accuracy, bottom: F1 fix size of 768 x 330 x 3) and achieved a comparable result of
score) using only the three best features from different domains and related work. 95%. The actual bottleneck is the amount of correctly labeled
data to learn from. Existing deep learning-based networks
VGGVD [70] have been proposed for surface texture classifica- almost perfectly classify the available texture image databases
tion. Bello et al. [71] extensively compared deep networks with as summarized in [57]. Consequently, we consider our extensive
hand-crafted image features. Deep networks outperform other image database with the new material labeling taxonomy as
approaches in terms of texture classification whenever non-sta- highly suitable to run further validations. We sliced the captured
tionary textures and the presence of multiple acquisition condi- RGB images Idisp from Section III into [500 x 500] RGB patches
tions changes is considered. By contrast, hand-crafted features leading to 175 images per material class with no pixel overlap
are superior in distinguishing stationary textures under steady between slices. We rotated each image by 90o to augment this
imaging conditions and have proven to be more robust than data-set, leading to 350 images per material class. We applied a
CNN-based features for variation in the image rotation [71]. split of 80% training and 20% test data. Fig. 20 shows a few ran-
Due to the rapid progress of deep learning, a comparison to the domly selected patches from our DL database which we denote
hand-crafted feature approach is inevitable in this context. as TexTUM database.
We use the following hyper-parameters for training.
1) Network Architecture and Results: Liu et al. [57] sum- Solver: stochastic gradient descent (SGDM) with
marize that the network structures AlexNet, VGGM, VGGVD, momentum of 0.9, alternatively: adam optimizer
and TCNN literally solved the texture classification problem on Mini batch size of 128
the largest existing texture data-sets such as CUReT [72], KTH- Initial learning rate of 0.01
TIPS2 [73], or ALOT [74] by achieving a classification perfor- L2 regularization of 0.0001,
mance of 99% according to Cimpoi et al. [69]. That is why we dropout layers with 40% dropout
use our database images based in these previously defined net- Overall, the DL network achieves a classification accuracy
work structures and do not go further into the network design of 90.7% 1.0% after 40 epochs of training, and hence,
details. Fig. 19 shows the DL network which we use in this arti- slightly outperforms all hand-crafted approaches.
cle. It is transfer-learned from AlexNet [68] which has proven to
work excellently for object recognition. We also cross-validated 2) Misclassification Comparison of Deep Learning and
the adapted network with the ALOT database (images sliced to a Hand-Crafted Features: Overall, the image-based DL network
Authorized licensed use limited to: SLUB Dresden. Downloaded on June 02,2023 at 07:51:49 UTC from IEEE Xplore. Restrictions apply.
STRESE et al.: HAPTIC MATERIAL ANALYSIS AND CLASSIFICATION INSPIRED BY HUMAN EXPLORATORY PROCEDURES 419
E. Discussion ACKNOWLEDGEMENT
We can draw the conclusion that a selection of features We would like to thank the Chair of Building Construction and
from different modalities performs best when combined with Material Science (TUM Department of Architecture) for access
popular classifiers like Naive Bayes or random forest, which to the collection of building materials area, and specifically, Dipl.
also have been applied successfully to comparable classifica- Ing. Johann Weber for providing extensive information about the
tion tasks. Ten features from different domains already lead to materials. In addition, the authors would like to thank the anony-
a satisfying performance on different classification label detail mous reviewers for their constructive comments which helped to
levels, and we observe even larger accuracies and F1 scores improve this article.
Authorized licensed use limited to: SLUB Dresden. Downloaded on June 02,2023 at 07:51:49 UTC from IEEE Xplore. Restrictions apply.
420 IEEE TRANSACTIONS ON HAPTICS, VOL. 13, NO. 2, APRIL-JUNE 2020
APPENDIX
Authorized licensed use limited to: SLUB Dresden. Downloaded on June 02,2023 at 07:51:49 UTC from IEEE Xplore. Restrictions apply.
STRESE et al.: HAPTIC MATERIAL ANALYSIS AND CLASSIFICATION INSPIRED BY HUMAN EXPLORATORY PROCEDURES 421
TABLE VI
MATERIAL CATEGORIZATION CONVENTION TABLE. FOR A COMPLETE LIST PLEASE REFER TO OUR WEBSITE [42]
Authorized licensed use limited to: SLUB Dresden. Downloaded on June 02,2023 at 07:51:49 UTC from IEEE Xplore. Restrictions apply.
422 IEEE TRANSACTIONS ON HAPTICS, VOL. 13, NO. 2, APRIL-JUNE 2020
Authorized licensed use limited to: SLUB Dresden. Downloaded on June 02,2023 at 07:51:49 UTC from IEEE Xplore. Restrictions apply.
STRESE et al.: HAPTIC MATERIAL ANALYSIS AND CLASSIFICATION INSPIRED BY HUMAN EXPLORATORY PROCEDURES 423
[23] J. M. Romano and K. J. Kuchenbecker, “Methods for robotic tool- [48] H. Culbertson, J. M. Romano, P. Castillo, M. Mintz, and
mediated haptic surface recognition,” in Proc. IEEE Haptics Symp., K. J. Kuchenbecker, “Refined methods for creating realistic haptic vir-
2014, pp. 49–56. tual textures from tool-mediated contact acceleration data,” in Proc.
[24] J. Hoelscher, J. Peters, and T. Hermans, “Evaluation of tactile feature IEEE Haptics Symp., 2012, pp. 385–391.
extraction for interactive object recognition.,” in Proc. Humanoids, [49] E. Meier. The wood database. (2019). [Online]. Available: https://www.
2015, pp. 310–317. wood-database.com/
[25] M. Kaboli and G. Cheng, “Robust tactile descriptors for discriminating [50] Material Archiv. (2019). [Online]. Available: http://www.materia-
objects from textural properties via artificial robotic skin,” IEEE Trans. larchiv.ch/
Robot., vol. 34, no. 4, pp. 985–1003, Aug. 2018. [51] A. Burka and K. J. Kuchenbecker, “Handling scan-time parameters in
[26] A. Burka et al., “Proton: A visuo-haptic data acquisition system for haptic surface classification,” in Proc. IEEE World Haptics Conf., 2017,
robotic learning of surface properties,” in Proc. IEEE Int. Conf. Multi- pp. 424–429.
sensor Fusion Integrat. Intell. Syst., 2016, pp. 58–65. [52] C. Harris and M. Stephens, “A combined corner and edge
[27] M. Strese, C. Schuwerk, A. Iepure, and E. Steinbach, “Multimodal detector.” in Proc. Alvey Vision Conf., Citeseer, 1988, vol. 15, no. 50.
feature-based surface material classification,” IEEE Trans. Haptics, pp. 1–5.
vol. 10, no. 2, pp. 226–239, Apr.–Jun. 2017. [53] H. Bay, T. Tuytelaars, and L. Van Gool, “Speeded up robust features
[28] M. Strese, Y. Boeck, and E. Steinbach, “Content-based surface (SURF),” in Proc. Eur. Conf. Comput. Vision., Springer, 2006, pp. 404–
material retrieval,” in Proc. IEEE World Haptics Conf., 2017, 417.
pp. 352–357. [54] H. Tamura, S. Mori, and T. Yamawaki, “Textural features correspond-
[29] S. Sato, S. Okamoto, Y. Matsuura, and Y. Yamada, “Wearable finger ing to visual perception,” IEEE Trans. Syst., Man, Cybern., vol. 8, no. 6,
pad sensor for tactile textures using propagated deformation on a side of pp. 460–473, Jun. 1978.
a finger: Assessment of accuracy,” in Proc. IEEE Int. Conf. Syst., Man, [55] J. P. Den Hartog, Mechanical Vibrations. Courier Corporation, p. 55 ff,
Cybern., 2015, pp. 892–896. 1985.
[30] Y. Vardar, B. G€ uçl€
u, and C. Basdogan, “Effect of waveform on tactile [56] W. M. Bergmann Tiest and A. M. Kappers, “Tactile perception of
perception by electrovibration displayed on touch screens,” IEEE Trans. thermal diffusivity,” Attention Perception Psychophys., vol. 71, no. 3,
Haptics, vol. 10, no. 4, pp. 488–499, Oct.–Dec. 2017. pp. 481–489, 2009.
[31] Y. Visell and Y. Shao, “Learning constituent parts of touch stimuli [57] L. Liu, J. Chen, P. Fieguth, G. Zhao, R. Chellappa, and M. Pietik€ainen,
from whole hand vibrations,” in Proc. IEEE Haptics Symp., 2016, “From BoW to CNN: Two decades of texture representation for tex-
pp. 253–258. ture classification,” Int. J. Comput. Vision, vol. 127, no. 1, pp. 74–109,
[32] R. Grigorii, M. Peshkin, and J. E. Colgate, “High-bandwidth tribometry 2019.
as a means of recording natural textures,” in Proc. IEEE World Haptics [58] R. M. Haralick, K. Shanmugam, and I. H. Dinstein, “Textural features
Conf., Munich, Germany, 2017, pp. 629–634. for image classification,” IEEE Trans. Systems, Man, Cybern., vol.
[33] S. Shin and S. Choi, “Geometry-based haptic texture modeling and SMC-3, no. 6, pp. 610–621, Nov. 1973.
rendering using photometric stereo,” in Proc. IEEE Haptics Symp., [59] R. Monzel. Haralick texture features Matlab. (2019). [Online]. Avail-
San Francisco, CA, USA, 2018, pp. 262–269. able: https://de.mathworks.com/matlabcentral/fileexchange/58769-hara-
[34] M. Dulik and L. Ladanyi, “Surface detection and recognition using licktextu refeatures
infrared light,” in Proc. IEEE Elektro, 2014, pp. 159–164. [60] S. Sornapudi. Tamura features. (2019). [Online]. Available: https://
[35] T. Aujeszky, G. Korres, and M. Eid, “Measurement-based thermal github.com/Sdhir/TamuraFeatures
modeling using laser thermography,” IEEE Trans. Instrum. Meas., [61] T. Ojala, M. Pietikainen, and T. Maenpaa, “Multiresolution gray-scale
vol. 67, no. 6, pp. 1359–1369, Jun. 2018. and rotation invariant texture classification with local binary patterns,”
[36] H. Choi, S. Cho, S. Shin, H. Lee, and S. Choi, “Data-driven thermal ren- IEEE Trans. Pattern Anal. Mach. Intell., vol. 24, no. 7, pp. 971–987,
dering: An initial study,” in Proc. IEEE Haptics Symp., San Francisco, Jul. 2002.
CA, USA, 2018, pp. 344–350. [62] T. Giannakopoulos and A. Pikrakis. Introduction to audio analysis.
[37] C. Basdogan and M. A. Srinivasan, “Haptic rendering in virtual (2019). [Online]. Available: https://de.mathworks.com/matlabcentral/
environments,” Handbook Virtual Environ., vol. 1, pp. 117–134, fileexchange/45831-matlab-audio- analysis-library
2002. [63] P. Flach, Machine Learning: The Art and Science of Algorithms That
[38] A. M. Okamura, J. T. Dennerlein, and R. D. Howe, “Vibration feedback Make Sense of Data. Cambridge, U.K.: Cambridge University Press,
models for virtual environments,” in Proc. IEEE Int. Conf. Robot. Auto- 2012.
mat., 1998, vol. 1, pp. 674–679. [64] I. Guyon and A. Elisseeff, “An introduction to variable and feature
[39] K. J. Kuchenbecker, J. Fiene, and G. Niemeyer, “Improving contact real- selection,” J. Mach. Learn. Res., vol. 3, no. Mar., pp. 1157–1182, 2003.
ism through event-based haptic feedback,” IEEE Trans. Visual. Comput. [65] P. Domingos, “A few useful things to know about machine learning,”
Graph., vol. 12, no. 2, pp. 219–230, Mar./Apr. 2006. Commun. ACM, vol. 55, no. 10, pp. 78–87, 2012.
[40] J. Sinapov and V. Sukhoy, “Vibrotactile recognition and categorization [66] M. Strese, C. Schuwerk, and E. Steinbach, “Surface classification
of surfaces by a humanoid robot,” IEEE Trans. Robot., vol. 27, no. 3, using acceleration signals recorded during human freehand move-
pp. 488–497, Jun. 2011. ment,” in Proc. IEEE World Haptics Conf., Evanston, IL, USA, 2015,
[41] V. Chu et al., “Robotic learning of haptic adjectives through physical pp. 214–219.
interaction,” Robot. Auton. Syst., vol. 63, pp. 279–292, 2015. [67] Y. LeCun, L. Bottou, Y. Bengio, and P. Haffner, “Gradient-based
[42] M. Strese, C. Schuwerk, R. Chaudhari, and E. Steinbach, “Haptic texture learning applied to document recognition,” Proc. IEEE, vol. 86, no. 11,
database,” [Online]. Available: https://zeus.lmt.ei.tum.de/downloads/ pp. 2278–2324, Nov. 1998.
texture/ [68] A. Krizhevsky, I. Sutskever, and G. E. Hinton, “ImageNet classification
[43] H. Culbertson, J. J. Lopez Delgado, and K. J. Kuchenbecker, “One with deep convolutional neural networks,” in Proc. Adv. Neural Inf.
hundred data-driven haptic texture models and open-source methods Process. Syst., 2012, pp. 1097–1105.
for rendering on 3d objects,” in Proc. IEEE Haptics Symp., 2014, [69] M. Cimpoi, S. Maji, and A. Vedaldi, “Deep filter banks for texture
pp. 319–325. recognition and segmentation,” in Proc. IEEE Conf. Comput. Vision
[44] W. Hassan, A. Abdulali, M. Abdullah, S. C. Ahn, and S. Jeon, “Towards Pattern Recognit., 2015, pp. 3828–3836.
universal haptic library: Library-based haptic texture assignment using [70] K. Simonyan and A. Zisserman, “Very deep convolutional networks for
image texture and perceptual space,” IEEE Trans. Haptics, vol. 11, large-scale image recognition,” 2014, arXiv:1409.1556.
no. 2, pp. 291–303, Apr.–Jun. 2018. [71] R. Bello-Cerezo, F. Bianconi, F. Di Maria, P. Napoletano, and
[45] F. Musso and J. Weber, TUM collection of building materials. (2019). F. Smeraldi, “Comparative evaluation of hand-crafted image descriptors
[Online]. Available: https://www.ar.tum.de/en/ebb/collection-of-build- vs. off-the-shelf CNN-based features for colour texture classification
ing-materials/ under ideal and realistic conditions,” Appl. Sci., vol. 9, no. 4, p. 738, 2019.
[46] H.-J. Bargel and G. Schulze, Werkstoffkunde. Berlin, Germany: [72] K. J. Dana, B. Van Ginneken, S. K. Nayar, and J. J. Koenderink,
Springer-Verlag, 2008. “Reflectance and texture of real-world surfaces,” ACM Trans. Graph.,
[47] N. Landin, J. M. Romano, W. McMahan, and K. J. Kuchenbecker, vol. 18, no. 1, pp. 1–34, 1999.
“Dimensional reduction of high-frequency accelerations for haptic ren- [73] P. Mallikarjuna, A. T. Targhi, M. Fritz, E. Hayman, B. Caputo, and
dering,” in Proc. Int. Conf. Human Haptic Sensing Touch Enabled J.-O. Eklundh, The KTH-TIPS2 Database. Stockholm, Sweden: Compu-
Comput. Appl., Springer, 2010, pp. 79–86. tational Vision and Active Perception Laboratory (CVAP), 2006.
Authorized licensed use limited to: SLUB Dresden. Downloaded on June 02,2023 at 07:51:49 UTC from IEEE Xplore. Restrictions apply.
424 IEEE TRANSACTIONS ON HAPTICS, VOL. 13, NO. 2, APRIL-JUNE 2020
[74] J. M. Geusebbroek, Amsterdam library of textures (ALOT). (2019). Jonas Kirsch received the Bachelor of Science
[Online]. Available: http://aloi.science.uva.nl/public/alot/ degree in electrical and computer engineering from
[75] Y. Gao, L. A. Hendricks, K. J. Kuchenbecker, and T. Darrell, “Deep the Technical University of Munich and is currently
learning for tactile understanding from visual and haptic data,” in Proc. working toward the Master of Science degree. In the
IEEE Int. Conf. Robot. Automat., 2016, pp. 536–543. summer of 2018, he completed his bachelor thesis in
[76] M. Ji, L. Fang, H. Zheng, M. Strese, and E. Steinbach, “Preprocessing- the field of haptics and tactile data acquisition and
free surface material classification using convolutional neural networks has been working ever since at the Chair of Media
pretrained by sparse autoencoder,” in Proc. IEEE 25th Int. Workshop Technology.
Mach. Learn. Signal Process., 2015, pp. 1–6.
€
[77] H. Zheng, L. Fang, M. Ji, M. Strese, Y. Ozer, and E. Steinbach, “Deep
learning for surface material classification using haptic and visual
information,” IEEE Trans. Multimedia, vol. 18, no. 12, pp. 2407–2416,
Dec. 2016. Eckehard Steinbach (M’96–SM’08–F’15) received
the degree in electrical engineering from the Univer-
Matti Strese received the degree in electrical engi- sity of Karlsruhe (Germany), University of Essex,
neering from the Technical University of Munich, Britain, and ESIEE in Paris. From 1994 to 2000,
Germany. He received the Master of Science degree he was a member of the Research Staff of the Image
in July 2014. After graduating, he joined the Chair of Communication Group, University of Erlangen-
Media Technology, Technical University of Munich, Nuremberg, Germany, where he received the engi-
in September 2014, where he is a member of the neering doctorate in 1999. From February 2000 to
Research Staff and working toward the Ph.D. degree. December 2001, he was a Postdoctoral Fellow with
His current research interests include the analysis of the Information Systems Laboratory of Stanford Uni-
haptic texture signals, surface classification, and arti- versity. In February 2002, he joined the Department
ficial surface synthesis devices. of Electrical and Computer Engineering, Technical University of Munich,
Germany, where he is currently a Full Professor of Media Technology. His
current research interests include the area of haptic and visual communication,
Lara Brudermueller received the Bachelor of Sci- indoor mapping, and localization.
ence degree in management and technology with spe-
cialization in electrical engineering from the
Technical University of Munich, Germany. She is
currently working toward Master of Science degree
in robotics, cognition, and intelligence from the
Technical University of Munich. In the summer of
2018, she wrote her bachelor’s thesis in the field of
haptics and object surface recognition at the Chair of
Media Technology.
Authorized licensed use limited to: SLUB Dresden. Downloaded on June 02,2023 at 07:51:49 UTC from IEEE Xplore. Restrictions apply.