You are on page 1of 24

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/340956298

Dentronics: Towards robotics and artificial intelligence in dentistry

Article in Dental Materials · April 2020


DOI: 10.1016/j.dental.2020.03.021

CITATIONS READS

116 4,262

5 authors, including:

Jasmin Grischke Lars Johannsmeier


Hannover Medical School Franka Robotics GmbH
16 PUBLICATIONS 722 CITATIONS 23 PUBLICATIONS 670 CITATIONS

SEE PROFILE SEE PROFILE

All content following this page was uploaded by Jasmin Grischke on 07 November 2020.

The user has requested enhancement of the downloaded file.


Dentronics: Towards Robots and AI in Dentistry

Dentronics: Towards Robotics and Artificial Intelligence in Dentistry

Abstract

Objectives: This paper provides an overview of existing applications and concepts of robotic systems
and artificial intelligence in dentistry. This review aims to provide the community with novel inputs
and argues for an increased utilization of these recent technological developments, referred to as
Dentronics, in order to advance dentistry.

Methods: First, background on developments in robotics, artificial intelligence (AI) and machine
learning (ML) are reviewed that may enable novel assistive applications in dentistry (Sec A). Second, a
systematic technology review that evaluates existing state-of-the-art applications in AI, ML and
robotics in the context of dentistry is presented (Sec B).

Results: A systematic literature research in pubmed yielded in a total of 558 results. 41 studies related
to ML, 53 studies related to AI and 49 original research papers on robotics application in dentistry were
included. ML and AI have been applied in dental research to analyze large amounts of data to
eventually support dental decision making, diagnosis, prognosis and treatment planning with the help
of data-driven analysis algorithms based on machine learning. So far, only few robotic applications
have made it to reality, mostly restricted to pilot use cases.

Significance: The authors believe that dentistry can greatly benefit from the current rise of digital
human-centered automation and be transformed towards a new robotic, ML and AI-enabled era. In
the future, Dentronics will enhance reliability, reproducibility, accuracy and efficiency in dentistry
through the democratized use of modern dental technologies, such as medical robot systems and
specialized artificial intelligence. Dentronics will increase our understanding of disease pathogenesis,
improve risk-assessment-strategies, diagnosis, disease prediction and finally lead to better treatment
outcomes.

Objectives

Various types of robots are already part of our everyday life, they support production in industrial
applications, cut our grass and clean our floors [1]. The current generation of human-safe robots is
finally able to work directly with human co-workers, assist them and relieve them of tedious and
laborious routine tasks [2, 3]. At the same time, they show a high degree of performance and are
becoming economically more relevant. In the near future, this will lead to the introduction of robots
to various new application areas, in particular as co-workers not only in the industrial sector but also
in the public domain in the form of smart assistants [4, 5]. One of such promising areas may be
dentistry since it provides manifold opportunities for assistive work and automation of simple
routine tasks, supporting dental staff while improving quality of work and care. However, there are
still major challenges and limitations that need to be overcome [6].

In the next decades more and more manual work will be automated as an inevitable consequence of
technological progress. A study from Oxford University describes the susceptibility of jobs being
Dentronics: Towards Robots and AI in Dentistry

computerized and found dental assistant and dental hygienist to be at elevated risk, however
dentists were scored with a low susceptibility [7]. We agree with this assessment since robotics and
ML will likely not replace dentists but complement their expertise and support them in order to
reach a new level of accuracy, better patient experience and treatment success. Already in 1967
Jenkins described a robot dental secretary and today more and more robot application scenarios
have made it to reality [8]. Improving healthcare quality and efficiency is demanding since the
processes in place have not yet been properly standardized. Consequently, researchers and
practitioners increasingly advocate digital process redesign as a valuable way to reduce practice
variation and improve the overall quality of health care [9-11].

By making use of smart robot technology in dental offices, e.g. as assistants to the dentist, the
current technological infrastructure could be augmented in many beneficial ways. Among the most
motivating factors may actually be human factors, i.e. the potential mental and physical overload of
human assistants after long hours of work that requires constant focus or the lack of ergonomic work
environments. This may lead to further problems such as decrease of hygiene and mistakes during
medical exams, disease diagnosis or treatment planning. In addition, general negligence with regard
to daily routine tasks (e.g. rinsing of dental chair waterpipes, cleaning of surfaces and instruments in
the dental office) is possible. Since robot assistants do not tire and are able to repeat their workflows
indefinitely, human resources could be freed for other tasks that robots are not able to do, such as
direct social interaction with patients or other work with high cognitive requirements.

Beyond the use of robots in dental assistance more invasive use cases, such as autonomous implant
placement [12] eventually combined with complex 3D-navigation [13, 14] or tooth preparation
procedures have been reported in the literature.

Another area of interest is the use of robot technology in dental education. Universities train their
students with the help of advanced simulation and haptic device interaction or even full-body robotic
patients [15] to improve fundamentally important skills for future dentists prior to real patient
contact [16-19].

As it is with many technologies that are introduced into a new environment, various obstacles of very
different nature may occur. One such hurdle is the fact that technological developments for medical
applications are extremely costly and the necessary technological advances have only recently been
introduced into industrial and assistive robotics. Recently however, modern robot technology has
developed rapidly, and even medical robots have been applied in some clinical areas as well as in the
field of dentistry [20]. Another important aspect may be the unknown patient compliance and
acceptance among dental professionals [21]. A study among patients found that female participants
are less motivated to undergo robotic medical treatments compared to male participants,
irrespective of gender, willingness to undergo robotic supported medical interventions decreases
with the invasiveness of the procedure [21].

The possibilities of future combinations of robotics and dentistry are manifold. The most immediate
one is obviously to create a working implementation that adheres to all necessary requirements such
as human safety, human-centered interaction, human-robot communication, reliable and intuitive to
use manipulation skills, which can be employed on a large scale with specific initial use-cases [6].
Building on this thought we outline the technological advances that have been made which enable
the utilization of robotic support in dentistry. Furthermore, we motivate this step by describing the
opportunities that emerge from a combination of robotics, artificial intelligence, machine learning
Dentronics: Towards Robots and AI in Dentistry

and dentistry (Figure 1). In the following, we refer to this combination as Dentronics as described in
our previous work [6, 22] and defined below.

Figure 1 Vision of possible Robots and Artificial intelligence service network to support future dentistry.

Definition of Dentronics: Dentronics is the hypernym of a wide field of modern dental


technologies, such as medical robot systems and specialized artificial intelligence, including
hardware, software, human-machine interaction, robot safety and assistive functions. It describes
assistive, diagnostic, predictive and invasive human-centered technological tools, used to enhance
reliability, reproducibility, accuracy and efficiency in dentistry.

Section A

Background on developments in robotics, artificial intelligence (AI) and machine learning (ML)

The authors present their expert knowledge to point out recent advances in interactive force
sensitive robotics and related artificial intelligence (AI) methods and data driven machine learning
(ML) relevant for dentistry in order to provide the dental community with an overview of the current
state-of-the-art enabler technologies in Dentronics.

Interactive force-sensitive robotics

There are several enabler technologies in robotics, AI and ML that may contribute to the
development of novel methodologies in dentistry. In the following, we review these technologies
with regard to the vision of Dentronics shown in Fig 1. In this review we distinguish in the context of
robotics between mechatronics, control, safety and human-robot interaction. Furthermore, we
review AI technologies used in robotic applications essential to Dentronics, specifically also ML
technologies and algorithms currently used in Dentistry.

Mechatronics
The development of the mechatronic components of robots has long been an incremental process
that only seventeen years ago started to yield first light-weight robot designs that would many years
later on convincingly demonstrate the feasibility of human-robot collaboration [23]. This design
principle has been extended and further developed [24, 25]. The most distinguishing factor of these
systems beyond their lightweight design is their use of joint torque sensors in every joint and the
resulting inherently or actively compliant and reflexive joints [26].

Next generation industrial systems, in particular the medical domain has been a driving factor in
developing novel robot platforms such as the research system DLR MIRO [27, 28] or the commercially
very successful da Vinci system [29]. The latter is utilized worldwide by many large clinics for
minimally invasive surgeries such as prostatectomies, cardiac valve repair and gynecologic
procedures. Much research in this area has also been done on continuum robots [30] that may be
Dentronics: Towards Robots and AI in Dentistry

better suited for a number of tasks related to minimally-invasive surgery. An overview of research
and system design in the medical domain can be found in [31].

In recent years the development of collaborative and safe robots has accelerated significantly with
robotic systems which enable fenceless operation. With commercially available collaborative robots
such as Universal Robot systems (Universal Robot, Odense, Denmark) , the Rethink Robotics (Rethink
Robotics GmbH, Rheinböllen, Germany) systems Baxter and Sawyer [32, 33] and the KUKA iiwa and iisy
platforms (KUKA AG, Augsburg, Germany), robots became sensitive. Just recently, these developments
have come with the Franka Emika Panda (Franka Emika GmbH, Munich, Germany) robot to a point
where the tactile perception of the robot enables new applications and the systems started to become
affordable, yet at the highest capability, an important factor when considering the overall acceptance
of collaborative robots.

Control
The controller of a robot determines how its joints are actuated depending on sensory input and is
therefore directly related to the way it is explicitly or implicitly interacting with the environment. The
application of robotic systems in a dental scenario requires the capability of sensitive physical
interaction without causing any form of harm to the environment or the system itself. In robotics,
impedance control [34] became an essential framework that still serves as the basis for many
modern control systems fulfilling this requirement. It enables robots to safely interact with the
environment in a similar manner as humans do by mimicking the compliant human motor control
behavior. This stands in vast contrast to classical position-controlled industrial robots used at
assembly lines and in segregated workspaces from humans. Also of major importance for the
foundation of modern robot control is the operational space formulation developed in [35] that
fundamentally changed the way a robot task is seen and defined.

More recent research on the topic of control for collaborative robots, especially also in case of
kinematic redundancy, can be found in [36-38].

Safety
Human safety is among the most important aspects in a human-robot co-working scenario. Especially
in the last decade there has been extensive research [39] about this topic in order to determine
requirements of safe robots and studies regarding injuries which resulted in the development of
novel design paradigms that aim to make modern robots inherently safe. In particular, some works
[40] address safety issues and led to new safety standards such as ISO 13482-3 (Safety of machinery -
Functional safety of safety-related electrical, electronic and programmable electronic control
systems). Within the soft robotics paradigm joint torque sensors with suitable disturbance observers
are used for human-robot contact handling and, more general, for unified collision handling and
reflex reaction [41]. To prevent injuries of human co-workers in unintended collisions, safe motion
control methods were developed [42], determining the maximum allowed velocity for ensuring
human safety by means of an injury database and the current robot configuration, an essential
component to let humans and robots share physical spaces and seamlessly interact. Other work
developed methods towards safety-aware robot and task design [43]. Beyond considering the
isolated robot alone also its workplace in its entirety shall be considered when designing a safe work
environment [44]. Within the European Economic Area a robot system has to be provided with a CE
Dentronics: Towards Robots and AI in Dentistry

marking which confirms the safety, health and environmental protection standards for products [45,
46].

Human-Robot Interaction
The understanding, designing, and evaluation of robotic systems for use by or with humans is
referred to as human-robot interaction [47].

There are various modes of interaction between humans and robots. Physical human-robot
interaction (pHRI) [48] has become increasingly relevant in modern robotics [25] and will also play an
important role in Dentronics applications. A former review article found haptics to be one of the key
elements for robotics in dentistry [49]. Safe pHRI requires collaborative and sensitive robots and
suitable compliant behavior [40] made possible by appropriate controllers as described above. An
example for physical communication between humans and robots are haptic gestures [50]. They
allow the human to relay context-dependent intentions to the robot by touching it. The robot’s
estimation of external forces can then process amplitude, direction and duration as well as other
haptic signatures of the touch and use it as a means of communication. Extensions with machine-
learning-based data analysis even allow autonomous contact classification [51]. In a broader context,
button interfaces are also related to physical interaction, especially if they are mounted on the robot
in order to form an integrated direct control as for example on the Franka Emika Panda arm [52] or
the Baxter platform [33].

Another rather basic form of contact-based interaction are graphical interfaces such as [53] where it
is important to not overwhelm the user with information but focus on the current context [54].
However, these methods may not be as intuitive as direct physical interaction with the robot.

Contact-free interaction modalities like visual interaction [55] based on RGB-D camera systems or
similar technologies such as infrared have been researched for many years. Especially, in dental
scenarios this would be a great benefit as the dentist most of the time cannot move around freely to
interact with a robot directly. In order to establish such type of communication, visually recognizable
gestures like waving, hand opening/closing or pointing are utilized [56-58]. Visual communication is
under the assumption that the dentist can move hands but cannot move around to directly touch the
robot and after short auditory/ visual cues for commands the dentist has hands free while the robot
does its work autonomously. Furthermore, voice recognition and foot-pedal controlled commands
are possible.

More advanced techniques involve motion tracking of humans or face recognition [59]. Its
advantages are that no direct contact with the robot is necessary and a certain degree of comfort for
the human user is created since they do not have to alter their respective location. Despite major
advantages over the last years, the systems are still sensitive to factors such as different lighting,
obstacles and still require substantial computational resources.

Auditory interaction includes verbal communication and general sound signals to relay information
between humans and robots [60]. Simple sound signals are often employed to support other means
of communication e.g. a confirmation sound when pressing buttons or performing a haptic gesture.
Verbal communication requires much more advanced algorithms and still is prone to errors in
practice. While text-to-speech (TTS) is comparatively widespread and easy to implement, general
speech recognition is a difficult problem especially in a quite unstructured scenario such as a dental
office. By leveraging large amounts of data, neural networks have proven to be a promising approach
Dentronics: Towards Robots and AI in Dentistry

[61]. A recent overview can be found in [62]. Other important work, early and recent, can be found in
[63-66].

Programming
Before the emergence of collaborative and soft robots in the last decade most platforms required a
tedious and time-consuming process in order to be programmed even for the simplest tasks. Today’s
technological level allows for much more intuitive and efficient programming schemes [67].
Kinesthetic teaching is a very common way to program collaborative robots and involves a human
user directly guiding the robot and teaching it the intended motions and skills. The more and more
intuitive and human-centered design of user interfaces with powerful task setup and programming
enable non-professionals to learn interactions with robots in very little time [68].

Machine Learning

The term machine learning (ML) summarizes different methods for making use of possibly large
amounts of data to learn and self-improve from own experience. It distinguishes itself from classical
artificial intelligence (AI) which is mostly concerned with planning and search problems, knowledge
representation and logical reasoning. A comprehensive overview of ML in general can be found in
[69]. More specialized branches of machine learning are e.g. transfer learning [70], reinforcement
learning [71, 72], object detection [73, 74] and speech recognition [75, 76]. Examples of recent
applications and research works are translators such as DeepL [77], diagnosis tools [78], speech
recognition frameworks [79] or manipulation learning schemes for robots [80].

Artificial Intelligence

In the context of robotics, the field of autonomous task planning comprises methods from classical
artificial intelligence such as tree-search algorithms and symbolic task planning [81] that are used to
autonomously plan a sequence of actions in order to achieve a desired goal. Most of these methods
originate from works that are unrelated to robotics yet very much applicable. An overview of
heuristic planners can be found in [82] and in [83] an incremental version of the well-known A*
algorithm (which is used in various contexts to search for optimal solutions, e.g. path planning, action
planning or scheduling) is developed. Furthermore, [84] and [85] provide a compact summary of
many elements in task planning.

Generally, autonomous robots depend on a knowledge base in order to reliably perform their
assigned tasks. Examples for stored knowledge are taxonomies of skills [86-89], i.e. the capabilities of
the robot, 3D-maps of the surroundings [90] or general information about the robot [91].
Furthermore, they have to be able to reason about current events and new information [92, 93] in
order to adapt to new situations.

Section B

Materials and Methods


Dentronics: Towards Robots and AI in Dentistry

Systematic review of the literature

Materials and Methods: A systematic review of the literature was performed in one database
(pubmed) to identify literature on ML, AI and robotic applications in dentistry. The following search
terms were used: “dent* AND robot*OR artificial intelligence OR machine learning”. All research items
not older than five years published until 01.01.2020 were included. Furthermore, a hand search in the
reference lists of the identified literature and a second electronic search was performed in 3 databases
(googlescholar, pubmed, IEEE xplore). Data extraction was performed from full-text analysis. Extracted
data included the area of application within dentistry (Table 1 and 2) and for robot technology the
estimated readiness level according to the information provided in the original research papers (Figure
2). The presented technologies were assigned to a technological readiness level according to the
information given in the publications. Systems were assigned to level 1 when a basic description of a
system principle is observed and reported. Level 2 is achieved when a full concept for a system was
formulated. Level 3 and 4 include In-vitro validation, whereas level 5 and 6 are field validations.
Technological readiness level 7 is achieved when a prototype is demonstrated in the operating
environment. Level 8 is reserved to qualified complete systems and level 9 is reached at the time when
a system has been proven in end-use operations. Furthermore, the application areas were put together
in order to point out the advances within the different fields.

Full-text analysis of the identified original research and review papers was performed to extract and
present relevant state-of-the-art knowledge on present activities of the dental community in the field
of Dentronics.

Figure 2 The technological advance of a system may be described using the technological readiness level (TRL1-9)
introduced by Mankins in 1995[94].

Results

Robotics in dentistry

In the following section we provide a technology review on state-of-the-art robot applications in


dentistry, which have been published so far. Literature research in pubmed identified a total of 137
research items on robots in dentistry. Out of these 14 were included for data extraction, another 35
original research papers were identified through hand search and in other electronic databases. In
total 49 studies on robot technology in dentistry were included for data extraction. Table 1
summarizes all studies and gives an overview of the actual technology readiness level of the systems
presented in the reviewed literature and sums up the studies in the same field. Review of the current
literature identified 10 application domains of robots in dentistry.
Table 1 sums up the current literature on robot applications in dentistry and informs about the estimated technology
readiness level of the systems.

Applications involving robot technology was published in the following 10 application areas in the
scope of Dentronics:
Dentronics: Towards Robots and AI in Dentistry

1. Maxillofacial surgery
A handful of reviews describe possible scenarios and applications of digitalization in maxillofacial
surgery with a special emphasis on the combination of implant dentistry and prosthodontics [95-99]
or on the positive effects of robot-assisted surgery in head and neck cancer [100]. Several authors
made advances towards fully automated implantology [98, 101-107], however accuracy and
reliability seem in general not to be adequate until today. There are a few exceptions to this,
however, those systems are not widely available or accessible i.e. intuitively usable by dentists
[99].The development of computer-assisted implant surgery based on the concept of prosthetic-
driven implantology and CT-scan analysis have been reviewed. Although advances in technological
readiness have been made, issues such as high costs and inherent complexity of the techniques and
hardware utilized are still to be overcome. For example, a six-axis robotic arm was proposed to assist
a surgeon in an orthognathic surgery, based on 3D data from a CT, positioning is to be planned by the
surgeon preoperatively. During the operation the robot is proposed to register patient movements
by real-time tracking. Orthognathic surgery was performed in a preliminary experiment with a
jawbone skull phantom [108]. However, Woo et al. found that before automated orthognathic
surgery can be tested in human studies the available software needs to be optimized and safety of
the hardware augmented [108].

A pilot phantom study concluded that implant placement with a six-axis robotic arm can improve
accuracy of the operation in zygomatic implant placement. The authors recommended to implement
force feedback by adding a haptic device to the presented system in future research [107].

A surgical robotic application which has made it to reality is an invasive robotic assistant for dental
implantology. It was permitted for operative use by the FDA (Food and Drug Administration) in
March 2017. The product is called Yomi and is produced by Neocis (Neocis Inc., Miami, USA). Based
on 3D-data from a CT the dentist plans the implant position. During surgery the robotic arm drills the
hole in the jawbone and places the implant according to the planning while the dentist can follow the
position of the burr in real-time, owing to the software, which allows the dentist to adjust placement
position of the implant intraoperatively. Also, in 2017, in China, a robot implanted two implants in a
human fully autonomously observed by a dentist who did not intervene in the process. This case was
published in the China Morning Post and the robot was developed by the Fourth Military Medical
University’s Stomatological Hospital and Beihang University. A study on a robotic system for guided
implant placement initially tested the robustness, usability and safety of a new method which has
been integrated into a navigation system, called RoboDent, for dental implant surgery with 45
patient datasets [109].

Studies found real-time tracking to be one of the key elements to be addressed in order to create a
system capable of performing implant surgery in patients [110]. Robotic implantology was also
proposed to enable drilling of complex forms of dental implants and by that enhance flexibility in
prosthetic rehabilitation in patients with reduced bone supply [111]. Other studies describe robot-
assisted implant placement [112] or phantom experiments for dental implant placement with the
help of industrial robot systems (MELFA RV-3S, Mitsubishi Electric Corporation, Tokyo, Japan). Those
industrial robots have six degrees of freedom (DOF), a position repeatability of ± 0.02 mm and
showed an error of 1.42 +/- 0.70 mm [104].

Human‐related factors (such as reduced concentration, trembling, distraction or reduced vision)


affect the accuracy and safety in maxillofacial surgery. A study proposed an autonomous surgical
Dentronics: Towards Robots and AI in Dentistry

system aiming to conduct maxillofacial surgeries under the assistance and surveillance of the
surgeon. A navigation module and a robot were seamlessly integrated into this system and a drilling
experiment was conducted on five 3D printed mandible models to test the pose detecting capability
and evaluate the operational performance. The experiment showed that this system was able to
successfully guide the robot finishing the operation regardless of the mandible position [113].

2. Robotic education
The idea of a dental training robot was first described in 1969 [114]. The application of a humanoid in
dental education was tested in 2017. A humanoid, a full-body patient simulation system (SIMROID),
was tested in a study among dental students to find out whether a robotic patient was more realistic
for the students to familiarize with real patients [16] than the usually used dummies. “Hanako”, the
SIMROID is standing 165 cm tall. It comes with a metal skeleton and vinyl chloride-based gum pattern
of skin. “Hanako” is an interesting contribution to education in dentistry as the SIMROID is imitating a
human in its actions and expressions. It can verbally express pain, roll its eyes, blink, shake its head in
pain, perform movements of jaw, tongue, elbow and wrist. Furthermore, it can even simulate a
vomiting reflex with a uvula sensor, and also simulate functions to induce bleeding and saliva flow
[115]. Tanzawa et al. introduced a medical emergency robot with the aim to help dental students to
get familiar with emergency situations [116]. Another robotic educational equipment described in
the literature is the ROBOTUTOR. This tool was developed as an alternative to a clinician to
demonstrate tooth-cleaning techniques to patients. It is a robotic device to train and show brushing
techniques. A study among patients showed that the ROBOTUTOR was the most attractive method
(according to patient evaluation) for dental health care education compared to other methods
(clinician or video audio tutorial). However, it was less effective than the clinician [117]. A haptic-
based tooth drilling simulator was introduced for dental education with an implemented collision
detection system to give force sensation to the user and make the virtual reality (VR) experience
more realistic [15]. A study found best learning of dental basic motor skills in trainees receiving a
combination of VR training with haptic feedback and human instructor verbal feedback [118]. Other
studies investigated the use of VR and haptic devices for training of dental implant placement or oral
anesthesia [119, 120]. VR laboratories with haptic devices become more and more part of the regular
curricula in dental education and have been found to improve student’s learning efficiency and effect
[118, 121, 122]. A review of different VR systems in dental education can be found in [123].

Additionally, a study investigated the learning experience of pre-clinical dental students using 3D
printed teeth designed with realistic pulpe cavities and simulated caries decays [124].

3. Tooth preparation
Preparation of a tooth for crowns and bridges is a routine task for the dentist, although even after
years of practical experience it is still challenging. The challenge is to reduce the tooth sufficiently to
create space for the prosthetic rehabilitation with a minimum of damage to sound tooth structure.
The idea of a robotic arm used for tooth preparation or preparation support for the dentist seems
tempting and sensible. A mechatronic system to support the dentist in drilling has been tested in
vitro and showed good results, however, it has not yet been validated in a clinical setting. The
dentist`s position accuracy was 53 % better with the mechatronic system [125] than without it. Yuan
et al. described a robotic tooth preparation system [126]) with the following hardware components:
Dentronics: Towards Robots and AI in Dentistry

1) an intraoral 3D scanner (TRIOS, 3Shape A/S, Copenhagen, Denmark) to obtain the 3D data of the
patient’s target tooth, adjacent teeth, opposing teeth and the teeth fixture 2) a computer-aided
design (CAD) /computer-aided manufacturing (CAM) software for designing the target preparation
shape and generating a 3D motion path of the laser 3) an effective low-heat laser suitable for hard
tissue preparation; 4) a 6 DoF robot arm 5) a tooth fixture connecting the robotic device with the
target tooth and protecting the adjacent teeth from laser cutting, designed using Solidworks
software (Dassault Systèmes SOLIDWORKS Corporation, Waltham, MA, USA). Moreover, other tooth
preparation devices were tested for their accuracy. A system with micro robots, controlling a
picosecond laser showed a preparation accuracy that met clinical needs, the error was about (0.089 ±
0.026) mm [127, 128]. Another tooth preparation system for veneers with a rotating diamond
instrument mounted on a robotic arm was compared to (human-) hand crown prep [129] and
showed better results than the tooth preparation carried out by the dentist. The average
repeatability of the system was about 40 µm [126, 130].

4. Testing of toothbrushes
The efficiency of toothbrushes and their abrasiveness towards enamel may be tested with highest
repeatability and comparability by using robotic systems. For example, an in vitro study [131] with a
six-axis robot compared the efficiency of two different tooth brushes with clinical hand brushing and
in vitro robotic brushing. Results showed that robotic brushing of teeth is an alternative for plaque
removal studies and may even replace clinical studies [131].

5. Root canal treatment and plaque removal

Root canal treatment is a procedure which is based on high accuracy. Usually, a dentist specialized in
endodontics works using magnification to assure adequate view of the root canal. Nelson et al.
published the idea of a robotic system for assistance during root canal treatment. The so-called
“vending machine” was supposed to supply the dentist with the necessary root canal instruments
during treatment [132] in order to reduce deflection from the operating site. A recent study
proposed the application of micro-robots with catalytic-ability to destroy biofilms within the root
canal and tested the system In-vitro. Furthermore, the authors discussed the use of these systems for
other applications such as prevention of tooth decay or peri-implant infection [132, 133].

6. Orthodontics and jaw movement

A novel system that generates the dental arch form has been developed. The system can be used to
bend orthodontic wires [134]. Edinger described a robot for the dental office for the first time in
1991, later he described a robotic system to reproduce condylar movements [135, 136]. Virtual
articulators are one of the technological bases necessary to fully rethink and digitalize dental
workflows. They enable simulation of occlusal changes in the digital world and may be strongly
empowered by AI in the future to e.g. simulate use of dental materials patient-individually or
simulate treatment outcomes of implant placement or maxilla-facial surgeries [137].

7. Material testing
Robotic dental wear and mastication simulators are proposed to test tooth filling materials [138] or
dental implant materials [139, 140]. One of the systems was driven by a robot with 6 DoF [141, 142].
In another study dental impression materials were tested with the help of a robotic arm [143].
Dentronics: Towards Robots and AI in Dentistry

8. Tooth arrangement for full dentures

Robotic assistance may also be helpful in supporting the dental technician [144]. A novel system that
generates the dental arch has been developed. Traditional methods of manually determining the
dental arch may soon be replaced by a robot to assist in generating a more individual dental arch.
The system can be used to fabricate full dentures. A study on this tooth-arrangement robot showed
that it was very accurate [134]. Tooth arrangement for full dentures shows an error of 1.64 mm when
the arc width direction (x-axis) is 37.25 mm [145].

9. X-ray Imaging Radiography

Positioning of the film / sensor and the x-ray source was proposed to be executed by a 6 DoF robotic
arm and was found to have no adverse effects. Results showed that the robotic system was superior
to the mechanical alignment approach, due to its excellent accuracy and repeatability [146, 147].
Another application presented in the literature is a robot equipped with a skull to investigate the
influence of head movement to the accuracy of 3D imaging [148].

10. Robot assistant

A prototypical robot assistant was proposed in [6]. The authors investigated the possibility of active
robotic support during treatments by handling of instruments via a multi-modal communication
framework that aims at dentists as users. It comprises of bilateral physical human-robot interaction,
touch display input, speech input and visual gestures. In their approach they used a state-of-the-art
safe collaborative and sensitive 7DoF robot and conducted a user-study to explore the feasibility of
different human-robot-interaction modalities in dentistry as shown in [6, 22].

Machine learning and artificial intelligence in Dentistry

The systematic literature search in pubmed revealed 142 studies not older than 5 years. 101 studies
were excluded as they did not report on machine learning in dentistry. 41 studies were included, and
full-text analysis was performed to summarize state-of-the-art knowledge:

In medical applications ML may enable computers to e.g. perform clinical diagnoses and suggest
treatments [149]. ML has the potential to detect relationships and patterns in big data [150] and has
been introduced to diagnose and predict diseases or to give treatment proposals. However, ML
methods must not be mistaken for being generally capable of human-level performance. There has
been some work on transfer learning i.e. generalizing acquired knowledge to different domains, yet,
the achieved performance is far from what humans are capable of [151-154] when attempting to
generalize knowledge about already learned problems. Furthermore, any algorithm can only be as
good as the data provided. All ML applications identified through this review are limited to a specific
research question and cannot be further applied for new datasets or new research questions.
The necessary understanding of underlying processes and structures and the ability to transfer this
understanding to novel problems still seems to be above the capabilities of existing ML technology.
ML in dentistry is generally not intended to replace the dentist, but rather to be a method to create a
second informed opinion based on mathematical decision making and prediction. Clearly, although
existing methods can be very powerful for the specific domain, they have been designed for they are
still far from being general purpose tools. The authors of a short communication published in
Quintessence International describe possible application scenarios of ML in dentistry and advocate
Dentronics: Towards Robots and AI in Dentistry

results produced by these methods as a second opinion in disease diagnosis and treatment planning
[155].

The specific methods of Artificial Neural Networks (ANN) have been introduced in the dental field for
diagnosis of visually confirmed dental caries, impacted teeth or for diagnosis in dental radiography.
Also, they have been reviewed in a publication in 2018 [156].

A recent review on deep learning in dental imaging found that publications in the field of ML-
dentistry increased since 2016 from 2 publications to 14 in 2018. Today, the field covers numerous
application scenarios in the field such as neural networks for reading of dental radiography [157]
outperforming dentists in sensitivity of identifying caries [158] , prediction of oral treatment need in
children [159], classification of dental plaque, treatment-planning for orthognathic surgery [160] or
orthodontic applications [161]. Studies elucidated the use of convolutional neural networks (CNNs)
algorithms to detect dental caries and concluded that such ML techniques are efficient and effective
for diagnosing dental caries [162]. Recently, studies on the implementation of ML for oral cancer
prediction [163] and prognosis using a deep learning-based CNN algorithm were published [164].
Furthermore, machine learning has been advocated as a tool to increase understanding of
pathogenesis, diagnosis, development of new risk-assessment strategies and prediction of
periodontal disease or bisphosphonate-related osteonecrosis of the jaw [150, 165, 166].
Interestingly, ML has also been discussed in terms of automation of aesthetic and cosmetic dentistry
treatment-planning [167].

The pubmed search with the search terms dent* AND artificial intelligence yielded 279 studies. 53
were found to be relevant for AI in dentistry, however after removal of duplicates from the ML
section and review papers 17 original research papers were included.

Studies investigated the use of AI (table 2) in digital dental radiography analysis for caries diagnostics
[162, 166, 168-170], radiographic landmarks [171], sinusitis [172], root fracture [173, 174] or
differential jaw pain diagnosis [175]. Furthermore, genome wide expression profiles have been
investigated in periodontitis [176]. Reviews discussed the possible impact of AI on esthetic dentistry
[167], orthodontics [177], oral surgery [178] or future concepts of oral healthcare in general [156]
[155] [179].

Table 2 sums up current literature on artificial intelligence and machine learning in dentistry.

Other applications were reported for prediction of negative treatment outcomes in prosthodontics,
such as debonding [180], and assessment of attractiveness or orthodontic treatment need [146-148,
181-184]. Interestingly, the use of AI has also been investigated for age estimation in forensic science
[185-188].

Discussion

Robotics is an emerging field in dentistry. However, research in this area is still sparse. Only few
studies proposed robotics technology to be used in dental applications and only a minority of the
published studies reported prototype systems in the operational area (TRL 7). Throughout the
literature authors claim that robotic systems enhance reliability, reproducibility and accuracy in their
test applications. Despite this, the amount of research done in this field is still limited due to lack of
Dentronics: Towards Robots and AI in Dentistry

available and accessible systems in recent years. However, as shown in Sec B such systems are
becoming reality and this challenge may be overcome in the next years.

Another reason why robotics is still a field of low interest in dentistry may be the lack of expert
knowledge to program and control those systems as a non-professional. Consequently, research in
this domain still relies on efficient collaboration between engineers and dentists. This may change in
the near future as the robotics community researches novel programming paradigms and interaction
methodologies in order to make communication between robots and humans as intuitive as possible,
see Sec. A. Furthermore, the use of AI methods to autonomously plan tasks and reason about the
environment may further reduce the effort on the user-side when using a robot.

Studies on ML in dentistry can be found back to 2016 and since then have become more and more
popular. However, only few studies on ML are published in dental journals so far. Still, the potential
of this special research field is undeniable and future research may lead to new opportunities such as
openly available data and easy-to-use ML tools. These may be used to enhance and personalize
diagnosis, prognosis, decision making and treatment planning in dentistry in the future [8].

The studies reviewed in this paper show how manifold Dentronics can help both clinicians and
scientists to improve work chains, e.g. in forensic science, here the use of Dentronics has been
reviewed in [189] and the authors came to the conclusion that digitalization including robot
technology is a helpful tool for human identification. For clinical scientist the idea of having a swarm
of micro-robots to destroy biofilms is tempting, considering the impact of biofilms in oral diseases
such as caries, periodontitis, mucositis or peri-implantitis. Especially, peri-implantitis is a growing
threat owing to the rising number of implants placed every year [190, 191].

Unfortunately, most interdisciplinary research combining engineering and dentistry in the field of
Dentronics focuses on implantology, although the invasive character of this application may impair
acceptance of this technology among patients and dentists. Hence, these most invasive applications
are little suitable as forerunners. Therefore, research in the field of assistive robotics seems to be
more promising to facilitate the introduction of this new robotic enabled era. Also, research on
educational robotics in university environments seems to be a promising initiator to introduce
Dentronics and take the first hurdle towards acceptance of robots among future dentists [192].

Another important point is to mention the effort that is required of dentists and dental assistants in
order to learn to work with these new technologies. Older generations may be more used to familiar
tools and are rather skeptical to adapt. However, new generations of dentists can be considered
digital natives and their experience might lead them to use digital tools more naturally. Moreover, in
light of the expected developments in robotics, AI and ML, future generations may even be
considered “robonatives” as defined here [193].

All robot applications presented in this review from robot assistants to tooth preparation tools over
material testing or arrangement of teeth in dentures have an inherent potential to advance dentistry
far beyond digitalization and into a new world where digitalization reaches out to manipulate our
real world. However, the overall technological readiness is still low, and more effort and research
need to be done in order to create a value to the field of Dentronics. On the other hand, there are
numerous approaches in the research community to explore the potentials and challenges of
integrating robotics, AI and ML into dentistry, thus the speed of innovation in this novel field should
increase in the next years.
Dentronics: Towards Robots and AI in Dentistry

Conclusion

Dentistry is moving forward towards a new era of data-driven and robot-assisted medicine. However,
the latest step changes in modern robot technology, ML and AI have not yet been fully introduced to
dental research nor have they reached technological readiness and cost-efficiency to enter the dental
market. Educational systems have already made it to reality. Robot dental assistants and other
applications e.g. oral surgery, tooth arrangement, orthodontics or material testing are promising
though most critical challenges for Dentronics besides the high costs and difficult operability of the
systems are the still rather basic sensory and manipulation capabilities of the robotic systems and lack
of learning abilities. Moreover, increased intuitiveness of the systems combined with broad
educational efforts and introduction of affordable systems are key challenges that need to be
overcome to truly introduce Dentronics. ML will push forward diagnostic measures, ease treatment
planning, reduce treatment errors and ultimately increase effectiveness of the overall health system.
Though, today use of ML is still restricted to pilot use cases and narrowly defined research questions.
More flexible systems with broader application areas are needed to reach valuable, human-level-
performance. Future dentists need to be familiarized with Dentronics including digital and real-world
human-robot-interaction skills.

Acknowledgement

Funding: No funding apart from the authors institutions was available for this study.

Conflict of interest disclosure

S. Haddadin has a potential conflict of interest as shareholder of Franka Emika GmbH.

References

[1] Rus D. The Robots Are Coming. https://www.foreignaffairs.com/articles/2015-06-16/robots-are-


coming. 08.10.2019
[2] Haddadin S, Suppa M, Fuchs S, Bodenmüller T, Albu-Schäffer A, Hirzinger G. Konzepte für den
Roboterassistenten der Zukunft - Towards the Robotic Co-Worker. at - Automatisierungstechnik.
2010;58:695-708.
[3] Chandrasekaran B, Conrad JM. Human-robot collaboration: A survey. SoutheastCon 20152015. p.
1-8.
[4] Forlizzi J, DiSalvo C. Service robots in the domestic environment: a study of the roomba vacuum in
the home. Proceedings of the 1st ACM SIGCHI/SIGART conference on Human-robot interaction. Salt
Lake City, Utah, USA: ACM; 2006. p. 258-65.
[5] Ivanov S, Webster C, Berezina K. Adoption of robots and service automation by tourism and
hospitality companies. 2017.
[6] Grischke J, Johannsmeier L, Eich L, Haddadin S. Dentronics: Review, First Concepts and Pilot Study
of a new Application Domain for collaborative Robots in Dental assistance. 2019 International
Conference on Robotics and Automation (ICRA): IEEE; 2019.
[7] Frey CB, Osborne MA. The future of employment: How susceptible are jobs to computerisation?
Technological Forecasting and Social Change. 2017;114:254-80.
[8] Jenkins P. Buy a robot secretary? Dent Manage. 1967;7:72 passim.
[9] Cernea S, Raz I. Insulin Therapy: Future Perspectives. Am J Ther. 2019.
[10] Jedamzik S. [Digital health and nursing : The future is now]. Unfallchirurg. 2019;122:670-5.
[11] Brokel JM, Harrison MI. Redesigning care processes using an electronic health record: a system's
experience. Jt Comm J Qual Patient Saf. 2009;35:82-92.
[12] Miller RJ. Navigated surgery in oral implantology: a case study. Int J Med Robot. 2007;3:229-34.
Dentronics: Towards Robots and AI in Dentistry

[13] Guo Y, Guo C. Maxillary-fronto-temporal approach for removal of recurrent malignant


infratemporal fossa tumors: Anatomical and clinical study. J Craniomaxillofac Surg. 2014;42:206-12.
[14] Bell RB, Markiewicz MR. Computer-assisted planning, stereolithographic modeling, and
intraoperative navigation for complex orbital reconstruction: a descriptive study in a preliminary
cohort. J Oral Maxillofac Surg. 2009;67:2559-70.
[15] Razavi M, Talebi HA, Zareinejad M, Dehghan MR. A GPU-implemented physics-based haptic
simulator of tooth drilling. Int J Med Robot. 2015;11:476-85.
[16] Abe S, Noguchi N, Matsuka Y, Shinohara C, Kimura T, Oka K, et al. Educational effects using a
robot patient simulation system for development of clinical attitude. Eur J Dent Educ. 2018;22:e327-
e36.
[17] Murbay S, Neelakantan P, Chang JWW, Yeung S. "Evaluation of the introduction of a dental
virtual simulator on the performance of undergraduate dental students in the pre-clinical operative
dentistry course". Eur J Dent Educ. 2019.
[18] Mirghani I, Mushtaq F, Allsop MJ, Al-Saud LM, Tickhill N, Potter C, et al. Capturing differences in
dental training using a virtual reality simulator. Eur J Dent Educ. 2018;22:67-71.
[19] de Boer IR, Lagerweij MD, Wesselink PR, Vervoorn JM. The Effect of Variations in Force Feedback
in a Virtual Reality Environment on the Performance and Satisfaction of Dental Students. Simul
Healthc. 2019;14:169-74.
[20] Jeelani S, Dany A, Anand B, Vandana S, Maheswaran T, Rajkumar E. Robotics and medicine: A
scientific rainbow in hospital. J Pharm Bioallied Sci. 2015;7:S381-3.
[21] Milner MN, Anania EC, Candelaria-Oquendo K, Rice S, Winter SR, Ragbir NK. Patient Perceptions
of New Robotic Technologies in Clinical Restorative Dentistry. J Med Syst. 2019;44:33.
[22] Grischke J, Johannsmeier L, Eich L, Haddadin S. Dentronics: Review, First Concepts and Pilot
Study of a New Application Domain for Collaborative Robots in Dental Assistance.
https://www.youtube.com/watch?v=M8s9bS8qkRE&t=9s. 28.01.2020
[23] Hirzinger G, Sporer N, Albu-Schaffer A, Hahnle M, Krenn R, Pascucci A, et al. DLR's torque-
controlled light weight robot III-are we reaching the technological limits now? Proceedings 2002 IEEE
International Conference on Robotics and Automation (Cat No02CH37292)2002. p. 1710-6 vol.2.
[24] Albu-Schäffer A, Haddadin S, Ott C, Stemmer A, Wimböck T, Hirzinger G. The DLR Lightweight
Robot – Design and Control Concepts for Robots in Human Environments. INDUSTRIAL ROBOT-AN
INTERNATIONAL JOURNAL. 2007;34:376-85.
[25] Bischoff R, Kurth J, Schreiber G, Koeppe R, Albu-Schaeffer A, Beyer A, et al. The KUKA-DLR
Lightweight Robot arm - a new reference platform for robotics research and manufacturing. ISR
2010 (41st International Symposium on Robotics) and ROBOTIK 2010 (6th German Conference on
Robotics)2010. p. 1-8.
[26] Albu-Schäffer A, Eiberger O, Fuchs M, Grebenstein M, Haddadin S, Ott C, et al. Anthropomorphic
Soft Robotics – From Torque Control to Variable Intrinsic Compliance. In: Pradalier C, Siegwart R,
Hirzinger G, editors. 14 th International Symposium of Robotic Research. Lucerne, Switzerland:
Springer Berlin Heidelberg; 2009. p. 185-207.
[27] Hagn U, Konietschke R, Tobergte A, Nickl M, Jörg S, Kübler B, et al. DLR MiroSurge: a versatile
system for research in endoscopic telesurgery. Int J Comput Assist Radiol Surg. 2010;5:183-93.
[28] Hagn U, Nickl M, Jörg S, Passig G, Bahls T, Nothhelfer A, et al. The DLR MIRO: A versatile
lightweight robot for surgical applications. Industrial Robot: An International Journal. 2008;35:324-
36.
[29] Bodner J, Wykypiel H, Wetscher G, Schmid T. First experiences with the da Vinci operating robot
in thoracic surgery. Eur J Cardiothorac Surg. 2004;25:844-51.
[30] Burgner-Kahrs J, Rucker DC, Choset H. Continuum Robots for Medical Applications: A Survey.
IEEE Transactions on Robotics. 2015;31:1261-80.
[31] Beasley R. Medical Robots: Current Systems and Research Directions. Journal of Robotics.
2012;2012.
[32] Ackerman E. Sawyer: Rethink Robotics Unveils New Robot.
https://spectrum.ieee.org/automaton/robotics/industrial-robots/sawyer-rethink-robotics-new-
robot. 28.10.2019
Dentronics: Towards Robots and AI in Dentistry

[33] Fitzgerald C. Developing baxter. 2013 IEEE Conference on Technologies for Practical Robot
Applications (TePRA)2013. p. 1-6.
[34] Hogan N. Impedance Control: An Approach to Manipulation: Part II—Implementation. Journal of
Dynamic Systems, Measurement, and Control. 1985;107:8.
[35] Khatib O. A unified approach for motion and force control of robot manipulators: The
operational space formulation. IEEE Journal on Robotics and Automation. 1987;3:43-53.
[36] Albu-Schaffer A, Ott C, Frese U, Hirzinger G. Cartesian impedance control of redundant robots:
recent results with the DLR-light-weight-arms. 2003 IEEE International Conference on Robotics and
Automation (Cat No03CH37422)2003. p. 3704-9 vol.3.
[37] Ott C. Cartesian Impedance Control of Redundant and Flexible-Joint Robots2008.
[38] Schindlbeck C, Haddadin S. Unified passivity-based Cartesian force/impedance control for rigid
and flexible joint robots via task-energy tanks. 2015 IEEE International Conference on Robotics and
Automation (ICRA)2015. p. 440-7.
[39] Haddadin S, Albu-Schäeffer A, Hirzinger G. Requirements for safe robots: measurements,
analysis and new insights. The International Journal of Robotics Research The International Journal of
Robotics Research. 2009;28:11-2.
[40] Haddadin S. Towards Safe Robots: Springer Berlin Heidelberg; 2014.
[41] Johannsmeier L, Haddadin S. A Hierarchical Human-Robot Interaction-Planning Framework for
Task Allocation in Collaborative Industrial Assembly Processes. IEEE Robotics and Automation Letters.
2017;2:41-8.
[42] Haddadin S, Haddadin S, Khoury A, Rokahr T, Parusel S, Burgkart R, et al. On making robots
understand safety: Embedding injury knowledge into control. Int J Rob Res. 2012;31:1578-602.
[43] Mansfeld N, Hamad M, Becker M, Marin AG, Haddadin S. Safety Map: A Unified Representation
for Biomechanics Impact Data and Robot Instantaneous Dynamic Properties. IEEE Robotics and
Automation Letters. 2018;3:1880-7.
[44] Michalos G, Makris S, Tsarouchi P, Guasch T, Kontovrakis D, Chryssolouris G. Design
Considerations for Safe Human-robot Collaborative Workplaces. Procedia CIRP. 2015;37:248-53.
[45] CE marking. http://ec.europa.eu/growth/single-market/ce-marking_en. 08.10.19
[46] Guide to application of the Machinery Directive 2006/42/EC. 2.1 ed: European Commission;
2017.
[47] Michael AG, Alan CS. Human-Robot Interaction: A Survey: Now Foundations and Trends; 2008.
[48] De Santis A, Siciliano B, De Luca A, Bicchi A. An atlas of physical human–robot interaction.
Mechanism and Machine Theory. 2008;43:253-70.
[49] Kapoor S, Arora P, Kapoor V, Jayachandran M, Tiwari M. Haptics - touchfeedback technology
widening the horizon of medicine. J Clin Diagn Res. 2014;8:294-9.
[50] Argall BD, Billard AG. A survey of Tactile Human-Robot Interactions. Robot Auton Syst.
2010;58:1159-76.
[51] Golz S, Osendorfer C, Haddadin S. Using tactile sensation for learning contact knowledge:
Discriminate collision from physical interaction. 2015 IEEE International Conference on Robotics and
Automation (ICRA)2015. p. 3788-94.
[52] Franka Emika. https://franka.de/. 28.10.2019
[53] Daniel B, Korondi P, Sziebig G, Thomessen T. Evaluation of Flexible Graphical User Interface for
Intuitive Human Robot Interactions. Acta Polytechnica Hungarica. 2014;11:135-51.
[54] Marion P, Fallon MF, Deits R, Valenzuela A, Pérez-D'Arpino C, Izatt G, et al. Director: A User
Interface Designed for Robot Operation with Shared Autonomy. J Field Robotics. 2017;34:262-80.
[55] Khan RZ, Ibraheem NA. Survey on Gesture Recognition for Hand Image Postures. Computer and
Information Science. 2012;5:110-21.
[56] Ende T, Haddadin S, Parusel S, Wüsthoff T, Hassenzahl M, Albu-Schäffer A. A human-centered
approach to robot gesture based communication within collaborative working processes. 2011
IEEE/RSJ International Conference on Intelligent Robots and Systems2011. p. 3367-74.
[57] Gleeson B, Maclean K, Haddadi A, Croft E, Alcazar J. Gestures for industry Intuitive human-robot
communication from human observation2013.
Dentronics: Towards Robots and AI in Dentistry

[58] Rautaray SS, Agrawal A. Vision based hand gesture recognition for human computer interaction:
a survey. Artif Intell Rev. 2015;43:1-54.
[59] Li SZ, Jain AK. Handbook of Face Recognition: Springer Publishing Company, Incorporated; 2011.
[60] Stiefelhagen R, Fugen C, Gieselmann R, Holzapfel H, Nickel K, Waibel A. Natural human-robot
interaction using speech, head pose and gestures. 2004 IEEE/RSJ International Conference on
Intelligent Robots and Systems (IROS) (IEEE Cat No04CH37566)2004. p. 2422-7 vol.3.
[61] Graves A, Mohamed A, Hinton G. Speech recognition with deep recurrent neural networks.
2013 IEEE International Conference on Acoustics, Speech and Signal Processing2013. p. 6645-9.
[62] Mavridis N. A review of verbal and non-verbal human-robot interactive communication. Robot
Auton Syst. 2015;63:22-35.
[63] Waibel A, Lee K-F. Readings in speech recognition: San Mateo, Calif. : Morgan Kaufmann
Publishers; 1990.
[64] Waibel A, Hanazawa T, Hinton G, Shikano K, Lang KJ. Phoneme recognition using time-delay
neural networks. IEEE Transactions on Acoustics, Speech, and Signal Processing. 1989;37:328-39.
[65] Müller M, Stüker S, Waibel A. Language Adaptive Multilingual CTC Speech Recognition. In:
Karpov A, Potapova R, Mporas I, editors. Speech and Computer. Cham: Springer International
Publishing; 2017. p. 473-82.
[66] Miao Y, Gowayyed M, Na X, Ko T, Metze F, Waibel A. An empirical exploration of CTC acoustic
models. 2016 IEEE International Conference on Acoustics, Speech and Signal Processing
(ICASSP)2016. p. 2623-7.
[67] Safeea M, Neto P. KUKA Sunrise Toolbox: Interfacing Collaborative Robots With MATLAB. IEEE
Robotics & Automation Magazine. 2019;26:91-6.
[68] Akan B. Human Robot Interaction Solutions for Intuitive Industrial Robot Programming 2012.
[69] Schapire RE. The Boosting Approach to Machine Learning: An Overview. In: Denison DD, Hansen
MH, Holmes CC, Mallick B, Yu B, editors. Nonlinear Estimation and Classification. New York, NY:
Springer New York; 2003. p. 149-71.
[70] Weiss K, Khoshgoftaar TM, Wang D. A survey of transfer learning. Journal of Big Data. 2016;3:9.
[71] Busoniu L, Babuska R, Schutter BD. A Comprehensive Survey of Multiagent Reinforcement
Learning. IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).
2008;38:156-72.
[72] Kober J, Bagnell JA, Peters J. Reinforcement learning in robotics: A survey. The International
Journal of Robotics Research. 2013;32:1238-74.
[73] Voulodimos A, Doulamis N, Doulamis A, Protopapadakis E. Deep learning for computer vision: A
brief review. Computational intelligence and neuroscience. 2018;2018.
[74] Guo Y, Liu Y, Oerlemans A, Lao S-Y, Wu S, Lew M. Deep learning for visual understanding: A
review. Neurocomputing. 2015;187.
[75] Padmanabhan J, Premkumar M. Machine Learning in Automatic Speech Recognition: A Survey.
IETE Technical Review. 2015;32:1-12.
[76] Nassif A, Shahin I, Attili I, Azzeh M, Shaalan K. Speech Recognition Using Deep Neural Networks:
A Systematic Review. IEEE Access. 2019;PP:1-.
[77] DeepL. https://www.deepl.com/translator. 28.01.2020
[78] Manogaran G, Vijayakumar V, Varatharajan R, Malarvizhi Kumar P, Sundarasekar R, Hsu C-H.
Machine Learning Based Big Data Processing Framework for Cancer Diagnosis Using Hidden Markov
Model and GM Clustering. Wireless Personal Communications. 2018;102:2099-116.
[79] Dario A, Sundaram A, Rishita A, Jingliang B, Eric B, Carl C, et al. Deep Speech 2 : End-to-End
Speech Recognition in English and Mandarin. PMLR; 2016. p. 173-82.
[80] Johannsmeier L, Gerchow M, Haddadin S. A Framework for Robot Manipulation: Skill Formalism,
Meta Learning and Adaptive Control. 2019 International Conference on Robotics and Automation
(ICRA): IEEE; 2019. p. 5844-50.
[81] Russell S, Norvig P. Artificial Intelligence: A Modern Approach, Global Edition: Pearson Education
Limited; 2016.
[82] Bonet B, Geffner H. Planning as heuristic search. Artificial Intelligence. 2001;129:5-33.
[83] Sven K, Maxim L, Yaxin L, David F. Incremental Heuristic Search in AI. AI Magazine. 2004;25.
Dentronics: Towards Robots and AI in Dentistry

[84] Barr A, Feigenbaum EA. The Handbook of Artificial Intelligence: Elsevier Science; 2014.
[85] Nilsson NJ. Principles of Artificial Intelligence: Elsevier Science; 2014.
[86] Huckaby JOD, Christensen HI. A Taxonomic Framework for Task Modeling and Knowledge
Transfer in Manufacturing Robotics. CogRob@AAAI2012.
[87] Björkelund A, Malec J, Nilsson K, Nugues P. Knowledge and Skill Representations for Robotized
Production. IFAC Proceedings Volumes. 2011;44:8999-9004.
[88] Erdem E, Aker E, Patoglu V. Answer set programming for collaborative housekeeping robotics:
representation, reasoning, and execution. Intell Serv Robot. 2012;5:275-91.
[89] Stenmark M, Malec J. Knowledge-based instruction of manipulation tasks for industrial robotics.
Robotics and Computer-Integrated Manufacturing. 2015;33:56-67.
[90] Kai MW, Armin H, Maren B, Cyrill S, Wolfram B. OctoMap: A probabilistic, flexible, and compact
3D map representation for robotic systems. In Proc of the ICRA 2010 workshop2010.
[91] Tenorth M, Beetz M. Representations for robot knowledge in the KnowRob framework. Artificial
Intelligence. 2017;247:151-69.
[92] Bessiere P, Laugier C, Siegwart R. Probabilistic Reasoning and Decision Making in Sensory-Motor
Systems2008.
[93] Hertzberg J, Chatila R. AI Reasoning Methods for Robotics. 2008. p. 207-23.
[94] Mankins J. Technology Readiness Level – A White Paper. 1995.
[95] Hassfeld S, Brief J, Raczkowsky J, Marmulla R, Mende U, Ziegler C. Computer-based approaches
for maxillofacial interventions. Minimally Invasive Therapy & Allied Technologies. 2003;12:25-35.
[96] Bhambhani R, Bhattacharyya J, Sen S. Digitization and Its Futuristic Approach in Prosthodontics.
The Journal of Indian Prosthodontic Society. 2013;13.
[97] Hamilton J. Robots, bionics, and bioengineered replacement parts in dentistry. J Calif Dent
Assoc. 2006;34:31-40.
[98] Gulati M, Anand V, Salaria S, Jain DN, Gupta S. Computerized implant-dentistry: Advances
toward automation. Journal of Indian Society of Periodontology. 2015;19:5-10.
[99] Azari A, Nikzad S. Computer-assisted implantology: historical background and potential
outcomes-a review. Int J Med Robot. 2008;4:95-104.
[100] Du YF, Chen N, Li DQ. [Application of robot-assisted surgery in the surgical treatment of head
and neck cancer]. Zhonghua Kou Qiang Yi Xue Za Zhi. 2019;54:58-61.
[101] Kasahara Y, Kawana H, Usuda S, Ohnishi K. Telerobotic-assisted bone-drilling system using
bilateral control with feed operation scaling and cutting force scaling. Int J Med Robot. 2012;8:221-9.
[102] Chiarelli T, Lamma E, Sansoni T. A fully 3D work context for oral implant planning and
simulation. International journal of computer assisted radiology and surgery. 2010;5:57-67.
[103] Fortin T, Guillaume C, Bianchi S, Buatois H, Coudert J-L. Precision of transfer of preoperative
planning for oral implants based on Cone-Beam-CT scan images through a robotic drilling machine-
an in vitro study. Clinical oral implants research. 2003;13:651-6.
[104] Sun X, McKenzie FD, Bawab S, Li J, Yoon Y, Huang JK. Automated dental implantation using
image-guided robotics: registration results. Int J Comput Assist Radiol Surg. 2011;6:627-34.
[105] Schiroli G, Angiero F, Zangerl A, Benedicenti S, Ferrante F, Widmann G. Accuracy of a flapless
protocol for computer-guided zygomatic implant placement in human cadavers: expectations and
reality. The international journal of medical robotics + computer assisted surgery : MRCAS. 2016;12
1:102-8.
[106] Widmann G, Stoffner R, Keiler M, Zangerl A, Widmann R, Puelacher W, et al. A laboratory
training and evaluation technique for computer-aided oral implant surgery. The international journal
of medical robotics + computer assisted surgery : MRCAS. 2009;5 3:276-83.
[107] Zhenggang C, Qin C, Fan S, Yu D, Wu Y, Chen X. Pilot study of a surgical robot system for
zygomatic implant placement. Medical Engineering & Physics. 2019.
[108] Woo S-Y, Lee S-J, Yoo J-Y, Han J-J, Hwang S-J, Huh K-H, et al. Autonomous bone reposition
around anatomical landmark for robot-assisted orthognathic surgery. Journal of Cranio-Maxillofacial
Surgery. 2017;45:1980-8.
[109] Maikuma Y, Usui K, Araki K, Mataki S, Kurosaki N, Furuya N. Evaluation of an Articulated
Measuring Apparatus for Use in the Oral Cavity. Dental materials journal. 2003;22:168-79.
Dentronics: Towards Robots and AI in Dentistry

[110] Chen X, Lin Y, Wu Y, Wang C. Real-time motion tracking in image-guided oral implantology. Int J
Med Robot. 2008;4:339-47.
[111] Sun X, Yoon Y, Li J, McKenzie F. Automated image-guided surgery for common and complex
dental implants. Journal of medical engineering & technology. 2014;38:1-9.
[112] Miller RJ. Navigated surgery in oral implantology: a case study. Int J Med Robot. 2007;3:229-34.
[113] Ma Q, Kobayashi E, Wang J, Hara K, Suenaga H, Sakuma I, et al. Development and preliminary
evaluation of an autonomous surgical system for oral and maxillofacial surgery. The International
Journal of Medical Robotics and Computer Assisted Surgery. 2019;15:e1997.
[114] A dental training robot. J Kans State Dent Assoc. 1969;53:161.
[115] Takanobu H, Okino A, Takanishi A, Madokoro M, Miyazaki Y, Maki K. Dental Patient Robot.
2006 IEEE/RSJ International Conference on Intelligent Robots and Systems2006. p. 1273-8.
[116] Tanzawa T, Futaki K, Kurabayashi H, Goto K, Yoshihama Y, Hasegawa T, et al. Medical
emergency education using a robot patient in a dental setting. Eur J Dent Educ. 2013;17:e114-9.
[117] Ahire M, Dani N, Muttha R. Dental health education through the brushing ROBOTUTOR: A new
learning experience. Journal of Indian Society of Periodontology. 2012;16:417-20.
[118] Al-Saud LM, Mushtaq F, Allsop MJ, Culmer PC, Mirghani I, Yates E, et al. Feedback and motor
skill acquisition using a haptic dental simulator. Eur J Dent Educ. 2017;21:240-7.
[119] Chen X, Sun P, Liao D. A patient-specific haptic drilling simulator based on virtual reality for
dental implant surgery. Int J Comput Assist Radiol Surg. 2018;13:1861-70.
[120] Corrêa CG, Machado MAAM, Ranzini E, Tori R, Nunes FLS. Virtual Reality simulator for dental
anesthesia training in the inferior alveolar nerve block. J Appl Oral Sci. 2017;25:357-66.
[121] Su Yin M, Haddawy P, Suebnukarn S, Schultheis H, Rhienmora P. Use of haptic feedback to train
correct application of force in endodontic surgery. Proceedings of the 22nd International Conference
on Intelligent User Interfaces: ACM; 2017. p. 451-5.
[122] Ihm J-J, Seo D-G. Does Reflective Learning with Feedback Improve Dental Students’ Self-
Perceived Competence in Clinical Preparedness? Journal of dental education. 2016;80:173-82.
[123] Rekow ED. Digital Dentistry: A Comprehensive Reference and Preview of the Future:
Quintessence Publishing Company Limited; 2018.
[124] Höhne C, Schmitter M. 3D Printed Teeth for the Preclinical Education of Dental Students. J Dent
Educ. 2019;83:1100-6.
[125] Ortiz Simon JL, Martinez AM, Espinoza DL, Romero Velazquez JG. Mechatronic assistant system
for dental drill handling. Int J Med Robot. 2011;7:22-6.
[126] Wang D, Wang L, Zhang Y, Lv P, Sun Y, Xiao J. Preliminary study on a miniature laser
manipulation robotic device for tooth crown preparation. Int J Med Robot. 2014;10:482-94.
[127] Yuan FS, Wang Y, Zhang YP, Sun YC, Wang DX, Lyu PJ. [Study on the appropriate parameters of
automatic full crown tooth preparation for dental tooth preparation robot]. Zhonghua Kou Qiang Yi
Xue Za Zhi. 2017;52:270-3.
[128] Yuan F, Wang Y, Zhang Y, Sun Y, Wang D, Lyu P. An automatic tooth preparation technique: A
preliminary study. Scientific Reports. 2016;6:25281.
[129] Otani T, Raigrodski A, Mancl L, Kanuma I, Rosen J. In vitro evaluation of accuracy and precision
of automated robotic tooth preparation system for porcelain laminate veneers. The Journal of
prosthetic dentistry. 2015;114.
[130] Wang L, Wang D, Zhang Y, Ma L, Sun Y, Lu P. An Automatic Robotic System for Three-
Dimensional Tooth Crown Preparation Using a Picosecond Laser. Lasers in Surgery and Medicine.
2014;46.
[131] Lang T, Staufer S, Jennes B, Gaengler P. Clinical validation of robot simulation of toothbrushing‐
comparative plaque removal efficacy. BMC Oral Health. 2014;14:82.
[132] Nelson CA, Hossain SG, Al-Okaily A, Ong J. A novel vending machine for supplying root canal
tools during surgery. J Med Eng Technol. 2012;36:102-16.
[133] Hwang G, Paula A, Hunter E, Liu Y, Babeer A, Karabucak B, et al. Catalytic antimicrobial robots
for biofilm eradication. Science Robotics. 2019;4:eaaw2388.
[134] Jiang JG, Zhang YD. Motion planning and synchronized control of the dental arch generator of
the tooth-arrangement robot. Int J Med Robot. 2013;9:94-102.
Dentronics: Towards Robots and AI in Dentistry

[135] Edinger D. [Robot system for the dental office]. Phillip J. 1991;8:301-2, 5-6, 8.
[136] Edinger DH. Accuracy of a robotic system for the reproduction of condylar movements: a
preliminary report. Quintessence Int. 2004;35:519-23.
[137] Lepidi L, Chen Z, Ravida A, Lan T, Wang HL, Li J. A Full-Digital Technique to Mount a Maxillary
Arch Scan on a Virtual Articulator. J Prosthodont. 2019;28:335-8.
[138] Conserva E, Menini M, Tealdo T, Bevilacqua M, Pera F, Ravera G, et al. Robotic chewing
simulator for dental materials testing on a sensor-equipped implant setup. Int J Prosthodont.
2008;21:501-8.
[139] Tahir AM, Jilich M, Trinh DC, Cannata G, Barberis F, Zoppi M. Architecture and design of a
robotic mastication simulator for interactive load testing of dental implants and the mandible. The
Journal of Prosthetic Dentistry. 2019;122:389.e1-.e8.
[140] Sen N, Us YO. Fatigue survival and failure resistance of titanium versus zirconia implant
abutments with various connection designs. J Prosthet Dent. 2019;122:315.e1-.e7.
[141] Raabe D, Harrison A, Ireland A, Alemzadeh K, Sandy J, Dogramadzi S, et al. Improved single- and
multi-contact life-time testing of dental restorative materials using key characteristics of the human
masticatory system and a force/position-controlled robotic dental wear simulator. Bioinspiration &
Biomimetics. 2011;7:016002.
[142] Raabe D, Alemzadeh K, Harrison AL, Ireland AJ. The chewing robot: a new biologically-inspired
way to evaluate dental restorative materials. Conf Proc IEEE Eng Med Biol Soc. 2009;2009:6050-3.
[143] Carvalho A, Brito P, Santos J, Caramelo FJ, Veiga G, Vasconcelos B, et al. Evaluation of two
dental impression materials using a robot arm. Bull Group Int Rech Sci Stomatol Odontol.
2011;50:36-7.
[144] Zhang YD, Jiang JG, Liang T, Hu WP. Kinematics modeling and experimentation of the multi-
manipulator tooth-arrangement robot for full denture manufacturing. J Med Syst. 2011;35:1421-9.
[145] Zhang YD, Jiang JG, Lv PJ, Wang Y. Coordinated control and experimentation of the dental arch
generator of the tooth-arrangement robot. Int J Med Robot. 2010;6:473-82.
[146] Burdea GC, Dunn SM, Levy G. Evaluation of robot-based registration for subtraction
radiography. Med Image Anal. 1999;3:265-74.
[147] Burdea GC, Dunn SM, Immendorf CH, Mallik M. Real-time sensing of tooth position for dental
digital subtraction radiography. IEEE Trans Biomed Eng. 1991;38:366-78.
[148] Spin-Neto R, Mudrak J, Matzen LH, Christensen J, Gotfredsen E, Wenzel A. Cone beam CT image
artefacts related to head motion simulated by a robot skull: visual characteristics and impact on
image quality. Dentomaxillofac Radiol. 2013;42:32310645.
[149] Chan Y-K, Chen Y-F, Pham T, Chang W, Hsieh M-Y. Artificial Intelligence in Medical Applications.
Journal of Healthcare Engineering. 2018;2018:2.
[150] Kim DW, Kim H, Nam W, Kim HJ, Cha IH. Machine learning to predict the occurrence of
bisphosphonate-related osteonecrosis of the jaw associated with dental extraction: A preliminary
report. Bone. 2018;116:207-14.
[151] Sun S. A survey of multi-view machine learning. Neural Computing and Applications. 2013;23.
[152] Lisa T, Jude S. Transfer Learning. Handbook of Research on Machine Learning Applications and
Trends: Algorithms, Methods, and Techniques. Hershey, PA, USA: IGI Global; 2010. p. 242-64.
[153] Devin C, Gupta A, Darrell T, Abbeel P, Levine S. Learning modular neural network policies for
multi-task and multi-robot transfer2017.
[154] Haddadin S, Johannsmeier L. The Art of Manipulation: Learning to Manipulate Blindly2018.
[155] Mupparapu M, Wu CW, Chen YC. Artificial intelligence, machine learning, neural networks, and
deep learning: Futuristic concepts for new dental diagnosis. Quintessence Int. 2018;49:687-8.
[156] Park WJ, Park JB. History and application of artificial neural networks in dentistry. Eur J Dent.
2018;12:594-601.
[157] Lee JH, Kim DH, Jeong SN. Diagnosis of cystic lesions using panoramic and cone beam
computed tomographic images based on deep learning neural network. Oral Dis. 2020;26:152-8.
[158] Mansour RF, Al-Marghilnai A, Hagas ZA. Use of artificial intelligence techniques to determine
dental caries : A systematic review. 2019.
Dentronics: Towards Robots and AI in Dentistry

[159] Wang Y, Hays RD, Marcus M, Maida CA, Shen J, Xiong D, et al. Developing Children's Oral Health
Assessment Toolkits Using Machine Learning Algorithm. JDR Clin Trans Res.
2019:2380084419885612.
[160] Choi HI, Jung SK, Baek SH, Lim WH, Ahn SJ, Yang IH, et al. Artificial Intelligent Model With
Neural Network Machine Learning for the Diagnosis of Orthognathic Surgery. J Craniofac Surg.
2019;30:1986-9.
[161] Hwang J-J, Jung Y-H, Cho B-H, Heo M-S. An overview of deep learning in the field of dentistry.
Imaging Sci Dent. 2019;49:1-7.
[162] Hung M, Voss MW, Rosales MN, Li W, Su W, Xu J, et al. Application of machine learning for
diagnostic prediction of root caries. Gerodontology. 2019;36:395-404.
[163] Wang X, Yang J, Wei C, Zhou G, Wu L, Gao Q, et al. A personalized computational model
predicts cancer risk level of oral potentially malignant disorders and its web application for
promotion of non-invasive screening. J Oral Pathol Med. 2019.
[164] Xie G, Dong C, Kong Y, Zhong JF, Li M, Wang K. Group Lasso Regularized Deep Learning for
Cancer Prognosis from Multi-Omics and Clinical Features. Genes (Basel). 2019;10:240.
[165] Ryder MI. Periodontics in the USA: An introduction. Periodontol 2000. 2020;82:9-11.
[166] Lee JH, Kim DH, Jeong SN, Choi SH. Detection and diagnosis of dental caries using a deep
learning-based convolutional neural network algorithm. J Dent. 2018;77:106-11.
[167] Blatz MB, Chiche G, Bahat O, Roblee R, Coachman C, Heymann HO. Evolution of Aesthetic
Dentistry. J Dent Res. 2019;98:1294-304.
[168] Schwendicke F, Golla T, Dreher M, Krois J. Convolutional neural networks for dental image
diagnostics: A scoping review. Journal of Dentistry. 2019;91:103226.
[169] Schwendicke F, Elhennawy K, Paris S, Friebertshäuser P, Krois J. Deep learning for caries lesion
detection in near-infrared light transillumination images: A pilot study. Journal of Dentistry.
2020;92:103260.
[170] Casalegno F, Newton T, Daher R, Abdelaziz M, Lodi-Rizzini A, Schürmann F, et al. Caries
Detection with Near-Infrared Transillumination Using Deep Learning. Journal of Dental Research.
2019;98:1227-33.
[171] Du X, Chen Y, Zhao J, Xi Y. A Convolutional Neural Network Based Auto-Positioning Method For
Dental Arch In Rotational Panoramic Radiography. Conf Proc IEEE Eng Med Biol Soc. 2018;2018:2615-
8.
[172] Murata M, Ariji Y, Ohashi Y, Kawai T, Fukuda M, Funakoshi T, et al. Deep-learning classification
using convolutional neural network for evaluation of maxillary sinusitis on panoramic radiography.
Oral Radiol. 2019;35:301-7.
[173] Mikrogeorgis G, Eirinaki E, Kapralos V, Koutroulis A, Lyroudia K, Pitas I. Diagnosis of vertical root
fractures in endodontically treated teeth utilising Digital Subtraction Radiography: A case series
report. Aust Endod J. 2018;44:286-91.
[174] Johari M, Esmaeili F, Andalib A, Garjani S, Saberkari H. Detection of vertical root fractures in
intact and endodontically treated premolar teeth by designing a probabilistic neural network: an ex
vivo study. Dentomaxillofac Radiol. 2017;46:20160107.
[175] Nam Y, Kim HG, Kho HS. Differential diagnosis of jaw pain using informatics technology. J Oral
Rehabil. 2018;45:581-8.
[176] Kebschull M, Papapanou PN. Exploring Genome-Wide Expression Profiles Using Machine
Learning Techniques. Methods Mol Biol. 2017;1537:347-64.
[177] Allareddy V, Venugopalan S, Nalliah R, Caplin J, Lee M, Allareddy V. Orthodontics in the era of
big data analytics. Orthodontics & Craniofacial Research. 2019;22:8-13.
[178] Pereira KR, Sinha R. Welcome the “new kid on the block” into the family: artificial intelligence
in oral and maxillofacial surgery. British Journal of Oral and Maxillofacial Surgery. 2020;58:83-4.
[179] Wu Q, Zhao Y-M, Bai S-Z, Li X. Application of robotics in stomatology. International journal of
computerized dentistry. 2019;22:251-60.
[180] Yamaguchi S, Lee C, Karaer O, Ban S, Mine A, Imazato S. Predicting the debonding of CAD/CAM
composite resin crowns with AI. Journal of Dental Research. 2019.
Dentronics: Towards Robots and AI in Dentistry

[181] Thanathornwong B. Bayesian-Based Decision Support System for Assessing the Needs for
Orthodontic Treatment. Healthc Inform Res. 2018;24:22-8.
[182] Takada K, Yagi M, Horiguchi E. Computational Formulation of Orthodontic Tooth-Extraction
Decisions Part I: To Extract or Not To Extract. The Angle orthodontist. 2009;79:885-91.
[183] Patcas R, Timofte R, Volokitin A, Agustsson E, Eliades T, Eichenberger M, et al. Facial
attractiveness of cleft patients: a direct comparison between artificial-intelligence-based scoring and
conventional rater groups. Eur J Orthod. 2019;41:428-33.
[184] Patcas R, Bernini DAJ, Volokitin A, Agustsson E, Rothe R, Timofte R. Applying artificial
intelligence to assess the impact of orthognathic treatment on facial attractiveness and estimated
age. Int J Oral Maxillofac Surg. 2019;48:77-83.
[185] Merdietio Boedi R, Banar N, De Tobel J, Bertels J, Vandermeulen D, Thevissen PW. Effect of
Lower Third Molar Segmentations on Automated Tooth Development Staging using a Convolutional
Neural Network. J Forensic Sci. 2019.
[186] De Tobel J, Radesh P, Vandermeulen D, Thevissen PW. An automated technique to stage lower
third molar development on panoramic radiographs for age estimation: a pilot study. The Journal of
forensic odonto-stomatology. 2017;35:42.
[187] Štepanovský M, Ibrová A, Buk Z, Velemínská J. Novel age estimation model based on
development of permanent teeth compared with classical approach and other modern data mining
methods. Forensic science international. 2017;279:72-82.
[188] Raith S, Vogel E, Anees N, Keul C, Güth J-F, Edelhoff D, et al. Artificial Neural Networks as a
powerful numerical tool to classify specific features of a tooth based on 3D scan data. Computers in
Biology and Medicine. 2016;80.
[189] Nagi R, Konidena A, Rakesh D, Jain S, Kaur N, Mann A. Digitization in forensic odontology: A
paradigm shift in forensic investigations. Journal of Forensic Dental Sciences. 2019;11:5.
[190] Dreyer H, Grischke J, Tiede C, Eberhard J, Schweitzer A, Toikkanen S, et al. Epidemiology and
risk factors of peri‐implantitis: A systematic review. Journal of Periodontal Research. 2018;53.
[191] Grischke J, Eberhard J, Stiesch M. Antimicrobial dental implant functionalization strategies -A
systematic review. Dent Mater J. 2016;35:545-58.
[192] Rekow ED. Digital dentistry: The new state of the art - Is it disruptive or destructive? Dent
Mater. 2020;36:9-24.
[193] Haddadin S, Johannsmeier L, Schmid J, Ende T, Parusel S, Haddadin S, et al. roboterfabrik: A
Pilot to Link and Unify German Robotics Education to Match Industrial and Societal Demands. In:
Lepuschitz W, Merdan M, Koppensteiner G, Balogh R, Obdržálek D, editors. Robotics in Education.
Cham: Springer International Publishing; 2019. p. 3-17.
Dentronics: Towards Robots and AI in Dentistry

Figure 1

Figure 2

View publication stats

You might also like