Professional Documents
Culture Documents
Full Chapter Pattern Recognition Introduction Features Classifiers and Principles de Gruyter Textbook 2Nd Edition Beyerer PDF
Full Chapter Pattern Recognition Introduction Features Classifiers and Principles de Gruyter Textbook 2Nd Edition Beyerer PDF
https://textbookfull.com/product/organic-chemistry-25-must-know-
classes-of-organic-compounds-de-gruyter-textbook-2nd-edition-
elzagheid/
https://textbookfull.com/product/carbon-for-micro-and-nano-
devices-de-gruyter-textbook-1st-edition-sharma/
https://textbookfull.com/product/thermal-engineering-engineering-
thermodynamics-and-heat-transfer-de-gruyter-textbook-1st-edition-
qiu/
https://textbookfull.com/product/differential-equations-a-first-
course-on-ode-and-a-brief-introduction-to-pde-de-gruyter-
textbook-1st-edition-shair-ahmad/
Semiconductor Quantum Light Sources: Fundamentals,
Technologies and Devices (De Gruyter Textbook) 1st
Edition Michler
https://textbookfull.com/product/semiconductor-quantum-light-
sources-fundamentals-technologies-and-devices-de-gruyter-
textbook-1st-edition-michler/
https://textbookfull.com/product/water-chemistry-analysis-and-
treatment-pollutants-microbial-contaminants-water-and-wastewater-
treatment-de-gruyter-textbook-1st-edition-elzagheid/
https://textbookfull.com/product/water-resources-management-
innovative-and-green-solutions-de-gruyter-stem-2nd-edition-
brears/
https://textbookfull.com/product/introduction-to-distributed-
computer-systems-principles-and-features-1st-edition-ludwik-
czaja-auth/
Jürgen Beyerer, Raphael Hagmanns, Daniel Stadler
Pattern Recognition
Also of interest
Mathematics of Deep Learning
An Introduction
Leonid Berlyand, Pierre-Emmanuel Jabin, 2023
ISBN 978-3-11-102431-8, e-ISBN (PDF) 978-3-11-102555-1
Category Theory
Invariances and Symmetries in Computer Science
Edited by Siddhartha Bhattacharyya, Vaclav Snasel, Aboul Ella Hassanien
Zoran Majkic, 2023
ISBN 978-3-11-108056-7, e-ISBN (PDF) 978-3-11-108167-0
Mathematical Logic
An Introduction
Daniel Cunningham, 2023
ISBN 978-3-11-078201-1, e-ISBN (PDF) 978-3-11-078207-3
Algorithms
Design and Analysis
Sushil C. Dimri, Preeti Malik, Mangey Ram, 2021
ISBN 978-3-11-069341-6, e-ISBN (PDF) 978-3-11-069360-7
Jürgen Beyerer, Raphael Hagmanns, Daniel Stadler
Pattern Recognition
2nd Edition
Authors
Prof. Dr.-Ing. habil. Jürgen Beyerer
Fraunhofer Institute of Optronics, System Technologies and Image Exploitation (IOSB)
Fraunhoferstr. 1
76131 Karlsruhe
juergen.beyerer@iosb.fraunhofer.de
-and-
Institute for Anthropomatics and Robotics, Chair IES
Karlsruhe Institute of Technology (KIT)
Haid-und-Neu-Str. 7
76131 Karlsruhe
Raphael Hagmanns
Institute for Anthropomatics and Robotics, Chair IES
Karlsruhe Institute of Technology (KIT)
Haid-und-Neu-Str. 7
76131 Karlsruhe
raphael.hagmanns@kit.edu
Daniel Stadler
Institute for Anthropomatics and Robotics, Chair IES
Karlsruhe Institute of Technology (KIT)
Haid-und-Neu-Str. 7
76131 Karlsruhe
daniel.stadler@kit.edu
ISBN 978-3-11-133919-1
e-ISBN (PDF) 978-3-11-133920-7
e-ISBN (EPUB) 978-3-11-133941-2
www.degruyter.com
Preface
Pattern Recognition ⊂ Machine Learning ⊂ Artificial Intelligence: This
relation could give the impression that pattern recognition is only a tiny, very special-
ized topic. That, however, is misleading. Pattern recognition is a very important field
of machine learning and artificial intelligence with its own rich structure and many
interesting principles and challenges. For humans, and also for animals, their natural
abilities to recognize patterns are essential for navigating the physical world which they
perceive with their naturally given senses. Pattern recognition here performs an impor-
tant abstraction from sensory signals to categories: on the most basic level, it enables
the classification of objects into “Eatable” or “Not eatable” or, e.g., into “Friend” or “Foe”.
These categories (or, synonymously, classes) do not always have a tangible character. Ex-
amples of non-material classes are, e.g., “secure situation” or “dangerous situation”. Such
classes may even shift depending on the context, for example, when deciding whether
an action is socially acceptable or not. Therefore, everybody is very much acquainted, at
least at an intuitive level, with what pattern recognition means to our daily life. This fact
is surely one reason why pattern recognition as a technical subdiscipline is a source of so
much inspiration for scientists and engineers. In order to implement pattern recognition
capabilities in technical systems, it is necessary to formalize it in such a way, that the
designer of a pattern recognition system can systematically engineer the algorithms and
devices necessary for a technical realization. This textbook summarizes a lecture course
about pattern recognition that one of the authors (Jürgen Beyerer) has been giving for
students of technical and natural sciences at the Karlsruhe Institute of Technology (KIT)
since 2005. The aim of this book is to introduce the essential principles, concepts and chal-
lenges of pattern recognition in a comprehensive and illuminating presentation. We will
try to explain all aspects of pattern recognition in a well understandable, self-contained
fashion. Facts are explained with a mixture of a sufficiently deep mathematical treat-
ment, but without going into the very last technical details of a mathematical proof. The
given explanations will aid readers to understand the essential ideas and to compre-
hend their interrelations. Above all, readers will gain the big picture that underlies all
of pattern recognition.
The authors would like to thank their peers and colleagues for their support:
Special thanks are owed to Dr. Ioana Gheţa who was very engaged during the early
phases of the lecture “Pattern Recognition” at the KIT. She prepared most of the many
slides and accompanied the course along many lecture periods.
Thanks as well to Dr. Martin Grafmüller and to Dr. Miro Taphanel for supporting
the lecture Pattern Recognition with great dedication.
Moreover, many thanks to Prof. Michael Heizmann and Prof. Fernando Puente León
for inspiring discussions, which have positively influenced to the evolution of the lecture.
Thanks to Christian Hermann and Lars Sommer for providing additional figures
and examples of deep learning. Our gratitude also to our friends and colleagues Alexey
https://doi.org/10.1515/9783111339207-202
VI Preface
Pak, Ankush Meshram, Chengchao Qu, Christian Hermann, Ding Luo, Julius Pfrommer,
Julius Krause, Johannes Meyer, Lars Sommer, Mahsa Mohammadikaji, Mathias Anneken,
Mathias Ziebarth, Miro Taphanel, Patrick Philipp, and Zheng Li for providing valuable
input and corrections for the preparation of this manuscript.
Lastly, we thank DeGruyter for their support and collaboration in this project.
https://doi.org/10.1515/9783111339207-203
Contents
Preface V
List of Figures XV
Notation XIX
Introduction XXIII
2 Features 10
2.1 Types of features and their traits 10
2.1.1 Nominal scale 10
2.1.2 Ordinal scale 12
2.1.3 Interval scale 12
2.1.4 Ratio scale and absolute scale 13
2.2 Feature space inspection 13
2.2.1 Projections 14
2.2.2 Intersections and slices 15
2.3 Transformations of the feature space 17
2.4 Measurement of distances in the feature space 17
2.4.1 Basic definitions 19
2.4.2 Elementary norms and metrics 20
2.4.3 A metric for sets 21
2.4.4 Metrics on the ordinal scale 23
2.4.5 The cosine distance 23
2.4.6 The Kullback–Leibler divergence 24
2.4.7 The t-distributed stochastic neighbor embedding 29
2.4.8 Tangential distance measure 31
X Contents
2.5 Normalization 34
2.5.1 Alignment, elimination of physical dimension, and leveling of
proportions 35
2.5.2 Lighting adjustment of images 35
2.5.3 Distortion adjustment of images 39
2.5.4 Dynamic time warping 40
2.6 Selection and construction of features 41
2.6.1 Descriptive features 41
2.6.2 Model-driven features 45
2.6.3 Construction of invariant features 50
2.7 Dimensionality reduction of the feature space 59
2.7.1 Principal component analysis 60
2.7.2 Kernelized principal component analysis 73
2.7.3 Independent component analysis 79
2.7.4 Multiple discriminant analysis 83
2.7.5 Dimensionality reduction with t-SNE 90
2.7.6 Autoencoder 91
2.7.7 Dimensionality reduction by feature selection 92
2.7.8 Bag of words 94
2.8 Exercises 99
Bibliography 315
Glossary 319
Index 325
List of Tables
Tab. 1 Capabilities of humans and machines in relation to pattern recognition XXV
Tab. 7.1 Character sequences generated by Markov models of different order 237
https://doi.org/10.1515/9783111339207-205
List of Figures
Fig. 1 Examples of artificial and natural objects XXIII
Fig. 2 Industrial bulk material sorting system XXIV
https://doi.org/10.1515/9783111339207-206
XVI List of Figures
Fig. 3.1 Example of a random distribution of mixed discrete and continuous quantities 104
Fig. 3.2 The decision space 105
Fig. 3.3 Workflow of the MAP classifier 108
Fig. 3.4 3-dimensional probability simplex in barycentric coordinates 108
Fig. 3.5 Connection between the likelihood ratio and the optimal decision region 112
Fig. 3.6 Decision of a MAP classifier in relation to the a posteriori probabilities 113
Fig. 3.7 Underlying densities in the reference example for classification 114
Fig. 3.8 Optimal decision regions 115
Fig. 3.9 Naive Bayes parameter estimation 117
Fig. 3.10 Risk of the minimax classifier 119
Fig. 3.11 Decision boundary with uneven priors 122
Fig. 3.12 Decision regions of a generic Gaussian classifier 124
Fig. 3.13 Decision regions of a generic two-class Gaussian classifier 124
Fig. 3.14 Decision regions of a Gaussian classifier with the reference example 125
Fig. 3.15 Iris flowers modeled with a GMM 126
Fig. 7.1 Techniques for extending linear discriminants to more than two classes 192
Fig. 7.2 Nonlinear separation by augmentation of the feature space 194
Fig. 7.3 Decision regions of a linear regression classifier 195
Fig. 7.4 Four steps of the perceptron algorithm 197
Fig. 7.5 Feed-forward neural network with one hidden layer 199
Fig. 7.6 Single layer of a feed-forward neural network 201
Fig. 7.7 Updates of weights in a feed-forward neural network 202
Fig. 7.8 Decision regions of a feed-forward neural network 205
Fig. 7.9 Neuron activation of an autoencoder with three hidden neurons 206
Fig. 7.10 Pre-training with stacked autoencoders 209
Fig. 7.11 Comparison of ReLU and sigmoid activation functions 210
Fig. 7.12 A single convolution block in a convolutional neural network 211
Fig. 7.13 High level structure of a convolutional neural network 211
Fig. 7.14 Types of features captured in convolution blocks of a convolutional neural network 213
Fig. 7.15 Detection and classification of vehicles in aerial images with CNNs 214
Fig. 7.16 Structure of an exemplary CNN used for feature extraction 215
Fig. 7.17 Building block of ResNet 216
Fig. 7.18 Scheme of a varational autoencoder 217
Fig. 7.19 Scheme of a mixture density network 217
Fig. 7.20 Classification with maximum margin 220
Fig. 7.21 Decision regions of a hard margin SVM 227
Fig. 7.22 Geometric interpretation of the slack variables 229
Fig. 7.23 Decision regions of a soft margin SVM 230
Fig. 7.24 Decision boundaries of hard margin and soft margin SVMs 231
Fig. 7.25 Toy example of a matched filter 232
Fig. 7.26 Discrete first order Markov model with three states 236
Fig. 7.27 Discrete first order hidden Markov model 238
Fig. 7.28 Illustration of a recurrent neural network 246
Fig. 7.29 Unrolling of a recurrent neural network 247
Fig. 7.30 Long short-term memory cell 248
Fig. 7.31 Transformer architecture 250
Fig. 7.32 Concept of a generative adversarial network 253
Fig. 7.33 Scheme of C-RNN-GAN 254
Fig. 9.1 Relation of the world model and training and test sets 274
Fig. 9.2 Sketch of different class assignments under different model families 275
Fig. 9.3 Expected test error, empirical training error, and VC confidence vs. VC dimension 276
Fig. 9.4 Classification error probability 279
Fig. 9.5 Classification outcomes in a 2-class scenario 280
Fig. 9.6 Performance indicators for a binary classifier 280
Fig. 9.7 Examples of ROC curves 282
Fig. 9.8 Converting a multi-class confusion matrix to binary confusion matrices 283
Fig. 9.9 Five-fold cross-validation 285
Fig. 9.10 Schematic example of AdaBoost training. 287
Fig. 9.11 AdaBoost classifier obtained by training in Figure 9.10 288
Fig. 9.12 Reasons to refuse to classify an object 288
Fig. 9.13 Classifier with rejection option 289
Fig. 9.14 Rejection criteria and the corresponding rejection regions 290
Notation
General identifiers
Special identifiers
c Number of classes
d Dimension of feature space
D Set of training samples
i, j, k Indices along the dimension, i.e., i, j, k ∈ {1, . . . , d}, or along the number of samples, i.e.,
i, j, k ∈ {1, . . . , N}
I Identity matrix
j Imaginary unit, j2 = −1
J Fisher information matrix
k(⋅,⋅) Kernel function
k(⋅) Decision function
K Decision space
l Cost function l : Ω0 /∼ × Ω/∼ → ℝ
L Cost matrix ∈ ℝ(c+1)×c
m Feature vector
mi Feature vector of the i-th sample
m ij The j-th component of the i-th feature vector
M ij The component at the i-th row and j-th column of the matrix M
M Feature space
N Number of samples
o Object
ω Class of objects, i.e., ω ⊆ Ω
ω0 Rejection class
Ω Set of objects (the relevant part of the world) Ω = {o1 , . . . , o N }
Ω/∼ The domain factorized w.r.t. the classes, i.e., the set of classes Ω/∼ = {ω1 , . . . , ω c }
Ω0 /∼ The set of classes including the rejection class, Ω0 /∼ = Ω/ {ω0 }
https://doi.org/10.1515/9783111339207-207
XX Notation
General sets
Special symbols
Abbreviations
https://doi.org/10.1515/9783111339207-208
XXIV Introduction
Computer
Camera
(line-scan)
Illumination
Bulk material
Conveyor
Ejection
stage
Background plate
These lecture notes on pattern recognition are mainly concerned with the last two issues.
The complete process of designing a pattern recognition system will be covered in its
entirety and the underlying mathematical background of the required building blocks
will be given in depth.
Pattern recognition systems are generally parts of larger systems, in which pattern
recognition is used to derive decisions from the result of the classification. Industrial
sorting systems are typical of this (see Figure 2). Here, products are processed differently
depending on their class memberships.
Introduction XXV
Hence, as a pattern recognition system is not an end in itself, the design of such a system
has to consider the consequences of a bad decision caused by a misclassification. This
puts pattern recognition between human and machine. The main advantage of auto-
matic pattern recognition is that it can execute recurring classification tasks with great
speed and without fatigue. However, an automatic classifier can only discern the classes
that were considered in the design phase and it can only use those features that were
defined in advance. A pattern recognition system to tell apples from oranges may label
a pear as an apple and a lemon as an orange if lemons and pears were not known in
the design phase. The features used for classification might be chosen poorly and not be
discriminative enough. Different environmental conditions (e.g., lighting) in the labora-
tory and in the field that were not considered beforehand might impair the classification
performance, too. Humans, on the other hand, can use their associative and cognitive
capabilities to achieve good classification performance even in adverse conditions. In
addition, humans are capable of undertaking further actions if they are unsure about a
decision. The contrasting abilities of humans and machines in relation to pattern recog-
nition are compared in Table 1. In many cases one will choose to build a hybrid system:
easy classification tasks will be processed automatically, ambiguous cases require hu-
man intervention, which may be aided by the machine, e.g., by providing a selection of
the most probable classes.
1 Fundamentals and definitions
The aim of this chapter is to describe the general structure of a pattern recognition
system and properly define the fundamental terms and concepts that were partially
used in the Introduction already. A description of the generic process of designing a
pattern recognizer will be given and the challenges at each step will be stated more
precisely.
Definition 1.1 (Equivalence relation). Let Ω be a set of elements with some relation ∼. Sup-
pose further that o, o1 , o2 , o3 ∈ Ω are arbitrary. The relation ∼ is said to be an equivalence
relation if it fulfills the following conditions:
1. Reflexivity: o ∼ o.
2. Symmetry: o1 ∼ o2 ⇔ o2 ∼ o1 .
3. Transitivity: o1 ∼ o2 and o2 ∼ o3 ⇒ o1 ∼ o3 .
of all elements that are equivalent to o. The object o is also called a representative of the
set [o]∼ (equivalence class). In the context of pattern recognition, each o ∈ Ω denotes an
object and each [o]∼ denotes a class. A different approach to classifying every element
of a set is given by partitioning the set:
It is easy to see that equivalence relations and partitions describe synonymous concepts:
every equivalence relation induces a partition and every partition induces an equiva-
lence relation.
The underlying principle of all pattern recognition is illustrated in Figure 1.1. On the
left it shows—in abstract terms—the world and a (sub)set Ω of objects that live within
https://doi.org/10.1515/9783111339207-001
2 1 Fundamentals and definitions
m2 , . . . , m c
Ω ⊆ World
ωc Rc
Sensing,
ω1 measuring,
R2
characterizing: m1
⋅⋅⋅⋅⋅⋅ o i → mi
ω2 Ω ...
R1
Decision
boundaries
the world. The set Ω is given by the pattern recognition task and is also called the do-
main. Only the objects in the domain are relevant to the task; this is the so called closed
world assumption. The task also partitions the domain into classes ω1 , ω2 , ω3 , . . . ⊆ Ω. A
suitable mapping associates every object o i to a feature vector mi ∈ M inside the feature
space M. The goal is now to find rules that partition M along decision boundaries so that
the classes of M match the classes of the domain. Hence, the rule for classifying an object
o is
ω̂ (o) := ω i if m (o) ∈ Ri . (1.2)
This means that the estimated class ω̂ (o) of object o is set to the class ω i if the feature
vector m (o) falls inside the region Ri . For this reason, the Ri are also called decision
regions. The concept of a classifier can now be stated more precisely:
Definition 1.3 (classifier). A classifier is a collection of rules that state how to evaluate
feature vectors in order to sort objects into classes. Equivalently, a classifier is a system
of decision boundaries in the feature space.
Readers experienced in machine learning will find these concepts very familiar. In fact,
machine learning and pattern recognition are closely intertwined: pattern recognition
is (mostly) supervised learning, as the classes are known in advance. This topic will be
picked up again later in this chapter.
In the previous section it was already mentioned that a pattern recognition system maps
objects onto feature vectors (see Figure 1.1) and that the classification is carried out in the
feature space. This section focuses on the steps involved and defines the terms pattern
and feature.
1.2 Structure of a pattern recognition system 3
Ω
Sensing
⋅⋅⋅
Preprocessing
Segmentation
Patterns
Feature extraction
Figure 1.2 shows the processing pipeline of a pattern recognition system. In the first
steps, the relevant properties of the objects from Ω must be put into a machine readable
interpretation. These first steps (yellow boxes in Figure 1.2) are usually performed by
methods of sensor engineering, signal processing, or metrology, and are not directly part
of the pattern recognition system. The result of these operations is the pattern of the
object under inspection.
Definition 1.4 (Pattern). A pattern is the collection of the observed or measured proper-
ties of a single object.
The most prominent pattern is the image, but patterns can also be (text) documents, audio
recordings, seismograms, or indeed any other signal or data. The pattern of an object is
the input to the actual pattern recognition, which is itself composed of two major steps
(gray boxes in Figure 1.2): previously defined features are extracted from the pattern and
the resulting feature vector is passed to the classifier, which then outputs an equivalence
class according to Equation (1.2).
A feature is any quality or quantity that can be derived from the pattern, for example,
the area of a region in an image, the count of occurrences of a key word within a text, or
the position of a peak in an audio signal.
Another random document with
no related content on Scribd:
"You may guess what it is, if you like," said Forbes, "but it
would spoil all the fun to show it to you beforehand. Ask me
questions, and I'll answer yes or no."
"Light?"
"Rather, yes."
"Yes."
"No, but that's not a fair question, you must find out more
about it."
"Decidedly."
"Yes—but not only a ruler. Here, I'll let you feel it, old boy."
"I quite forgot you didn't care for useful things, I like them
myself. But," he added, anxious to raise Jack's spirits, and
to make the best of his present which he felt was a failure,
and unappreciated, "this is a particularly nice ruler—it has a
first-rate pencil in it, and a view of the Grammar School and
Arboretum outside. That's why I got it, I thought you'd be
sure to like it."
"Forbes," said Mr. Hodson, laying a kind hand on the boy's
shoulder,
"you remind me of a verse in Proverbs, 'He that ruleth
his spirit is greater than he that taketh a city.'"
"Thanks awfully," was all that Jack could think of to say,
then after a moment's pause he asked, "What are you going
to give Geoff?"
"I say Forbes," said Jack colouring, and in a low voice, "you
wouldn't, I suppose, give Geoff the ruler and let me have
the stick?"
Jack burst into tears at this, and ran past Mr. Hodson and
Geoff, who had overheard Forbes' words, as he had raised
his voice in his anger.
CHAPTER IV.
TAKING A CITY.
Now an apple puff was not a very great thing to give up for
the sake of another, and perhaps some of my little readers
may think that it would not have signified very much if
Geoffrey had eaten it after all. But we must remember, that
life is made up of little things, and the great battle of life,
on which so much depends, consists often of little victories
and little losses, and this small victory that Geoffrey gained
that afternoon helped him in after years to gain a far
greater one.
"You are old Rachel, aren't you?" asked Geoffrey, who had
never seen her before.
"No one sent it," interposed Geoffrey. "We had them for
dinner to-day, and I thought you'd like one as they were so
good. I'm Geoffrey Fortescue, and I heard of you from Mrs.
Green."
"Money ain't for such as me: the big folk that don't need it,
they have the money. This world's comforts ain't for me."
Rachel darted a quick look at the boy, and as she saw the
earnest young face looking at her so pitifully, the expression
on her own face softened, and she shook her head.
"I thought God loved everybody, and meant Heaven for the
whole world," said Geoff, "and," he added earnestly, "I'm
quite sure God must want you there, because you are so
lonely."
Rachel wiped away a tear or two with her apron. She had
not cried for many a long day. She had harboured too bitter
thoughts to allow of tears, but to-day, something in the
boy's simple words touched her hard old heart.
Mr. Hodson, who had together with the Vicar for many a
year tried in vain to overcome Rachel's objection to see a
clergyman, was glad enough of the news Geoffrey brought
him, and prepared at once to go and see her.
"If Rachel really wants to find God, He certainly will not turn
her away," answered Mr. Hodson. "The Lord Jesus Christ has
made a way there for us all, and old Rachel's way is the
same as yours and mine. Do you remember the story
Geoff," he added, as he put On his coat to start off at once,
"of the man who saved his children by making a bridge of
his own body from the window of his burning house to that
of the opposite one? The houses were very near together,
and he could reach from one window to another."
"His children one by one crossed over his body into safety,
and just as the last child was saved, the house fell in, and
the man was killed. When the Blessed Lord Jesus died on
the Cross, He made a bridge for Rachel, and for you and for
me to Heaven. You see, I have good news for your old
friend, Geoff my boy, so you run home as fast as you can or
you'll get a scolding."
And Geoff did get a scolding. Nurse met him at the door.
Snow lay for several days some inches thick in the garden,
and though the boys enjoyed the snowballing well enough,
and were able to keep themselves thoroughly warm, little
Dodie seemed to feel the cold very acutely, and often came
in from her daily walk crying from the pain of freezing
fingers.
In fact, the child did not seem herself, and Nurse began to
grow uneasy about her, particularly as in seven days' time,
Major Fortescue was expected and she was naturally
anxious that all the children should be looking their best on
his arrival.
"I think the cold has struck her," said Nurse, as one day she
altogether refused to eat her dinner. "I've a mind to send
for Dr. Booth, the powders I've been giving her don't seem
to be what she wants."
"I hope not, my dear, but it ain't like her to turn away from
her food, and she has a nasty little cough that don't get
better. Anyways I'll ask Dr. Booth to look in, there can't be
no harm in that. There, there my darling," she added,
taking Dodie on to her knee, "don't cry, there's a pet."
"I don't like the look of her," she murmured more to herself
than to anyone else, "her eyes are too bright to be natural,
and she's restless, poor little dear." Then louder she added,
"Geoff, you might run down when you've finished your
dinner and ask the doctor to be so good as to look in. You'd
catch him before he starts out on his rounds if you're
quick."
"Dr. Booth," he said,—looking up into the Doctor's face—
"will Dodie be well by the time Father comes home?"
Geoff did not move his eyes from his face, till Nurse
suddenly looking up and becoming conscious of the three
little listeners who stood around, ordered them all
peremptorily out of the room. Geoffrey, however, waylaid
the Doctor as he left.
It was the greatest relief when Nurse at last came down and
told him he might go upstairs and watch by Dodie's crib
while she had her tea, and that was the beginning of a
continual watching on the boy's part. Nurse finding how
gentle and tender he was, and how noiselessly he could
move about when he liked, did not object to his spending
many hours by Dodie's crib, and indeed, in her great
anxiety, she began to be thankful for the boy's presence.
For the Doctor's report of Dodie had been serious. The child