Professional Documents
Culture Documents
Textbook Energy Minimization Methods in Computer Vision and Pattern Recognition Marcello Pelillo Ebook All Chapter PDF
Textbook Energy Minimization Methods in Computer Vision and Pattern Recognition Marcello Pelillo Ebook All Chapter PDF
https://textbookfull.com/product/computer-vision-pattern-
recognition-image-processing-and-graphics-renu-rameshan/
https://textbookfull.com/product/image-processing-computer-
vision-and-pattern-recognition-hamid-r-arabnia/
https://textbookfull.com/product/handbook-of-pattern-recognition-
and-computer-vision-6th-edition-c-h-chen/
Pattern Recognition and Computer Vision Third Chinese
Conference PRCV 2020 Nanjing China October 16 18 2020
Proceedings Part III Yuxin Peng
https://textbookfull.com/product/pattern-recognition-and-
computer-vision-third-chinese-conference-prcv-2020-nanjing-china-
october-16-18-2020-proceedings-part-iii-yuxin-peng/
https://textbookfull.com/product/artificial-life-and-
evolutionary-computation-12th-italian-workshop-
wivace-2017-marcello-pelillo/
https://textbookfull.com/product/pattern-recognition-and-
computer-vision-first-chinese-conference-prcv-2018-guangzhou-
china-november-23-26-2018-proceedings-part-iv-jian-huang-lai/
https://textbookfull.com/product/pattern-recognition-and-
computer-vision-first-chinese-conference-prcv-2018-guangzhou-
china-november-23-26-2018-proceedings-part-i-jian-huang-lai/
https://textbookfull.com/product/pattern-recognition-and-
computer-vision-third-chinese-conference-prcv-2020-nanjing-china-
october-16-18-2020-proceedings-part-i-1st-edition-yuxin-peng/
Marcello Pelillo
Edwin Hancock (Eds.)
LNCS 10746
123
Lecture Notes in Computer Science 10746
Commenced Publication in 1973
Founding and Former Series Editors:
Gerhard Goos, Juris Hartmanis, and Jan van Leeuwen
Editorial Board
David Hutchison
Lancaster University, Lancaster, UK
Takeo Kanade
Carnegie Mellon University, Pittsburgh, PA, USA
Josef Kittler
University of Surrey, Guildford, UK
Jon M. Kleinberg
Cornell University, Ithaca, NY, USA
Friedemann Mattern
ETH Zurich, Zurich, Switzerland
John C. Mitchell
Stanford University, Stanford, CA, USA
Moni Naor
Weizmann Institute of Science, Rehovot, Israel
C. Pandu Rangan
Indian Institute of Technology Madras, Chennai, India
Bernhard Steffen
TU Dortmund University, Dortmund, Germany
Demetri Terzopoulos
University of California, Los Angeles, CA, USA
Doug Tygar
University of California, Berkeley, CA, USA
Gerhard Weikum
Max Planck Institute for Informatics, Saarbrücken, Germany
More information about this series at http://www.springer.com/series/7412
Marcello Pelillo Edwin Hancock (Eds.)
•
123
Editors
Marcello Pelillo Edwin Hancock
Ca’ Foscari University of Venice University of York
Venice York
Italy UK
LNCS Sublibrary: SL6 – Image Processing, Computer Vision, Pattern Recognition, and Graphics
This Springer imprint is published by the registered company Springer International Publishing AG
part of Springer Nature
The registered company address is: Gewerbestrasse 11, 6330 Cham, Switzerland
Preface
This volume contains the papers presented at the 11th International Conference on
Energy Minimization Methods in Computer Vision and Pattern Recognition
(EMMVCPR). The year 2017 marked the 20th anniversary of the inaugural meeting
of the conference series, which took place in Venice in May 1997, and we wanted to
celebrate by bringing the event back to Venice. The conference was held from October
30 to November 1, 2017, in the unique setting of Palazzo Franchetti, a historical
Venetian palace facing the Grand Canal. The conference was co-located with ICCV
2017, the 16th International Conference on Computer Vision, which was held in
Venice Lido the week before EMMCVPR.
The call for papers produced 51 submissions, resulting in the 37 papers appearing in
this volume, nine of which were presented as long talks, and the remainder as short
talks. We make no distinction between these two types of paper in this book.
The accepted papers cover a wide range of problems and perspectives including
clustering and quantum computing, image processing and video analysis, shading and
light, as well as optimization methods, and address both theoretical issues as well as
real-world problems. In addition to the contributed papers, the conference featured
two invited talks by Joachim Buhmann, from ETH Zurich (Switzerland), and Paolo
Frasconi, from the University of Florence (Italy).
We would like to take this opportunity to express our gratitude to all those who
helped to organize the conference. First of all, thanks are due to the members of the
Steering and the Scientific Committees and to all the additional reviewers. Special
thanks are due to the members of the Organizing Committee, Sebastiano Vascon and
Ismail Elezi, together with Cristiana Fiandra and her team at “The Office” for managing
the conference organization, and also Roberta D’Argenio and Agnese Boscarol, from
the European Center for Living Technology (ECLT), for their administrative support.
We gratefully acknowledge generous financial support from the Bertinoro International
Center for Informatics (BICI) and the Department of Environmental Sciences, Infor-
matics and Statistics (DAIS) of Ca’ Foscari University of Venice.
We also offer our appreciation to the editorial staff at Springer in producing this
book, and for supporting the event through publication in the LNCS series. We thank
all the authors and the invited speakers for helping to make this event a success, and
producing a high-quality publication to document the event. We warmly acknowledge
the help of Sebastiano Vascon in assembling the final proceedings volume.
Program Committee
Ognjen Arandjelovic University of St. Andrews, UK
Xiang Bai Huazhong University of Science and Technology,
China
Michael Bronstein University of Lugano, Switzerland
Andres Bruhn University of Stuttgart, Germany
Sotirios Chatzis CUT, Cyprus
Xilin Chen Institute of Computing Technology, China
Jim Clark McGill University, Canada
Daniel Cremers Technical University of Munich, Germany
Gianfranco Doretto West Virginia University, USA
Francisco Escolano University of Alicante, Spain
Mario Figueiredo Instituto de Telecomunicações,
Instituto Superior Técnico, Portugal
Alexander Fix Oculus Research, USA
Boris Flach Czech Technical University in Prague, Czech Republic
Davi Geiger NYU, USA
Edwin Hancock University of York, UK
Hiroshi Ishikawa Waseda University, Japan
Ian Jermyn Durham University, UK
Fredrik Kahl Lund University, Sweden
Zoltan Kato University of Szeged, Hungary
Anton Konushin Lomonosov Moscow State University, Russia
Hamid Krim North Carolina State University, USA
Longin Jan Latecki Temple University, USA
Hongdong Li ANU, Australia
Stan Li CBSR, CASIA, China
Stephen Maybank Birkbeck College, UK
Marcello Pelillo University of Venice, Italy
Thomas Pock Graz University of Technology, Austria
Antonio Robles-Kelly CSIRO, Australia
Emanuele Rodola Technical University of Munich, Germany
Samuel Rota Bulò Fondazione Bruno Kessler, Italy
Christoph Schnoerr Heidelberg University, Germany
Fiorella Sgallari University of Bologna, Italy
Ali Shokoufandeh Drexel University, USA
Hugues Talbot Université Paris Est, France
Wenbing Tao Huazhong University of Science and Technology,
China
VIII Organization
Additional Reviewers
Shadow and Specularity Priors for Intrinsic Light Field Decomposition . . . . . 389
Anna Alperovich, Ole Johannsen, Michael Strecke,
and Bastian Goldluecke
1 Introduction
After decades of being a mainly theoretical endeavor, quantum computing now
appears to become practical. First working quantum computers are being sold,
major industrial players invest into corresponding research and development,
and a growing number of voices predicts rapid technological progress [1–3]. All
these developments will likely impact machine learning and pattern recognition
because quantum computers promise fast solutions to the kind of optimization
problems dealt with in these fields [4–8].
In this paper, we are therefore concerned with the feasibility of quantum
computing for pattern recognition. In particular, we consider binary clustering
of numerical or relational data and show how to approach these problems using
adiabatic quantum computing [9].
Our interest in adiabatic quantum computing stems from the fact that recent
technological progress has happened mainly in this area [10,11]. Available devices
are geared towards finding low energy states of Ising models
n
n
H(s) = Qij si sj + qi si (1)
i,j=1 i=1
This can be easily seen using Fisher’s analysis of variance [17]. It establishes
that the total scatter of a sample of data points can be written as
ST = x − x̄2 = SW (k) + 1 SB (k) (6)
2n
x∈X
Interestingly, this expression provides an intuition for the fact that k-means
clustering is agnostic of cluster shapes and distances and thus tends to produce
clusters of about equal size even in situations where this seems unreasonable
[18]. In order for SB in (7) to be large, both the distance x̄1 − x̄2 between
the two cluster centers and the product n1 n2 of the two cluster sizes have to be
large. However, since the sum n1 + n2 = n is fixed, the product of the sizes will
be maximal if n1 = n2 = n2 .
6 C. Bauckhage et al.
and, using this heuristic, the problem of k = 2 means clustering becomes to solve
2
argmax n1 x̄1 − n2 x̄2 . (9)
x̄ 1 , x̄ 2
Next, we express the norm in (9) in a form that does not explicitly depend
on the cluster means x̄i . To this end, we gather the given data in a data matrix
X = [x1 , . . . , xn ] ∈ Rm×n and introduce two binary vectors z 1 , z 2 ∈ {0, 1}n
which indicate cluster memberships in the sense that entry l of z i is 1 if xl ∈ Xi
and 0 otherwise. This way, we can write n1 x̄1 = Xz 1 as well as n2 x̄2 = Xz 2
and thus
n1 x̄1 − n2 x̄2 2 = X z 1 − z 2 2 = Xs2 . (10)
Here, s is guaranteed to be a bipolar vector because, in hard k-means clustering,
every given data point is assigned to one and only one cluster so that
This, however, establishes that there is an Ising model formulation for the
problem of k = 2 means clustering of zero mean data. On the one hand, zero
mean data implies
x̄ = 1
n (n1 x̄1 + n2 x̄2 ) = 0 ⇔ n1 x̄1 = −n2 x̄2 (12)
so that the two means will always be of opposite sign. On the other hand, since
Xs2 = sT X T Xs, the problem in (9) is equivalent to solving
Because of (12) this will necessarily produce a vector s whose entries are not all
equal and thus induce a clustering of the data in X. If we furthermore substitute
Q = X T X, the problem in (23) can also be written as
n
argmin − Qij si sj (14)
s∈{−1,1}n i,j=1
Looking at (14), two final remarks appear to be in order. First of all, the
coupling matrix Q = X T X is a Gram matrix. The clustering heuristic derived
in this section therefore allows for invoking the kernel trick and is thus applicable
to a wide variety of practical problems.
Second of all, (14) exposes the “hardness” of k = 2 means clustering for it
reveals it to be an integer programming problem. That is, it shows that binary
clustering is to find a suitable label vector s ∈ {−1, 1}n whose entries assign data
points to clusters. A naı̈ve approach to k = 2 means clustering would therefore
be to evaluate (14) for each of the 2n possible assignments of n data points to 2
clusters. On a conventional digital computer this is of course impractical if n is
large. On a quantum computer, on the other hand, we could prepare a system of
n qubits that is in a superposition of 2n states each of which reflects a possible
solution. The trick is then to evolve the system such that, when its wave function
is collapsed, it is more likely to be found in a state that corresponds to a good
solution. Below, we discuss how to accomplish this.
count the degrees of the vertices vi . The degree matrix of G is a diagonal matrix
D = diag(d ) and it is easily verified that both the degree matrix as well as the
outer product matrix ddT are symmetric and positive semidefinite.
The Laplacian matrix of G is given by L = D − A. Again, if G is undirected,
its Laplacian is a symmetric and positive semidefinite matrix.
Finally, the volume vol(G) of G amounts to the sum of all its vertex degrees
n
vol(G) = di = 1T d (16)
i=1
+1, if vi ∈ V1
si = (18)
−1, if vi ∈ V2
indicate which of the clusters a vertex will belong to after the cut, straightforward
algebra [20] reveals that
cut(V1 , V2 ) = 1
4 sT L s. (19)
However, since naı̈vely minimizing (19) might lead to unbalanced partitions,
various modifications have been proposed. A particularly popular heuristic is to
consider normalized cuts
sT L s sT L s η1 + η2
Ncut(V1 , V2 ) = + = sT L s · (20)
η1 η2 η1 η2
which trade off the size of cuts and the volumes of the resulting subgraphs [19].
Finally, it is easy to see that the problem of minimizing (20) with respect to
s ∈ {−1, 1}n is an integer programming problem and thus NP hard. The Ncut
problem is therefore usually relaxed to spectral clustering [19,21,22]. However,
on an adiabatic quantum computer, we could again prepare a system of n qubits
which are in a superposition of the 2n possible solutions and evolve the system
such that the optimal solution is more likely to be measured. This, however,
requires an optimization objective that can be expressed as an Ising model.
Since the Ncut objective in (20) does not meet this requirement, we next derive
a balanced cut criterion that corresponds to an Ising model.
so that the squared difference of the volumes of the subgraphs that result from
a graph cut can be written as
(η1 − η2 )2 = sT ddT s. (22)
This observation immediately leads to yet another heuristic for balanced
minimum cuts: combining (19) and (22), we can formalize graph clustering as the
task of solving the following equality constrained quadratic binary optimization
problem
s∗ = argmin sT L s s. t. sT ddT s = 0 (23)
s∈{−1,1}n
∂
ψ(t) = −i H(t) ψ(t) (32)
∂t
where we have set = 1. In adiabatic quantum computing, we consider periods
ranging from t = 0 to t = τ and assume the Hamiltonian at time t to be given
as a convex combination of two static Hamiltonians, namely
H(t) = 1 − τt HB + τt HP . (33)
where σzi denotes the Pauli spin matrix σz acting on the ith qubit, that is
σzi = I ⊗ I ⊗ . . . ⊗ I ⊗ σz ⊗ I ⊗ I . . . ⊗ I . (35)
i−1 terms n−i terms
where σxi is defined as above, this time with respect to the Pauli spin matrix
σx .
To compute a clustering, we then let |ψ(t) evolve from |ψ(0) to ψ(τ )
where |ψ(0) is chosen to be the ground state of HB . That is, if λ denotes the
smallest eigenvalue of HB , the initial state |ψ(0) of the system corresponds to
the solution of
HB ψ(0) = λ ψ(0) . (37)
Finally, upon termination of its evolution, a measurement is performed on
the n qubit system. This will cause the wave function ψ(τ) to collapse to a
particular basis state and the probability for this state to be ψi is given by the
Ising Models for Binary Clustering via Adiabatic Quantum Computing 11
amplitude |ai (τ )|2 . However, since the adiabatic evolution was steered towards
the problem Hamiltonian HP , basis states that correspond to ground states of
HP are more likely to be found.
On an adiabatic quantum computer, this process be carried out physically.
On a digital computer, we may simulate it by numerically solving
τ
ψ(τ ) = −i H(t) ψ(t) dt. (38)
0
5 Practical Examples
In this section, we present several examples which demonstrate the practical
feasibility of the above ideas. Our examples are of didactic nature and mainly
intended to illustrate the adiabatic evolution of n qubit systems. We thus restrict
ourselves to rather small n = n1 + n2 in order to be able to comprehensibly
visualize how the amplitudes of 2n basis states evolve over time.
In each experiment, we simulate quantum adiabatic evolutions on a digital
computer. To this end, we set up the corresponding problem
Hamiltonian HP ,
the beginning Hamiltonian HB , and its ground state ψ(0) as discussed above
and use the Python quantum computing toolbox QuTiP [23] to numerically
solve (38) for t ∈ [0, τ = 50].
|ai |2
0.025
0.020
(a) n = 8 data points in R2
0.015
0.010
0.005
0.000
t→
|ψ i
(b) binary clustering result (c) evolution of amplitudes of basis states ψi
Fig. 1. Example of adiabatic quantum k = 2 means clustering using the Ising model
in (14). (a) zero mean sample of 8 data points; (b) corresponding clustering result; (c)
adiabatic evolution of a system of 8 qubits. During its evolution over time t, the system
is in a superposition of 28 = 256 basis states ψi each representing a possible binary
clustering. Initially, it is equally likely to find the system in any of these states. At the
end, two basis states have noticeably higher amplitudes
|ai |2 than
the others
and are
therefore more likely to be measured; these are 00111111 and 11000000 and they
both induce the result in (b).
Figure 2(a) shows n = 16 data points arranged in a pattern that suggests the
use of kernel k-means clustering. Accordingly, we choose the coupling matrix for
the Ising model in (14) to be a centered kernel matrix
Ising Models for Binary Clustering via Adiabatic Quantum Computing 13
1 1 1
Qij = k(xi , xj ) − k(xi , xl ) − k(xk , xj ) + 2 k(xk , xl ) (39)
n n n
l k k,l
2
where k xi , xj = exp −xi − xj /2 σ 2 and σ = 0.25. During its quantum
adiabatic evolution, the corresponding 16 qubit system ψ(t) is in a super-
position of 216 = 65536 basis states but evolves to a configuration in which
two of these basis states have much higher amplitudes than the others (see
Fig. 2(c)). Here, the two most likely basis states are 1111111100000000 and
0000000011111111 and they cluster the data in a manner a human observer
would expect (see Fig. 2(b)).
(b) binary clustering result (c) evolution of amplitudes of basis states ψi
(b) binary clustering result (c) evolution of amplitudes of basis states ψ i
graph can be in. At t = 0, all basis states are equally likely but over time their
amplitudes begin to increase or decrease. At t = τ , two of the basis states have
considerably higher amplitudes than the others so that a measurement of the
quantum system at this time will likely cause it to collapse to either one of these
more probable states.
Table 2 shows the top 8 most likely basis states for the qubit system of this
experiment to be found in at the end of its adiabatic evolution. Looking at this
table, we recognize that the two most likely basis states are |0000000000111111
and |1111111111000000 which, due to obvious symmetries, both produce the
result shown in Fig. 3(b).
Ising Models for Binary Clustering via Adiabatic Quantum Computing 15
6 Related Work
7 Summary
After decades of being a mainly theoretical concept, quantum computing is now
becoming practical. Companies like IBM, Google, or Microsoft invest increasing
resources into corresponding research and development [1] and further rapid
progress is expected [2,3]. All of this will likely impact statistical learning and
pattern recognition, because quantum computers have the potential to accelerate
the kind of optimization procedures that are at the heart of many algorithms in
these fields.
Since at the time of this writing, adiabatic quantum computers appear to
be most advanced, we were concerned with adiabatic quantum computing for
data analysis, especially for data clustering. From an abstract point of view,
the problem of setting up pattern recognition algorithms for adiabatic quantum
computing can be seen as the problem of expressing their objective functions in
terms of Ising models because adiabatic quantum computers are tailored towards
minimizing Ising energies. We thus derived Ising models for (kernel) k = 2-means
clustering and balanced graph cuts. Simulated examples demonstrated that the
Schrödinger equations governing the dynamics of corresponding qubit systems
can indeed be used to cluster data.
References
1. Technology Quarterly: Quantum Devices. The Economist, March 2017
2. Castelvecchi, D.: Quantum computers ready to leap out of the lab in 2017. Nature
541(7635), 9–10 (2017). https://doi.org/10.1038/541009a
3. Gibney, E.: D-Wave upgrade: how scientists are using the world’s most controversial
quantum computer. Nature 541(7638), 447–448 (2017). https://doi.org/10.1038/
541447b
4. Grover, L.: From Schrödinger’s equation to the quantum search algorithm. J. of
Phys. 56(2), 333–348 (2001). https://doi.org/10.1119/1.1359518
5. Aı̈meur, E., Brassard, G., Gambs, S.: Quantum clustering algorithms. In: Proceed-
ings ICML (2007)
6. Aı̈meur, E., Brassard, G., Gambs, S.: Quantum speed-up for unsupervised learning.
Mach. Learn. 90(2), 261–287 (2013). https://doi.org/10.1007/s10994-012-531305
7. Lloyd, S., Mohseni, M., Rebentrost, P.: Quantum algorithms for supervised and
unsupervised machine learning. arXiv:1307.0411 [quant-ph] (2013)
8. Wiebe, N., Kapoor, A., Svore, K.: Quantum algorithms for nearest-neighbor meth-
ods for supervised and unsupervised learning. Quantum Inf. Comput. 15(3–4),
316–356 (2015)
9. Albash, T., Lidar, D.: Adiabatic quantum computing. arXiv:1611.04471 [quant-ph]
(2016)
10. Bian, Z., Chudak, F., Macready, W., Rose, G.: The Ising model: teaching an old
problem new tricks. Technical report, D-Wave Systems (2010)
11. Johnson, M., Amin, M., Gildert, S., Lanting, T., Hamze, F., Dickson, N., Harris,
R., Berkley, A., Johansson, J., Bunyk, P., Chapple, E., Enderud, C., Hilton, J.,
Karimi, K., Ladizinsky, E., Ladizinsky, N., Oh, T., Perminov, I., Rich, C., Thom,
M., Tolkacheva, E., Truncik, C., Uchaikin, S., Wang, J., Wilson, B., Rose, G.:
Quantum annealing with manufactured spins. Nature 473(7346), 194–198 (2011).
https://doi.org/10.1038/nature10012
Ising Models for Binary Clustering via Adiabatic Quantum Computing 17
12. Born, M., Fock, V.: Beweis des Adiabatensatzes. Zeitschrift für Physik 51(3–4),
165–180 (1928). https://doi.org/10.1007/BF01343193
13. Lloyd, S.: Least squares quantization in PCM. IEEE Trans. Inf. Theory 28(2),
129–137 (1982). https://doi.org/10.1109/TIT.1982.1056489
14. Hartigan, J., Wong, M.: Algorithm AS 136: a k-means clustering algorithm. J. Roy.
Stat. Soc. C 28(1), 100–108 (1979). https://doi.org/10.2307/2346830
15. MacQueen, J.: Some methods for classification and analysis of multivariate observa-
tions. In: Proceedings Berkeley Symposium on Mathematical Statistics and Prob-
ability (1967)
16. Aloise, D., Deshapande, A., Hansen, P., Popat, P.: NP-hardness of euclidean
sum-of-squares clustering. Mach. Learn. 75(2), 245–248 (2009). https://doi.org/
10.1007/s10994-009-5103-0
17. Fisher, R.: On the probable error of a coefficient correlation deduced from a small
sample. Metron 1, 3–32 (1921)
18. MacKay, D.: Information Theory, Inference, and Learning Algorithms. Cambridge
University Press, Cambridge (2003)
19. Shi, J., Malik, J.: Normalized cuts and image segmentation. IEEE Trans. PAMI
22(8), 888–905 (2000). https://doi.org/10.1109/34.868688
20. Dhillon, I.: Co-clustering documents and words using bipartite spectral graph par-
titioning. In: Proceedings KDD (2001)
21. von Luxburg, U.: A tutorial on spectral clustering. Stat. Comput. 17(4), 395–416
(2007). https://doi.org/10.1007/s11222-007-9033-z
22. Fiedler, M.: A property of eigenvectors of nonnegative symmetric matrices and its
application to graph theory. Czech. Math. J. 25(4), 619–633 (1975)
23. Johansson, J., Nation, P., Nori, F.: QuTiP 2: a python framework for the dynamics
of open quantum systems. Comput. Phys. Commun. 184(4), 1234–1240 (2013).
https://doi.org/10.1016/j.cpc.2012.11.019
24. Biamonte, J., Wittek, P., Pancotti, N., Rebentrost, P., Wiebe, N., Lloyd, S.: Quan-
tum machine learning. arXiv:1611.09347 [quant-ph] (2016)
25. Dunjiko, V., Taylor, J., Briegel, H.: Quantum-enhanced machine learning. Phys.
Rev. Lett. 117(13), 130501 (2016). https://doi.org/10.1103/PhysRevLett.117.
130501
26. Schuld, M., Sinayskiy, I., Petruccione, F.: An introduction to quantum
machine learning. Contemp. Phys. 56(2), 172–185 (2014). https://doi.org/10.1080/
00107514.2014.964942
27. Wiebe, N., Kapoor, A., Svore, K.: Quantum perceptron models. In: Proceedings
NIPS (2016)
28. Wittek, P.: Quantum Machine Learning. Academic Press, London (2014)
29. Ushijima-Mwesigwa, H., Negre, C., Mniszewski, S.: Graph partitioning using quan-
tum annealing on the D-Wave system. arXiv:1705.03082 [quant-ph] (2017)
30. Newman, M.: Modularity and community structure in networks. PNAS 103(23),
8577–8582 (2006). https://doi.org/10.1073/pnas.0601602103
Quantum Interference and Shape
Detection
1 Introduction
We are given an image I containing multiple shapes, some of which are possibly
occluded and/or deformed, and there is possible clutter noise data. To detect
the shapes in I, we propose a new method, which is rooted in quantum theory.
The main advantage of the quantum method over existing methods is its high
accuracy and resistance to deformations thanks to the quantum interference
phenomenon as described, e.g., by Feynman and Hibbs [1] and Feynman [2].
The input data is specified by a set of N feature points X = {x1 , x2 , ..., xN }
in RD . Here we focus on 2-dimensional data, so D = 2. An image I may be
specified by the set of points, as shown for example in Fig. 1. Otherwise, feature
points can be extracted from I, using, for example SIFT features, HOG features,
or maximum of wavelet responses, etc.
In the previous chapter I have spoken of Mr. Warner and his fruit
orchard. The old saying that misfortunes, like blessings, never come
singly was verified in their case. One day, not long after the previous
incident, Miss Warner was in the orchard pruning some young trees.
As she moved away from a tree she had finished with, she felt a
sharp slap on her right thigh and knew at once she had trodden on a
snake, which animals are very numerous in that part of the country.
Her bush training had taught her that it is safest in such a case to
stand quite still until you know which end of the snake you are
treading on. Perchance it may be on its head, and if so you can
easily dispatch it; and if, unfortunately, you are on its tail, well, no
earthly power can save you from being bitten before you can jump
clear. As ill-luck would have it, Miss Warner had trod upon the
snake’s tail and it had retaliated by digging its poisoned fang into her
thigh. It was just about to make another stab, when she struck at it
with her pruning knife, cutting it in two and killing it instantly. Then
she coolly wiped the point of the knife on her dress, and deliberately
made a cross cut into her thigh where the snake had bitten her.
Then, while the blood was spurting from the wound, she called out to
her father who came running to her, knowing by the sound of her
voice that something was the matter. In a few seconds he had torn
his handkerchief into strips and tightly bound the leg above and
below the wound. Then, saddling his horse and one for the wounded
girl, they set off on their twelve mile ride to Newcastle.
You can imagine what that ride was like to Miss Warner, up hill and
down dale as fast as the horses could go, the great gash in her leg
was very agony of pain. It required endurance, nerve, and pluck—
qualities our colonial bush-reared maidens are in no way deficient in.
Her father, too, had a very good reason for letting her ride on
horseback. Snake poison, as is well known, causes sleepiness,
which, if succumbed to, knows no waking. Had Mr. Warner taken her
in a trap, he would not have been able to prevent her from falling
asleep, so had put her upon her horse, hoping to reach Newcastle
before the poison took effect. They had ridden about nine miles
when Miss Warner became very faint, and could scarcely keep her
seat on her horse. Just then they met a young man riding out to the
lake district, and as soon as he heard the state of affairs, he at once
turned his horse round and went back to Newcastle to obtain a
doctor.
Fortunately the doctor was in, he immediately ordered his carriage,
and taking his instrument case and some antidote to counteract the
snake poison, he set out and met Mr. and Miss Warner just outside
the town. He at once helped her off her horse, then she was taken
more dead than alive to the nearest house, and all that medical skill
could do and suggest was done for her, but it was fully three months
before she was able to return home, and then looked a perfect wreck
in comparison to her former robust self. But the brave spirit was in no
way quenched by the suffering she had gone through.
The other brave girl was the daughter of an old boat-builder name
Parrell, who lived on the banks of Lake McQuarrie, with his wife and
his one daughter, Jennie, who was seventeen years of age, and had
been born in the bush of Australia. Like most of the girls reared in the
bush, she was a fearless horse-woman, a strong swimmer, a first-
class shot with a revolver, as cool as a cucumber at all times, and, to
crown all, one of the prettiest girls in those parts, at least I thought
so, and many a young fellow beside, but Jennie would have none of
us, but would laugh and shake her head at our attempts to oust each
other in our efforts to win her favour, but it was all to no purpose,
Jennie remained heart-whole, and we sighed in vain.
The house they lived in was built of weatherboard, and stood at
the mouth of a small creek, where it emptied itself into the lake. All
the rooms were on the ground floor, and were divided by a thin
partition about six feet high, thus making two bedrooms and a good-
sized living room, all furnished very comfortably, the beds used being
the ordinary trestle camp beds. One night I had gone over to try and
get a chat with Jennie—the night was hot and sultry, there was not a
breath of air moving, the day had been one of the hottest we had
had, the mosquitoes were terribly vicious. When I got there I found
that Mrs. Parrell and Jennie, unable to bear the heat and mosquitoes
any longer for that day, had gone to bed under their mosquito
curtains, so Mr. Parrell and I sat on a log outside the cottage door.
There was some satisfaction in being near the object of my
admiration at any rate, and her father always gave me a very hearty
welcome, which was something in my favour, at least I thought so.
They had been in bed about two hours—while we had been yarning
—when Mrs. Parrell heard her daughter quietly calling her.
“What do you want, lassie?” she replied. “Can’t you sleep?”
“No, mother, there is a snake in my bed,” the girl answered. “It is
lying on my naked legs. I dare not move or it will bite me. Tell my
father quick.”
The terrified mother needed no second bidding, springing out of
bed she rushed to the door and told her husband.
“Hush, mother,” he said quietly, laying his hand on her arm, “don’t
make a sound, or you will do more harm than good.” Then quietly he
crept into his own room and speaking softly, said:
“Where is it now, Jennie? Don’t be afraid, lassie.”
“On my stomach, father, and it is working up to my breast,” she
replied in a low tone.
I thought, of course, the father would have rushed into the room,
and at one blow would have killed the reptile. Not so, the old man
had learnt by experience a better way than that.
The door of the bedroom was just at the foot of the girl’s bed, and
any noise, either by opening or shutting it, would have startled the
snake, and, perhaps, made it plunge its poisoned fang into Jennie’s
body, and thus have ended her young happy life, but the father knew
just what to do under the circumstances. Going to a cupboard that
stood in the corner of his room, he took down an old violin, then
crossing, without a sound, over to the end of the room farthest away
from the bed, he drew the bow lightly over the strings making a soft
plaintive sound. The moment the snake heard the noise, it poised its
head up on the girl’s breast, and as the soft plaintive notes floated
about the room it began to wriggle along towards the foot of the bed
in the direction of the sound. The brave girl kept her nerve and
presence of mind marvellously. She knew full well her very life
depended on it, for the slightest movement on her part, while the
snake was on her body, would have been fatal. But as soon as she
felt the snake slip off the bed she sprang up and out of the way. The
snake, when it heard her move, made a dart for a small knot hole in
the planking of the floor, through which it had entered the room, but it
had Jennie to reckon with, and before it could reach it she had
thrown a pillow upon it, then her father rushed in and dispatched it
with a stick, he brought it outside, and we measured it, and found it
was three feet nine inches in length, and was about as thick as a
broom handle. It was a carpet snake, one of the most deadly
enemies to be encountered in bush life. A few minutes afterwards,
Jennie and her mother having dressed, came outside, and I could
not help telling her how much I admired her nerve and courage, and
asked how she knew it was a snake?
“Why because it was so cold and clammy,” she replied, making a
wry face. “A snake is always cold, no matter how hot the place may
be where it gets into.”
The next day I had occasion to go to Newcastle on some business
for Mr. Williams and looked forward with some pleasure to the twelve
mile ride through the bush on the back of old Blunderbuss, a horse
that had once belonged to that king of bushrangers, Captain Morgan,
and after having passed through several people’s hands, had
become the property of Mr. Williams. We had only gone a few miles
on our way, when I began to notice that what little air there was had
an acrid smell about it, and every step it seemed to become more
close and stifling. Blunderbuss, too, began to twitch his ears and
sniff. Then a little farther on the cause of it burst upon me—the bush
was on fire—for a second or two I pulled up, and looking round, to
my amazement the trees near by began to blaze and crackle, and
there I was with fire in the front of me and sweeping along the path
over which I had come. There was nothing to be done but try and get
through to where it was already burnt, so putting spurs to the horse
we began our race for life. Hotter and hotter grew the blast, as I
urged the panting beast forward. Kangaroos leaped out from among
the burning scrub, and fled onward, a greater enemy than man was
upon them. Then a drove of wild horses went madly past, while the
birds shrieked and fluttered above our heads, powerless to help
themselves. Onward, still onward we went, the flames hissing and
crackling over our heads, and still we seemed to be no nearer to the
outlet, no track was visible, and all around was the thick hot air.
Blunderbuss, too, seemed to be pulling in quite an opposite direction,
then it flashed across my mind that perhaps the instinct of the poor
beast would lead us into safety, so giving him his head and hoping
for the best I let him go his own way. In an instant he wheeled almost
round, and sped onward towards what seemed a veritable inferno of
flaming trees, with head outstretched and feet scarce touching the
hot earth he dashed through the blazing mass, and presently to my
joy I saw that he was making for an old bridle path that led around
the cliffs to Newcastle, and thankful I was, for had it not been for the
instinct of Blunderbuss we might both have perished miserably. As
we slackened our speed now that the danger was past, I turned to
look at the sight we were leaving behind. Overhead was a dusky
canopy of thick black smoke, there stood the black bare trees
between us and the still raging fire, farther away the sparks were
dropping from the thick smoke like a hailstorm, then again it looked
like a moving curtain of crimson smoke, with the falling and blazing
twigs and small branches, like coloured fireworks—all blue, red and
yellow, while tongues of flame licked up the dry scrub and grass.
It was a grand sight to look at when you were safely out of it, but
we were both in a sorry state, for our hair was singed and our skin
blistered where the flames had touched us. The next day
Blunderbuss and I returned to Belmont, needless to say we gave the
still burning bush a wide berth, and reached home before nightfall.
The fire was burning for the best part of a week, and when some
weeks afterwards I passed that way again, I marvelled how we had
ever passed through that furnace of flame.
CHAPTER XXI
A Dangerous Enterprise