Professional Documents
Culture Documents
Textbook Neural Information Processing 24Th International Conference Iconip 2017 Guangzhou China November 14 18 2017 Proceedings Part Ii 1St Edition Derong Liu Et Al Eds Ebook All Chapter PDF
Textbook Neural Information Processing 24Th International Conference Iconip 2017 Guangzhou China November 14 18 2017 Proceedings Part Ii 1St Edition Derong Liu Et Al Eds Ebook All Chapter PDF
https://textbookfull.com/product/neural-information-
processing-27th-international-conference-iconip-2020-bangkok-
thailand-november-18-22-2020-proceedings-part-iv-haiqin-yang/
https://textbookfull.com/product/neural-information-
processing-27th-international-conference-iconip-2020-bangkok-
thailand-november-18-22-2020-proceedings-part-v-haiqin-yang/
https://textbookfull.com/product/neural-information-
processing-27th-international-conference-iconip-2020-bangkok-
thailand-november-23-27-2020-proceedings-part-ii-haiqin-yang/
Computational Intelligence and Intelligent Systems 9th
International Symposium ISICA 2017 Guangzhou China
November 18 19 2017 Revised Selected Papers Part II
Kangshun Li
https://textbookfull.com/product/computational-intelligence-and-
intelligent-systems-9th-international-symposium-
isica-2017-guangzhou-china-november-18-19-2017-revised-selected-
papers-part-ii-kangshun-li/
https://textbookfull.com/product/neural-information-
processing-27th-international-conference-iconip-2020-bangkok-
thailand-november-23-27-2020-proceedings-part-i-haiqin-yang/
https://textbookfull.com/product/neural-information-
processing-27th-international-conference-iconip-2020-bangkok-
thailand-november-23-27-2020-proceedings-part-iii-haiqin-yang/
https://textbookfull.com/product/advanced-hybrid-information-
processing-first-international-conference-adhip-2017-harbin-
china-july-17-18-2017-proceedings-1st-edition-guanglu-sun/
https://textbookfull.com/product/neural-information-
processing-25th-international-conference-iconip-2018-siem-reap-
cambodia-december-13-16-2018-proceedings-part-ii-long-cheng/
Derong Liu · Shengli Xie
Yuanqing Li · Dongbin Zhao
El-Sayed M. El-Alfy (Eds.)
LNCS 10635
Neural
Information Processing
24th International Conference, ICONIP 2017
Guangzhou, China, November 14–18, 2017
Proceedings, Part II
123
Lecture Notes in Computer Science 10635
Commenced Publication in 1973
Founding and Former Series Editors:
Gerhard Goos, Juris Hartmanis, and Jan van Leeuwen
Editorial Board
David Hutchison
Lancaster University, Lancaster, UK
Takeo Kanade
Carnegie Mellon University, Pittsburgh, PA, USA
Josef Kittler
University of Surrey, Guildford, UK
Jon M. Kleinberg
Cornell University, Ithaca, NY, USA
Friedemann Mattern
ETH Zurich, Zurich, Switzerland
John C. Mitchell
Stanford University, Stanford, CA, USA
Moni Naor
Weizmann Institute of Science, Rehovot, Israel
C. Pandu Rangan
Indian Institute of Technology, Madras, India
Bernhard Steffen
TU Dortmund University, Dortmund, Germany
Demetri Terzopoulos
University of California, Los Angeles, CA, USA
Doug Tygar
University of California, Berkeley, CA, USA
Gerhard Weikum
Max Planck Institute for Informatics, Saarbrücken, Germany
More information about this series at http://www.springer.com/series/7407
Derong Liu Shengli Xie
•
Neural
Information Processing
24th International Conference, ICONIP 2017
Guangzhou, China, November 14–18, 2017
Proceedings, Part II
123
Editors
Derong Liu Dongbin Zhao
Guangdong University of Technology Institute of Automation
Guangzhou Chinese Academy of Sciences
China Beijing
China
Shengli Xie
Guangdong University of Technology El-Sayed M. El-Alfy
Guangzhou King Fahd University of Petroleum
China and Minerals
Dhahran
Yuanqing Li Saudi Arabia
South China University of Technology
Guangzhou
China
publishing the proceedings in the prestigious LNCS series and for sponsoring the best
paper awards at ICONIP 2017.
General Chair
Derong Liu Chinese Academy of Sciences and Guangdong University
of Technology, China
Advisory Committee
Sabri Arik Istanbul University, Turkey
Tamer Basar University of Illinois, USA
Dimitri Bertsekas Massachusetts Institute of Technology, USA
Jonathan Chan King Mongkut’s University of Technology, Thailand
C.L. Philip Chen The University of Macau, SAR China
Kenji Doya Okinawa Institute of Science and Technology, Japan
Minyue Fu The University of Newcastle, Australia
Tom Gedeon Australian National University, Australia
Akira Hirose The University of Tokyo, Japan
Zeng-Guang Hou Chinese Academy of Sciences, China
Nikola Kasabov Auckland University of Technology, New Zealand
Irwin King Chinese University of Hong Kong, SAR China
Robert Kozma University of Memphis, USA
Soo-Young Lee Korea Advanced Institute of Science and Technology,
South Korea
Frank L. Lewis University of Texas at Arlington, USA
Chu Kiong Loo University of Malaya, Malaysia
Baoliang Lu Shanghai Jiao Tong University, China
Seiichi Ozawa Kobe University, Japan
Marios Polycarpou University of Cyprus, Cyprus
Danil Prokhorov Toyota Technical Center, USA
DeLiang Wang The Ohio State University, USA
Jun Wang City University of Hong Kong, SAR China
Jin Xu Peking University, China
Gary G. Yen Oklahoma State University, USA
Paul J. Werbos Retired from the National Science Foundation, USA
VIII ICONIP 2017 Organization
Program Chairs
Shengli Xie Guangdong University of Technology, China
Yuanqing Li South China University of Technology, China
Dongbin Zhao Chinese Academy of Sciences, China
El-Sayed M. El-Alfy King Fahd University of Petroleum and Minerals,
Saudi Arabia
Program Co-chairs
Shukai Duan Southwest University, China
Kazushi Ikeda Nara Institute of Science and Technology, Japan
Weng Kin Lai Tunku Abdul Rahman University College, Malaysia
Shiliang Sun East China Normal University, China
Qinglai Wei Chinese Academy of Sciences, China
Wei Xing Zheng University of Western Sydney, Australia
Regional Chairs
Cesare Alippi Politecnico di Milano, Italy
Tingwen Huang Texas A&M University at Qatar, Qatar
Dianhui Wang La Trobe University, Australia
Publicity Chairs
Jun Fu Northeastern University, China
Min Han Dalian University of Technology, China
Yanjun Liu Liaoning University of Technology, China
Stefano Squartini Università Politecnica delle Marche, Italy
Kay Chen Tan National University of Singapore, Singapore
Kevin Wong Murdoch University, Australia
Simon X. Yang University of Guelph, Canada
Publication Chairs
Ding Wang Chinese Academy of Sciences, China
Jian Wang China University of Petroleum, China
Finance Chair
Xinping Guan Shanghai Jiao Tong University, China
X ICONIP 2017 Organization
Registration Chair
Qinmin Yang Zhejiang University, China
Conference Secretariat
Biao Luo Chinese Academy of Sciences, China
Bo Zhao Chinese Academy of Sciences, China
Contents
Deep Learning
DeepBIBX: Deep Learning for Image Based Bibliographic Data Extraction . . . 286
Akansha Bhardwaj, Dominik Mercier, Andreas Dengel,
and Sheraz Ahmed
Boxless Action Recognition in Still Images via Recurrent Visual Attention. . . . 663
Weijiang Feng, Xiang Zhang, Xuhui Huang, and Zhigang Luo
Brain-Computer Interface
Recognition of Voluntary Blink and Bite Base on Single Forehead EMG. . . . 759
Jianhai Zhang, Wenhao Huang, Shaokai Zhao, Yanyang Li,
and Sanqing Hu
Composite and Multiple Kernel Learning for Brain Computer Interface . . . . . 803
Minmin Miao, Hong Zeng, and Aimin Wang
Computational Finance
1 Introduction
Automated theorem proving (ATP) is a subfield of automated reasoning and
mathematical logic. The goal of the ATP is proving that conjecture is a logical
consequence of axioms and hypotheses. The traditional way of ATP is using
first order language such as Isabelle [13], HOL [18] to build axioms and make
reasoning. For example, [15] gives a form that made by premises-conclusion
pairs. [17] introduces a way that could give procedures and intermediate steps.
Nevertheless, in ATP, the whole process is strongly depended on researcher’s
experience because people need to predicting whether a statement is useful in
the proof of a given conjecture (we call this process: premise selection). And
there are dozens of thousands of statement. All the thing that computer can do
is helping people to complete the logical inference. Meanwhile, although formal
proof requires couples of person-years, which is highly time-consuming, the result
is not well: the formal proof still cannot prove complex system [10].
In recent years, machine learning becomes a popular technology to solve ATP
problems [2,4]. Such as [11], provides a method that using machine learning to
build a ATP system. In [5], the author provides us a dataset named: HolStep.
This dataset is a machine learning dataset for ATP. At the same time [5] demon-
strate state-of-the-art performance on HolStep, reaching 85% accuracy. But there
is no generalizability of the results. Because this model ignores the most basic
feature of ATP: recursion.
So in this paper, we are going to join recursion with Convolutional neural net-
work, helping people deciding intermediate steps. We introduce the elementary
c Springer International Publishing AG 2017
D. Liu et al. (Eds.): ICONIP 2017, Part II, LNCS 10635, pp. 3–12, 2017.
https://doi.org/10.1007/978-3-319-70096-0_1
4 K. Peng and D. Ma
Fig. 1. Top traditional linear CNN structure. Bottom a tree-structure CNN with five
leaf notes
2 Related Work
The combining machine learning and ATP is focused on two aspects: premise
selection and strategy selection. The basic theorem proving task is premise selec-
tion. Given a number of proven facts and a conjecture to be proved, the problem
that selecting the most possible facts to finish a successful proof is called premise
selection [2], The task is crucial for the efficiency of the state-of-the-art auto-
matic techniques [3]. In [9], The authors implement the SInE classifier to solve
the large scale theory reasoning.
The subsequent theorem proving task is strategic selection. Strategy selection
means that people use the premise to finish the proof according to a precise order.
In modern ATP, for instance Vampire [11] or E [16], it includes language that
can description the strategy and allow a user to specify the ordering.
Tree Structure CNN for Automated Theorem Proving 5
At last, the machine learning method provides an effective way to help people
for choosing the inference steps. In paper [1], the author raises a learning-based
premise selection method that in a 50% improvement on the benchmark over
the SInE, a state-of-the-art system. In paper [6] the author successfully applies
it into higher-order logic proving.
3 Task Description
In a computer or mathematics system. There are some properties we think they
are right. We call them axioms. For example, rev[ ] = [ ]. rev represents reverse
operator for a list. [ ] is an empty list. rev[ ] = [ ] means that reversing an
empty list equal to list itself. For properties that we do not know or want to
verify, such as rev(rev xs) = xs (xs is a nonempty list), we call it conjecture.
rev(rev xs) = xs means that if we reverse a list twice, we get the original list. If
we could use axioms to prove a conjecture, we call conjecture: premises. Our job
is choosing a premise from a set of the premises. Because human have to specify
intermediate steps in dozens of thousands of theorems, this is time-consuming
work that could take a couple of years [10]. So in this paper, we give attention
to the task that whether a premise is helpful to the final result. Apparently, the
model of our paper is a binary conditioned classification model: If the premise
is helpful for the final conclusion, it is belonging to the positive class. Otherwise
belong to the negative class.
At first, we will give an example to explain what is the main goal and subgoal,
the basic and important feature of formal proof. For instance, there is a list xs.
We wish to prove that reverse the list twice is equal to the list itself. If we
represent it formally, we could get the following equation:
rev(rev[ ]) = [ ]. (2)
a#list means that take the first element from the list, for another word, take
the head of the list. To prove the first subgoal, we need an axiom:
rev[ ] = [ ]. (4)
The symbol & means an operator, for example: #. This premise is helpful
to our result, so it belong to the positive class. From this example, we can
see that every time we choose a premise and make inference, we will get some
new subgoals. To proof those new subgoals, we need more premises. This is
a recursively process. Also, because recursion feature, the sequence of proof is
important. For example, if we try to prove the second subgoal rev(rev(a#list)) =
a#list. The proof would not have succeeded.
From this example, we can conclude that if we want CNN to deal with this
task, CNN must has ability to deal with recursively process. Only in this way
can we get a good result. Meanwhile, we know that tree-structure is a good way
to deal with recursive, so we are going to design a tree-structure CNN.
conjecture block. Our model includes one output: 0 imply that the dependency
(axiom) block has no or negative relationship with the conjecture block. 1 imply
that the dependency (axiom) block has a positive relationship with the conjec-
ture block. We describe the format of input in Sect. 6 specifically.
The first layer should be word embedding layer. Word embedding layer
will convert the dependency (axiom) block and conjecture block into an 256-
dimensional vector. This layer has been implemented by an open source frame-
work keras and we can use it’s API directly. The specific principles are men-
tioned in [12]. This step is not a vital step in our model because we do not have
to understand the process of word embedding. We only need the result of word
embedding. Then we deal with vectors by CNN layer and maxpooling layer. The
difference between our model and [2] is: We think the output of every CNN
layer is the subgoal of ATP, so we try to merge the output of every CNN into
a whole. After that, we use bidirectional LSTM to deal with the conjecture and
LSTM deal with the dependency (axiom). At last, we choose the binary cross
entropy function as loss function. We also utilize L2 regularization to prevent
the overfitting. The whole structure is indicated exactly in Fig. 3.
Our key idea of this work is to enable the neural network to learn the recursive
feature of ATP. In order to complete this goal, we use the tree-structure CNN.
The tree-structure CNN is different from all previous works in an important
aspect: previous approaches do not explicitly incorporate this recursive feature
of ATP into model. So those models will generalize poorly, whereas our model
incorporates recursion and will achieve perfect generalization.
5 Experiments
This dataset is made by google [2,5], which is well-suited for machine learn-
ing that are highly relevant for ATP. There are 2013046 training examples and
196030 testing examples in total. The dataset together with the description of
the used format is available from: http://cl-informatik.uibk.ac.at/cek/holstep/.
The input of this data set and labeled of data is as follow: Each input file
consists of a conjecture block, a number of dependency (axiom) blocks, and a
number of training/testing example blocks. The conjecture block starts with an
‘N’ and consists of 3 lines:
N name of conjecture
C text representation of the conjecture
T tokenization of the conjecture
Each dependency (axiom) block starts with a ‘D’ and consists of 3 lines:
D name of dependency (axiom)
A text representation of the dependency (axiom)
T tokenization of the conjecture
Each training/testing example starts with the symbol + or −. where + means
useful in the final proof and − not useful and consists of 2 lines:
+ text representation of the intermediate step
T tokenization of the intermediate step
8 K. Peng and D. Ma
Our model is implemented in tensorflow and keras. Each model was trained
on a Nvidia 1070GTX. The complete evaluation on HolStep dataset is given in
Tables 1 and 2, we run experiments on all five models in HolStep and compare
our result with four other results:
1D CNN+LSTM and 1D CNN. This model is purposed by [2,5]. It is a
simple but available model.
2-layer CNN+LSTM [8]. The difference between 1D CNN+LSTM and 2-
layer CNN+LSTM is that there is only one input to 1D CNN+LSTM. In 2-layer
CNN+LSTM, there are two input:statements and conjecture. The structure of
this model is show as Fig. 2.
VGG-16. VGG-16 was purposed by Oxford Visual Geometry Group. It won the
champion of ImageNet 2014. In this paper, we try to find whether this model
could be used to deal with the natural language problem [14].
ResNet. ResNet is a residual learning framework to ease the training of networks
that are substantially deeper than those used previously. It explicitly reformulate
the layers as learning.
Residual functions with reference to the layer inputs, instead of learning
unreferenced functions [7]. Tree-structure CNN+BILSTM. Our model tree-
structure CNN and we change LSTM to bidirectional LSTM as Fig. 3 shown.
The first model only has one input: dependency (axiom) blocks. The second
model includes two input: dependency (axiom) blocks and conjecture blocks.
The structure of the first model is show as Fig. 2.
6 Results
At first, we compare the traditional classify approach with our model. Tradi-
tional classify approaches include: SVM, KNN, Logistic Regression. Experimen-
tal results are presented in Tables 1 and 2 (the model with * is ours). Our model
yields 90% accuracy in training dataset. This shows that tree-structure CNN
could deal with the recursive process well. Additionally, our model yields 85%
accuracy in test dataset, 5% lower that train dataset. This difference is due
to (1) lacking of training data and overfitting. (2) The dependent relationship
between conjecture block and dependency (axiom) blocks is too complex. Mean-
while, SVM, KNN, Logistic Regression also do not have a good result. That is
because traditional way could not deal with the recursive information. They just
can measure the similarity of geometric space, such as Euclid Space.
Second, we compare other CNN models with our model. The result is showed
in the Tables 3 and 4. From the Tables 3 and 4, we can conclude that the CNN
models are better that traditional approach. However, the VGG-16 and ResNet
(50 layers), which perform very well in ImageNet contest, do not get a satisfac-
tory result in this dataset. This result means that there may not be generality
between CV models and NLP models because the basic feature of the picture
and ATP are different. So we still need to choose a suitable way for a specific
task.
Tree Structure CNN for Automated Theorem Proving 9
Fig. 2. Left shows the construction of our neural network. We try to involve all the
output of every CNN layers into a whole. And we apply bidirection LSTM to deal with
the last result. Right show the show the construction of [2]
Fig. 3. Left shows the relationship between epoch and the accuracy. Right shows the
relationship between the epoch and lost
10 K. Peng and D. Ma
For the 1D CNN+LSTM model and 2-layer CNN+LSTM model, the accu-
racy of the train set and test set are analogous. It shows that for those models, the
dataset are enough. Nonetheless, for Tree-structure CNN+BILSTM, the dataset
is not sufficient. Tree-structure CNN+BILSTM are more complicated than 1D
CNN+LSTM and 2-layer CNN+LSTM. So our model could learn the feature
more well. We need, however, more data to train.
7 Discussion
In this paper, we propose a deep learning model that can predict the usefulness
of a statement to the final result. Our work could improve ATP techniques
and save time for a formal proof. Our model has a significant generalization
ability because our model can capture the basic characteristic: recursion. Also,
the process of ATP is an orderly sequence. So we use the bidirectional LSTM and
the experimental results show our model reaching 90% accuracy, 5% higher than
[5]. But there are still some problems. The first one is we need more datasets to
prove our model’s generalization ability. Unfortunately, there are few datasets
in this area. So one of our future works is making more datasets of ATP. The
second problem is that although tree-structure CNN can improve the accuracy, it
is a simple model that only cocaine several nodes. But the ATP always includes
a huge number of subgoals. So when the ATP is too complicated, our model
may not work well. At the same time, if we add more nodes into our model,
the train will become highly time consuming. Also, more node may not improve
the accuracy significantly. The third problem is that even if we improve the
Tree Structure CNN for Automated Theorem Proving 11
accuracy in 5%, there is still a big space left. So in the future, we are about
revise the structure of LSTM. Changing the linear structure into tree-structure
in the future. We hope this revision could improve the accuracy.
At last, we note that there is an interesting example in our experiment. The
example is shown in the Fig. 4.
Fig. 4. Shows the proof relationship between main goal and subgoal. A → B means
we need A to proof B. The suspension points means that more axioms are needed to
proof D and E
The Fig. 4 clearly shows the proof relationship between subgoal and main
goal. A is the main goal of our proof. If we want to proof A, we need A1, A2,
and A3. So A1, A2, A3 have a positive relationship with the main goal. It is easy
to know that all the leaf node in this tree are all have a positive relationship
with the main goal. However, we find that in our experimental result. Our model
classifies a leaf node in seven layer of this tree into a negative relationship (The
seven layer is not shown in Fig. 4). Oppositely, SVM gives our the correct answer.
From this example, we could get a conclusions: ATP is very complex. It contain
many axioms and the relationship between them is also complex. So, we need
to combine the traditional way with CNN. Only CNN or SVM may not be
successful.
References
1. Alama, J., Heskes, T., Kühlwein, D., Tsivtsivadze, E., Urban, J.: Premise selection
for mathematics by corpus analysis and kernel methods. J. Autom. Reasoning
52(2), 191–213 (2014)
2. Alemi, A.A., Chollet, F., Irving, G., Szegedy, C., Urban, J.: DeepMath-deep
sequence models for premise selection. arXiv preprint arXiv:1606.04442 (2016)
3. Aspinall, D., Kaliszyk, C.: What’s in a theorem name? In: Blanchette, J.C., Merz,
S. (eds.) ITP 2016. LNCS, vol. 9807, pp. 459–465. Springer, Cham (2016). doi:10.
1007/978-3-319-43144-4 28
Another random document with
no related content on Scribd:
way, but not quite like the town doctors; and the ministers are very
nice—” This she said in a hesitating undertone, not expressive of
hearty concurrence, and ended in a firmer voice, “but not like my
own.”
“’Deed, mem,” said Hillend, “gi’e us farmers a gude miller an’ a gude
smith, an’ we can do weel enough wi’ ony ministers or doctors that
likes to come.”
“That wasna bad for Hillend,” said Bell.
“Well, Bell,” said Mr. Walker, “I thought it rather hard on the ministers
when I first heard the story, but—” And here he gave his views of the
Non-Intrusionists with, for him, unusual fervour, and added, “Now I
quite agree with Hillend, that congregations should accept, and
welcome, and honour the ministers who are appointed over them.”
“That’s without a doubt,” said Bell; “and esteem them very highly for
their work’s sake.”
The news of Mr. Walker’s appointment to Blinkbonny
was received with first a stare, then a shrug of the “AS YOU
shoulders, then a pretty general feeling that “they LIKE IT.”
might have had worse.” He was certainly not a shining
light, but he was a nice man, had a large family, and it would be a
good change for them. And although the local poetaster circulated a
sorry effusion on the subject, in which he, without acknowledgment,
stole from Cowper’s Needless Alarm,—
and—
[11] Endure.
But when the settling up came, Bell found Mrs. Walker “easy dealt
wi’,”—not only satisfied with her valuation, but very complimentary as
to the state in which everything was left, and very agreeable—very.
CHAPTER VII.
OUT OF THE OLD HOME AND INTO THE NEW.
James Ballantine.
[13] Trust.
Dan waited long and wearily for his expected brood; he looked for
them on the reckoned day, but it passed, and the next, and the next,
until a full week had elapsed, and still no birds. Early on the eighth
morning he determined to “pitch” the eggs away, and was angrily
stooping down to lift off the hen, which, although it was a great
favourite and a “splendid sitter,” would have had a rough toss and a
long one, when he heard a cheep.
The welcome sound was marrow to his bones. “Eh!”
was his first exclamation; “what’s that? is’t possible HIDDEN
after a’?” He heard more cheeping. “Isn’t it a gude TREASUR
thing I’ve been sae patient?” Then looking at the hen, ES.
which, but a minute before, he was preparing to use
very roughly, he said, “Eh, grannie, grannie, ye’re the best clocker in
the county; eh, my auld darlin’, my queen o’ beauty, ye’ll no’ want
your handfu’ o’ groats for this—I’ll gi’e ye a peck; jist anither day,
grannie, an’ ye’ll get oot wi’ yer darlin’s, ye ace o’ diements!”
The cheeping had now become very decided, and Dan, again
addressing grannie, said: “Sit on, my flower o’ the flock, my fail-me-
never, hap[15] the giant-killers wi’ yer bonnie, golden, cosy feathers
just till the nicht, till their wee jackets an’ glancin’ spurs are dry; an’
I’ll bring a’ the neebors about seven o’clock when they come hame,
and I’ll open the door, an’ ye’ll march out like Wallington at the head
o’ the Scotch Greys at Waterloo; and will they no’ stare when they
see your sturdy family following ye like the Royal Artillery?”
He then locked the door, and “warned” his cronies and neighbours to
come “sharp seven,” and they would see something really worth their
while.
Dan was in the fidgets all afternoon. Shortly before seven o’clock a
small crowd had gathered in his garden, to which Dan told the
pedigree of the birds, and spoke of their qualities in the most glowing
terms.
“Let’s see them, Dan,” said several voices; “let’s see them.”
“I’m waiting for Watty,” said Dan; and turning to a boy, said, “Gang to
the house-end, ma man, an’ see if he’s no’ comin’;” then addressing
his visitors, he said, “Watty’s the only man that I’m feared for in this
district; his birds hae beaten mine owre often; I’ll tether him noo, or
I’m cheated.”
As Dan finished this speech, Watty, a queer-looking customer
wearing a hairy skull-cap, smoking a short black pipe, and with both
hands in his pockets, joined the gathering. He gave a side nod to
Dan, and said “Hoo’s a’?” to the company.
“Noo for the show!” said Dan, as he unlocked the
’TWIXT hen-house (it was coal-house, goat-house, and
THE CUP served various other purposes), and flung the door
AND THE wide open, saying, “Come awa’, grannie, wi’ your
LIP.
‘royal family.’ There’s a pictur’, men, for ye.”
Grannie’s family had been restless, because hungry and particularly
thirsty, and she and they obeyed Dan’s summons with great
readiness and even haste.
Watty, who had till then smoked on in silence, quickly took the pipe
out of his mouth, stooped a little, shaded his eyes with one hand,
and seemed sadly puzzled. His first remark was:
“Man, Dan, they’ve awfu’ braid nebs” (broad bills).
“Braid nebs, or no’ braid nebs,” said Dan, “the game’s there onyway.”
“May be,” said Watty, “but they have maist awfu’ braid nebs,” for by
this time he and all the onlookers had “smelt a rat;” “and in ma
opinion they’re jucks.”
“Ye’re a juck!” said Dan, looking at him fiercely.
“Dinna look at me, Dan, look at them; look at their nebs, look at their
wab-feet—is thae no jucks?”
A second glance revealed to Dan that this was too true.
Roars of laughter, which only such an audience can give, ensued, in
which “Braid nebs,” “Gemm jucks,” “Grannie’s royal family,” “Tether
Watty,” were heard amidst the noisy peals of the uncontrolled and
apparently uncontrollable merriment.
Dan looked unutterable things; his face was one of dismal agony. He
took side glances at the crowd; each followed by a long look—a
perplexed, vindictive look—at the ducklings; whilst all the while the
crowd waxed merrier, and laughed louder as they saw his miserable,
heartbroken countenance.
Watty stooped down to lift a duckling, saying at the same time, “Man,
Dan, have ye lost your sicht? Div ye no’ see that thae’s jucks? Look
at their nebs, their feet, their size; hear their weet-weet;” but
“Grannie” barred the pass, flew at his hand, and pecked it sharply.
This revived the sorely afflicted Dan, and rousing himself, he said,
“Weel dune, grannie!” which the crowd received with a cheer and a
very loud laugh.
One of the onlookers, wishing to soothe Dan, said: “Jucks are as
gude as hens ony day, Dan; an’ they’re healthy-like birds.”
“You ignorant gomeral![16] you senseless blockhead! you born idiot!”
said Dan, his excitement increasing as he proceeded; “jucks like
game-cocks! jucks like the kind o’ game-cocks that should ha’ been
there, that were set by my ain hands! haud yer bletherin’[17] tongue.
Somebody’s been puggyin’[18] me. If I kent wha dared to tak’ their
nap[19] aff me, I wad gi’e them what they wad mind a’ their days; I
would fell them!”
[19] Fun.
A large crowd had now collected in Dan’s garden, and when the
new-comers heard the cause of the merriment, they joined in it and
kept it up.
“What are ye a’ doin’ laughin’ there at, like
LET heeawnies [hyenas]? Out o’ this, every one o’ ye, or
SLEEPING I’ll gar some o’ ye laugh on the ither side o’ yer lug
DOGS LIE. [ear]!” said Dan, looking daggers.
“Lock them up, Dan, for fear the witches change them into turkeys,”
said one of the crowd.
This made Dan furious: he seized an old spade which lay on the top
of his hen-house, and vowed that he “would fell ony man that said
another word.”
“If ye can catch him,” said a waif, with a knowing wink; and he made
off as fast as he could.
“If I can what?” said Dan. “I believe you’re the vagabond that’s
puggied me, and I’ll catch ye, supple an’ a’ as ye think ye are!”
Dan started, holding the spade over his head, fury in his eye,
vengeance in his heart. The crowd saw that his blood was up, and
cried, “Run, run, run for your very life!”
The man got into the field that lay between Dan’s cottage and Knowe
Park; Dan followed, as did also many of the crowd. The pursued
man, repenting of his rashness, and fearing the worst, as well he
might, made straight for Knowe Park wall.
Bell had heard the laughter when milking Daisy; Mr. and Mrs. Barrie
had heard it when taking an evening stroll in the garden, and all
three were standing at the wall wondering what could cause it, as the
laughter was unusually boisterous. They saw the chase begin. The
flying man observed Mr. Barrie, and made toward him as to a city of
refuge. When Mr. Barrie saw Dan rushing on, so dangerously armed
and so furious, he cried loudly, “Stop, Corbett! stop! I command you.”
This made Dan slacken his pace and lower his spade, but he walked
sulkily on with the crowd, saying, “I’m no’ dune wi’ him yet. I’ll gi’e
him’t for this yet.—Wait a wee, just wait a wee,” until they came to
the wall of the garden.
“Whatever is all this about?” said Mr. Barrie. “What’s wrong, Corbett,
that you are so furious?”
“A’s wrang, sir, a’s wrang. I’ve been rubbit [i.e. robbed], an’ insulted,
an’ chagareened by that—” It took Dan a little time to select an
epithet strong enough for the occasion, and at the same time fit for
the minister’s ears. This was a difficult matter; many rushed to his
tongue-end, strong, withering, seasoned; undoubtedly, had it not
been for Mr. Barrie, he would have fired them off in a volley, and
greatly relieved himself thereby. At length he hurled out, “that
unhanged vagabond, he’s puggied me, but—”
Mr. Barrie looked at Dan, and said, “Stop, Corbett, say no more till
your passion cools;” then turning to the crowd he said, “What is the
cause of this unseemly uproar?”
Watty and several others began to explain the affair,
but every one that attempted it had to stop after saying PROBING
a word or two; even the offending man, although now THE
quite safe, was unable to get beyond “Dan set hens’ WOUND.
eggs” for laughing, and every man in the field was
writhing in fits and contortions, through excessive laughter, with the
exception of Dan, on whom the laughter was telling like oil on a
flame.
Mr. Barrie looked at Dan, and seeing that he was becoming even
more ferocious, said calmly: “Corbett, from the behaviour of the
crowd I suspect they have been playing some trick on you, and they
evidently have succeeded to their entire satisfaction, but to your
great annoyance. Please tell me really what has excited you.”
Dan told his story. The laughter was quite as general, but became
more distant as he proceeded, for whilst telling his tale he scowled
on the “grinning baboons,” as he called them, and clutched his
spade angrily, which still further widened the circle. Although Mr.
Barrie remained grave, Mrs. Barrie could not but laugh quietly, and
Bell, sheltered by an evergreen shrub, did so heartily, repeating,
“Well, I never!” All at once she stopped, thought a little, then saying
to herself, “That explains it,” she came close to the wall at the point
where Dan stood, and said: “There’s a brood o’ chickens, lang-leggit,
sharp-nebbit things, come to me that I never set; they’re maybe
yours, they’re no ours—they’re come-o’-wills.”
“What!” said Dan; “whan did they come out?”
“This day week exactly.”
“Let’s see them. Come in, Watty, an’ gie’s your skill o’ them,” said
Dan, with a happier but still nervous face; then addressing himself to
Bell, he said: “Hoo mony came oot?”
“Eleven out o’ thirteen; there were twa eggs did naething.”
“That’s very gude; that’s grand!” said Dan, who was already climbing
the wall to get in.
“Had ye no’ better wait till the morn’s mornin’?” said the considerate
Bell. “They’re a’ shut up for the nicht, an’ cosy under their mother’s
wing; ye’ll disturb them, puir things.”
“I maun see them the nicht; I’ll no’ live if I dinna see them the noo,
but I’ll be real canny wi’ them. Come on.”
Dan, Watty, and Bell went to the “cavie” or hencoop,
folded back the old bag which had been dropt over the BETTER
front of it to keep the inmates warm, and Dan saw to LO’ED YE
his intense delight two little heads peeping from under CANNA
their feathery covering. His educated although single BE.
eye at once settled the kind: “Game, game, every inch o’ them, and
baith cocks!” Then turning to his crony he said: “Watty, you’ll lift the
hen canny, canny, an’ I’ll tak’ stock.”
The result was “six cocks an’ five hens, the real true-blue breed,”
declared by Dan, and confirmed by Watty, with the addition of, “Dan,
ye’re rich noo.”
Bell would not hear of them being shifted that night, and ultimately
persuaded Dan to “leave them wi’ her hen till they were pickin’ for
themselves; she would take care o’ them, an’ nae cats could get
near them, for she had just gotten new nets.”
Dan got Bell to take the ducks,—“he couldn’t bear them; there was
nae water for them; his fowls wad dab them till there was no’ ane
left; it wad be a great obleegement to him.”
When Dan got home he could not rest; he smartly took down his
fishing-rod and strode to the waterside. The evening air cooled him,
and he was further consoled by a good take. Under the “bass” (straw
door-mat) at Knowe Park kitchen door next morning, Bell found a
ten-pound salmon and three good large trouts—possibly they had
not passed the water-bailiffs. Bell looked at all sides of the question
of “what to do with them?” Many difficulties presented themselves to
her honest, correct mind, and as the greatest of these was, “What
else could she do with them?” she took in the foundlings and used
them well.
There was a little coming and going between Bell and Dan, until the
chickens were able to shift for themselves. When that was the case,
he carried them carefully over to his own house, and shared it with
them for a few months. The ducklings throve with Bell, and she
repaid Dan for them and the fish (for she found out that her guess as
to its having come from Dan was correct) in several ways, but
principally by occasional dozens of her “buttered” eggs. When eggs
were abundant, and therefore cheap, she preserved a large quantity
by rubbing them when newly laid with a very little butter all over, and
keeping them in salt. It was generally thought that she had some
special receipt or “secret,” for her buttered eggs had a fresh, curdy,
rich flavour that few preservers could attain to.
A penurious old maid had complained to Bell that “she did not
understand her hens; she was quite provoked at them, because in
the summer-time, when eggs were only sixpence the dozen, they
laid lots, but in the winter-time, when they were more than double
that price, they would not lay at all.”
Bell’s reply was: “I daresay no’; but ’deed, mem, ye’ll
CATCHING need to baith feed them better, an’ keep them
A TARTAR. cleaner and cosier, or they’ll do but little for you.”
The nicknames by which Dan had formerly been distinguished were,
after the affair of the ducklings, dropt entirely out of use, and he was
thereafter spoken of as “Braidnebs,” although none could use it in his
hearing with impunity.
Thomas Scott, the farmer of Babbie’s Mill, a forward ill-bred man,
was speaking in the market to Mr. Taylor, the elder already referred
to in these “Bits.” Dan chanced to pass near them, and the miller
said, loud enough for him and the most of the folks about the cross
to hear him, “Braidnebs or no’ braidnebs, the game’s there onyway.”
Dan scowled at the miller, and tried to suppress his rage. In his own
words, “I tried to steek[20] my mouth, but there was a rattlin’ in my
throat like to choke me. I lookit at Mr. Taylor. He kent,[21] ’deed a’body
kent, that the miller’s wife was a yammerin’[22] petted cat, an’ I said,
‘Maister Taylor, there’s a big bubblyjock[23] gangs about Babbie’s Mill
yonder, but he’s dabbit[24] to death wi’ a hen.’”
[20] Shut.
[21] Knew.
[22] Grumbling.
[23] Turkey-cock.
[24] Pecked.