You are on page 1of 54

Neural Information Processing: 24th

International Conference, ICONIP 2017,


Guangzhou, China, November 14-18,
2017, Proceedings, Part II 1st Edition
Derong Liu Et Al. (Eds.)
Visit to download the full and correct content document:
https://textbookfull.com/product/neural-information-processing-24th-international-conf
erence-iconip-2017-guangzhou-china-november-14-18-2017-proceedings-part-ii-1st-e
dition-derong-liu-et-al-eds/
More products digital (pdf, epub, mobi) instant
download maybe you interests ...

Neural Information Processing: 24th International


Conference, ICONIP 2017, Guangzhou, China, November
14-18, 2017, Proceedings, Part II 1st Edition Derong
Liu Et Al. (Eds.)
https://textbookfull.com/product/neural-information-
processing-24th-international-conference-iconip-2017-guangzhou-
china-november-14-18-2017-proceedings-part-ii-1st-edition-derong-
liu-et-al-eds/

Neural Information Processing 27th International


Conference ICONIP 2020 Bangkok Thailand November 18 22
2020 Proceedings Part IV Haiqin Yang

https://textbookfull.com/product/neural-information-
processing-27th-international-conference-iconip-2020-bangkok-
thailand-november-18-22-2020-proceedings-part-iv-haiqin-yang/

Neural Information Processing 27th International


Conference ICONIP 2020 Bangkok Thailand November 18 22
2020 Proceedings Part V Haiqin Yang

https://textbookfull.com/product/neural-information-
processing-27th-international-conference-iconip-2020-bangkok-
thailand-november-18-22-2020-proceedings-part-v-haiqin-yang/

Neural Information Processing 27th International


Conference ICONIP 2020 Bangkok Thailand November 23 27
2020 Proceedings Part II Haiqin Yang

https://textbookfull.com/product/neural-information-
processing-27th-international-conference-iconip-2020-bangkok-
thailand-november-23-27-2020-proceedings-part-ii-haiqin-yang/
Computational Intelligence and Intelligent Systems 9th
International Symposium ISICA 2017 Guangzhou China
November 18 19 2017 Revised Selected Papers Part II
Kangshun Li
https://textbookfull.com/product/computational-intelligence-and-
intelligent-systems-9th-international-symposium-
isica-2017-guangzhou-china-november-18-19-2017-revised-selected-
papers-part-ii-kangshun-li/

Neural Information Processing 27th International


Conference ICONIP 2020 Bangkok Thailand November 23 27
2020 Proceedings Part I Haiqin Yang

https://textbookfull.com/product/neural-information-
processing-27th-international-conference-iconip-2020-bangkok-
thailand-november-23-27-2020-proceedings-part-i-haiqin-yang/

Neural Information Processing 27th International


Conference ICONIP 2020 Bangkok Thailand November 23 27
2020 Proceedings Part III Haiqin Yang

https://textbookfull.com/product/neural-information-
processing-27th-international-conference-iconip-2020-bangkok-
thailand-november-23-27-2020-proceedings-part-iii-haiqin-yang/

Advanced Hybrid Information Processing: First


International Conference, ADHIP 2017, Harbin, China,
July 17–18, 2017, Proceedings 1st Edition Guanglu Sun

https://textbookfull.com/product/advanced-hybrid-information-
processing-first-international-conference-adhip-2017-harbin-
china-july-17-18-2017-proceedings-1st-edition-guanglu-sun/

Neural Information Processing: 25th International


Conference, ICONIP 2018, Siem Reap, Cambodia, December
13–16, 2018, Proceedings, Part II Long Cheng

https://textbookfull.com/product/neural-information-
processing-25th-international-conference-iconip-2018-siem-reap-
cambodia-december-13-16-2018-proceedings-part-ii-long-cheng/
Derong Liu · Shengli Xie
Yuanqing Li · Dongbin Zhao
El-Sayed M. El-Alfy (Eds.)
LNCS 10635

Neural
Information Processing
24th International Conference, ICONIP 2017
Guangzhou, China, November 14–18, 2017
Proceedings, Part II

123
Lecture Notes in Computer Science 10635
Commenced Publication in 1973
Founding and Former Series Editors:
Gerhard Goos, Juris Hartmanis, and Jan van Leeuwen

Editorial Board
David Hutchison
Lancaster University, Lancaster, UK
Takeo Kanade
Carnegie Mellon University, Pittsburgh, PA, USA
Josef Kittler
University of Surrey, Guildford, UK
Jon M. Kleinberg
Cornell University, Ithaca, NY, USA
Friedemann Mattern
ETH Zurich, Zurich, Switzerland
John C. Mitchell
Stanford University, Stanford, CA, USA
Moni Naor
Weizmann Institute of Science, Rehovot, Israel
C. Pandu Rangan
Indian Institute of Technology, Madras, India
Bernhard Steffen
TU Dortmund University, Dortmund, Germany
Demetri Terzopoulos
University of California, Los Angeles, CA, USA
Doug Tygar
University of California, Berkeley, CA, USA
Gerhard Weikum
Max Planck Institute for Informatics, Saarbrücken, Germany
More information about this series at http://www.springer.com/series/7407
Derong Liu Shengli Xie

Yuanqing Li Dongbin Zhao


El-Sayed M. El-Alfy (Eds.)

Neural
Information Processing
24th International Conference, ICONIP 2017
Guangzhou, China, November 14–18, 2017
Proceedings, Part II

123
Editors
Derong Liu Dongbin Zhao
Guangdong University of Technology Institute of Automation
Guangzhou Chinese Academy of Sciences
China Beijing
China
Shengli Xie
Guangdong University of Technology El-Sayed M. El-Alfy
Guangzhou King Fahd University of Petroleum
China and Minerals
Dhahran
Yuanqing Li Saudi Arabia
South China University of Technology
Guangzhou
China

ISSN 0302-9743 ISSN 1611-3349 (electronic)


Lecture Notes in Computer Science
ISBN 978-3-319-70095-3 ISBN 978-3-319-70096-0 (eBook)
https://doi.org/10.1007/978-3-319-70096-0

Library of Congress Control Number: 2017957558

LNCS Sublibrary: SL1 – Theoretical Computer Science and General Issues

© Springer International Publishing AG 2017


This work is subject to copyright. All rights are reserved by the Publisher, whether the whole or part of the
material is concerned, specifically the rights of translation, reprinting, reuse of illustrations, recitation,
broadcasting, reproduction on microfilms or in any other physical way, and transmission or information
storage and retrieval, electronic adaptation, computer software, or by similar or dissimilar methodology now
known or hereafter developed.
The use of general descriptive names, registered names, trademarks, service marks, etc. in this publication
does not imply, even in the absence of a specific statement, that such names are exempt from the relevant
protective laws and regulations and therefore free for general use.
The publisher, the authors and the editors are safe to assume that the advice and information in this book are
believed to be true and accurate at the date of publication. Neither the publisher nor the authors or the editors
give a warranty, express or implied, with respect to the material contained herein or for any errors or
omissions that may have been made. The publisher remains neutral with regard to jurisdictional claims in
published maps and institutional affiliations.

Printed on acid-free paper

This Springer imprint is published by Springer Nature


The registered company is Springer International Publishing AG
The registered company address is: Gewerbestrasse 11, 6330 Cham, Switzerland
Preface

ICONIP 2017 – the 24th International Conference on Neural Information Processing –


was held in Guangzhou, China, continuing the ICONIP conference series, which
started in 1994 in Seoul, South Korea. Over the past 24 years, ICONIP has been held in
Australia, China, India, Japan, Korea, Malaysia, New Zealand, Qatar, Singapore,
Thailand, and Turkey. ICONIP has now become a well-established, popular and
high-quality conference series on neural information processing in the region and
around the world. With the growing popularity of neural networks in recent years, we
have witnessed an increase in the number of submissions and in the quality of papers.
Guangzhou, Romanized as Canton in the past, is the capital and largest city of southern
China’s Guangdong Province. It is also one of the five National Central Cities at the
core of the Pearl River Delta. It is a key national transportation hub and trading port.
November is the best month in the year to visit Guangzhou with comfortable weather.
All participants of ICONIP 2017 had a technically rewarding experience as well as a
memorable stay in this great city.
A neural network is an information processing structure inspired by biological
nervous systems, such as the brain. It consists of a large number of highly intercon-
nected processing elements, called neurons. It has the capability of learning from
example. The field of neural networks has evolved rapidly in recent years. It has
become a fusion of a number of research areas in engineering, computer science,
mathematics, artificial intelligence, operations research, systems theory, biology, and
neuroscience. Neural networks have been widely applied for control, optimization,
pattern recognition, image processing, signal processing, etc.
ICONIP 2017 aimed to provide a high-level international forum for scientists,
researchers, educators, industrial professionals, and students worldwide to present
state-of-the-art research results, address new challenges, and discuss trends in neural
information processing and applications. ICONIP 2017 invited scholars in all areas of
neural network theory and applications, computational neuroscience, machine learning,
and others.
The conference received 856 submissions from 3,255 authors in 56 countries and
regions across all six continents. Based on rigorous reviews by the Program Committee
members and reviewers, 563 high-quality papers were selected for publication in the
conference proceedings. We would like to express our sincere gratitude to all the
reviewers for the time and effort they generously gave to the conference. We are very
grateful to the Institute of Automation of the Chinese Academy of Sciences, Guang-
dong University of Technology, South China University of Technology, Springer’s
Lecture Notes in Computer Science (LNCS), IEEE/CAA Journal of Automatica Sinica
(JAS), and the Asia Pacific Neural Network Society (APNNS) for their financial
support. We would also like to thank the publisher, Springer, for their cooperation in
VI Preface

publishing the proceedings in the prestigious LNCS series and for sponsoring the best
paper awards at ICONIP 2017.

September 2017 Derong Liu


Shengli Xie
Yuanqing Li
Dongbin Zhao
El-Sayed M. El-Alfy
ICONIP 2017 Organization

General Chair
Derong Liu Chinese Academy of Sciences and Guangdong University
of Technology, China

Advisory Committee
Sabri Arik Istanbul University, Turkey
Tamer Basar University of Illinois, USA
Dimitri Bertsekas Massachusetts Institute of Technology, USA
Jonathan Chan King Mongkut’s University of Technology, Thailand
C.L. Philip Chen The University of Macau, SAR China
Kenji Doya Okinawa Institute of Science and Technology, Japan
Minyue Fu The University of Newcastle, Australia
Tom Gedeon Australian National University, Australia
Akira Hirose The University of Tokyo, Japan
Zeng-Guang Hou Chinese Academy of Sciences, China
Nikola Kasabov Auckland University of Technology, New Zealand
Irwin King Chinese University of Hong Kong, SAR China
Robert Kozma University of Memphis, USA
Soo-Young Lee Korea Advanced Institute of Science and Technology,
South Korea
Frank L. Lewis University of Texas at Arlington, USA
Chu Kiong Loo University of Malaya, Malaysia
Baoliang Lu Shanghai Jiao Tong University, China
Seiichi Ozawa Kobe University, Japan
Marios Polycarpou University of Cyprus, Cyprus
Danil Prokhorov Toyota Technical Center, USA
DeLiang Wang The Ohio State University, USA
Jun Wang City University of Hong Kong, SAR China
Jin Xu Peking University, China
Gary G. Yen Oklahoma State University, USA
Paul J. Werbos Retired from the National Science Foundation, USA
VIII ICONIP 2017 Organization

Program Chairs
Shengli Xie Guangdong University of Technology, China
Yuanqing Li South China University of Technology, China
Dongbin Zhao Chinese Academy of Sciences, China
El-Sayed M. El-Alfy King Fahd University of Petroleum and Minerals,
Saudi Arabia

Program Co-chairs
Shukai Duan Southwest University, China
Kazushi Ikeda Nara Institute of Science and Technology, Japan
Weng Kin Lai Tunku Abdul Rahman University College, Malaysia
Shiliang Sun East China Normal University, China
Qinglai Wei Chinese Academy of Sciences, China
Wei Xing Zheng University of Western Sydney, Australia

Regional Chairs
Cesare Alippi Politecnico di Milano, Italy
Tingwen Huang Texas A&M University at Qatar, Qatar
Dianhui Wang La Trobe University, Australia

Invited Session Chairs


Wei He University of Science and Technology Beijing, China
Dianwei Qian North China Electric Power University, China
Manuel Roveri Politecnico di Milano, Italy
Dong Yue Nanjing University of Posts and Telecommunications,
China

Poster Session Chairs


Sung Bae Cho Yonsei University, South Korea
Ping Guo Beijing Normal University, China
Yifei Pu Sichuan University, China
Bin Xu Northwestern Polytechnical University, China
Zhigang Zeng Huazhong University of Science and Technology, China

Tutorial and Workshop Chairs


Long Cheng Chinese Academy of Sciences, China
Kaizhu Huang Xi’an Jiaotong-Liverpool University, China
Amir Hussain University of Stirling, UK
ICONIP 2017 Organization IX

James Kwok Hong Kong University of Science and Technology,


SAR China
Huajin Tang Sichuan University, China

Panel Discussion Chairs


Lei Guo Beihang University, China
Hongyi Li Bohai University, China
Hye Young Park Kyungpook National University, South Korea
Lipo Wang Nanyang Technological University, Singapore

Award Committee Chairs


Haibo He University of Rhode Island, USA
Zhong-Ping Jiang New York University, USA
Minho Lee Kyungpook National University, South Korea
Andrew Leung City University of Hong Kong, SAR China
Tieshan Li Dalian Maritime University, China
Lidan Wang Southwest University, China
Jun Zhang South China University of Technology, China

Publicity Chairs
Jun Fu Northeastern University, China
Min Han Dalian University of Technology, China
Yanjun Liu Liaoning University of Technology, China
Stefano Squartini Università Politecnica delle Marche, Italy
Kay Chen Tan National University of Singapore, Singapore
Kevin Wong Murdoch University, Australia
Simon X. Yang University of Guelph, Canada

Local Arrangements Chair


Renquan Lu Guangdong University of Technology, China

Publication Chairs
Ding Wang Chinese Academy of Sciences, China
Jian Wang China University of Petroleum, China

Finance Chair
Xinping Guan Shanghai Jiao Tong University, China
X ICONIP 2017 Organization

Registration Chair
Qinmin Yang Zhejiang University, China

Conference Secretariat
Biao Luo Chinese Academy of Sciences, China
Bo Zhao Chinese Academy of Sciences, China
Contents

Deep Learning

Tree-Structure CNN for Automated Theorem Proving . . . . . . . . . . . . . . . . . 3


Kebin Peng and Dianfu Ma

Training Deep Autoencoder via VLC-Genetic Algorithm . . . . . . . . . . . . . . . 13


Qazi Sami Ullah Khan, Jianwu Li, and Shuyang Zhao

Training Very Deep Networks via Residual Learning with Stochastic


Input Shortcut Connections . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
Oyebade K. Oyedotun, Abd El Rahman Shabayek, Djamila Aouada,
and Björn Ottersten

Knowledge Memory Based LSTM Model for Answer Selection . . . . . . . . . . 34


Weijie An, Qin Chen, Yan Yang, and Liang He

Breast Cancer Malignancy Prediction Using Incremental Combination


of Multiple Recurrent Neural Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . 43
Dehua Chen, Guangjun Qian, Cheng Shi, and Qiao Pan

TinyPoseNet: A Fast and Compact Deep Network for Robust Head


Pose Estimation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53
Shanru Li, Liping Wang, Shuang Yang, Yuanquan Wang,
and Chongwen Wang

Two-Stage Temporal Multimodal Learning for Speaker


and Speech Recognition. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64
Qianli Ma, Lifeng Shen, Ruishi Su, and Jieyu Chen

SLICE: Structural and Label Information Combined Embedding


for Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73
Yiqi Chen and Tieyun Qian

An Ultrasonic Image Recognition Method for Papillary Thyroid


Carcinoma Based on Depth Convolution Neural Network . . . . . . . . . . . . . . 82
Wei Ke, Yonghua Wang, Pin Wan, Weiwei Liu, and Hailiang Li

An STDP-Based Supervised Learning Algorithm for Spiking


Neural Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 92
Zhanhao Hu, Tao Wang, and Xiaolin Hu
XII Contents

An End-to-End Approach for Bearing Fault Diagnosis Based


on a Deep Convolution Neural Network . . . . . . . . . . . . . . . . . . . . . . . . . . 101
Liang Chen, Yuxuan Zhuang, Jinghua Zhang, and Jianming Wang

Morph-CNN: A Morphological Convolutional Neural Network


for Image Classification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 110
Dorra Mellouli, Tarek M. Hamdani, Mounir Ben Ayed,
and Adel M. Alimi

Combating Adversarial Inputs Using a Predictive-Estimator Network. . . . . . . 118


Jeff Orchard and Louis Castricato

A Parallel Forward-Backward Propagation Learning Scheme


for Auto-Encoders. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 126
Yoshihiro Ohama and Takayoshi Yoshimura

Relation Classification via Target-Concentrated Attention CNNs . . . . . . . . . . 137


Jizhao Zhu, Jianzhong Qiao, Xinxiao Dai, and Xueqi Cheng

Comparing Hybrid NN-HMM and RNN for Temporal Modeling


in Gesture Recognition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 147
Nicolas Granger and Mounîm A. el Yacoubi

Patterns Versus Characters in Subword-Aware Neural Language Modeling . . . . 157


Rustem Takhanov and Zhenisbek Assylbekov

Hierarchical Attention BLSTM for Modeling Sentences and Documents . . . . 167


Xiaolei Niu and Yuexian Hou

Bi-Directional LSTM with Quantum Attention Mechanism


for Sentence Modeling. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 178
Xiaolei Niu, Yuexian Hou, and Panpan Wang

An Efficient Binary Search Based Neuron Pruning Method


for ConvNet Condensation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 189
Boyu Zhang, A.K. Qin, and Jeffrey Chan

CNN-LSTM Neural Network Model for Quantitative Strategy Analysis


in Stock Markets. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 198
Shuanglong Liu, Chao Zhang, and Jinwen Ma

Learning Inverse Mapping by AutoEncoder Based Generative


Adversarial Nets . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 207
Junyu Luo, Yong Xu, Chenwei Tang, and Jiancheng Lv

Fast and Accurate Image Super Resolution by Deep CNN


with Skip Connection and Network in Network . . . . . . . . . . . . . . . . . . . . . 217
Jin Yamanaka, Shigesumi Kuwashima, and Takio Kurita
Contents XIII

Generative Moment Matching Autoencoder with Perceptual Loss . . . . . . . . . 226


Mohammad Ahangar Kiasari, Dennis Singh Moirangthem,
and Minho Lee

Three-Means Ternary Quantization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 235


Jie Ding, JunMin Wu, and Huan Wu

Will Outlier Tasks Deteriorate Multitask Deep Learning? . . . . . . . . . . . . . . . 246


Sirui Cai, Yuchun Fang, and Zhengyan Ma

The Effect of Task Similarity on Deep Transfer Learning. . . . . . . . . . . . . . . 256


Wei Zhang, Yuchun Fang, and Zhengyan Ma

Exploiting the Tibetan Radicals in Recurrent Neural Network


for Low-Resource Language Models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 266
Tongtong Shen, Longbiao Wang, Xie Chen, Kuntharrgyal Khysru,
and Jianwu Dang

Learning Joint Multimodal Representation Based on Multi-fusion


Deep Neural Networks. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 276
Zepeng Gu, Bo Lang, Tongyu Yue, and Lei Huang

DeepBIBX: Deep Learning for Image Based Bibliographic Data Extraction . . . 286
Akansha Bhardwaj, Dominik Mercier, Andreas Dengel,
and Sheraz Ahmed

Bio-Inspired Deep Spiking Neural Network for Image Classification . . . . . . . 294


Jingling Li, Weitai Hu, Ye Yuan, Hong Huo, and Tao Fang

Asynchronous, Data-Parallel Deep Convolutional Neural Network Training


with Linear Prediction Model for Parameter Transition . . . . . . . . . . . . . . . . 305
Ikuro Sato, Ryo Fujisaki, Yosuke Oyama, Akihiro Nomura,
and Satoshi Matsuoka

Efficient Learning Algorithm Using Compact Data Representation


in Neural Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 315
Masaya Kibune and Michael G. Lee

Regularizing CNN via Feature Augmentation . . . . . . . . . . . . . . . . . . . . . . . 325


Liechuan Ou, Zheng Chen, Jianwei Lu, and Ye Luo

Effectiveness of Adversarial Attacks on Class-Imbalanced


Convolutional Neural Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 333
Rafael Possas and Ying Zhou

Sharing ConvNet Across Heterogeneous Tasks . . . . . . . . . . . . . . . . . . . . . . 343


Takumi Kobayashi
XIV Contents

Training Deep Neural Networks for Detecting Drinking Glasses


Using Synthetic Images . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 354
Abdul Jabbar, Luke Farrawell, Jake Fountain, and Stephan K. Chalup

Image Segmentation with Pyramid Dilated Convolution Based


on ResNet and U-Net . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 364
Qiao Zhang, Zhipeng Cui, Xiaoguang Niu, Shijie Geng, and Yu Qiao

Deep Clustering with Convolutional Autoencoders . . . . . . . . . . . . . . . . . . . 373


Xifeng Guo, Xinwang Liu, En Zhu, and Jianping Yin

An Incremental Deep Learning Network for On-line Unsupervised


Feature Extraction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 383
Yu Liang, Yi Yang, Furao Shen, Jinxi Zhao, and Tao Zhu

Compressing Low Precision Deep Neural Networks


Using Sparsity-Induced Regularization in Ternary Networks . . . . . . . . . . . . . 393
Julian Faraone, Nicholas Fraser, Giulio Gambardella, Michaela Blott,
and Philip H.W. Leong

A Feature Learning Approach for Image Retrieval. . . . . . . . . . . . . . . . . . . . 405


Junfeng Yao, Yao Yu, Yukai Deng, and Changyin Sun

Soft-Margin Softmax for Deep Classification . . . . . . . . . . . . . . . . . . . . . . . 413


Xuezhi Liang, Xiaobo Wang, Zhen Lei, Shengcai Liao, and Stan Z. Li

Temporal Attention Neural Network for Video Understanding . . . . . . . . . . . 422


Jegyung Son, Gil-Jin Jang, and Minho Lee

Regularized Deep Convolutional Neural Networks for Feature


Extraction and Classification. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 431
Khaoula Jayech

Soccer Video Event Detection Using 3D Convolutional Networks


and Shot Boundary Detection via Deep Feature Distance . . . . . . . . . . . . . . . 440
Tingxi Liu, Yao Lu, Xiaoyu Lei, Lijing Zhang, Haoyu Wang, Wei Huang,
and Zijian Wang

Very Deep Neural Networks for Hindi/Arabic Offline Handwritten


Digit Recognition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 450
Rolla Almodfer, Shengwu Xiong, Mohammed Mudhsh,
and Pengfei Duan

Layer Removal for Transfer Learning with Deep Convolutional


Neural Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 460
Weiming Zhi, Zhenghao Chen, Henry Wing Fung Yueng, Zhicheng Lu,
Seid Miad Zandavi, and Yuk Ying Chung
Contents XV

Music Genre Classification Using Masked Conditional Neural Networks . . . . 470


Fady Medhat, David Chesmore, and John Robinson

Reinforced Memory Network for Question Answering . . . . . . . . . . . . . . . . . 482


Anupiya Nugaliyadde, Kok Wai Wong, Ferdous Sohel, and Hong Xie

Hybrid Deep Learning for Sentiment Polarity Determination


of Arabic Microblogs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 491
Sadam Al-Azani and El-Sayed M. El-Alfy

Low Frequency Words Compression in Neural Conversation System. . . . . . . 501


Sixing Wu, Ying Li, and Zhonghai Wu

A Width-Variable Window Attention Model for Environmental Sensors. . . . . 512


Cuiqin Hou, Yingju Xia, Jun Sun, Jing Shang, Ryozo Takasu,
and Masao Kondo

Memorizing Transactional Databases Compressively in Deep Neural


Networks for Efficient Itemset Support Queries. . . . . . . . . . . . . . . . . . . . . . 521
Yi Ji and Yukio Ohsawa

Offensive Sentence Classification Using Character-Level CNN


and Transfer Learning with Fake Sentences . . . . . . . . . . . . . . . . . . . . . . . . 532
Suin Seo and Sung-Bea Cho

Hierarchical Hybrid Attention Networks for Chinese Conversation


Topic Classification. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 540
Yujun Zhou, Changliang Li, Bo Xu, Jiaming Xu, Jie Cao, and Bo Xu

Aggregating Class Interactions for Hierarchical Attention


Relation Extraction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 551
Kaiyu Huang, Si Li, and Guang Chen

Tensorial Neural Networks and Its Application in Longitudinal


Network Data Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 562
Mingyuan Bai, Boyan Zhang, and Junbin Gao

3HAN: A Deep Neural Network for Fake News Detection . . . . . . . . . . . . . . 572


Sneha Singhania, Nigel Fernandez, and Shrisha Rao

Hierarchical Parameter Sharing in Recursive Neural Networks


with Long Short-Term Memory . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 582
Fengyu Li, Mingmin Chi, Dong Wu, and Junyu Niu

Robust Deep Face Recognition with Label Noise . . . . . . . . . . . . . . . . . . . . 593


Jirui Yuan, Wenya Ma, Pengfei Zhu, and Karen Egiazarian
XVI Contents

Weakly-Supervised Dual Generative Adversarial Networks


for Makeup-Removal . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 603
Xuedong Hou, Yun Li, and Tao Li

Analysis of Gradient Degradation and Feature Map Quality in Deep


All-Convolutional Neural Networks Compared
to Deep Residual Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 612
Wei Gao and Mark D. McDonnell

Single-Image Super-Resolution for Remote Sensing Data


Using Deep Residual-Learning Neural Network . . . . . . . . . . . . . . . . . . . . . 622
Ningbo Huang, Yong Yang, Junjie Liu, Xinchao Gu, and Hua Cai

Layer-Wise Training to Create Efficient Convolutional Neural Networks . . . . 631


Linghua Zeng and Xinmei Tian

Learning Image Representation Based on Convolutional Neural Networks . . . 642


Zhanbo Yang, Fei Hu, Jingyuan Wang, Jinjing Zhang, and Li Li

Heterogeneous Features Integration in Deep Knowledge Tracing. . . . . . . . . . 653


Lap Pong Cheung and Haiqin Yang

Boxless Action Recognition in Still Images via Recurrent Visual Attention. . . . 663
Weijiang Feng, Xiang Zhang, Xuhui Huang, and Zhigang Luo

Compositional Sentence Representation from Character


Within Large Context Text. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 674
Geonmin Kim, Hwaran Lee, Bokyeong Kim, and Soo-young Lee

Ultra-deep Neural Network for Face Anti-spoofing . . . . . . . . . . . . . . . . . . . 686


Xiaokang Tu and Yuchun Fang

License Plate Detection Using Deep Cascaded Convolutional Neural


Networks in Complex Scenes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 696
Qiang Fu, Yuan Shen, and Zhenhua Guo

Brain-Computer Interface

Task-Free Brainprint Recognition Based on Degree of Brain Networks . . . . . 709


Wanzeng Kong, Qiaonan Fan, Luyun Wang, Bei Jiang, Yong Peng,
and Yanbin Zhang

Optimized Echo State Network with Intrinsic Plasticity for EEG-Based


Emotion Recognition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 718
Rahma Fourati, Boudour Ammar, Chaouki Aouiti,
Javier Sanchez-Medina, and Adel M. Alimi
Contents XVII

A Computational Investigation of an Active Region in Brain Network


Based on Stimulations with Near-Infrared Spectroscopy . . . . . . . . . . . . . . . . 728
Xu Huang, Raul Fernandez Rojas, Allan C. Madoc, Keng-Liang Ou,
and Sheikh Md. Rabiul Islam

An Algorithm Combining Spatial Filtering and Temporal Down-Sampling


with Applications to ERP Feature Extraction . . . . . . . . . . . . . . . . . . . . . . . 739
Feifei Qi, Yuanqing Li, Zhenfu Wen, and Wei Wu

Intent Recognition in Smart Living Through Deep Recurrent


Neural Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 748
Xiang Zhang, Lina Yao, Chaoran Huang, Quan Z. Sheng,
and Xianzhi Wang

Recognition of Voluntary Blink and Bite Base on Single Forehead EMG. . . . 759
Jianhai Zhang, Wenhao Huang, Shaokai Zhao, Yanyang Li,
and Sanqing Hu

Multimodal Classification with Deep Convolutional-Recurrent Neural


Networks for Electroencephalography . . . . . . . . . . . . . . . . . . . . . . . . . . . . 767
Chuanqi Tan, Fuchun Sun, Wenchang Zhang, Jianhua Chen,
and Chunfang Liu

An Improved Visual-Tactile P300 Brain Computer Interface. . . . . . . . . . . . . 777


Hongyan Sun, Jing Jin, Yu Zhang, Bei Wang, and Xingyu Wang

A New Hybrid Feature Selection Algorithm Applied to Driver’s


Status Detection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 786
Peng-fei Ye, Lan-lan Chen, and Ao Zhang

Deep Learning Method for Sleep Stage Classification . . . . . . . . . . . . . . . . . 796


Ling Cen, Zhu Liang Yu, Yun Tang, Wen Shi, Tilmann Kluge,
and Wee Ser

Composite and Multiple Kernel Learning for Brain Computer Interface . . . . . 803
Minmin Miao, Hong Zeng, and Aimin Wang

Transfer Learning Enhanced Common Spatial Pattern Filtering for Brain


Computer Interfaces (BCIs): Overview and a New Approach . . . . . . . . . . . . 811
He He and Dongrui Wu

EEG-Based Driver Drowsiness Estimation Using Convolutional


Neural Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 822
Yuqi Cui and Dongrui Wu

Real-Time fMRI-Based Brain Computer Interface: A Review . . . . . . . . . . . . 833


Yang Wang and Dongrui Wu
XVIII Contents

Computational Finance

Dynamic Bidding Strategy Based on Probabilistic Feedback


in Display Advertising . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 845
Yuzhu Wu, Shumin Pan, Qianwen Zhang, and Jinkui Xie

Dempster-Shafer Fusion of Semi-supervised Learning Methods


for Predicting Defaults in Social Lending . . . . . . . . . . . . . . . . . . . . . . . . . . 854
Aleum Kim and Sung-Bae Cho

Robust Portfolio Risk Minimization Using the Graphical Lasso . . . . . . . . . . 863


Tristan Millington and Mahesan Niranjan

Non-Negative Matrix Factorization with Exogenous Inputs


for Modeling Financial Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 873
Steven Squires, Luis Montesdeoca, Adam Prügel-Bennett,
and Mahesan Niranjan

Stacked Denoising Autoencoder Based Stock Market Trend Prediction


via K-Nearest Neighbour Data Selection . . . . . . . . . . . . . . . . . . . . . . . . . . 882
Haonan Sun, Wenge Rong, Jiayi Zhang, Qiubin Liang, and Zhang Xiong

Ten-Quarter Projection for Spanish Central Government Debt


via WASD Neuronet . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 893
Yunong Zhang, Zhongxian Xue, Mengling Xiao, Yingbiao Ling,
and Chengxu Ye

Data Augmentation Based Stock Trend Prediction


Using Self-organising Map . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 903
Jiayi Zhang, Wenge Rong, Qiubin Liang, Haonan Sun, and Zhang Xiong

Deep Candlestick Mining . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 913


Andrew D. Mann and Denise Gorse

Author Index . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 923


Deep Learning
Tree-Structure CNN for Automated
Theorem Proving

Kebin Peng(B) and Dianfu Ma

School of Computer Science and Engineering, Beihang University, Colleage Road,


Haidian District. 37, Beijing 100191, China
kebinpeng@gmail.com, madf@act.buaa.edu.cn

Abstract. The most difficult and heavy work of Automated Theorem


Proving (ATP) is that people should search in millions of intermediate
steps to finish proof. In this paper, we present a novel neural network,
which can effectively help people to finish this work. Specifically, we
design a tree-structure CNN, involving bidirectional LSTM. We com-
pare our model with other neural network models and make experiments
on HOLStep dataset, which is a machine learning dataset for Higher-
order logic theorem proving. Being compared to previous approaches,
our model improves accuracy significantly, reaching 90% accuracy on
the test set.

1 Introduction
Automated theorem proving (ATP) is a subfield of automated reasoning and
mathematical logic. The goal of the ATP is proving that conjecture is a logical
consequence of axioms and hypotheses. The traditional way of ATP is using
first order language such as Isabelle [13], HOL [18] to build axioms and make
reasoning. For example, [15] gives a form that made by premises-conclusion
pairs. [17] introduces a way that could give procedures and intermediate steps.
Nevertheless, in ATP, the whole process is strongly depended on researcher’s
experience because people need to predicting whether a statement is useful in
the proof of a given conjecture (we call this process: premise selection). And
there are dozens of thousands of statement. All the thing that computer can do
is helping people to complete the logical inference. Meanwhile, although formal
proof requires couples of person-years, which is highly time-consuming, the result
is not well: the formal proof still cannot prove complex system [10].
In recent years, machine learning becomes a popular technology to solve ATP
problems [2,4]. Such as [11], provides a method that using machine learning to
build a ATP system. In [5], the author provides us a dataset named: HolStep.
This dataset is a machine learning dataset for ATP. At the same time [5] demon-
strate state-of-the-art performance on HolStep, reaching 85% accuracy. But there
is no generalizability of the results. Because this model ignores the most basic
feature of ATP: recursion.
So in this paper, we are going to join recursion with Convolutional neural net-
work, helping people deciding intermediate steps. We introduce the elementary
c Springer International Publishing AG 2017
D. Liu et al. (Eds.): ICONIP 2017, Part II, LNCS 10635, pp. 3–12, 2017.
https://doi.org/10.1007/978-3-319-70096-0_1
4 K. Peng and D. Ma

character—subgoal, main goal and recursion—into CNN and propose a novel


neural network called Tree-structure CNN. The experimental results show that
recursion is a effective way to tackle the formal proof, especially for the premise
selection. Specifically, we change the traditional linear structure of CNN into
Tree-structure in order to cope with recursion. In our evaluation, we run experi-
ments in HolStep and compare our approach with four other models. The exper-
imental results demonstrate that our model yields significant accuracy improve-
ments compare to [5], reaching 90% accuracy on the HolStep.
The rest of this paper is organized as follows. In the Sect. 2, we will review
related work. In the Sect. 3 we introduce ATP and the basic feature–recursive.
This feature is the most important motivation of our paper. Our model and
motivation is in Sect. 4. Experimental and results are shown in Sects. 5 and 6.
At last, Discussion is in Sect. 7.

Fig. 1. Top traditional linear CNN structure. Bottom a tree-structure CNN with five
leaf notes

2 Related Work
The combining machine learning and ATP is focused on two aspects: premise
selection and strategy selection. The basic theorem proving task is premise selec-
tion. Given a number of proven facts and a conjecture to be proved, the problem
that selecting the most possible facts to finish a successful proof is called premise
selection [2], The task is crucial for the efficiency of the state-of-the-art auto-
matic techniques [3]. In [9], The authors implement the SInE classifier to solve
the large scale theory reasoning.
The subsequent theorem proving task is strategic selection. Strategy selection
means that people use the premise to finish the proof according to a precise order.
In modern ATP, for instance Vampire [11] or E [16], it includes language that
can description the strategy and allow a user to specify the ordering.
Tree Structure CNN for Automated Theorem Proving 5

At last, the machine learning method provides an effective way to help people
for choosing the inference steps. In paper [1], the author raises a learning-based
premise selection method that in a 50% improvement on the benchmark over
the SInE, a state-of-the-art system. In paper [6] the author successfully applies
it into higher-order logic proving.

3 Task Description
In a computer or mathematics system. There are some properties we think they
are right. We call them axioms. For example, rev[ ] = [ ]. rev represents reverse
operator for a list. [ ] is an empty list. rev[ ] = [ ] means that reversing an
empty list equal to list itself. For properties that we do not know or want to
verify, such as rev(rev xs) = xs (xs is a nonempty list), we call it conjecture.
rev(rev xs) = xs means that if we reverse a list twice, we get the original list. If
we could use axioms to prove a conjecture, we call conjecture: premises. Our job
is choosing a premise from a set of the premises. Because human have to specify
intermediate steps in dozens of thousands of theorems, this is time-consuming
work that could take a couple of years [10]. So in this paper, we give attention
to the task that whether a premise is helpful to the final result. Apparently, the
model of our paper is a binary conditioned classification model: If the premise
is helpful for the final conclusion, it is belonging to the positive class. Otherwise
belong to the negative class.
At first, we will give an example to explain what is the main goal and subgoal,
the basic and important feature of formal proof. For instance, there is a list xs.
We wish to prove that reverse the list twice is equal to the list itself. If we
represent it formally, we could get the following equation:

rev(rec xs) = xs (1)


rev is the action of reverse. xs is the list. This is our main goal. We seek to
prove it. Firstly we are needed to prove a basic situation: an empty list. This is
one of the subgoals. We can formally write it like this:

rev(rev[ ]) = [ ]. (2)

Second, we need to prove another subgoal

rev(reclist) = list => rev(rev(a#list)) = a#list. (3)

a#list means that take the first element from the list, for another word, take
the head of the list. To prove the first subgoal, we need an axiom:

rev[ ] = [ ]. (4)

To prove the second subgoal, we need a premise:

rev(xs&ys) = (rev ys)&(rev xs). (5)


6 K. Peng and D. Ma

The symbol & means an operator, for example: #. This premise is helpful
to our result, so it belong to the positive class. From this example, we can
see that every time we choose a premise and make inference, we will get some
new subgoals. To proof those new subgoals, we need more premises. This is
a recursively process. Also, because recursion feature, the sequence of proof is
important. For example, if we try to prove the second subgoal rev(rev(a#list)) =
a#list. The proof would not have succeeded.
From this example, we can conclude that if we want CNN to deal with this
task, CNN must has ability to deal with recursively process. Only in this way
can we get a good result. Meanwhile, we know that tree-structure is a good way
to deal with recursive, so we are going to design a tree-structure CNN.

4 Network Structure and Motivation


4.1 Motivation
The basic characteristic of ATP is that the proof process is a recursive process
as Fig. 1 show (The bottom one). Bottom one of Fig. 1 is a tree. Every node in
this tree is a goal that needs to be proofed in ATP. Specially, the root of the
tree is the main goal. The leaf node of the tree is the subgoal. Proof process is
searching the tree in deep first search sequence. According to this feature, we
have two motivations: Firstly, we change the linear CNN structure into tree-
structure. Secondly, we combine the CNN with LSTM (not show in Fig. 1).
For the first motivation, we think traditional linear structure of CNN, as Fig. 1
show, could not be able to deal with the recursion very well. We therefore design
a tree-structure CNN as same as the proof process. We use two CNN layers
to handle the two input part, conjecture block and dependency (axiom) block,
separately. Then we join those two layer’s output together, as the joint note in
Fig. 1. To proof the subgoal, we still need the dependency (axiom) block. So we
put the dependency (axiom) block into CNN layer again. The conjecture block
and dependency (axiom) block will be specifically narrated in Sect. 6.
For the second motivation, we combine LSTM with tree-structure CNN
because the order is very important to ATP. There is an order relationship
between the main goal and subgoals. If it is disordered, the proof process of
ATP will fail. That is, the final goal cannot be proved. Due to the ability for
processing order sequence, LSTM is a natural choice for our model.
The key of our model is a Tree-structure CNN, as showed in Fig. 1. Based
on this hierarchical model. We can track the obvious feature of formal proof:
Subgoal. Subgoal is a very common and important feature of formal proof, which
we present in Sect. 4. The core idea of our model is the recursion. We try to use
tree-structure to represent recursive process of ATP. In our model, recursion
equal to tree-structure.

4.2 Network Structure


We are inspired by [2,5] and our model are based on the deep convolutional
neural network too. Our model has two input: dependency (axiom) block and
Tree Structure CNN for Automated Theorem Proving 7

conjecture block. Our model includes one output: 0 imply that the dependency
(axiom) block has no or negative relationship with the conjecture block. 1 imply
that the dependency (axiom) block has a positive relationship with the conjec-
ture block. We describe the format of input in Sect. 6 specifically.
The first layer should be word embedding layer. Word embedding layer
will convert the dependency (axiom) block and conjecture block into an 256-
dimensional vector. This layer has been implemented by an open source frame-
work keras and we can use it’s API directly. The specific principles are men-
tioned in [12]. This step is not a vital step in our model because we do not have
to understand the process of word embedding. We only need the result of word
embedding. Then we deal with vectors by CNN layer and maxpooling layer. The
difference between our model and [2] is: We think the output of every CNN
layer is the subgoal of ATP, so we try to merge the output of every CNN into
a whole. After that, we use bidirectional LSTM to deal with the conjecture and
LSTM deal with the dependency (axiom). At last, we choose the binary cross
entropy function as loss function. We also utilize L2 regularization to prevent
the overfitting. The whole structure is indicated exactly in Fig. 3.
Our key idea of this work is to enable the neural network to learn the recursive
feature of ATP. In order to complete this goal, we use the tree-structure CNN.
The tree-structure CNN is different from all previous works in an important
aspect: previous approaches do not explicitly incorporate this recursive feature
of ATP into model. So those models will generalize poorly, whereas our model
incorporates recursion and will achieve perfect generalization.

5 Experiments
This dataset is made by google [2,5], which is well-suited for machine learn-
ing that are highly relevant for ATP. There are 2013046 training examples and
196030 testing examples in total. The dataset together with the description of
the used format is available from: http://cl-informatik.uibk.ac.at/cek/holstep/.
The input of this data set and labeled of data is as follow: Each input file
consists of a conjecture block, a number of dependency (axiom) blocks, and a
number of training/testing example blocks. The conjecture block starts with an
‘N’ and consists of 3 lines:
N name of conjecture
C text representation of the conjecture
T tokenization of the conjecture
Each dependency (axiom) block starts with a ‘D’ and consists of 3 lines:
D name of dependency (axiom)
A text representation of the dependency (axiom)
T tokenization of the conjecture
Each training/testing example starts with the symbol + or −. where + means
useful in the final proof and − not useful and consists of 2 lines:
+ text representation of the intermediate step
T tokenization of the intermediate step
8 K. Peng and D. Ma

Our model is implemented in tensorflow and keras. Each model was trained
on a Nvidia 1070GTX. The complete evaluation on HolStep dataset is given in
Tables 1 and 2, we run experiments on all five models in HolStep and compare
our result with four other results:
1D CNN+LSTM and 1D CNN. This model is purposed by [2,5]. It is a
simple but available model.
2-layer CNN+LSTM [8]. The difference between 1D CNN+LSTM and 2-
layer CNN+LSTM is that there is only one input to 1D CNN+LSTM. In 2-layer
CNN+LSTM, there are two input:statements and conjecture. The structure of
this model is show as Fig. 2.
VGG-16. VGG-16 was purposed by Oxford Visual Geometry Group. It won the
champion of ImageNet 2014. In this paper, we try to find whether this model
could be used to deal with the natural language problem [14].
ResNet. ResNet is a residual learning framework to ease the training of networks
that are substantially deeper than those used previously. It explicitly reformulate
the layers as learning.
Residual functions with reference to the layer inputs, instead of learning
unreferenced functions [7]. Tree-structure CNN+BILSTM. Our model tree-
structure CNN and we change LSTM to bidirectional LSTM as Fig. 3 shown.
The first model only has one input: dependency (axiom) blocks. The second
model includes two input: dependency (axiom) blocks and conjecture blocks.
The structure of the first model is show as Fig. 2.

6 Results
At first, we compare the traditional classify approach with our model. Tradi-
tional classify approaches include: SVM, KNN, Logistic Regression. Experimen-
tal results are presented in Tables 1 and 2 (the model with * is ours). Our model
yields 90% accuracy in training dataset. This shows that tree-structure CNN
could deal with the recursive process well. Additionally, our model yields 85%
accuracy in test dataset, 5% lower that train dataset. This difference is due
to (1) lacking of training data and overfitting. (2) The dependent relationship
between conjecture block and dependency (axiom) blocks is too complex. Mean-
while, SVM, KNN, Logistic Regression also do not have a good result. That is
because traditional way could not deal with the recursive information. They just
can measure the similarity of geometric space, such as Euclid Space.
Second, we compare other CNN models with our model. The result is showed
in the Tables 3 and 4. From the Tables 3 and 4, we can conclude that the CNN
models are better that traditional approach. However, the VGG-16 and ResNet
(50 layers), which perform very well in ImageNet contest, do not get a satisfac-
tory result in this dataset. This result means that there may not be generality
between CV models and NLP models because the basic feature of the picture
and ATP are different. So we still need to choose a suitable way for a specific
task.
Tree Structure CNN for Automated Theorem Proving 9

Fig. 2. Left shows the construction of our neural network. We try to involve all the
output of every CNN layers into a whole. And we apply bidirection LSTM to deal with
the last result. Right show the show the construction of [2]

Fig. 3. Left shows the relationship between epoch and the accuracy. Right shows the
relationship between the epoch and lost
10 K. Peng and D. Ma

Table 1. Train dataset. Table 2. Test dataset.

Model Accuracy Model Accuracy


KNN 70.1% KNN 68.8%
SVM 72.5% SVM 69.1%
Logistic regression 63.2% Logistic regression 60.7%
Tree-CNN+BILSTM* 90% Tree-CNN+BILSTM* 85%

Table 3. Train dataset. Table 4. Test dataset.

Model Accuracy Model Accuracy


VGG-16 56.3% VGG-16 54.7%
ResNet(50 layers) 67.4% ResNet(50 layers) 61.2%
CNN 81.6% CNN 80.5%
1D CNN+LSTM 82.4% 1D CNN+LSTM 79.6%
2-layer CNN+LSTM 85% 2-layer CNN+LSTM 82.1%
Tree-CNN+BILSTM* 90% Tree-CNN+BILSTM* 85%

For the 1D CNN+LSTM model and 2-layer CNN+LSTM model, the accu-
racy of the train set and test set are analogous. It shows that for those models, the
dataset are enough. Nonetheless, for Tree-structure CNN+BILSTM, the dataset
is not sufficient. Tree-structure CNN+BILSTM are more complicated than 1D
CNN+LSTM and 2-layer CNN+LSTM. So our model could learn the feature
more well. We need, however, more data to train.

7 Discussion
In this paper, we propose a deep learning model that can predict the usefulness
of a statement to the final result. Our work could improve ATP techniques
and save time for a formal proof. Our model has a significant generalization
ability because our model can capture the basic characteristic: recursion. Also,
the process of ATP is an orderly sequence. So we use the bidirectional LSTM and
the experimental results show our model reaching 90% accuracy, 5% higher than
[5]. But there are still some problems. The first one is we need more datasets to
prove our model’s generalization ability. Unfortunately, there are few datasets
in this area. So one of our future works is making more datasets of ATP. The
second problem is that although tree-structure CNN can improve the accuracy, it
is a simple model that only cocaine several nodes. But the ATP always includes
a huge number of subgoals. So when the ATP is too complicated, our model
may not work well. At the same time, if we add more nodes into our model,
the train will become highly time consuming. Also, more node may not improve
the accuracy significantly. The third problem is that even if we improve the
Tree Structure CNN for Automated Theorem Proving 11

accuracy in 5%, there is still a big space left. So in the future, we are about
revise the structure of LSTM. Changing the linear structure into tree-structure
in the future. We hope this revision could improve the accuracy.
At last, we note that there is an interesting example in our experiment. The
example is shown in the Fig. 4.

Fig. 4. Shows the proof relationship between main goal and subgoal. A → B means
we need A to proof B. The suspension points means that more axioms are needed to
proof D and E

The Fig. 4 clearly shows the proof relationship between subgoal and main
goal. A is the main goal of our proof. If we want to proof A, we need A1, A2,
and A3. So A1, A2, A3 have a positive relationship with the main goal. It is easy
to know that all the leaf node in this tree are all have a positive relationship
with the main goal. However, we find that in our experimental result. Our model
classifies a leaf node in seven layer of this tree into a negative relationship (The
seven layer is not shown in Fig. 4). Oppositely, SVM gives our the correct answer.
From this example, we could get a conclusions: ATP is very complex. It contain
many axioms and the relationship between them is also complex. So, we need
to combine the traditional way with CNN. Only CNN or SVM may not be
successful.

References
1. Alama, J., Heskes, T., Kühlwein, D., Tsivtsivadze, E., Urban, J.: Premise selection
for mathematics by corpus analysis and kernel methods. J. Autom. Reasoning
52(2), 191–213 (2014)
2. Alemi, A.A., Chollet, F., Irving, G., Szegedy, C., Urban, J.: DeepMath-deep
sequence models for premise selection. arXiv preprint arXiv:1606.04442 (2016)
3. Aspinall, D., Kaliszyk, C.: What’s in a theorem name? In: Blanchette, J.C., Merz,
S. (eds.) ITP 2016. LNCS, vol. 9807, pp. 459–465. Springer, Cham (2016). doi:10.
1007/978-3-319-43144-4 28
Another random document with
no related content on Scribd:
way, but not quite like the town doctors; and the ministers are very
nice—” This she said in a hesitating undertone, not expressive of
hearty concurrence, and ended in a firmer voice, “but not like my
own.”
“’Deed, mem,” said Hillend, “gi’e us farmers a gude miller an’ a gude
smith, an’ we can do weel enough wi’ ony ministers or doctors that
likes to come.”
“That wasna bad for Hillend,” said Bell.
“Well, Bell,” said Mr. Walker, “I thought it rather hard on the ministers
when I first heard the story, but—” And here he gave his views of the
Non-Intrusionists with, for him, unusual fervour, and added, “Now I
quite agree with Hillend, that congregations should accept, and
welcome, and honour the ministers who are appointed over them.”
“That’s without a doubt,” said Bell; “and esteem them very highly for
their work’s sake.”
The news of Mr. Walker’s appointment to Blinkbonny
was received with first a stare, then a shrug of the “AS YOU
shoulders, then a pretty general feeling that “they LIKE IT.”
might have had worse.” He was certainly not a shining
light, but he was a nice man, had a large family, and it would be a
good change for them. And although the local poetaster circulated a
sorry effusion on the subject, in which he, without acknowledgment,
stole from Cowper’s Needless Alarm,—

“A mutton statelier than the rest,”—

and—

“His loving mate and true,


But more discreet than he, a Moorland ewe,”—

changing the original “Cambrian” to “Moorland,” it did not take, and


Blinkbonny on its personal and social and “soft” side was ready to
“entertain” Mr. Walker.
He carried the news of his own appointment to the manse, and
although it surprised Mr. Barrie at the moment, he heartily wished
him every success and comfort, and added that he would find the
manse at his service by the time he was inducted. Mr. Walker
assured Mr. Barrie that there was no hurry, as “he did not see that
they could possibly come in until after the harvest was past at
Middlemoor.”
When Bell heard that Mr. Walker was coming to
Blinkbonny, she forgot her usual good manners. “Mr. JEWS AND
Walker!—Walker o’ Middlemoor!—fat Walker’s gotten BRITHERS.
the kirk, has he? He’s a slow coach—pity the folk that
gangs to hear him; but ’deed they’ll no’ mony gang. He minds me o’
Cauldwell’s speech at the cattle show. After Sir John palavered away
about the grand stock, and praised Cauldwell for gettin’ sae money
prizes, the decent man just said, ‘Sir John and gentlemen, thank ye
a’ kindly. I’m nae hand o’ makin’ a speech. I may be a man among
sheep, but I’m a sheep among men.’” And Bell showed how
changeable human affections are; for although Mr. Walker and she
had been hand-and-glove friends, she summed up with, “Mr. Walker
will never fill Mr. Barrie’s shoon [shoes]. I never could thole[11] him an’
his filthy tobacco smoke. Ugh! ma puir kitchen will sune be in a
bonny mess; an’ I dinna ken what to think about the things in the
garden an’ outhouses that are ours, for, as Mrs. Walker ance said to
me, her motto was, ‘Count like Jews and ’gree like brithers.’”

[11] Endure.

But when the settling up came, Bell found Mrs. Walker “easy dealt
wi’,”—not only satisfied with her valuation, but very complimentary as
to the state in which everything was left, and very agreeable—very.
CHAPTER VII.
OUT OF THE OLD HOME AND INTO THE NEW.

“Confide ye aye in Providence, for Providence is kind,


An’ bear ye a’ life’s changes wi’ a calm an’ tranquil mind;
Though press’d an’ hemm’d on every side, ha’e faith an’ ye’ll win through,
For ilka blade o’ grass keps its ain drap o’ dew.”

James Ballantine.

M R. BARRIE had written to Sir John McLelland, thanking him for


his uniform kindness, and saying that he had disjoined himself
from the Established Church. He also wrote to the clerk of his
presbytery to the same effect, adding that he would leave the manse
as soon as he could.
A short time sufficed to put Knowe Park into habitable order.
Whenever this was known, Mr. Barrie was cumbered by proffers of
help from the farmers in the parish. He could have had fifty carts to
remove his furniture for one that he required; and acts or offers of
considerate attention were so showered on him that he was
embarrassed by them.
At length the day came for “flitting.” It was a fine morning in the
middle of summer,—everything was looking its best. The manse in
itself was a charming place. To Mr. and Mrs. Barrie and their children
it had been a happy home, and in their inmost hearts it was hallowed
by many tender associations; and the church was endeared to Mr.
Barrie as he recalled the pleasant meetings therein with his beloved
flock. The parting was a bitter ordeal, trying to flesh and blood, and
as such they felt it very keenly.
At the hour for family worship, the men who were
taking down the furniture and making it ready for being THE
carted were asked to come to the “books;” and they MELODY
told afterwards that in singing the 23d Psalm their OF JOY
voices quivered, and that there was a lump in their AND
throat as the 138th Psalm was read as the “ordinary” PEACE.
for the morning, for the circumstances seemed to give additional
meaning to such parts of it as—“strengthenedst me with strength in
my soul,” “though I walk in the midst of trouble, Thou wilt revive me,”
“the Lord will perfect that which concerneth me,” “forsake not the
works of Thine own hands.”
As soon as the first cart was laden and off, Bell went to Knowe Park
to get things put rightly in and up. The three elder children had
resolved to flit their own belongings. James took his small barrow,
filled with a confused load of skates, books, etc. Mary carried her
little chair, Black Tam the negro doll, and some books and toys;
Lewie his little chair, a toy horse, and a whip. They had reached the
post office (which stood a little back from the main street), and were
resting on the broad open pavement in front of it, James sitting on
his barrow, the others in their chairs.
Dr. Guthrie, who had been spending a day or two in
DR. the neighbourhood, was calling at the post office.
GUTHRIE Soon, as his quick eye rested on the singular group,
AND THE his face became radiant with such a smile as he
BAIRNS.
could give, and which the children returned very
frankly. He went close to them, stooped down and patted Mary’s
cheek, got his hand under her chin and stroked it playfully, all the
while looking kindly in her face; then glancing at her lap, he said:
“What’s the name of that fine doll, my wee pet? is it Sambo, or
Pompey, or what?”
“That’s black Tam,” said Mary. “It was Nellie’s doll, and I’m taking it to
our new house.”
“Nellie’s, was it? And is Nellie too old for dolls now, and has she
given it to you? He looks as if he had seen better days.”
“Oh! please; sir, Nellie’s dead,” said Mary, looking towards the
churchyard; “she’s buried over there.”
“But Bell and mamma say that Nellie’s in heaven,” said Lewie very
decidedly.
The suddenness and beauty of Lewie’s answer strongly affected Dr.
Guthrie. He took out his snuff-box and took a moderate pinch, then
clapped Lewie’s head, and said:
“Yes, my wee man, you’re right; Nellie’s in heaven. But what’s your
name?”
James now took speech in hand: “My name’s James Barrie, and this
is Mary, and this is Lewie. We’re flitting from the manse over yonder;”
and he pointed in the same direction as Mary had looked. But Dr.
Guthrie, thus suddenly brought into contact with this stern reality of
the Disruption, had again to apply to his snuff-box, and was in the
act of taking it out of his pocket when Sir John McLelland drove up to
the post office and alighted. Dr. Guthrie and he knew one another as
members of Assembly, and they shook hands cordially, Sir John
expressing surprise at seeing the doctor there.
“Sir John,” said the doctor, “excuse me,”—and he dried the tear that
was coursing down his cheek,—“do you know these children?”
Sir John had not observed the group, but he looked at them long
enough to admit of Dr. Guthrie pulling out his box, taking one good
snuff, and getting another ready for despatch in his fingers.
“Oh, yes,” said Sir John, “they are Mr. Barrie’s children;” then looking
at James: “How are mamma and papa keeping?”
The children had risen, and the boys had taken off their caps when
Sir John appeared. In answer to the question James said: “They’re
quite well, thank you, sir; we’re all going to our new house to-day;
we’re helping to flit.”
Dr. Guthrie took his reserve snuff, looked first at Sir John, then at the
children, and swinging his hand so that it pointed to the children,
then to the manse, and resting it now towards them and again
towards it, he recited with much feeling, for he seemed deeply
moved:
“From scenes like these old Scotia’s grandeur springs,
This makes her loved at home, revered abroad;
Princes and lords are but the breath of kings,
An honest man’s the noblest work of God.”

By this time several of the villagers were attracted by


GIFF-
GAFF.
the scene, and they scarcely could repress the cheer
that was struggling for vent in their throats. Respect
for Sir John, however, kept it down until he drove away, when a right
hearty greeting was given to Dr. Guthrie, in whose eyes the tear still
trembled, and many pressed forward to grasp his hand,—none more
warmly than Kennedy the tailor, who, producing his snuff-box, said:
“Ye’ll excuse me, sir; I dinna ken ye, but—ye’ll excuse me, sir—but
would ye do me the honour of takin’ a snuff out of my box?”
“Certainly, my good friend,” said the doctor; “and we’ll giff-gaff,”
handing his box to the tailor, and helping himself out of Kennedy’s
dimpled, black-looking, oval-shaped tin box.
The tailor took a pinch, said it was “prime snuff,” and added: “Burns
is a great poet, and that was a grand verse you gied us the noo, and
the occasion’s worthy o’t. Mr. Barrie ’s an honest man, but he’s far
mair, he’s a patriot-martyr.”
The last cartload had left the manse; there was nothing for Mr. and
Mrs. Barrie to do but lock the door and follow. They paid a farewell
visit to each room. Their footsteps sounded harshly through the
house, now empty and dreary, still they were loath to leave. When
they were fairly outside of the front door they lingered on its step;
then Mr. Barrie, with a quick “This will never do,” locked the door and
withdrew the key.
They were bracing themselves for their trying walk past the church,
past the churchyard, and through the village, when a noise, a familiar
noise, yet with an eerie wail in it, made them both start. It came from
old Tibby the cat—Nellie’s Tibby. Bell had carried her to Knowe Park
in a basket as carefully as if she had been Nellie herself, and had
shut her up in a room. When the children came, James and Mary
had got strict orders to watch her; but Tibby had beaten them all and
got off, and home and into some quiet corner of the manse, whence,
when the door was locked, she crept out, uttering her wailing protest.
“Poor Tibby,” said Mrs. Barrie, “we must take you with us.”
When the door was re-opened, Tibby was easily caught. She had
evidently felt convinced, after a bewildered ramble through the empty
house, that there was some reason for her late transportation and
imprisonment.
This little incident re-opened the floodgates of tender memories, and
forced tears from Mrs. Barrie’s eyes, although by that time the
fountain had been largely drawn upon. She felt thankful to have
something else than herself to think of; and Tibby’s presence in her
arm, tucked cosily into the corner of her shawl, served to divide her
attention, and supplied sufficient amount of occupation to make the
walk less trying to her. She leaned heavily on Mr. Barrie’s arm, partly
from weariness, partly from excitement.
When they reached Knowe Park, Bell had tea set for
them in the parlour; and the children, having already THE NEW
made a complete round of the whole premises, gave HOME.
at the tea-table cheering proofs that they had not lost
their appetites, as well as curious details of what they had
discovered in their ramblings over their new home.
Bell had got the bedrooms into wonderful order for their
accommodation at night, and this deprived kind neighbours of the
pleasure they would have had in “putting up” for a few nights all or
any of the family. Within a few days they all felt quite at home, and
the additional work entailed by making the manse things go as far as
they could, kept them so busy that they were surprised at their
having got over the flitting, and especially the “leaving” of the manse,
so soon and so quietly.
I did not think it possible that Bell could have wrought harder than I
had always known her to do; but she did, and soon Knowe Park was
as much to her, in as far as the garden and live stock were
concerned, as the old homestead had been. And although Guy the
beadle offered to bring out of the manse garden whatever she
wished, Bell had enough and to spare, and told Guy to use for
himself what he liked, and after that only to sell what was ripe or
“near spoiling.”
True to his trust, Guy brought her a fair sum of money obtained in
this way, which she handed to Mr. Barrie, not Mrs. Barrie as usual,
telling him how it had come. Mr. Barrie was greatly pleased with Guy
and Bell, and thanked them warmly; but to Bell’s astonishment he
handed her back the money, and said: “Give it to the poor, Bell, and
oh! let us be thankful we have something to give away.”
This was several steps in advance of Bell’s notions of what was
called for, and she spoke to Mrs. Barrie about it. Mrs. Barrie was well
aware that she would need to be very economical, but Mr. Barrie’s
“thankful to have something to give away” was so like himself, and
the money had come so unexpectedly, that she said:
“Certainly, Bell, we’ll carry out Mr. Barrie’s wishes; and when
something has thus come that we can give, let us be thankful to get
the more blessedness, for it is more blessed to give than to receive.”
Bell could not quite go in with this doctrine. She thought for a little,
and then said hesitatingly:
“Just so, mem; but you’ll surely no’ object to me selling whatever’s to
spare at Knowe Park, mem, will ye? I think less o’ what comes frae
the auld manse; an’ I’m aye gaun to ca’t that, an’ this house is to be
the manse. No’ the new manse, but the manse—the manse.”
“Do as you have always done, Bell; no directions I
could give would serve you so well as your own good BOTH
sense. And I have been so unsettled by the events of RIGHT.
the past two months that I hardly know my own mind;
but one thing I do know, and feel—” here Mrs. Barrie’s eyes filled,
and she finished the sentence with a trembling voice, “and that is,
that you have been a sister and a mother to us all,—a Deborah and
a Ruth, a Martha and a Dorcas put together. May God reward you.”
This was nearly too much for Bell, but the necessity of getting on and
getting through was pressing her strongly. She accordingly braced
herself up, and said in a cheerful tone:
“Mrs. Barrie, I’ve gotten ower a’ my fears an’ cares o’ a worldly kind
about this kirk business, an’ I’m humbled to think that I spoke to you
an’ the minister an’ ithers as I did, an’ that I didna join the noble army
till after the battle was won; but noo,” said she with great solemnity, “I
pray that I may mak’ up for my faintin’ in the day o’ adversity by
settin’ my face like a flint to my wark,” and here she lowered her
tone. “But I’m forgettin’ mysel’, an’ we maun a’ set the stout heart to
the stey [steep] brae, an’ gather up the loins o’ our minds and heads
and hands, and no’ turn back like Lot’s wife. We’re gaun to dae fine
here: the range is very licht on the coals; an’ the hens are takin’ to
the place, an’ layin’ weel; an’ Daisy’s up to her knees in clover,” and
here Bell put on her blithest look, “an’ I never saw either Mr. Barrie or
you lookin’ better. And we maunna let it be said that we’re
‘unsettled,’ when in every sense o’ the word we’re settled, and weel
settled,—we couldna be better,—we’re just real weel set.”
Bell’s hearty speech put Mrs. Barrie into good spirits. She left the
kitchen with a smile on her lip and a warm thought in her heart,
which found expression as she walked through the lobby in “Thank
God for Bell!”
Bell was contentedly happy because she was
constantly busy, and her schemes prospered. From SCIENCE
the day Mr. Barrie had hinted at the possibility of their AND
leaving the manse, she set herself to contrive if by any POULTRY.
means she could be more than ever one of the bread-
winners, and her first attempt was on the hens. Some one had told
her about the increased yield of eggs which Sir John’s henwife had
got by some changes she had made in the food and treatment of her
poultry. Bell adopted the new system, and improved on it. She
succeeded far beyond her expectations, and with a happy face told
me of her luck one afternoon when she was ordering some
peppercorns and other spices, with which to experiment still further
on a notion of her own.
“I’ve been trying different plans wi’ my hens. I first gied them dry
grain, and they did but middlin’; then I gied them rough meal, an’
they did better; syne I boiled their meat, an’ put a ‘curn’[12] o’ spice
in’t, an’ they did splendid—far mair than paid for the extra meat; then
I got a cracknel frae the candlemaker (ane o’ yon dark, cheese-
lookin’ things that they make out o’ the rinds o’ fat, an’ skins, an’ sic
like that comes out o’ their tallow), and boiled a bit o’ it among their
meat, and the result was extraordinary; they just laid on an’ on till
they actually reduced themselves to fair skeletons. I was fair
affronted to see them about the place, an’ I had to gi’e them a rest
an’ change their victuals. Now I try to mix their meat so as to get
them baith to lay weel an’ to be size for the table. But ye’ll hae seen
what grand eggs I’ve been sending to yoursel’, an’ how mony mair
than before?”

[12] A small quantity.

I knew that to be the case, and said so. Bell continued:


“But besides that, early in the spring I got some settings o’ eggs that
they say are a grand kind, and the birds are a gude size a’ready. I
got them from Dan Corbet, an’ so I wadna like to say very muckle
about them, for Dan’s no’ aye to lippen[13] to. ’Deed, since we’ve
come to live nearer him, I’m no sae high about them, for he has a
vermin o’ game-cocks about him, and they whiles cross the north
park and fecht wi’ mine—they’re a fair torment.”

[13] Trust.

Dan Corbet was a “queer mixture.” He was a native of Blinkbonny,


but had been out of the parish for several years; report said he had
been a smuggler on the west coast of Scotland. He returned to his
native parish about the year 1820, with scars on his face, and
without one of his eyes, which gave him a sinister look. For some
years he had been night-watchman in the churchyard, as the
outrageous custom of violating the sanctity of the grave in order to
procure subjects for surgical demonstration and actual use in
teaching anatomy had sent a thrill of horror over Scotland, and had
led to the systematic watching of churchyards by at least two
individuals every night. Dan was the paid regular watchman, and at
least one or more respectable householders by turns watched with
him. Dan’s reckless character fitted him for the dreary post; and
although none of those who watched with him respected him, they
found that he was always wakeful, and, in the matter in hand,
trustworthy.
When the night watching was given up, Dan
maintained himself by doing on a larger scale the odd THE
sorts of jobs which he had sometimes taken in hand in DUMMIE
order to add to his salary as watchman, or “dummie DOCTOR.
doctor,” as he was called. My older readers will
remember with what feelings of indignation the resurrectionists or
dummie doctors (for these were the names given to the violators of
the graves) were spoken of, and that after their disappearance the
odious name, “dummie doctor,” sometimes stuck to the watchman.
Dan acted amongst the surrounding farmers as butcher, mole-
catcher, rat-catcher, and, in a rough way, as a veterinary surgeon;
was employed as extra hand at sheep-shearings, corn-threshings,
etc. He was a regular attender of local cattle markets, fairs, races,
and games; a good and keen fisher, and strongly suspected of being
a poacher, but never convicted. He was a wiry, spare, athletic man of
about 5 feet 11 inches high, with a weatherbeaten countenance, thin
grizzled hair, and a long stride. He lived in a cottage, divided by a
single park-breadth from Knowe Park, and kept a perfect menagerie
of dogs, ferrets, goats, and fowls—the latter being principally game
sorts. His favourite pastime was cock-fighting; but it was, to Dan’s
great regret, being discountenanced and put down. He had a variety
of surnames; “the Corbie,” as a contraction of his own name, was the
most common, but he was known as the “Mowdie” (mole), the “Rat,”
the “Doctor,” the “Vet.,” and “Ggemmie,” as well as the “Dummie
Doctor” or “Dummie.”
The eggs he had given to Bell were not from his stock, but had been
got in exchange for some of these; and as he had sometimes been
employed by Bell as a butcher, there was a trade connection
between them, but the intimacy had been purely “professional,” as
Dan, in the matter of social position or religion, was looked on as
quite an outcast; and the description of him, in this respect, ranged
from “a poor creature” to “an awfu’ man.”
Dan had got a setting of eggs from a very rare strain of game fowls,
and had been loud in laying off their properties to his cronies, some
of whom, on the night that Dan “set” them, took them carefully from
under the hen and put ducks’ eggs in their place; they then crossed
the field, got over Knowe Park wall, and put Dan’s eggs under one of
Bell’s “clockers,”[14] using every precaution not to injure the eggs, as
well as to avoid detection.

[14] Clucking hens.

Dan waited long and wearily for his expected brood; he looked for
them on the reckoned day, but it passed, and the next, and the next,
until a full week had elapsed, and still no birds. Early on the eighth
morning he determined to “pitch” the eggs away, and was angrily
stooping down to lift off the hen, which, although it was a great
favourite and a “splendid sitter,” would have had a rough toss and a
long one, when he heard a cheep.
The welcome sound was marrow to his bones. “Eh!”
was his first exclamation; “what’s that? is’t possible HIDDEN
after a’?” He heard more cheeping. “Isn’t it a gude TREASUR
thing I’ve been sae patient?” Then looking at the hen, ES.
which, but a minute before, he was preparing to use
very roughly, he said, “Eh, grannie, grannie, ye’re the best clocker in
the county; eh, my auld darlin’, my queen o’ beauty, ye’ll no’ want
your handfu’ o’ groats for this—I’ll gi’e ye a peck; jist anither day,
grannie, an’ ye’ll get oot wi’ yer darlin’s, ye ace o’ diements!”
The cheeping had now become very decided, and Dan, again
addressing grannie, said: “Sit on, my flower o’ the flock, my fail-me-
never, hap[15] the giant-killers wi’ yer bonnie, golden, cosy feathers
just till the nicht, till their wee jackets an’ glancin’ spurs are dry; an’
I’ll bring a’ the neebors about seven o’clock when they come hame,
and I’ll open the door, an’ ye’ll march out like Wallington at the head
o’ the Scotch Greys at Waterloo; and will they no’ stare when they
see your sturdy family following ye like the Royal Artillery?”

[15] Cover carefully.

He then locked the door, and “warned” his cronies and neighbours to
come “sharp seven,” and they would see something really worth their
while.
Dan was in the fidgets all afternoon. Shortly before seven o’clock a
small crowd had gathered in his garden, to which Dan told the
pedigree of the birds, and spoke of their qualities in the most glowing
terms.
“Let’s see them, Dan,” said several voices; “let’s see them.”
“I’m waiting for Watty,” said Dan; and turning to a boy, said, “Gang to
the house-end, ma man, an’ see if he’s no’ comin’;” then addressing
his visitors, he said, “Watty’s the only man that I’m feared for in this
district; his birds hae beaten mine owre often; I’ll tether him noo, or
I’m cheated.”
As Dan finished this speech, Watty, a queer-looking customer
wearing a hairy skull-cap, smoking a short black pipe, and with both
hands in his pockets, joined the gathering. He gave a side nod to
Dan, and said “Hoo’s a’?” to the company.
“Noo for the show!” said Dan, as he unlocked the
’TWIXT hen-house (it was coal-house, goat-house, and
THE CUP served various other purposes), and flung the door
AND THE wide open, saying, “Come awa’, grannie, wi’ your
LIP.
‘royal family.’ There’s a pictur’, men, for ye.”
Grannie’s family had been restless, because hungry and particularly
thirsty, and she and they obeyed Dan’s summons with great
readiness and even haste.
Watty, who had till then smoked on in silence, quickly took the pipe
out of his mouth, stooped a little, shaded his eyes with one hand,
and seemed sadly puzzled. His first remark was:
“Man, Dan, they’ve awfu’ braid nebs” (broad bills).
“Braid nebs, or no’ braid nebs,” said Dan, “the game’s there onyway.”
“May be,” said Watty, “but they have maist awfu’ braid nebs,” for by
this time he and all the onlookers had “smelt a rat;” “and in ma
opinion they’re jucks.”
“Ye’re a juck!” said Dan, looking at him fiercely.
“Dinna look at me, Dan, look at them; look at their nebs, look at their
wab-feet—is thae no jucks?”
A second glance revealed to Dan that this was too true.
Roars of laughter, which only such an audience can give, ensued, in
which “Braid nebs,” “Gemm jucks,” “Grannie’s royal family,” “Tether
Watty,” were heard amidst the noisy peals of the uncontrolled and
apparently uncontrollable merriment.
Dan looked unutterable things; his face was one of dismal agony. He
took side glances at the crowd; each followed by a long look—a
perplexed, vindictive look—at the ducklings; whilst all the while the
crowd waxed merrier, and laughed louder as they saw his miserable,
heartbroken countenance.
Watty stooped down to lift a duckling, saying at the same time, “Man,
Dan, have ye lost your sicht? Div ye no’ see that thae’s jucks? Look
at their nebs, their feet, their size; hear their weet-weet;” but
“Grannie” barred the pass, flew at his hand, and pecked it sharply.
This revived the sorely afflicted Dan, and rousing himself, he said,
“Weel dune, grannie!” which the crowd received with a cheer and a
very loud laugh.
One of the onlookers, wishing to soothe Dan, said: “Jucks are as
gude as hens ony day, Dan; an’ they’re healthy-like birds.”
“You ignorant gomeral![16] you senseless blockhead! you born idiot!”
said Dan, his excitement increasing as he proceeded; “jucks like
game-cocks! jucks like the kind o’ game-cocks that should ha’ been
there, that were set by my ain hands! haud yer bletherin’[17] tongue.
Somebody’s been puggyin’[18] me. If I kent wha dared to tak’ their
nap[19] aff me, I wad gi’e them what they wad mind a’ their days; I
would fell them!”

[16] Stupid fellow.

[17] Foolish talking.

[18] Playing monkey tricks.

[19] Fun.

A large crowd had now collected in Dan’s garden, and when the
new-comers heard the cause of the merriment, they joined in it and
kept it up.
“What are ye a’ doin’ laughin’ there at, like
LET heeawnies [hyenas]? Out o’ this, every one o’ ye, or
SLEEPING I’ll gar some o’ ye laugh on the ither side o’ yer lug
DOGS LIE. [ear]!” said Dan, looking daggers.
“Lock them up, Dan, for fear the witches change them into turkeys,”
said one of the crowd.
This made Dan furious: he seized an old spade which lay on the top
of his hen-house, and vowed that he “would fell ony man that said
another word.”
“If ye can catch him,” said a waif, with a knowing wink; and he made
off as fast as he could.
“If I can what?” said Dan. “I believe you’re the vagabond that’s
puggied me, and I’ll catch ye, supple an’ a’ as ye think ye are!”
Dan started, holding the spade over his head, fury in his eye,
vengeance in his heart. The crowd saw that his blood was up, and
cried, “Run, run, run for your very life!”
The man got into the field that lay between Dan’s cottage and Knowe
Park; Dan followed, as did also many of the crowd. The pursued
man, repenting of his rashness, and fearing the worst, as well he
might, made straight for Knowe Park wall.
Bell had heard the laughter when milking Daisy; Mr. and Mrs. Barrie
had heard it when taking an evening stroll in the garden, and all
three were standing at the wall wondering what could cause it, as the
laughter was unusually boisterous. They saw the chase begin. The
flying man observed Mr. Barrie, and made toward him as to a city of
refuge. When Mr. Barrie saw Dan rushing on, so dangerously armed
and so furious, he cried loudly, “Stop, Corbett! stop! I command you.”
This made Dan slacken his pace and lower his spade, but he walked
sulkily on with the crowd, saying, “I’m no’ dune wi’ him yet. I’ll gi’e
him’t for this yet.—Wait a wee, just wait a wee,” until they came to
the wall of the garden.
“Whatever is all this about?” said Mr. Barrie. “What’s wrong, Corbett,
that you are so furious?”
“A’s wrang, sir, a’s wrang. I’ve been rubbit [i.e. robbed], an’ insulted,
an’ chagareened by that—” It took Dan a little time to select an
epithet strong enough for the occasion, and at the same time fit for
the minister’s ears. This was a difficult matter; many rushed to his
tongue-end, strong, withering, seasoned; undoubtedly, had it not
been for Mr. Barrie, he would have fired them off in a volley, and
greatly relieved himself thereby. At length he hurled out, “that
unhanged vagabond, he’s puggied me, but—”
Mr. Barrie looked at Dan, and said, “Stop, Corbett, say no more till
your passion cools;” then turning to the crowd he said, “What is the
cause of this unseemly uproar?”
Watty and several others began to explain the affair,
but every one that attempted it had to stop after saying PROBING
a word or two; even the offending man, although now THE
quite safe, was unable to get beyond “Dan set hens’ WOUND.
eggs” for laughing, and every man in the field was
writhing in fits and contortions, through excessive laughter, with the
exception of Dan, on whom the laughter was telling like oil on a
flame.
Mr. Barrie looked at Dan, and seeing that he was becoming even
more ferocious, said calmly: “Corbett, from the behaviour of the
crowd I suspect they have been playing some trick on you, and they
evidently have succeeded to their entire satisfaction, but to your
great annoyance. Please tell me really what has excited you.”
Dan told his story. The laughter was quite as general, but became
more distant as he proceeded, for whilst telling his tale he scowled
on the “grinning baboons,” as he called them, and clutched his
spade angrily, which still further widened the circle. Although Mr.
Barrie remained grave, Mrs. Barrie could not but laugh quietly, and
Bell, sheltered by an evergreen shrub, did so heartily, repeating,
“Well, I never!” All at once she stopped, thought a little, then saying
to herself, “That explains it,” she came close to the wall at the point
where Dan stood, and said: “There’s a brood o’ chickens, lang-leggit,
sharp-nebbit things, come to me that I never set; they’re maybe
yours, they’re no ours—they’re come-o’-wills.”
“What!” said Dan; “whan did they come out?”
“This day week exactly.”
“Let’s see them. Come in, Watty, an’ gie’s your skill o’ them,” said
Dan, with a happier but still nervous face; then addressing himself to
Bell, he said: “Hoo mony came oot?”
“Eleven out o’ thirteen; there were twa eggs did naething.”
“That’s very gude; that’s grand!” said Dan, who was already climbing
the wall to get in.
“Had ye no’ better wait till the morn’s mornin’?” said the considerate
Bell. “They’re a’ shut up for the nicht, an’ cosy under their mother’s
wing; ye’ll disturb them, puir things.”
“I maun see them the nicht; I’ll no’ live if I dinna see them the noo,
but I’ll be real canny wi’ them. Come on.”
Dan, Watty, and Bell went to the “cavie” or hencoop,
folded back the old bag which had been dropt over the BETTER
front of it to keep the inmates warm, and Dan saw to LO’ED YE
his intense delight two little heads peeping from under CANNA
their feathery covering. His educated although single BE.
eye at once settled the kind: “Game, game, every inch o’ them, and
baith cocks!” Then turning to his crony he said: “Watty, you’ll lift the
hen canny, canny, an’ I’ll tak’ stock.”
The result was “six cocks an’ five hens, the real true-blue breed,”
declared by Dan, and confirmed by Watty, with the addition of, “Dan,
ye’re rich noo.”
Bell would not hear of them being shifted that night, and ultimately
persuaded Dan to “leave them wi’ her hen till they were pickin’ for
themselves; she would take care o’ them, an’ nae cats could get
near them, for she had just gotten new nets.”
Dan got Bell to take the ducks,—“he couldn’t bear them; there was
nae water for them; his fowls wad dab them till there was no’ ane
left; it wad be a great obleegement to him.”
When Dan got home he could not rest; he smartly took down his
fishing-rod and strode to the waterside. The evening air cooled him,
and he was further consoled by a good take. Under the “bass” (straw
door-mat) at Knowe Park kitchen door next morning, Bell found a
ten-pound salmon and three good large trouts—possibly they had
not passed the water-bailiffs. Bell looked at all sides of the question
of “what to do with them?” Many difficulties presented themselves to
her honest, correct mind, and as the greatest of these was, “What
else could she do with them?” she took in the foundlings and used
them well.
There was a little coming and going between Bell and Dan, until the
chickens were able to shift for themselves. When that was the case,
he carried them carefully over to his own house, and shared it with
them for a few months. The ducklings throve with Bell, and she
repaid Dan for them and the fish (for she found out that her guess as
to its having come from Dan was correct) in several ways, but
principally by occasional dozens of her “buttered” eggs. When eggs
were abundant, and therefore cheap, she preserved a large quantity
by rubbing them when newly laid with a very little butter all over, and
keeping them in salt. It was generally thought that she had some
special receipt or “secret,” for her buttered eggs had a fresh, curdy,
rich flavour that few preservers could attain to.
A penurious old maid had complained to Bell that “she did not
understand her hens; she was quite provoked at them, because in
the summer-time, when eggs were only sixpence the dozen, they
laid lots, but in the winter-time, when they were more than double
that price, they would not lay at all.”
Bell’s reply was: “I daresay no’; but ’deed, mem, ye’ll
CATCHING need to baith feed them better, an’ keep them
A TARTAR. cleaner and cosier, or they’ll do but little for you.”
The nicknames by which Dan had formerly been distinguished were,
after the affair of the ducklings, dropt entirely out of use, and he was
thereafter spoken of as “Braidnebs,” although none could use it in his
hearing with impunity.
Thomas Scott, the farmer of Babbie’s Mill, a forward ill-bred man,
was speaking in the market to Mr. Taylor, the elder already referred
to in these “Bits.” Dan chanced to pass near them, and the miller
said, loud enough for him and the most of the folks about the cross
to hear him, “Braidnebs or no’ braidnebs, the game’s there onyway.”
Dan scowled at the miller, and tried to suppress his rage. In his own
words, “I tried to steek[20] my mouth, but there was a rattlin’ in my
throat like to choke me. I lookit at Mr. Taylor. He kent,[21] ’deed a’body
kent, that the miller’s wife was a yammerin’[22] petted cat, an’ I said,
‘Maister Taylor, there’s a big bubblyjock[23] gangs about Babbie’s Mill
yonder, but he’s dabbit[24] to death wi’ a hen.’”

[20] Shut.
[21] Knew.

[22] Grumbling.

[23] Turkey-cock.

[24] Pecked.

Poor “Babbie’s Mill” was well known to be “hen-pecked” at home,


and the laugh was so cleverly, so deservedly, so daringly turned
against him, that he was nonplussed for a little; but he screwed up
his courage, and tried to look disdainfully at Dan. Dan’s single eye
was glaring at him, and the blank socket of his other eye was
twitching nervously. The miller looked bold, and said: “Go about your
business, ye ill-tongued scoundrel!”
“Ye what?” shrieked Dan, going close up to the miller, who stept back
and tried to move off; but Dan followed him closely, and poured out,
in a voice compounded of bawling, howling, and hissing, whilst all
the while his arm moved quickly up and down: “What did ye say?—
ill-tongued? Wha has as ill a tongue as yoursel’, if it be na your wife?
Ye’ll daur to insult a man in the middle o’ the street that wasna
meddlin’ wi’ you, an’ then speak o’ him being ill-tongued! Gae hame
to Babbie’s Mill an ‘clapper’ there like yer auld mill, an’ tak’ double
‘mouters’[25] out o’ ither folk’s sacks to fill yer ain. Ye’re no’ mealy-
mooed [mouthed] though ye’re a miller; dicht the stour aff your ain
tongue before ye try to mend ither folks. You should be the last man
to ca’ onybody a scoundrel; them that meets ye in the market wad
think butter wadna melt in yer mouth, but let them gang to Babbie’s
Mill an’ they’ll find ye can chew gey hard beans. What d’ye think o’

You might also like