You are on page 1of 54

Image Analysis and Recognition 14th

International Conference ICIAR 2017


Montreal QC Canada July 5 7 2017
Proceedings 1st Edition Fakhri Karray
Visit to download the full and correct content document:
https://textbookfull.com/product/image-analysis-and-recognition-14th-international-co
nference-iciar-2017-montreal-qc-canada-july-5-7-2017-proceedings-1st-edition-fakhri-
karray/
More products digital (pdf, epub, mobi) instant
download maybe you interests ...

Engineering Psychology and Cognitive Ergonomics:


Cognition and Design: 14th International Conference,
EPCE 2017, Held as Part of HCI International 2017,
Vancouver, BC, Canada, July 9-14, 2017, Proceedings,
Part II 1st Edition Don Harris (Eds.)
https://textbookfull.com/product/engineering-psychology-and-
cognitive-ergonomics-cognition-and-design-14th-international-
conference-epce-2017-held-as-part-of-hci-
international-2017-vancouver-bc-canada-july-9-14-2017-proceed/

Medical Image Understanding and Analysis 21st Annual


Conference MIUA 2017 Edinburgh UK July 11 13 2017
Proceedings 1st Edition María Valdés Hernández

https://textbookfull.com/product/medical-image-understanding-and-
analysis-21st-annual-conference-miua-2017-edinburgh-uk-
july-11-13-2017-proceedings-1st-edition-maria-valdes-hernandez/

Progress in Pattern Recognition Image Analysis Computer


Vision and Applications 22nd Iberoamerican Congress
CIARP 2017 Valparaíso Chile November 7 10 2017
Proceedings 1st Edition Marcelo Mendoza
https://textbookfull.com/product/progress-in-pattern-recognition-
image-analysis-computer-vision-and-applications-22nd-
iberoamerican-congress-ciarp-2017-valparaiso-chile-
november-7-10-2017-proceedings-1st-edition-marcelo-mendoza/

Image Analysis and Recognition 13th International


Conference ICIAR 2016 in Memory of Mohamed Kamel Póvoa
de Varzim Portugal July 13 15 2016 Proceedings 1st
Edition Aurélio Campilho
https://textbookfull.com/product/image-analysis-and-
recognition-13th-international-conference-iciar-2016-in-memory-
of-mohamed-kamel-povoa-de-varzim-portugal-
Rough Sets International Joint Conference IJCRS 2017
Olsztyn Poland July 3 7 2017 Proceedings Part I 1st
Edition Lech Polkowski

https://textbookfull.com/product/rough-sets-international-joint-
conference-ijcrs-2017-olsztyn-poland-july-3-7-2017-proceedings-
part-i-1st-edition-lech-polkowski/

Rough Sets International Joint Conference IJCRS 2017


Olsztyn Poland July 3 7 2017 Proceedings Part II 1st
Edition Lech Polkowski

https://textbookfull.com/product/rough-sets-international-joint-
conference-ijcrs-2017-olsztyn-poland-july-3-7-2017-proceedings-
part-ii-1st-edition-lech-polkowski/

Image Analysis and Recognition 11th International


Conference ICIAR 2014 Vilamoura Portugal October 22 24
2014 Proceedings Part II 1st Edition Aurélio Campilho

https://textbookfull.com/product/image-analysis-and-
recognition-11th-international-conference-iciar-2014-vilamoura-
portugal-october-22-24-2014-proceedings-part-ii-1st-edition-
aurelio-campilho/

Image Analysis and Recognition 11th International


Conference ICIAR 2014 Vilamoura Portugal October 22 24
2014 Proceedings Part I 1st Edition Aurélio Campilho

https://textbookfull.com/product/image-analysis-and-
recognition-11th-international-conference-iciar-2014-vilamoura-
portugal-october-22-24-2014-proceedings-part-i-1st-edition-
aurelio-campilho/

Virtual Augmented and Mixed Reality 9th International


Conference VAMR 2017 Held as Part of HCI International
2017 Vancouver BC Canada July 9 14 2017 Proceedings 1st
Edition Stephanie Lackey
https://textbookfull.com/product/virtual-augmented-and-mixed-
reality-9th-international-conference-vamr-2017-held-as-part-of-
hci-international-2017-vancouver-bc-canada-
Fakhri Karray
Aurélio Campilho
Farida Cheriet (Eds.)
LNCS 10317

Image Analysis
and Recognition
14th International Conference, ICIAR 2017
Montreal, QC, Canada, July 5–7, 2017
Proceedings

123
Lecture Notes in Computer Science 10317
Commenced Publication in 1973
Founding and Former Series Editors:
Gerhard Goos, Juris Hartmanis, and Jan van Leeuwen

Editorial Board
David Hutchison
Lancaster University, Lancaster, UK
Takeo Kanade
Carnegie Mellon University, Pittsburgh, PA, USA
Josef Kittler
University of Surrey, Guildford, UK
Jon M. Kleinberg
Cornell University, Ithaca, NY, USA
Friedemann Mattern
ETH Zurich, Zurich, Switzerland
John C. Mitchell
Stanford University, Stanford, CA, USA
Moni Naor
Weizmann Institute of Science, Rehovot, Israel
C. Pandu Rangan
Indian Institute of Technology, Madras, India
Bernhard Steffen
TU Dortmund University, Dortmund, Germany
Demetri Terzopoulos
University of California, Los Angeles, CA, USA
Doug Tygar
University of California, Berkeley, CA, USA
Gerhard Weikum
Max Planck Institute for Informatics, Saarbrücken, Germany
More information about this series at http://www.springer.com/series/7412
Fakhri Karray Aurélio Campilho

Farida Cheriet (Eds.)

Image Analysis
and Recognition
14th International Conference, ICIAR 2017
Montreal, QC, Canada, July 5–7, 2017
Proceedings

123
Editors
Fakhri Karray Farida Cheriet
University of Waterloo Politechnique Montreal
Waterloo, ON Montreal, QC
Canada Canada
Aurélio Campilho
University of Porto
Porto
Portugal

ISSN 0302-9743 ISSN 1611-3349 (electronic)


Lecture Notes in Computer Science
ISBN 978-3-319-59875-8 ISBN 978-3-319-59876-5 (eBook)
DOI 10.1007/978-3-319-59876-5

Library of Congress Control Number: 2017943053

LNCS Sublibrary: SL6 – Image Processing, Computer Vision, Pattern Recognition, and Graphics

© Springer International Publishing AG 2017


This work is subject to copyright. All rights are reserved by the Publisher, whether the whole or part of the
material is concerned, specifically the rights of translation, reprinting, reuse of illustrations, recitation,
broadcasting, reproduction on microfilms or in any other physical way, and transmission or information
storage and retrieval, electronic adaptation, computer software, or by similar or dissimilar methodology now
known or hereafter developed.
The use of general descriptive names, registered names, trademarks, service marks, etc. in this publication
does not imply, even in the absence of a specific statement, that such names are exempt from the relevant
protective laws and regulations and therefore free for general use.
The publisher, the authors and the editors are safe to assume that the advice and information in this book are
believed to be true and accurate at the date of publication. Neither the publisher nor the authors or the editors
give a warranty, express or implied, with respect to the material contained herein or for any errors or
omissions that may have been made. The publisher remains neutral with regard to jurisdictional claims in
published maps and institutional affiliations.

Printed on acid-free paper

This Springer imprint is published by Springer Nature


The registered company is Springer International Publishing AG
The registered company address is: Gewerbestrasse 11, 6330 Cham, Switzerland
Preface

ICIAR 2017 was the 14th edition in the series of annual conferences on image analysis
and recognition, offering a forum for participants to interact and present their latest
research contributions in theory, methodology, and applications of image analysis and
recognition. ICIAR 2017, the International Conference on Image Analysis and
Recognition was held in Montreal, Canada, July 5–7, 2017. ICIAR is organized by
AIMI - Association for Image and Machine Intelligence, a not-for-profit organization
registered in Ontario, Canada.
We received a total of 133 papers from 33 countries. Before the review process, all
the papers were checked for similarity using a comparison database of scholarly work.
The review process was carried out by members of the Program Committee and other
reviewers. Each paper was reviewed by at least two reviewers, and checked by the
conference chairs. A total of 73 papers were finally accepted and appear in these
proceedings. We would like to express our gratitude to the authors for their contri-
bution, and we thank the reviewers for the careful evaluation and feedback provided to
the authors. It is this collective effort that resulted in the strong conference program and
high-quality proceedings.
We were very pleased to include three outstanding keynote talks: “The Role of
Ultrasound and Augmented Reality for Intra-cardiac Beating Heart Interventions” by
Terry Peters, Western University, Canada; “Matrix-Tensor Representation and Deep
Learning in Large-Scale Image Data Analysis” by Tien Bui, Concordia University,
Canada; and “Discovering Deep Knowledge from Biosequence and Temporal
Sequence Data” by Andrew K. C. Wong, University of Waterloo, Canada. We have
also included in this year’s program, a special session on Machine Learning for
Medical Image Computing, organized by Alex Wong, of the University of Waterloo,
Canada, and Farzad Khalvati of Sunnybrook Research Institute in Toronto, Canada.
We would like to express our gratitude to the keynote speakers and the special session
organizers for accepting our invitation to share their vision and recent advances in their
areas of expertise.
We would like to thank Khaled Hammouda, the webmaster of the conference, for
maintaining the website, managing the registrations, interacting with the authors, and
preparing the proceedings. We are also grateful to Springer’s editorial staff, for sup-
porting this publication in the LNCS series. We also would like to acknowledge the
members of the local Organizing Committee for their assistance and support during the
organization of the conference, and the École Polytechnique de Montreal for making
the venue available.
VI Preface

Finally, we were very pleased to welcome all the participants to ICIAR 2017 in
Montreal. For those who were not able to attend, we hope this publication provides a
good view into the research work presented at the conference, and we look forward to
meeting you at the next ICIAR conference.

July 2017 Fakhri Karray


Aurélio Campilho
Farida Cheriet
Organization

General Chairs

Fakhri Karray
University of Waterloo, Canada
karray@uwaterloo.ca

Aurélio Campilho
University of Porto, Portugal
campilho@fe.up.pt

Farida Cheriet
École Polytechnique de Montréal, Canada
farida.cheriet@polymtl.ca

Webmaster

Khaled Hammouda
Waterloo, Ontario, Canada
khaledh@aimiconf.org

Supported by

AIMI – Association for Image and Machine Intelligence

Department of Electrical and Computer Engineering


Faculty of Engineering
University of Porto
Portugal
VIII Organization

CPAMI – Centre for Pattern Analysis and Machine Intelligence


University of Waterloo
Canada

Center for Biomedical Engineering Research


INESC TEC - INESC Technology and Science
Portugal

Program Committee
A. Abate University of Salerno, Italy
J. Alba-Castro University of Vigo, Spain
E. Alegre University of Leon, Spain
L. Alexandre University of Beira Interior, Portugal
H. Araujo University of Coimbra, Portugal
A. Barkah University of Waterloo, Canada
J. Barron University of Western Ontario, Canada
J. Batista University of Coimbra, Portugal
S. Bedawi University of Waterloo, Canada
J. Boisvert CNRC, Ottawa, Canada
G. Bonnet-Loosli UCA, Université Clermont Auvergne, France
F. Camastra University of Naples Parthenope, Italy
M. Camplani University of Bristol, UK
J. Cardoso INESC TEC and University of Porto, Portugal
M. Coimbra University of Porto, Portugal
A. Dawoud University of Southern Mississippi, USA
J. Debayle Ecole Nationale Supérieure des Mines de Saint-Etienne
(ENSM-SE), France
G. Doretto West Virginia University, USA
L. Duong École de technologie supérieure, Canada
M. El-Sakka University of Western Ontario, Canada
P. Fallavollita University of Ottawa, Canada
A. Farahat Hitachi America, Ltd. R&D, USA
J. Fernandez CNB-CSIC, Spain
R. Fisher University of Edinburgh, UK
D. Frejlichowski West Pomeranian University of Technology, Szczecin,
Poland
Organization IX

G. Giacinto University of Cagliari, Italy


V. Gonzalez-Castro University of Leon, Spain
G. Grossi University of Milan, Italy
L. Heutte University de Rouen Normandie, France
T. Hurtut Polytechnique Montreal, Canada
L. Igual University of Barcelona, Spain
D. Jin National Institutes of Health, USA
F. Khalvati University of Toronto, Canada
A. Khamis CPAMI, University of Waterloo, Canada
Y. Kita National Institute AIST, Japan
A. Kong Nanyang Technological University, Singapore
M. Koskela CSC, IT Center for Science Ltd., Finland
A. Kuijper TU Darmstadt and Fraunhofer IGD, Germany
P. Langlois Polytechnique Montreal, Canada
H. Lombaert École de technologie supérieure, Canada
J. Lorenzo-Ginori Universidad Central Marta Abreu de Las Villas, Cuba
A. Marçal University of Porto, Portugal
J. Marques Instituto Superior Tecnico, Portugal
M. Melkemi University of Haute Alsace, France
A. Mendonça University of Porto, Portugal
M. Mignotte University of Montreal, Canada
P. Morrow University of Ulster, UK
M. Nappi University of Salerno, Italy
H. Ogul Baskent University, Turkey
C. Ou University of Waterloo, Canada
C. Ouali University of Waterloo, Canada
M. Penedo CITIC, Universidade da Coruña, Spain
P. Pina Instituto Superior Técnico, Universidade de Lisboa,
Portugal
A. Pinho University of Aveiro, Portugal
J. Pinto Instituto Superior Técnico, Portugal
L. Prevost ESIEA (Ecole d’ingénieurs du monde numérique), France
P. Radeva Universitat de Barcelona, CVC, Spain
B. Remeseiro Universitat de Barcelona, Spain
H. Ren Samsung Research USA, USA
J. Rouco INESC TEC, Portugal
K. Roy North Carolina A&T State University, USA
A. Sappa Compute Vision Center, Spain
G. Schaefer Loughborough University, UK
P. Scheunders University of Antwerp, Belgium
L. Seoud CNRC, Ottawa, Canada
J. Sequeira Ecole Supérieure d’Ingénieurs de Luminy, France
J. Silva University of Porto, Portugal
J. Sousa Instituto Superior Técnico, Portugal
S. Sural Indian Institute of Technology, India
A. Taboada-Crispi Universidad Central Marta Abreu de Las Villas, Cuba
X Organization

X. Tan Nanjing University of Aeronautics and Astronautics, China


J. Tavares University of Porto, Portugal
A. Torsello Università Ca’ Foscari Venezia, Italy
A. Uhl University of Salzburg, Austria
M. Vento Università di Salerno, Italy
E. Vrscay University of Waterloo, Canada
Z. Wang University of Waterloo, Canada
M. Wirth University of Guelph, Canada
A. Wong University of Waterloo, Canada
X. Xie Swansea University, UK
J. Xue University College London, UK
P. Yan Philips Research, USA
P. Zemcik Brno University of Technology, Czech Republic
H. Zhou Queen’s University Belfast, UK
R. Zwiggelaar Aberystwyth University, UK

Additional Reviewers
H. Bendaoudi Polytechnique Montreal, Canada
C. Chevrefils Polytechnique Montreal, Canada
X. Clady UPMC, France
A. Cunha University of Trás-os-Montes-e-Alto-Douro, Portugal
A. Elmogy University of Tanta, Egypt
R. Farah Polytechnique Montreal, Canada
A. Galdran INESC TEC Porto, Portugal
F. Girard Polytechnique Montreal, Canada
M. Hortas University of A Coruña, Spain
C. Kandaswamy University of Porto, Portugal
Y. Miao University of Waterloo, Canada
F. Monteiro Polytechnic Institute of Bragança, Portugal
P. Negri UADE, Argentina
J. Novo University of A Coruña, Spain
H. Oliveira INESC TEC, Portugal
N. Rodriguez Universidade da Coruña, Spain
L. Teixeira University of Porto, Portugal
L. Zhang Hong Kong Polytechnic University, Hong Kong,
SAR China
F. Zhou Tsinghua University, China
Contents

Machine Learning in Image Recognition

A Weight-Selection Strategy on Training Deep Neural Networks


for Imbalanced Classification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
Antonio Sze-To and Andrew K.C. Wong

End-to-End Deep Learning for Driver Distraction Recognition . . . . . . . . . . . 11


Arief Koesdwiady, Safaa M. Bedawi, Chaojie Ou, and Fakhri Karray

Deep CNN with Graph Laplacian Regularization for Multi-label


Image Annotation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
Jonathan Mojoo, Keiichi Kurosawa, and Takio Kurita

Transfer Learning Using Convolutional Neural Networks


for Face Anti-spoofing. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
Oeslle Lucena, Amadeu Junior, Vitor Moia, Roberto Souza,
Eduardo Valle, and Roberto Lotufo

Depth from Defocus via Active Quasi-random Point Projections:


A Deep Learning Approach . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
Avery Ma, Alexander Wong, and David Clausi

Machine Learning for Medical Image Computing

Discovery Radiomics via a Mixture of Deep ConvNet Sequencers


for Multi-parametric MRI Prostate Cancer Classification . . . . . . . . . . . . . . . 45
Amir-Hossein Karimi, Audrey G. Chung, Mohammad Javad Shafiee,
Farzad Khalvati, Masoom A. Haider, Ali Ghodsi, and Alexander Wong

Discovery Radiomics for Pathologically-Proven Computed Tomography


Lung Cancer Prediction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54
Devinder Kumar, Audrey G. Chung, Mohammad J. Shaifee,
Farzad Khalvati, Masoom A. Haider, and Alexander Wong

Left Ventricle Wall Detection from Ultrasound Images Using Shape


and Appearance Information . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63
Gerardo Tibamoso, Sylvie Ratté, and Luc Duong

Probabilistic Segmentation of Brain White Matter Lesions


Using Texture-Based Classification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71
Mariana Bento, Yan Sym, Richard Frayne, Roberto Lotufo,
and Letícia Rittner
XII Contents

A Machine Learning-Driven Approach to Computational Physiological


Modeling of Skin Cancer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79
Daniel S. Cho, Farzad Khalvati, David A. Clausi, and Alexander Wong

Ejection Fraction Estimation Using a Wide Convolutional Neural Network. . . . 87


AbdulWahab Kabani and Mahmoud R. El-Sakka

Fully Deep Convolutional Neural Networks for Segmentation


of the Prostate Gland in Diffusion-Weighted MR Images . . . . . . . . . . . . . . . 97
Tyler Clark, Alexander Wong, Masoom A. Haider, and Farzad Khalvati

Image Enhancement and Reconstruction

Compensated Row-Column Ultrasound Imaging System


Using Three Dimensional Random Fields. . . . . . . . . . . . . . . . . . . . . . . . . . 107
Ibrahim Ben Daya, Albert I.H. Chen, Mohammad Javad Shafiee,
Alexander Wong, and John T.W. Yeow

Curvelet-Based Bayesian Estimator for Speckle Suppression


in Ultrasound Imaging . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 117
Rafat Damseh and M. Omair Ahmad

Object Boundary Based Denoising for Depth Images. . . . . . . . . . . . . . . . . . 125


Mayoore S. Jaiswal, Yu-Ying Wang, and Ming-Ting Sun

A Note on Boosting Algorithms for Image Denoising . . . . . . . . . . . . . . . . . 134


Cory Falconer, C. Sean Bohun, and Mehran Ebrahimi

Image Segmentation

K-Autoregressive Clustering: Application on Terahertz Image Analysis . . . . . 145


Mohamed Walid Ayech and Djemel Ziou

Scale and Rotation Invariant Character Segmentation from Coins . . . . . . . . . 153


Ali K. Hmood, Tamarafinide V. Dittimi, and Ching Y. Suen

Image Segmentation Based on Solving the Flow in the Mesh


with the Connections of Limited Capacities . . . . . . . . . . . . . . . . . . . . . . . . 163
Michael Holuša, Andrey Sukhanov, and Eduard Sojka

Motion and Tracking

Exploiting Semantic Segmentation for Robust Camera


Motion Classification. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 173
François-Xavier Derue, Mohamed Dahmane, Marc Lalonde,
and Samuel Foucher
Contents XIII

An Event-Based Optical Flow Algorithm for Dynamic Vision Sensors. . . . . . 182


Iffatur Ridwan and Howard Cheng

People’s Re-identification Across Multiple Non-overlapping Cameras


by Local Discriminative Patch Matching . . . . . . . . . . . . . . . . . . . . . . . . . . 190
Rabah Iguernaissi, Djamal Merad, and Pierre Drap

3D Computer Vision

Hybrid Multi-modal Fusion for Human Action Recognition . . . . . . . . . . . . . 201


Bassem Seddik, Sami Gazzah, and Najoua Essoukri Ben Amara

Change Detection in Urban Streets by a Real Time Lidar Scanner


and MLS Reference Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 210
Bence Gálai and Csaba Benedek

Creating Immersive Virtual Reality Scenes Using a Single RGB-D Camera . . . 221
Po Kong Lai and Robert Laganière

Sunshine Hours and Sunlight Direction Using Shadow Detection in a Video. . . 231
Palak Bansal, Chao Sun, and Won-Sook Lee

People-Flow Counting Using Depth Images for Embedded Processing . . . . . . 239


Guilherme S. Soares, Rubens C. Machado, and Roberto A. Lotufo

Salient Object Detection in Images by Combining Objectness Clues


in the RGBD Space. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 247
François Audet, Mohand Said Allili, and Ana-Maria Cretu

Feature Extraction

Development of an Active Shape Model Using the Discrete


Cosine Transform . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 259
Kotaro Yasuda and M. Omair Ahmad

Ground Plane Segmentation Using Artificial Neural Network


for Pedestrian Detection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 268
Jorge Candido and Mauricio Marengoni

An Improved Directional Convexity Measure for Binary Images . . . . . . . . . . 278


Péter Bodnár and Péter Balázs

Learning Salient Structures for the Analysis of Symmetric Patterns . . . . . . . . 286


Jaime Lomeli-R. and Mark S. Nixon

Triplet Networks Feature Masking for Sketch-Based Image Retrieval . . . . . . 296


Omar Seddati, Stéphane Dupont, and Saïd Mahmoudi
XIV Contents

Are You Smiling as a Celebrity? Latent Smile and Gender Recognition. . . . . 304
M. Dahmane, S. Foucher, and D. Byrns

An Empirical Analysis of Deep Feature Learning for RGB-D


Object Recognition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 312
Ali Caglayan and Ahmet Burak Can

Image Registration Based on a Minimized Cost Function


and SURF Algorithm. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 321
Mohannad Abuzneid and Ausif Mahmood

A Better Trajectory Shape Descriptor for Human Activity Recognition . . . . . 330


Pejman Habashi, Boubakeur Boufama, and Imran Shafiq Ahmad

Detection and Classification

Gaussian Mixture Trees for One Class Classification in Automated


Visual Inspection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 341
Matthias Richter, Thomas Längle, and Jürgen Beyerer

Shadow Detection for Vehicle Classification in Urban Environments . . . . . . . 352


Muhammad Hanif, Fawad Hussain, Muhammad Haroon Yousaf,
Sergio A. Velastin, and Zezhi Chen

Input Fast-Forwarding for Better Deep Learning . . . . . . . . . . . . . . . . . . . . . 363


Ahmed Ibrahim, A. Lynn Abbott, and Mohamed E. Hussein

Improving Convolutional Neural Network Design via Variable


Neighborhood Search . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 371
Teresa Araújo, Guilherme Aresta, Bernardo Almada-Lobo,
Ana Maria Mendonça, and Aurélio Campilho

Fast Spectral Clustering Using Autoencoders and Landmarks . . . . . . . . . . . . 380


Ershad Banijamali and Ali Ghodsi

Improved Face and Head Detection Based on Traditional Middle


Eastern Clothing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 389
Abdulaziz Alorf and A. Lynn Abbott

Unsupervised Group Activity Detection by Hierarchical Dirichlet Processes . . . 399


Ali Al-Raziqi and Joachim Denzler

Classification Boosting by Data Decomposition Using Consensus-Based


Combination of Classifiers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 408
Vitaliy Tayanov, Adam Krzyżak, and Ching Suen
Contents XV

Classification Using Mixture of Discriminative Learners: The Case


of Compositional Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 416
Elvis Togban and Djemel Ziou

Biomedical Image Analysis

Mesh-Based Active Model Initialization for Multiple Organ


Segmentation in MR Images. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 429
M.R. Mohebpour, F. Guibault, and F. Cheriet

Sperm Flagellum Center-Line Tracing in Fluorescence 3D+t Low


SNR Stacks Using an Iterative Minimal Path Method . . . . . . . . . . . . . . . . . 437
Paul Hernandez-Herrera, Fernando Montoya, Juan M. Rendón,
Alberto Darszon, and Gabriel Corkidi

Curvelet-Based Classification of Brain MRI Images . . . . . . . . . . . . . . . . . . 446


Rafat Damseh and M. Omair Ahmad

A Novel Automatic Method to Evaluate Scoliotic Trunk Shape Changes


in Different Postures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 455
Philippe Debanné, Ola Ahmad, Stefan Parent, Hubert Labelle,
and Farida Cheriet

Breast Density Classification Using Local Ternary Patterns


in Mammograms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 463
Andrik Rampun, Philip Morrow, Bryan Scotney, and John Winder

Segmentation of Prostate in Diffusion MR Images via Clustering . . . . . . . . . 471


Junjie Zhang, Sameer Baig, Alexander Wong, Masoom A. Haider,
and Farzad Khalvati

Facial Skin Classification Using Convolutional Neural Networks . . . . . . . . . 479


Jhan S. Alarifi, Manu Goyal, Adrian K. Davison, Darren Dancey,
Rabia Khan, and Moi Hoon Yap

Automatic Detection of Globules, Streaks and Pigment Network Based


on Texture and Color Analysis in Dermoscopic Images . . . . . . . . . . . . . . . . 486
Amaya Jiménez, Carmen Serrano, and Begoña Acha

Image Analysis in Ophthalmology

Learning to Deblur Adaptive Optics Retinal Images . . . . . . . . . . . . . . . . . . 497


Anfisa Lazareva, Muhammad Asad, and Greg Slabaugh

A Deep Neural Network for Vessel Segmentation of Scanning Laser


Ophthalmoscopy Images . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 507
Maria Ines Meyer, Pedro Costa, Adrian Galdran, Ana Maria Mendonça,
and Aurélio Campilho
XVI Contents

Adversarial Synthesis of Retinal Images from Vessel Trees . . . . . . . . . . . . . 516


Pedro Costa, Adrian Galdran, Maria Ines Meyer, Ana Maria Mendonça,
and Aurélio Campilho

Automated Analysis of Directional Optical Coherence Tomography Images . . . 524


Florence Rossant, Kate Grieve, and Michel Paques

Contrast Enhancement by Top-Hat and Bottom-Hat Transform with


Optimal Structuring Element: Application to Retinal Vessel Segmentation . . . 533
Rafsanjany Kushol, Md. Hasanul Kabir, Md Sirajus Salekin,
and A.B.M. Ashikur Rahman

Retinal Biomarkers of Alzheimer’s Disease: Insights from Transgenic


Mouse Models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 541
Rui Bernardes, Gilberto Silva, Samuel Chiquita, Pedro Serranho,
and António Francisco Ambrósio

Particle Swarm Optimization Approach for the Segmentation


of Retinal Vessels from Fundus Images . . . . . . . . . . . . . . . . . . . . . . . . . . . 551
Bilal Khomri, Argyrios Christodoulidis, Leila Djerou,
Mohamed Chaouki Babahenini, and Farida Cheriet

Retinal Vessel Segmentation from a Hyperspectral Camera Images . . . . . . . . 559


Rana Farah, Samuel Belanger, Reza Jafari, Claudia Chevrefils,
Jean-Philippe Sylvestre, Frédéric Lesage, and Farida Cheriet

Remote Sensing

The Potential of Deep Features for Small Object Class Identification


in Very High Resolution Remote Sensing Imagery . . . . . . . . . . . . . . . . . . . 569
M. Dahmane, S. Foucher, M. Beaulieu, Y. Bouroubi, and M. Benoit

Segmentation of LiDAR Intensity Using CNN Feature Based


on Weighted Voting . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 578
Masaki Umemura, Kazuhiro Hotta, Hideki Nonaka, and Kazuo Oda

A Lattice-Theoretic Approach for Segmentation of Truss-Like


Porous Objects in Outdoor Aerial Scenes . . . . . . . . . . . . . . . . . . . . . . . . . . 586
Hrishikesh Sharma, Tom Sebastian, and Balamuralidhar Purushothaman

Non-dictionary Aided Sparse Unmixing of Hyperspectral Images


via Weighted Nonnegative Matrix Factorization . . . . . . . . . . . . . . . . . . . . . 596
Yaser Esmaeili Salehani and Mohamed Cheriet

Stroke Width Transform for Linear Structure Detection: Application


to River and Road Extraction from High-Resolution Satellite Images. . . . . . . 605
Moslem Ouled Sghaier, Imen Hammami, Samuel Foucher,
and Richard Lepage
Contents XVII

Applications

Real Time Fault Detection in Photovoltaic Cells by Cameras on Drones . . . . 617


Alessandro Arenella, Antonio Greco, Alessia Saggese, and Mario Vento

Cow Behavior Recognition Using Motion History Image Feature . . . . . . . . . 626


Sung-Jin Ahn, Dong-Min Ko, and Kang-Sun Choi

Footnote-Based Document Image Classification . . . . . . . . . . . . . . . . . . . . . 634


Sara Zhalehpour, Andrew Piper, Chad Wellmon, and Mohamed Cheriet

Feature Learning for Footnote-Based Document Image Classification . . . . . . 643


Sherif Abuelwafa, Mohamed Mhiri, Rachid Hedjam, Sara Zhalehpour,
Andrew Piper, Chad Wellmon, and Mohamed Cheriet

Analysis of Sloshing in Tanks Using Image Processing . . . . . . . . . . . . . . . . 651


Rahul Kamilla and Vishwanath Nagarajan

Light Field Estimation in the Ultraviolet Spectrum . . . . . . . . . . . . . . . . . . . 660


Julien Couillaud, Djemel Ziou, and Wafa Benzaoui

Author Index . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 671


Machine Learning in Image Recognition
A Weight-Selection Strategy on Training Deep
Neural Networks for Imbalanced Classification

Antonio Sze-To1,2 and Andrew K.C. Wong1,2(B)


1
Centre for Pattern Analysis and Machine Intelligence, University of Waterloo,
Waterloo, ON N2L 3G1, Canada
{hy2szeto,akcwong}@uwaterloo.ca
2
Department of Systems Design Engineering, University of Waterloo,
Waterloo, ON N2L 3G1, Canada

Abstract. Deep Neural Networks (DNN) have recently received great


attention due to their superior performance in many machining-learning
problems. However, the use of DNN is still impeded, if the input data is
imbalanced. Imbalanced classification refers to the problem that one class
contains a much smaller number of samples than the others in classifica-
tion. It poses a great challenge to existing classifiers including DNN, due
to the difficulty in recognizing the minority class. So far, there are still
limited studies on how to train DNN for imbalanced classification. In this
study, we propose a new strategy to reduce over-fitting in training DNN
for imbalanced classification based on weight selection. In training DNN,
by splitting the original training set into two subsets, one used for train-
ing to update weights, and the other for validation to select weights, the
weights that render the best performance in the validation set would be
selected. To our knowledge, it is the first systematic study to examine a
weight-selection strategy on training DNN for imbalanced classification.
Demonstrated by experiments on 10 imbalanced datasets obtained from
MNIST, the DNN trained by our new strategy outperformed the DNN
trained by a standard strategy and the DNN trained by cost-sensitive
learning with statistical significance (p = 0.00512). Surprisingly, the DNN
trained by our new strategy was trained on 20% less training images,
corresponding to 12,000 less training images, but still achieved an out-
performing performance in all 10 imbalanced datasets. The source code
is available in https://github.com/antoniosehk/WSDeepNN.

Keywords: Weight selection · Training strategy · Deep neural net-


works · Imbalanced data · Classification

1 Introduction

Deep Learning [1] has recently achieved great success by achieving superhuman
performance in many places such as beating the world champion in GO [2]. It
has been shown to outperform other machine-learning techniques in a variety of
tasks [1]. In essence, Deep Learning refers to the use of Deep Neural Networks

c Springer International Publishing AG 2017
F. Karray et al. (Eds.): ICIAR 2017, LNCS 10317, pp. 3–10, 2017.
DOI: 10.1007/978-3-319-59876-5 1
4 A. Sze-To and A.K.C. Wong

(DNN), defined as artificial neuron networks with at least 3 hidden layers [3]).
Its success is built upon the advent of fast graphics processing units (GPU), and
the availability of large amount of data [1]. Nevertheless, if the data is highly
skewed, or imbalanced [4], even DNN could not avoid such negative impact.
Imbalanced classification [5–7] refers to the problem that one class contains
a much smaller number of samples than the other classes in classification. Such
phenomenon exists in many real-world machine learning applications [6], e.g.
fraud detection, medical diagnostics and equipment failure prediction. Due to the
difficulty in recognizing patterns of the minority class, imbalanced classification
poses a great challenge to existing classifiers including DNN [8].
Throughout the years, many methods [5–7] have been developed for tack-
ing imbalanced classification. They can be categorized into two types: (1) re-
sampling [9], which changes the balance between classes by considering the local
characteristics of the samples, and (2) cost-sensitive learning [10], which assigns
higher misclassification costs to the minority class.
However, studies on training DNN on imbalanced classification are still
scanty. Most of them applied traditional methods such as re-sampling [8,11]
or cost-sensitive learning [12] or both [13]. Recent trend includes regularizing
the loss function [14,15] during training or representation learning [4,16].
In this study, we propose a new strategy to reduce over-fitting in training
DNN for imbalanced classification based on weight selection. In training DNN,
by splitting the original training set into two subsets, one used for training to
update weights, and the other for validation to select weights, the weights that
render the best performance in the validation set would be selected. To our
knowledge, it is the first systematic study to examine a weight-selection strategy
on training DNN for imbalanced classification. Our objective is to investigate if
such strategy can improve the predictive performance of DNN on imbalanced
classification, even when the DNN are trained on a reduced training set.

2 Methodology
2.1 Problem Definition
Let Strain = {(xi , yi )}N
i=1 be a set of N training samples, where xi ∈ X is an
n-dimensional vector and yi ∈ Y = {1, . . . , C} is a class label associated with
the instance xi . Given Strain , the problem is to learn a function f : X → Y ,
such that
E(x,y)∼D [L(f (x); y)] (1)
is minimized, where D is the data distribution over X × Y , x ∈ X, y ∈ Y and
L(z; y) is a loss function that measures the loss if we predict y as z. Equation 1
represents the generalization error of a learnt function. In practice, it would be
approximated by a set of testing samples Stest = {(xi , yi )}M
i=1 as follows:
M
1 
T estf (Stest ) = [L(f (xi ); yi )] (2)
M i=1
A Weight-Selection Strategy on Training DNN for Imbalanced Classification 5

2.2 Proposed Method


Deep Neural Networks (DNN) [1] can be abstractly represented as a function
gθ : X → Y , where g is the function that represents the neuron network configu-
ration (e.g. the number of hidden layers. . . ) while θ is the model parameter that
represents the weights. By applying back-propagation algorithm [1] on Strain ,
a set of weights can be obtained. To achieve a smaller loss value, a standard
strategy is to run the back-propagation algorithm iteratively on Strain , and in
each iteration (epoch), an updated set of weights would be obtained. According
to [17], the procedure can be simplified as shown in Algorithm 1. We denote this
procedure as a standard strategy to train DNN in this study.

Algorithm 1. DNN
Input: a set of training samples Strain , a function g, initialized model parameter θ1

Output: a model parameter θ corresponds to function g.


initialize θ = θ1
for t = 1 to epochs do
θt+1 = backprop(Strain , θt )
θ = θt+1
end for
return θ

The problem of the standard strategy is that it is prone to over-fitting, par-


ticularly for imbalanced data. In addition to traditional approaches such as re-
sampling and cost-sensitive learning, here we propose a new strategy known as
Validation-Loss (VL) strategy, as shown in Algorithm 2. It is to split the input
training set into two sets, one for training the DNN to update weights, and the
other for performance validation to select weights. The weights that render the
the minimum loss value on the validation set would be selected.

Algorithm 2. DNN+VL
Input: a set of training samples Strain , a function g, initialized model parameter θ1

Output: a model parameter θ corresponds to function g.


split Strain into Ttrain and Tvalidate
initialize θ = θ1
initialize vloss = T estgθ (Tvalidate )
for t = 1 to epochs do
θt+1 = backprop(Strain , θt )
if vloss > T estgθ (Tvalidate ) then
vloss = T estgθ (Tvalidate )
θ = θt+1
end if
end for
return θ
6 A. Sze-To and A.K.C. Wong

3 Experiments and Results


3.1 Datasets
Ten imbalanced image datasets were obtained from the widely used MNIST
dataset [18] following the procedure described in [19]. It is a handwritten digit
dataset, containing 60,000 training images and 10,000 testing images. Each image
is associated with a distinct digit as its class. There are 10 digit classes (0, 1,
2,. . . , 9) in total. All images are in black and white with the dimension of 28 × 28.
Following [19], we obtained an imbalanced image dataset (Dataset-i) by referring
the digit class i as positive (+ve) class, and the other digit classes as negative
(−ve) class, with the imbalance ratio (ImR) provided (Table 1). The ImR was
calculated via dividing the number of negative samples (majority) by the number
of positive samples (minority). The higher the ImR is, the more imbalance the
dataset is. All image gray levels were divided by 255 to make each pixel value to
be within 0 and 1.

Table 1. A summary of the 10 imbalanced image dataset obtained from MINST [18].
Each dataset has two classes: −ve and +ve classes, where the +ve class represents
a digit (minority class) and the −ve class represents the remaining digits (majority
class). The imbalance ratios (ImR) were calculated via dividing the no. of −ve samples
by the no. of +ve samples. The higher the ImR is, the more imbalance the dataset is.

Training −ve Training +ve Training ImR Testing −ve Testing +ve Testing ImR
Dataset-0 54077 5923 9.13 9020 980 9.20
Dataset-1 53258 6742 7.90 8865 1135 7.81
Dataset-2 54042 5958 9.07 8968 1032 8.69
Dataset-3 53869 6131 8.79 8990 1010 8.90
Dataset-4 54158 5842 9.27 9018 982 9.18
Dataset-5 54759 5421 10.10 9108 982 10.21
Dataset-6 54082 5918 9.14 9042 958 9.44
Dataset-7 53735 6265 8.58 8972 1028 8.73
Dataset-8 54149 5851 9.25 9026 974 9.27
Dataset-9 50000 5949 9.09 8991 1009 8.91

3.2 Experiments
Five classifiers were adopted in experiments under a binary classification setting:
(1) support vector machine (SVM) with a radial basis function (RBF) kernel;
(2) decision tree constructed by CART (Classification and Regression Trees)
algorithm; both of them were adopted as baseline algorithms and were trained
on the entire training set of each dataset. (3) DNN: a DNN trained by a standard
strategy, i.e. Algorithm 1, training on the entire training set of each dataset
(which contains 60,000 images). (4) DNN-CL: a DNN trained by a standard
strategy with cost-sensitive learning, i.e. setting the misclassification cost of the
minority class to the majority class as ImR to 1; (5) DNN-VL: a DNN trained
A Weight-Selection Strategy on Training DNN for Imbalanced Classification 7

a) Imbalance classification using Deep Neural Networks


MNIST Deep Neural Networks
handwritten digits

-ve

+ve
Input 2 Classify
128
256
512 Legend
784 TR: Training
b) Standard Strategy c) Proposed Strategy VA: Validation
TE: Testing
TR TE TR VA TE

For 1 to epochs do: For 1 to epochs do:


Train on the TR set to update weights Train on the TR set to update weights
End Validate the performance on the VE set
Test on the TE set Save the weights if the VE’s results improve
End
Load the saved weights
Test on the TE set

Fig. 1. (a) DNN with 3 hidden layers were trained to classify if a digit image pertains
to the −ve class (minority) or +ve class (majority) on 10 imbalanced datasets obtained
from MNIST (Table 1). (b) An illustration of the standard strategy, where the DNN is
trained on the entire training set to update weights. (c) An illustration of the proposed
strategy, where the original training set is split into two subsets, one for training to
update weights and the other for weight selection. The weights which obtain the best
performance on the validation set would be selected.

by the proposed strategy, i.e. Algorithm 2, training only on a randomly selected


48,000 (80%) images from the training set of each dataset for weight updates,
and the remaining 12,000 (20%) images for weight selection. These classifiers
were tested on the testing 10,000 images of each dataset.

3.3 Evaluation
Following [20], the prediction performance of the classifiers was evaluated by
the area under the receiver operating characteristic (ROC) curve to enable the
comparison over a range of prediction thresholds, and also by the area under
the precision-recall curve (PRC) to provide a more informative representation
of performance assessment for highly skewed datasets.

3.4 Implementation and Parameter Setting


To build the DNN, the deep learning library Keras (http://keras.io/) with
Theano backend [21] was adopted. The DNN has 5 layers, as illustrated in Fig. 1.
The 1st layer is an input layer with 784 (=28 × 28) dimension. The 2nd layer,
8 A. Sze-To and A.K.C. Wong

the 3rd layer and the 4th layers are hidden layers of rectified linear units (ReLu)
[22] with 512, 256 and 128 dimension respectively. The 5th layer is an out-
put layer of softmax with 2 dimension. The loss function was categorical cross-
entropy. The optimizer was RMSProp [20]. The number of training epochs and
batch size were 50 and 128 respectively. All parameters were set default unless
further specified. Also, SVM (RBF) and Decision Tree were implemented using

Table 2. A comparison of the prediction performance among 5 classifiers in terms


of the area under the receiver operating characteristic (ROC) curve in testing. Two-
tailed Wilcoxon Signed-Rank Test was conducted on the ROC values obtained by DNN
and DNN-VL, as well as the ROC values obtained by DNN-CL and DNN-VL. In both
tests, we obtained a p-value of 0.00512 < 0.05.

Testing ROC SVM (RBF) Decision tree DNN DNN-CL DNN-VL


Dataset-0 0.99864 0.96504 0.99549 0.99555 0.99917
Dataset-1 0.99857 0.97613 0.99671 0.99436 0.99921
Dataset-2 0.98468 0.91820 0.99297 0.99088 0.99874
Dataset-3 0.98835 0.91336 0.99299 0.99558 0.99865
Dataset-4 0.99323 0.92614 0.99383 0.99363 0.99817
Dataset-5 0.98109 0.92016 0.98830 0.99558 0.99921
Dataset-6 0.99577 0.95732 0.99120 0.99261 0.99880
Dataset-7 0.98999 0.94502 0.99244 0.98820 0.99602
Dataset-8 0.98351 0.89631 0.98670 0.97505 0.99835
Dataset-9 0.97748 0.90388 0.98679 0.98825 0.99575

Table 3. A comparison of the prediction performance among 5 classifiers in terms


of the area the under precision-recall curve (PRC) in testing. Two-tailed Wilcoxon
Signed-Rank Test was conducted on the PRC values obtained by DNN and DNN-
VL, as well as the PRC values obtained by DNN-CL and DNN-VL. In both tests, we
obtained a p-value of 0.00512 < 0.05.

Testing PRC SVM (RBF) Decision tree DNN DNN-CL DNN-VL


Dataset-0 0.99630 0.96852 0.99709 0.99526 0.99847
Dataset-1 0.99715 0.97564 0.99749 0.99591 0.99851
Dataset-2 0.97197 0.92987 0.99457 0.99034 0.99696
Dataset-3 0.98099 0.91459 0.99400 0.99251 0.99708
Dataset-4 0.98123 0.93479 0.99355 0.99020 0.99675
Dataset-5 0.97172 0.92722 0.99148 0.99414 0.99741
Dataset-6 0.99136 0.96317 0.99349 0.99367 0.99740
Dataset-7 0.98191 0.94941 0.99232 0.98696 0.99429
Dataset-8 0.96266 0.90656 0.99249 0.97733 0.99520
Dataset-9 0.95822 0.91355 0.99095 0.98462 0.99363
A Weight-Selection Strategy on Training DNN for Imbalanced Classification 9

sci-kit learn [23] with default parameter setting. All experiments were run on
a computer with 8.0 GB RAM, a i5-2410M-2.30 GHz CPU (4 Cores) and a
GT520M Graphics card. These settings were used in all experiments unless fur-
ther specified.

3.5 Results
Table 2 shows the comparative results on the area under the ROC curve in the
testing images. We observe that in all datasets DNN-VL obtained a higher ROC
than DNN and DNN-CL, and the baseline algorithms, SVM (RBF) and Decision
Tree. We conducted a Two-tailed Wilcoxon Signed-Rank Test on the ROC values
obtained by DNN and DNN-VL, as well as the ROC values obtained by DNN-
CL and DNN-VL. Both tests are statistically significant (p = 0.00512 < 0.05).
Similar results evaluated by the area under the PRC in the testing images are
observed in Table 3.

4 Conclusion
In this study, we proposed a new strategy, known as weight-selection strategy,
for training Deep Neural Networks (DNN) on imbalanced classification. To our
knowledge, it is the first systematic study to examine a weight-selection strategy
on training DNN for imbalanced classification. Illustrated by experiments on
10 imbalanced datasets obtained from MNIST, the DNN trained by the new
strategy outperformed the DNN trained by a standard strategy and the DNN
trained by cost-sensitive learning with statistical significance (p = 0.00512). It
should be noted that the DNN trained by our new strategy was trained on
20% less training images, corresponding to 12,000 training images less, but still
achieved an outperforming performance. Future extension of this work includes
incorporating the proposed strategy with existing techniques to investigate if the
performance can be further improved.

References
1. LeCun, Y., Bengio, Y., Hinton, G.: Deep learning. Nature 521(7553), 436–444
(2015)
2. Silver, D., Huang, A., Maddison, C.J., Guez, A., Sifre, L., Van Den Driessche, G.,
Schrittwieser, J., Antonoglou, I., Panneershelvam, V., Lanctot, M., et al.: Master-
ing the game of go with deep neural networks and tree search. Nature 529(7587),
484–489 (2016)
3. Bengio, Y., et al.: Learning deep architectures for AI. Foundations and trends.R
Mach. Learn. 2(1), 1–127 (2009)
4. Huang, C., Li, Y., Change Loy, C., Tang, X.: Learning deep representation for
imbalanced classification. In: Proceedings of the IEEE Conference on Computer
Vision and Pattern Recognition, pp. 5375–5384 (2016)
5. Sun, Y., Wong, A.K., Kamel, M.S.: Classification of imbalanced data: a review.
Int. J. Pattern Recogn. Artif. Intell. 23(04), 687–719 (2009)
10 A. Sze-To and A.K.C. Wong

6. He, H., Garcia, E.A.: Learning from imbalanced data. IEEE Trans. Knowl. Data
Eng. 21(9), 1263–1284 (2009)
7. Haixiang, G., Yijing, L., Shang, J., Mingyun, G., Yuanyue, H., Bing, G.: Learn-
ing from class-imbalanced data: review of methods and applications. Expert Syst.
Appl. 73, 220–239 (2017)
8. Masko, D., Hensman, P.: The impact of imbalanced training data for convolutional
neural networks. In: Degree Project in Computer Science, pp. 1–28. KTH Royal
Institute of Technology (2015)
9. Chawla, N.V., Bowyer, K.W., Hall, L.O., Kegelmeyer, W.P.: SMOTE: synthetic
minority over-sampling technique. J. Artif. Intell. Res. 16, 321–357 (2002)
10. Ting, K.M.: A comparative study of cost-sensitive boosting algorithms. In: In Pro-
ceedings of the 17th International Conference on Machine Learning. Citeseer (2000)
11. Oquab, M., Bottou, L., Laptev, I., Sivic, J.: Learning and transferring mid-level
image representations using convolutional neural networks. In: Proceedings of the
IEEE Conference on Computer Vision and Pattern Recognition, pp. 1717–1724
(2014)
12. Khan, S.H., Bennamoun, M., Sohel, F., Togneri, R.: Cost sensitive learning of deep
feature representations from imbalanced data. arXiv preprint arXiv:1508.03422
(2015)
13. Zhang, C., Gao, W., Song, J., Jiang, J.: An imbalanced data classification algorithm
of improved autoencoder neural network. In: 2016 Eighth International Conference
on Advanced Computational Intelligence (ICACI), pp. 95–99. IEEE (2016)
14. Shen, W., Wang, X., Wang, Y., Bai, X., Zhang, Z.: Deepcontour: a deep convolu-
tional feature learned by positive-sharing loss for contour detection. In: Proceed-
ings of the IEEE Conference on Computer Vision and Pattern Recognition, pp.
3982–3991 (2015)
15. Wang, S., Liu, W., Wu, J., Cao, L., Meng, Q., Kennedy, P.J.: Training deep neural
networks on imbalanced data sets. In: 2016 International Joint Conference on
Neural Networks (IJCNN), pp. 4368–4374. IEEE (2016)
16. Ng, W.W., Zeng, G., Zhang, J., Yeung, D.S., Pedrycz, W.: Dual autoencoders
features for imbalance classification problem. Pattern Recogn. 60, 875–889 (2016)
17. Sutskever, I.: Training recurrent neural networks. Ph.D. thesis, University of
Toronto (2013)
18. LeCun, Y., Bottou, L., Bengio, Y., Haffner, P.: Gradient-based learning applied to
document recognition. Proc. IEEE 86(11), 2278–2324 (1998)
19. An, J., Cho, S.: Variational autoencoder based anomaly detection using recon-
struction probability. In: Special Lecture on IE, vol. 2, pp. 1–18. SNU Data Mining
Center (2015)
20. Tieleman, T., Hinton, G.: Lecture 6.5-rmsprop: divide the gradient by a running
average of its recent magnitude. COURSERA: Neural Netw. Mach. Learn. 4(2)
(2012)
21. Bastien, F., Lamblin, P., Pascanu, R., Bergstra, J., Goodfellow, I., Bergeron, A.,
Bouchard, N., Warde-Farley, D., Bengio, Y.: Theano: new features and speed
improvements. arXiv preprint arXiv:1211.5590 (2012)
22. Nair, V., Hinton, G.E.: Rectified linear units improve restricted Boltzmann
machines. In: Proceedings of the 27th International Conference on Machine Learn-
ing (ICML-10), pp. 807–814 (2010)
23. Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O.,
Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., et al.: Scikit-learn: machine
learning in python. J. Mach. Learn. Res. 12(Oct), 2825–2830 (2011)
End-to-End Deep Learning for Driver
Distraction Recognition

Arief Koesdwiady(B) , Safaa M. Bedawi, Chaojie Ou, and Fakhri Karray

Center for Pattern Analysis and Machine Intelligence, University of Waterloo,


Waterloo, ON, Canada
{abkoesdw,sbedawi,c9ou,karray}@uwaterloo.ca

Abstract. In this paper, an end-to-end deep learning solution for driver


distraction recognition is presented. In the proposed framework, the
features from pre-trained convolutional neural networks VGG-19 are
extracted. Despite the variation in illumination conditions, camera posi-
tion, driver’s ethnicity, and genders in our dataset, our best fine-tuned
model, VGG-19 has achieved the highest test accuracy of 95% and an
average accuracy of 80% per class. The model is tested with leave-one-
driver-out cross validation method to ensure generalization. The results
show that our framework avoided the overfitting problem which typically
occurs in low-variance datasets. A comparison between our framework
with the state-of-the-art XGboost shows that the proposed approach
outperforms XGBoost in accuracy by approximately 7%.

Keywords: Driver distraction · Deep learning · Intelligent transporta-


tion system

1 Introduction

Distracted driving is much more dangerous than most people realize. Accord-
ing to the most recent published World Health Organization (WHO) report, it
was estimated that, in 2015, over 1.25 million people were killed on the roads
worldwide, making road trac injuries a leading cause of death globally [6]. Driver
errors still remain the main cause of accidents in the roads. Using the cellphone
for texting, talking and navigation as well as drowsiness are dierent types of activ-
ities that drastically decrease drivers attention to the road. In this background,
research has focused on help to improve these alarming statistics. Research
includes, but is not limited to: research on identifying driver behaviour to help
the industry creating solutions to reduce the effect of driver distraction [3] and
research on self-driving cars [1], where lane marking detection, path planning,
and control are the area of enhancements. In [3], a review is provided on algo-
rithms used in driver distraction detection. The main methods are focused on
face detection, face/hand tracking and detection of facial landmarks [9]. Among
the algorithms used are: SVM, Histogram of Oriented Gradients, Artficial Neural
Networks (ANN, and Deep Neural Networks (DNN). The reported results for

c Springer International Publishing AG 2017
F. Karray et al. (Eds.): ICIAR 2017, LNCS 10317, pp. 11–18, 2017.
DOI: 10.1007/978-3-319-59876-5 2
12 A. Koesdwiady et al.

these methods show that they are affected to various degrees, by illumination,
skin colour and camera position inside the car.
In this paper, an end-to-end deep learning based classifier is investigated for
driver distraction detection. This is motivated by its performance in self-driving
car [1] and to have the system operated without direct human interaction. The
purpose of this paper is to investigate the robustness of the suggested frame-
work under different illuminations, different drivers type and different camera
positions. To our best knowledge, an end-to-end deep learning framework has
not been used for driver distraction detection.
The rest of this paper is organized as follows. An overview on deep learning
framework, off-the-shelf feature extraction, dimensionality reduction and how it
is used in classification are introduced in Sect. 2. In Sect. 3, a description of our
dataset, experimental setup and fine-tuning approaches are described. Analysis
of our results is in Sect. 4 and finally the conclusion and future work are presented
in Sect. 5.

2 End-to-End Deep Learning Framework


Driver distraction recognition problem can be treated as a multi-class classifi-
cation process to map the input observations to a driver state. The developed
system includes three main components, as shown in Fig. 1. The first component
is a variant of convolutional deep neural network for high-abstracted feature
extraction. Followed by a max pooling layer which reduces the dimension of
features. The last component includes 6 fully-connected layers and a softmax
layer.

Fig. 1. End-to-end deep learning framework.


Another random document with
no related content on Scribd:
"Us must face a stormy night, my dear," said William; "but 'twill pass,
however long. Fetch up a good lot of peat, and keep the hot milk and the
sago pudden down here; and trust me, if your eyes close, to be on the
watch."

"They won't close," promised Auna.

"I've stopped the cuckoo clock," explained William, "because the noise
vexed him. We'll be still as death to-night, for silence is physic in itself."

He ate and drank and she bade him take a stiff glass of spirits and water.

"You've done wonders, and I'll never forget it, and no more will father
when he's well again," declared Auna. "You smoke your pipe now and have
a rest down here, and I'll fetch up the easy, wicker chair, so you'll sit up
there comfortable. But you mustn't smoke in his room, Mr. Marydrew."

She went out presently, to feed the ponies, while William lit his pipe,
with an ear cocked lest the sufferer should wake.

"Dark and still, and a fox barking far ways off," said Auna when she
returned.

Jacob did not wake till nearly midnight and then they wrestled with him
for an hour, but made him take his food and physic. He was in agony, yet
his mind seemed a little clearer and he restrained himself. Then fever
dreams swept over him. Under the light of the shaded lamp, hidden in a
corner, he could see his daughter, and still he believed her Margery; but
Margery come from beyond the grave. He thought his wife's spirit was
beside him and the belief acted as an anodyne.

"Say you've come for me! Say you've come for me! Say you are going
to forgive me and fetch me home!" he cried. And then Auna saw that he
wept; so she bent over him and wiped away his tears. Her own fell, too, and
when he had lapsed again into insensibility, she crept away and sobbed
silently in a corner. This happened in the dead of the night and William,
despite his resolutions, was sound asleep beside the fire at the time. She was
glad that he slept and had not seen her cry.
She pulled herself together and took heart, mended the fire and sat alert
between the two unconscious men while the long night drained away.

Jacob did not waken until four o'clock, with a shout that brought
William to his feet.

"Dang my old wig, if I haven't had forty winks," he said, "why for did
you let me drop off, Auna?"

CHAPTER XI

JACOB LIVES

A trained nurse arrived before noon on the following day and Dr.
Cousins drove her up. The physician found that Jacob was responding to his
remedies and no special symptoms of any gravity had appeared. The
disease promised to run its course, but whether it would destroy the patient
remained to be seen. He was dangerously ill and still delirious. Jacob could
not state his case, but the doctor inclined to hope that he suffered less of
acute pain. He assured Auna that she had every reason to be hopeful and the
nurse, at a later hour in the day, declared the same. She was chiefly
concerned with the sick man's reserves of strength and, seeing his powerful
frame, felt sanguine.

William went home on his pony after the doctor's visit, and an hour later
John Henry and Peter came up together, bringing fresh milk and other
comforts. They did not see their father, but they stopped with Auna for an
hour or two and spoke of the Huxams.

Great decisions had been made and their grandparents were returning to
the post-office, while Jeremy and Jane would occupy the new house for the
present, until the time came to let it furnished for the summer.
John Henry now saw through his uncle and spoke scornfully of him.

"A worthless waster—that's all he is—and no good for anything. All


talk and smiles and fine manners; and under 'em nothing. He ain't got any
brains and he ain't got any guts; and everybody in the land knows it but
grandmother."

"Does she like going back?" asked Auna.

"Yes—else she wouldn't go. I believe she's been at a loose end ever
since she left the shop. She hates doing nothing and being waited on. It's
contrary to her nature, and she's been very queer lately and frightened
grandfather off and on. She'll be jolly glad to get back into the midst of
things, and so will he. There's a lot of work in 'em both yet and it's good for
their money and for the rest of the family that they're going back. There
wouldn't have been any business left if Uncle Jeremy had messed about
there much longer."

"To think we thought him such a wonder when we were little!" said
Peter.

"When you get up to be a man yourself, you soon see what most men
are worth," answered John Henry. "He'll never do anything and, if his
children are like him, they'll all go in the workhouse soon or late; and if
they're workers, then he'll live on them come he grows old."

Auna asked after her sister and heard that she was well. Then the
brothers departed and Peter promised to return ere long.

"In about three days Nurse Woolcombe says we shall know for certain if
father's going to live," Auna told them; "but I know now. I'm positive sure
he's going to live."

"'Tis doubtful if it would be a good thing," declared John Henry. "I


speak for himself, of course. I'm sure nobody wants him to die."

"It will be the turning point for father perhaps," thought Peter. "He
might quiet down after a shaking like this."
Jacob Bullstone suffered for many days and it was a fortnight before
Auna could find herself sleeping natural sleep and waking without dread.
The sick man mended and fell back again. His strength wore down and a
first relapse found him in the gravest danger, for his heart was weary. But
he pulled up again with devoted nursing and skilled attention. To Auna,
Nurse Woolcombe became as a goddess, and she sealed a friendship with
the widow that lasted for life. They worked together and the younger was
skilful and understanding.

The head symptoms continued distressing and Bullstone lapsed into


delirium on several occasions after periods of sanity. His temperature
puzzled the nurse and she strove to distract the patient's mind from himself.
But, for a time, he continued impatient and declared that no friendly hand
would have desired to prolong his life. Then he grew more reasonable, and
when the crisis was past and his sons were permitted to see him, while
secretly amazed at the sight of the pallid and shrunken ghost they called
their father, both found him peaceful and in a frame of mind they hardly
remembered.

His face was white and a growth of beard also helped to disguise the
countenance they knew; but his great, dark eyes no longer roamed restlessly
over them. They were dimmed by much pain, yet they were gentle and
steady. He spoke little and his voice had weakened to a whisper; but he
listened and nodded affirmation. His chief concern was with Avis.

"Is her child born?" he asked thrice, at intervals.

"The child's due," said John Henry, "and she's very well, so Bob tells
me."

He looked up at the great lads solemnly.

"John Henry and Peter," he said—"my two sons come up to see me.
That's good."
"And lots to show you when you're about again, father. You'll be
surprised at 'Red House Rover's' new puppies."

"The old dog ran home. He thought I was going to die; but he didn't
bargain for doctor and Auna. They've saved me I suppose."

"The Lord of Hosts have saved you, father," declared John Henry, who
had great faith in his God.

"Believe it if you can," said Jacob.

They spoke of the Huxams, but found him not interested.

"Is the child of Avis born?" he asked again.

They left him and told Auna that they were well pleased and that he had
been kind to them.

"He won't die now; but I expect he'll be a bed-lier for evermore. You've
got to face that, Auna," said John Henry.

"So long as he lives till he's at peace, nothing else matters," she
answered. "But he'll be better than a bed-lier. Nurse says he'll walk in a
month and get back his nature mighty quick, when he can eat strong food
again."

Jacob mended slowly and the weather held against him, for the spring
was harsh and chill. The light increased with the cold and early March
found snow on the moors and a harsh spell of wind from the north-east. The
doctor begged Jacob to return to Red House, that his cure might be
hastened; but this he declined to do. He was calm and patient now, though
very weak. He liked to be alone and he expressed a great desire to see
William Marydrew.

Then came good news and Auna had the joy of telling him.

"Avis has got a girl baby, father; and 'tis a beautiful, perfect little child
with Bob's eyes. And Avis is doing well and the baby's going to be called
'Margery.' And Margery Elvin's a pretty name Nurse Woolcombe says; and
so it is."

The news did Bullstone good service and occupied his mind.

"The name of 'Margery' will be upon my tongue again," he said, "and I


must school myself to speak it and hear it, Auna. I shall be very glad to see
the little creature. You're worthy to be your mother's daughter, and that's the
highest praise you'll ever get from human lips. And may the child of Avis
be worthy to be her granddaughter."

Auna felt very happy.

"It never rains but it pours, father," she said. "I've had a letter to-day—a
letter from Great-Uncle Lawrence Pulleyblank. His writing's gone very
spindley and up and down, because he's so old; but when you're equal to it
he wants you to travel to Plymouth for the sea air; and if you won't go, then
he wants for me to go, when you can spare me for a week or two."

"Wishful for me to go?"

She began to speak, but stopped, since her words would have to deal
with the incidents of her mother's death.

"I dearly hope you'll do it when you can."

"No, no; but thank the man. You shall go to him. I won't be so selfish
about you as I have been. I've kept your light under a bushel too long,
because you were all I had left. Others must share you, I reckon. You're pale
as a davered rose through so much nursing. You shall go presently and
make a good holiday with the old chap."

He asked constantly for William, and hoped that each fine day might
bring him.

"I've got a wonderful thing to tell him, that only such a man would
understand," he explained to Nurse Woolcombe.

They concealed the fact that Mr. Marydrew had been ill with bronchitis.
Then Auna went to see him and was able to tell her father that Billy had
returned to health, after a chill.

"He's all right and is coming up the first soft day in May," she promised.

Jacob himself began to regain strength, and there fell a morning when
Auna went to Brent to bid a barber climb to Huntingdon and shave him.

"He's a proper ghost, and you mustn't be frightened, Mr. Prynn," she
said; "but he's going on all right, and now he's wishful to have his beard
away."

Yet, before the barber came, Jacob changed his mind again.

"I'll wear it for a sign," he said. "I'll let it be."

At last William arrived and Jacob greeted him with affection. The old
man and Auna sat one on each side of his chair and Auna held her father's
hand.

"I'm going downstairs again next week," explained Jacob. He was


sitting by the fire.

William expressed great pleasure at his appearance,

"A far different creature from what I left," he said. "Then you was a
burning, fiery furnace, my dear. But now you be glad for a bit of fire
outside yourself. Can you catch heat from it?"

"And you—you've been bad, too, I'm vexed to learn."

"Only the tubes," explained William. "My tubes was filled up, so as I
had to fight for air a bit; but us old oaks takes a lot of throwing. I'm good
for another summer anyway. Spring's afoot down the vale."

"I've had great thoughts in the shadow of death, William. I've come
through, as you see, and shall live a bit longer. At death's door I knocked
and they wouldn't let me enter in. You can't get so close, though, without
learning many things. Yet I wouldn't be without what I know. It points to
peace—a withered sort of peace, where no hope is."

"You can't live without hope, my dear man. It's so needful as the air you
breathe."

"Yes, you can live without it; you can do your duty without it. I heard a
laugh yesterday night—'twas myself. Nature made me laugh, because to be
without hope is almost beyond reason, and anything outside reason makes
us laugh."

William regarded him doubtfully.

"I thought to find you'd thrown over all these silly fancies," he said.
"You must keep a hand on yourself, Jacob, now you've come through and
are going to live. It's bad to laugh when there's nothing to laugh at. You
mustn't do that. Emma Andrews laughed for three days; and she went down
to the river and drowned herself on the fourth."

"I'm all right. Between ourselves, Billy, I had bats in the belfry for a
time after my wife died. I know it now and I'm surprised that none marked
it. After the trial came a great flash of light to my mind. From within it
came and made all the past look dark—burned it to dust and cinders. Only
the future mattered, and it wasn't the judge and jury showed me I had been
wrong—it wasn't them at all. It was the flash of light. Then hope got hold
on me like a giant and I hoped too much. That was my punishment—to
hope too much and not see hope had died. But my sickness has drained the
poison out of me, and though my frame is weak, my brain is clear. I see and
I can put things together. I've come to a great thought—a shattering thing
but true."

"A comforting thing then."

"No—truth is seldom comforting. But it puts firm ground under you and
shows you where to stand and how to protect yourself against hope. I'm a
well-educated man, William, and though I've fallen far below all that I was
taught as a boy, I've risen again now. But life's too short for most of us to
learn how to live it. Too short to get away from our feelings, or look at it all
from outside. But I can now. I've reached to that. I can look at myself, and
skin myself, and feel no more than if I was peeling a potato."

William began to be uneasy.

"Leave yourself alone. Have you seen your granddaughter yet?"

"No; but my heart goes out to her. Don't look fearful: I'm all right. I
haven't done with my children, or their children. I'm human still. I can take
stock of myself, thanks to my forgotten wisdom—lost when Margery died,
and found again. A bit ago I was growing awful cold. I felt not unkindly to
the world, you must know, but cold was creeping into me, body and soul. I
didn't love as I used, nor hate as I used, nor care as I used. I didn't want to
see what I couldn't see, nor do what I couldn't do. All was fading out in a
cold mist. Then I had my great illness, and there was no more mist, and I
began to link up again with the world. Nothing could have done it but that.
And then I got the bird's-eye view denied to most of us, but reached by me
through great trials.

"For look at it, Billy. First there was my faulty nature and little
experience. No experience of life—an only son, kept close by loving
parents—and with the awful proneness to be jealous hid in me, like poison
in a root. Then fate, or chance, to play trick after trick upon me after
marriage and build up, little by little, the signs of my great, fancied wrong.
Signs that another man would have laughed at, but proofs—deadly proofs
of ruin to my jaundiced sight. And the cunning, the craft to heap these
things on my head—all shadows to a sane man; but real as death to me!
First one, then another—each a grain of sand in itself, but growing,
growing, till the heap was too heavy for me to bear.
"And whose work was it? 'God' you say, since He's responsible for all
and willed it so. God, to plan a faulty man and start him to his own
destruction; God, to make me love a woman with a mother like Margery's,
so that, when the wounds might be healed, there was that fiend ever ready
and willing and watchful to keep 'em open. God, to will that I should never
hear my wife forgive me, though she had forgiven me; God, to let her die
before I could get to her and kneel at her feet!

"No, William, a tale like this leaves a man honest, or else mad. And I'll
be honest and say that no loving, merciful, all-powerful Father ever treated
his children so. Mark how calm I am—no fury, no lamentation, no rage
now. Just clear sight to see and show the way of my downfall. Your God
could have given me a pinch of fine character to save me. He could have
made me more generous, more understanding of my pure wife, less
suspicious, less secret, less proud, less mean. He could have built me not to
head myself off from everything and bring back night and ruin on my head.
But no.

"We'll allow I got all I deserved—we'll confess that as I was made


cracked, I had to break. But what about Margery? Did she get what she
deserved? Did she earn what she was called to suffer—a creature, sweet as
an open bud, to be drawn through dirt and horror and things evil and foul to
early death? Was that the work of the all powerful and all just?"

"Always remember there's the next world, Jacob."

"Can fifty next worlds undo the work of this one? Can eternity alter
what I did and what she suffered here? The next world's no way out,
William. The balance isn't struck there, because evil never can turn into
good, either in earth, or heaven. You can wipe away tears, but you can't
wipe away what caused them to flow. And where have I come to now, think
you? Another great light I've seen, like the light that blazed to Paul; but it
blazed a different story to me. When I see a man praising God, I'm
reminded of a mouse that runs to hide in the fur of the cat that's killing it. I
no longer believe in God, William, and I'll tell you why. Because I think too
well of God to believe in Him. D'you understand that? I wish He existed; I
wish we could see His handicraft and feel His love; but let us be brave and
not pretend. The sight of my little life and the greater sight of the whole
world as it is—these things would drive God's self into hell if He was just.
He'd tumble out of His heaven and call on the smoke of the pit to hide Him
and His horrible works. And so I've come to the blessed, grey calm of
knowing that what I suffered there was none to save me from. It's a sign of
the greatness of man that he could give all his hard-won credit to God,
William, and invent a place where justice would be done by a Being far
nobler, finer, truer and stronger than himself. But proofs against are too
many and too fearful. The world's waiting now for another Christ to wake
us to the glory of Man, William, because the time has come when we're old
enough to trust ourselves, and walk alone, and put away childish things. We
deserve a good God—or none."

The ancient listened patiently.

"Your mind is working like a river—I see that," he answered. "But be


patient still, Jacob. Keep yourself in hand. You'll find yourself yet; you be
on the right road I expect; and when you do find yourself, you'll find your
God was only hid, not dead."

He uttered kindly thoughts and they talked for an hour together. Then
Auna and William descended to the kitchen and he ate with her. He was
happy at what he had seen rather than at what he had heard.

"Forget all your father says," begged William. "He's going to be a strong
and healthy man. I mark the promise in him. A great victory for doctor and
you and nurse. There's a bit of fever in his mind yet; but the mind be always
the last to clean up again after a great illness. His talk be only the end of his
torments running away—like dirty water after a freshet."

"Do you think, if we could get him down to see Margery Elvin
christened, it would be a useful thing, Mr. Marydrew?" asked Auna, and
William approved the idea.

"By rights he ought to go to Church and thank God for sparing him," he
said. "But, be that as it will, if he saw his grandchild made a faithful
follower and heard a hymn sung out and the organs rolling, it might all help
to do the good work."
"I'll try to bring it about, though it may be a very difficult thing to
manage," she said.

"You make a valiant effort," urged William, "and tell your sister to hold
over the event till her father's man enough to come down and lend a hand."

He returned to Jacob before he left Huntingdon.

CHAPTER XII

THE CHRISTENING

On a day six weeks later, Jacob went down among men and, at the
desire of his children, attended the baptism of his grandchild. The families
assembled and the time was afternoon on Sunday. All interested, save
Judith Huxam, were present, and after the ceremony ended, a little company
trailed up the hill to Owley, that they might drink tea together and cut the
christening cake. Avis and Auna walked side by side and Auna carried the
baby; while behind them came Peter, Robert Elvin and his mother. John
Henry had joined his Aunt Jane Huxam and her little boys; Jeremy and
Adam Winter followed them and Jacob Bullstone, with Barlow Huxam,
walked fifty yards in the rear. They talked earnestly together and Barlow
had the more to say.

He was full of great anxieties, yet did not fail to express regret at his
son-in-law's illness and satisfaction that he had been restored to health.

"A triumph for your constitution and the doctor's skill. I've thought
upon you and not left you out of my prayers," he said.

"Yes; I've come through; and it was worth while. Time will show,"
answered Jacob.
"A thoughtful moment, when first you see yourself as grandfather,"
commented Barlow, "and still more so when you've only got to wait till a
little one can talk to hear yourself called 'great-grandfather.' That's how it is
with me now."

"How d'you find yourself taking up the reins at the post-office once
again?"

"The power is still there, thank God," answered Mr. Huxam. "But time
don't stand still. Life goes pretty light with me, but in confidence I may tell
you it doesn't go so light with my wife. You don't understand her and I don't
expect you to do it, Jacob; and she don't understand you; but you've been
through heavy waters; you've brought forth deep things out of darkness, in
Bible words, and I may tell you that all's not right with my wife. She was a
bit cheerfuller at first, when we went back into harness and let Jeremy and
his family go to the residence, but it was a flash in the pan. Judy's brooding
again and speaking in riddles, and I'm much put about for the future—the
future here below I mean—not in the world to come."

Bullstone spoke quietly of his own thoughts on the subject of his


mother-in-law.

"I've hated your wife with a deadlier hate than I thought was in my
nature," he said. "But not now. Before I had my great illness, I always
hoped to see her face to face once more and have speech with her; because I
was much feared of a thing happening, Barlow. I'd meant to see her alone
and warn her, by all that she held sacred, to play fair if she got to Margery
before I did. Such was my blinded sight, then, that I thought it might lie in
her power, if she went on before, to poison my wife's mind in heaven, as
she did on earth, so that if I came I'd still get no forgiveness. But that was
all mist and dream and foolishness, of course. If there was a heaven, there
would be no bearing false witness in it. But there's no heaven and no
meetings and no Margery. It's all one now. Things must be as they are, and
things had to be as they were, because I'm what I am, and Judith Huxam is
what she is."

"A very wrongful view," replied Huxam. "But, for the minute, your
feelings are beside the question, Jacob, and I'm not faced in my home with
any fog like that, but hard facts. And very painful and tragical they may
prove for me and all my family. You'll understand that she never could
forgive the fearful day we journeyed to Plymouth—you and me; and she
held that I'd gone a long way to put my soul in peril by taking you there to
see your dying wife. That's as it may be, and I've never been sorry for what
I did myself, and I don't feel that it put a barrier between me and my Maker.
But now the case is altered and I'm faced with a much more serious matter.
Judy don't worry about me no more and she don't worry about any of us,
but, strange to relate, she worries about herself!"

"I've heard a whisper of it."

"You might say it was the Christian humility proper to a saint of God;
but this mighty gloom in her brain gets worse. Once, between ourselves, it
rose to terror, at half after three of a morning in last March. She jumped up
from her bed and cried out that Satan was waiting for her in the street.
That's bad, and I spoke to Dr. Briggs behind her back next day."

"Her religion was always full of horrors, and the birds are coming back
to roost."

"That's a very wrong view and I won't grant it," answered Barlow.

"Who can look into the heart of another? Who can know the driving
power behind us?"

"It's not her heart: it's her poor head. Briggs is watching over her and he
don't like it any more than I do. There's a well-known condition of the
human mind called 'religious melancholy,' Jacob; and it's a very dangerous
thing. And it's got to be stopped, or else a worse state may over-get her."

"She looks back and mourns maybe? Perhaps it's only her frozen
humanity thawing with the years."

"She don't look back. Never was a woman less prone to look back. She
looks forward and, owing to this delusion of the mind, she don't like what
she sees and it makes her terrible glum. Her eyes are full of thunder, and her
voice is seldom heard now."
"We reap what we sow."

"Not always. She's walked hand in hand with her God ever since she
came to years of understanding, and it's a hard saying that such a woman
deserved to lose her hope and suffer from a disordered mind."

"It's not a disordered mind that loses hope, Barlow—only a clear one.
Hope's not everything."

"Hope is everything; and if the mind weakens, then the life of the soul
stops and there's nothing left but an idiot body to watch until its end. I've
got to face the chance of Judy's immortal part dying, though her clay may
go on walking the earth for another twenty years; so you'll understand I'm
in pretty deep trouble."

"It may not happen."

"It may not. But I'm warned."

Jacob expressed no great regret, for the things that now entered his mind
he could not, or would not, utter.

Mr. Huxam pursued his own grey thought.

"Sometimes it happens that these people who are overthrown by


religion, by the dark will of their Creator, have got to be put away from
their friends altogether; because too much religion, like too much learning,
topples over the brain."

"Perhaps it's only conscience pricking her."

"In a lesser one it might be that; but to hear such a woman as her
wondering in the small hours whether, after all, she is redeemed, that's not
conscience—it's a breakdown of the machinery. 'Could I lose my own soul
by saving Margery's?' she asked me once, and such a question of course
means a screw loose."

Bullstone did not answer and Barlow presently feared that he might
have said too much. He sighed deeply.
"Keep this from every human ear," he begged. "I may be wrong. There
may be a high religious meaning in all this that will come to light. We must
trust where only we can trust."

"You'll find where that is, if you live long enough and suffer long
enough," was all the other answered.

A cheerful spirit marked the little celebration at Owley and, for the first
time, Jacob held his granddaughter in his arms. He had brought a gift—a
trinket of silver with a moonstone set in it—that he had purchased before
their marriage for Margery.

Auna and her father walked home together afterwards up the long slope
from Owley to the moor. He was calm and gracious and they spoke of the
girl's coming visit to her great-uncle.

"I wish you'd change your mind even now and come along with me,"
she said. "You'd do Uncle Lawrence good very like."

"No, I shouldn't do him good, and a town's too great a thought for me
yet a while. Not but what I want to do a bit of good, to return a little of all
that's been done for me. But opportunity doesn't lack. I'll get in touch with
my fellow-creatures slow and gradual, one by one. They frighten me too
much all together. They always did; but I'll come back to them, like a ghost,
presently."

"You're not a ghost any more. Look how fine you stood among the
people to-day, and how pleased they were to see you," said Auna.

"Jeremy's going to drive up and fetch you Monday week. He's a


gentleman at large for the minute. Idleness always finds that man at his
best."

"But he'll be a chemist next, and he's reading about it already. He says
that the goal's in sight, and he feels that, as a dispensing chemist, he will
come out like the sun from behind a cloud."
"A very ornamental man, but would have done better as a tree, Auna.
There's many a human would have given more pleasure and less trouble as
a tree in a wood."

Auna laughed.

"He'd have been a very good-looking tree—one of the silver birches


perhaps."

"Margery Elvin is now of the Household of Faith," said Jacob suddenly.

"And weren't she good when parson took her, father?"

"I shouldn't wonder, if all goes well, whether I don't go down and see
her again when you're away."

"I do hope you will then, and write me a letter to say how you are."

"I shall be full of thinking about you. I have thoughts about you. I am
going to make you happy."

"Be happy yourself and I shall be."

"No, that can't happen; but I'll make you happy another way than that. I
can look ahead as I never did before. It will come in good time. Patience is
greater than happiness. I'll go back into the world some day. But my soul
must be quiet—quite quiet for a little longer yet, Auna, please."

Jeremy came at the appointed time to drive his niece into Brent. She
was going to spend one night with her Aunt Jane at the villa and proceed
next morning to Plymouth. To the last moment she was busy with
arrangements for Jacob's comfort. They had a milch cow at Huntingdon,
and Bullstone milked it himself. A red dog had also settled down with him.

Jeremy was anxious about his mother, but turned to another subject as
he and his niece journeyed down the hills together. Auna had waved her last
farewell to her father and he had waved back. Then Jeremy touched a
personal matter.

"I want for you to sound Uncle Lawrence about his money," he said.
"You're a clever and understanding girl and can be trusted. It's time his
family knew his intentions, Auna, and you might be doing a useful thing if
you were able to get a word out of him."

"If he says anything I'll remember it," she promised.

"I hope he'll live for a long time yet; and when he's got to go, I hope
you'll have the money, if you want it."

"I'm thinking not of myself, but my children, Auna, and your Aunt
Jane."

"I'll send the boys post-card pictures from Plymouth," she promised,
"and I'm hopeful to send some fine fishes to father if I can."

CHAPTER XIII

THE PROMISE

Three men were talking at Shipley Bridge, and one prepared to leave the
others and ascend to the Moor. But he was in no great hurry. Adam Winter
and old William listened to George Middleweek, who had come from Red
House. His talk concerned the Bullstones and he spoke of Peter.

"He's wise for his years, but I laugh to see the real boy moving and
feeling behind the parrot cry of what he's been taught to say and feel. A
young woman has turned him down at Brent. She loves somebody else and
haven't got no use for Peter; and he damns his luck one minute, and the next
says that everything that happens must happen. But it's taking him all his
time to believe it as well as say it, and, meanwhile, nature will out and a
dog or two have had to yelp for Master Peter's troubles. The sorrows of the
dog-breeder be often visited on the dog I reckon."

"Life runs over a lot of innocent dogs, no doubt," said William, "and
leaves 'em mangled and wondering what they've done to be disembowelled,
just as they thought they was being so good and faithful."

"To a man of Peter Bullstone's mind, the thought that a girl could refuse
him is very vexatious I expect," admitted Adam. "His mother's family ain't
out of the wood yet. My Aunt Amelia tells me that Judith Huxam was
catched by Barlow going to the police-station to give herself up for fancied
crimes! It looks like they'll have to put her away."

"That's what God brings His Chosen Few to, Adam!"

"No, George. You mustn't say things like that in my hearing, please. All
that happens is part of the pattern, and who can judge of the pattern from
the little piece under his own eye? Not the wittiest man among us."

"Cant!" answered George. "But if that woman's drove mad—her, who


have driven so many others the same way—then that's one to the good for
your precious Maker, Adam: a bit of plain justice that us common men can
understand."

They spoke of Jacob Bullstone.

"He'll be excited to-day I shouldn't wonder," said George, "because his


daughter is going back to him this evening. Peter drives in for her
presently."

"Auna wrote me a letter full of woe," William told them. "She's one of
they young hearts from which even us frozen old blids can catch heat. What
d'you think? Her Great-Uncle Pulleyblank's minded to make her his heir,
and she's prayed him to leave his money to her Uncle Jeremy, because he
wants it and she don't! But Pulleyblank knows his Jeremy too well I hope."

"And Jacob seems as if he was more in tune too," declared William.


"He's doing kind things to lonely people—such as don't run up against
much kindness as a rule," explained Adam. "The lonely have a way to smell
out the other lonely ones. He sits very quiet now for hours at a time,
perched on a moor-stone so still as a heron."

The master of Shipley got upon his horse, which stood tethered under an
oak beside the hedge. Then he rode off to climb the waste lands, while the
others went on their way.

Winter had come to dinner with Jacob, and he found him cheerful and
exalted before the thought of his child's return.

He explained his hopes and purposes, but with diffidence.

"If men, such as you and Marydrew, still think well of it, I don't say but
what I might slip back again to some quiet spot," he said. "I shouldn't feel
that I'd got any right, exactly, to thrust in again among folk—such a thing as
I've grown to be; but if it was only for William's sake, I'd come. He always
held out that I'd be saved for some usefulness, and I'd like to make good his
words."

"Come then," urged Winter, "don't go back on it."

"I'm a thought clearer sighted than I have been, Adam, and more
patient. How's Samuel? I understand him now so well as you do yourself."

They talked of common interests, but Bullstone grew restless as the sun
went westerly, and he did not seek to stay his guest when the farmer rose to
return.

"I've got everything in fine fettle for her—for Auna," he explained.


"She's coming back from her sea-faring to-day. Peter goes in for her and
she'll walk along alone from Shipley, and the boy will be up with her box
to-morrow morning. The moor's in a cheerful mind to-night, but I dread to
hear her say she likes the sea better."

"No fear of that while you kennel up here. But I hope we shall have you
both down before autumn."

You might also like