Professional Documents
Culture Documents
Download textbook Artificial Neural Networks And Machine Learning Icann 2014 24Th International Conference On Artificial Neural Networks Hamburg Germany September 15 19 2014 Proceedings 1St Edition Stefan Wermter ebook all chapter pdf
Download textbook Artificial Neural Networks And Machine Learning Icann 2014 24Th International Conference On Artificial Neural Networks Hamburg Germany September 15 19 2014 Proceedings 1St Edition Stefan Wermter ebook all chapter pdf
https://textbookfull.com/product/matlab-deep-learning-with-
machine-learning-neural-networks-and-artificial-intelligence-1st-
edition-phil-kim/
https://textbookfull.com/product/artificial-neural-networks-with-
tensorflow-2-ann-architecture-machine-learning-projects-
poornachandra-sarang/
https://textbookfull.com/product/artificial-intelligence-for-
business-what-you-need-to-know-about-machine-learning-and-neural-
networks-doug-rose/
https://textbookfull.com/product/artificial-neural-networks-for-
engineering-applications-1st-edition-alma-alanis/
Stefan Wermter Cornelius Weber
Wlodzislaw Duch Timo Honkela
Petia Koprinkova-Hristova Sven Magg
Günther Palm Alessandro E.P. Villa (Eds.)
123
Lecture Notes in Computer Science 8681
Commenced Publication in 1973
Founding and Former Series Editors:
Gerhard Goos, Juris Hartmanis, and Jan van Leeuwen
Editorial Board
David Hutchison
Lancaster University, UK
Takeo Kanade
Carnegie Mellon University, Pittsburgh, PA, USA
Josef Kittler
University of Surrey, Guildford, UK
Jon M. Kleinberg
Cornell University, Ithaca, NY, USA
Alfred Kobsa
University of California, Irvine, CA, USA
Friedemann Mattern
ETH Zurich, Switzerland
John C. Mitchell
Stanford University, CA, USA
Moni Naor
Weizmann Institute of Science, Rehovot, Israel
Oscar Nierstrasz
University of Bern, Switzerland
C. Pandu Rangan
Indian Institute of Technology, Madras, India
Bernhard Steffen
TU Dortmund University, Germany
Demetri Terzopoulos
University of California, Los Angeles, CA, USA
Doug Tygar
University of California, Berkeley, CA, USA
Gerhard Weikum
Max Planck Institute for Informatics, Saarbruecken, Germany
Stefan Wermter Cornelius Weber
Wlodzislaw Duch Timo Honkela
Petia Koprinkova-Hristova Sven Magg
Günther Palm Alessandro E.P. Villa (Eds.)
13
Volume Editors
Stefan Wermter
Cornelius Weber
Sven Magg
University of Hamburg, Hamburg, Germany
E-mail: {wermter, weber, magg}@informatik.uni-hamburg.de
Wlodzislaw Duch
Nicolaus Copernicus University, Torun, Poland
E-mail: wduch@is.umk.pl
Timo Honkela
University of Helsinki, Helsinki, Finland
E-mail: timo.honkela@helsinki.fi
Petia Koprinkova-Hristova
Bulgarian Academy of Sciences, Sofia, Bulgaria
E-mail: pkoprinkova@bas.bg
Günther Palm
University of Ulm, Ulm, Germany
E-mail: guenther.palm@uni-ulm.de
Alessandro E.P. Villa
University of Lausanne, Lausanne, Switzerland
E-mail: alessandro.villa@unil.ch
0
1
00
11 0
1 0
1
0 11
00 00
11
0011
0
1 00
11
0
1 1 00
0
1 00
11
00 00
11
0
1
00
11
0
1
00
11 0
1 00
11
0
1 0
1 11
00
11 00
11
0
1
00
11 00
11
00
11
00
1100
11
01
1 0
00
11 00 1
11 0 1
00
11
0 0
1 00
11 00
1100
11
011
1 00
110 1
1
00
11 01
1 0
1
0
1 0
1
001
11 00
11 0
1 00
11
0
1 01
1 0011
11 0
1
00
1100
00
11
00
11 0
1
00
11 0 0
views, delivering an average of 4.3 reviews per pa- 11
4
001100 1
11
0
1
01
1
0
1
0
1
01
1
0
0
1
00
11
0
00
11
00
11
0
001
11
0
1
00
11
0
101
1
0
1
00
11
0
001
11
0
1
00
11
0
1
0
00
11
00
11
0
1
0 1
1 00
11
0
00
11
0
1
0
1
0
00
11
0
1
00
11
0
1
0
1
00
00
11
0
10
1
00
11
00
11
011
1 00
11
00
0
1
0
1
0 1
00
11
1
00
11
0
1001
11
0
00
11
0
1
00
11
0
00
11
0
1
0 11
1
0
1
00
00
11
00
11
0
1
00
11 00 0 0
1 0
1
001
11 00
11
00
1 0
00
11 0
1
00
11 0
1
00
1101
1 001
11
0 00
11
00
1 00
11
00
11
0
1 00
11
0
100
11 0
1
00
110
1 00
11
01
1 01
1 0
1
00
11 00
11
0
1 0
1
00
11 0
1
0 1
1 00
110
1 00
11
0
1 00
11 00
11
00
11
011
1 00
11
0
1 0
1
00
1101
1 0
00
11
per. This helped to obtain a reliable evaluation 11 00
0011
11
2
00
11
00
11
00 1
11
00
0
1
0
0
01
1
01
1
0
00
11
0
1
001
11
0
00
11
0
10
1
00
11
01
1
00
11
00
1
00
11
0
1
00
11
0
00
11
00
11
0
1
00
11
0
1
00
11
0
00
11
0
1
00
11
0
1
0
1
01
1
0
1
00
11
0
1
001
11
0
00
11
0
1
00
11
0
10
1
00
11
00
1
00
11
0
1
00
11
00
0
1
00
11
00
11
0
1
00
11
00
11 0
1
0
1
00
11
00
11
0
1001
11
00
11
00
11
00
11
0
1
00
11
0
00
11
0
1
0
1
01
1
01
1
0
1
00
11
0
1
00
1
00
11
0
1
00
11
00
11 00
11
00
11 01
1 0 0
00
11
0
1 0
10
1
00
11 0
1
00
11
00
11 0 1
1
00
11 00
11
0 0
1 0
1
00
11
0
1 0
1011
1
00
11 0
1
00
11
00 00
11
00
11
0
1 00
11
0
1 0
1 00
11
00
1
00
11 00
11 0
1 0
1
00
11 0
1 0
1
00
11
011
1 0
1
00
11 00
11
0
1 01
1 0
1
00
11 0
10
1
00
11 0
1
00
11
00
11 001
11
00
11 00
11 00
11
0
10
1
score for each paper, which was computed by the 11 00
11
00
011
0011
0
001
11
00
11 0
00 1
0.5
0
1 0
101
1
00
11
0 1 00
11
00
1
0 11
1
00
00
0
1
00
11
00 1
11
1.5
0
1
00
11
0200
11
0
1 0
1 001
1
00
11
0 1
1
2.5
00
1
00
11 0
1
00
11
00
11
03 1
0
100
11
00
11
0 1 0
00
11
01
3.5
0
1
0 11
00
0
10
1
4
OCS Score
Springer Online Conference Service (OCS) by av-
eraging the reviewers’ ratings and taking into account the reviewers’ confidences.
Papers were sorted with respect to their scores (see figure) and 108 papers with
a score of 2.0 or higher were accepted. Furthermore, the multiple professional
reviews delivered valuable feedback to all authors.
The conference program featured 24 sessions, which contained 3 talks each,
and which were arranged in 2 parallel tracks. There were 2 poster sessions with
33 posters and 2 live demonstrations of research results. Talks and posters were
categorized into topical areas, providing the titles for the conference sessions
and for the chapters in this proceedings volume. Chapters are ordered roughly
in chronological order of the conference sessions.
VI Preface
We would like to thank all the participants for their contribution to the
conference program and for their contribution to these proceedings. Many thanks
go to the local organizers for their support and hospitality. We also express our
sincere thanks to all active reviewers for their assistance in the review procedures
and their valuable comments and recommendations.
General Chair
Stefan Wermter Hamburg, Germany
Program Co-chairs
Wlodzislaw Duch Torun, Poland, ENNS Past-President
Timo Honkela Helsinki, Finland
Petia Koprinkova-Hristova Sofia, Bulgaria
Günther Palm Ulm, Germany
Alessandro E.P. Villa Lausanne, Switzerland, ENNS President
Cornelius Weber Hamburg, Germany
Recurrent Networks
Sequence Learning
Dynamic Cortex Memory: Enhancing Recurrent Neural Networks
for Gradient-Based Sequence Learning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
Sebastian Otte, Marcus Liwicki, and Andreas Zell
Learning and Recognition of Multiple Fluctuating Temporal Patterns
Using S-CTRNN . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
Shingo Murata, Hiroaki Arie, Tetsuya Ogata, Jun Tani,
and Shigeki Sugano
Regularized Recurrent Neural Networks for Data Efficient Dual-Task
Learning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
Sigurd Spieckermann, Siegmund Düll, Steffen Udluft,
and Thomas Runkler
Human-Machine Interaction
Human Activity Recognition on Smartphones with Awareness of Basic
Activities and Postural Transitions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 177
Jorge-Luis Reyes-Ortiz, Luca Oneto, Alessandro Ghio, Albert Samá,
Davide Anguita, and Xavier Parra
Deep Networks
Variational EM Learning of DSBNs with Conditional Deep Boltzmann
Machines . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 257
Xing Zhang and Siwei Lyu
Theory
Optimization
Row-Action Projections for Nonnegative Matrix Factorization . . . . . . . . . 299
Rafal Zdunek
Layered Networks
Mix-Matrix Transformation Method for Max-Cut Problem . . . . . . . . . . . . 323
Iakov Karandashev and Boris Kryzhanovsky
Table of Contents XIX
Vision
Supervised Learning
Ensembles
Dynamic Ensemble Selection and Instantaneous Pruning for Regression
Used in Signal Calibration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 475
Kaushala Dias and Terry Windeatt
Regression
Fast Sensitivity-Based Training of BP-Networks . . . . . . . . . . . . . . . . . . . . . 507
Iveta Mrázová and Zuzana Petřı́čková
Classification
A CFS-Based Feature Weighting Approach to Naive Bayes Text
Classifiers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 555
Shasha Wang, Liangxiao Jiang, and Chaoqun Li
Neuroscience
Cortical Models
Excitation/Inhibition Patterns in a System of Coupled Cortical
Columns . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 651
Daniel Malagarriga, Alessandro E.P. Villa,
Jordi Garcı́a-Ojalvo, and Antonio J. Pons
Self-generated Off-line Memory Reprocessing Strongly Improves
Generalization in a Hierarchical Recurrent Neural Network . . . . . . . . . . . . 659
Jenia Jitsev
Table of Contents XXIII
Applications
Technical Systems
RatSLAM on Humanoids - A Bio-Inspired SLAM Model Adapted
to a Humanoid Robot . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 789
Stefan Müller, Cornelius Weber, and Stefan Wermter
Precise Wind Power Prediction with SVM Ensemble Regression . . . . . . . . 797
Justin Heinermann and Oliver Kramer
Neural Network Approaches to Solution of the Inverse Problem
of Identification and Determination of Partial Concentrations of Salts
in Multi-component Water Solutions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 805
Sergey Dolenko, Sergey Burikov, Tatiana Dolenko,
Alexander Efitorov, Kirill Gushchin, and Igor Persiantsev
Lateral Inhibition Pyramidal Neural Network for Detection of Optical
Defocus (Zernike Z5 ) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 813
Bruno J.T. Fernandes and Diego Rativa
Development of a Dynamically Extendable SpiNNaker Chip Computing
Module . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 821
Rui Araújo, Nicolai Waniek, and Jörg Conradt
The Importance of Physiological Noise Regression in High Temporal
Resolution fMRI . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 829
Norman Scheel, Catie Chang, and Amir Madany Mamlouk
Development of Automated Diagnostic System for Skin Cancer:
Performance Analysis of Neural Network Learning Algorithms
for Classification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 837
Ammara Masood, Adel Ali Al-Jumaily, and Tariq Adnan
Table of Contents XXV
Demonstrations
Entrepreneurship Support Based on Mixed Bio-Artificial Neural
Network Simulator (ESBBANN) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 845
Eugenio M. Fedriani and Manuel Chaves-Maza
1 Introduction
Almost two decades ago, Hochreiter et al. [6] introduced the long short term
memory (LSTM) as a novel learning architecture to solve hard long time lag
problems, such as, context-sensitive grammars. The key idea of that work was
the introduction of multiplicative units, which simulate a behavior of gates
to set, read, and reset [2] the internal state of the memory cell. The internal
state is provided by self-connected linear cell referred as constant error carousel
(CEC). LSTM networks have been successfully applied to various tasks, such as
speech [4] and handwriting recognition [5], optical character recognition [10] or
OCT based lung tumor detection [9].
While in the original work some connections between the gates existed (though
not in the same structured manner as proposed in this paper), they have been
removed by later work because of training issues [3]. Note that earlier LSTM
variants were trained using a truncated gradient, while it is more common today
to use the full gradient computed with back-propagation through time (BPTT).
However, we think that there is a great potential of speeding up the training and
S. Wermter et al. (Eds.): ICANN 2014, LNCS 8681, pp. 1–8, 2014.
c Springer International Publishing Switzerland 2014
2 S. Otte, M. Liwicki, and A. Zell
···
Forget gate φ
CEC
Π gc Π
Input gate ι fc Output gate ω
···
···
Fig. 1. Illustration of the Dynamic Cortex Memory. Like in the LSTM model one or
more CECs are surrounded by three gating neurons. Each gate is linked with each
other gate in both directions (cortex) and has a self-recurrent connection (gate state).
Input Gate:
υιt = wiι xti + whι xht−1 + wcι sct−1
i∈I h∈H c∈D
Forget Gate:
υφt = wiφ xti + whφ xht−1 + wcφ sct−1
i∈I h∈H c∈D
Cell Input:
υct = wic xti + whc xht−1 (5)
i∈I h∈H
Cell State:
stc = xtι fc (υct ) + xtφ sct−1 (6)
4 S. Otte, M. Liwicki, and A. Zell
Output Gates:
υωt = wiω xti + whω xt−1
h + wcω st−1
c
i∈I h∈H c∈D
Cell Output:
xtc = xtω gc (stc ) (9)
Output Gate:
δωt = ϕω (υωt ) σc (stc )tc + wωι διt+1 + wωφ δφt+1 + wωω δωt+1 (12)
c∈D
Cell State:
ζct = xtω gc (stc )tc + xt+1
φ ζc
t+1
+ wcι διt+1 + wcφ δφt+1 + wcω δωt (13)
Cell Input:
δct = xtι fc (υct )ζct (14)
Forget Gate:
δφt = ϕφ (υφt ) st−1
c ζc
t
+ wφι διt+1 + wφφ δφt+1 + wφω δωt (15)
c∈D
Input Gate:
διt = ϕι (υιt ) ϕc (υct )ζct + wιι διt+1 + wιφ δφt + wιω δωt (16)
c∈D
3 Experiments
In this section DCMs are compared with LSTMs on three different sequential
problems with long term dependencies. In order to provide a fair comparison our
general proceeding for the experiments was to first look for a training configura-
tion that works well for the LSTM and then use exactly the same configuration
for the DCM.
An Enhanced Recurrent Neural Network Model 5
3.1 Storing
In this relatively simple experiment the task is just to learn memorizing a value
at a marked position within a sequence and keep the value until the end of the
sequence. Note, that just keeping a constant value for several time steps (> 10)
and reproducing it precisely afterwards is difficult for classical RNNs.
A training sample is generated as follows. Given a sequence over R2 of length
T/2 + rand(T/2), whereby the components of a vector at a certain time step t are
denoted with ut1 and ut2 . All ut1 are randomly chosen from [0, 1]. One sequence
index t is randomly selected and the value of ut2 is set to 1, while all other ut2
with t = t are 0. The target value of the sequence is then ut1 .
The tested architectures are in particular a standard LSTM, a DCM, a DCM
with no cortex and a DCM with no gate state. Each consists of two input units,
only one self-recurrent hidden block and one non-linear output unit. The training
was performed with gradient descent and momentum term, whereby the gradient
was computed with BPTT using 6,000 training samples generated with T = 60
and 200 epochs.
Fig. 2 depicts the typical training behavior of all four architectures for the
given learning problem. The corresponding generalization results are given in
Table 1. By adding the gate states (DCM with no cortex) the network behaves
turbulent in the very early phase of the training. But afterwards, it converges
slightly faster than the LSTM. However, despite the lower final training error,
DCM
DCM (no gs)
DCM (no cort)
0.04
Training error (MSE)
LSTM
0.018
0.008
0.0036
0.0016
0 50 100 150 200
Epochs
Generalization results
Network Training error
T = 60 T = 100 T = 1, 000
LSTM 0.007 99.3 % 96.4 % 67.4 %
DCM (no cort) 0.006 99.6 % 95.6 % 60.0 %
DCM (no gs) 0.003 100 % 100 % 72 %
DCM 0.002 100 % 100 % 72.2 %
Another random document with
no related content on Scribd:
ABANDONED. Cr. 8vo. 6s.
Shannon (W. F.) THE MESS DECK. Cr. 8vo. 3s. 6d.
Illustrated Edition.
Demy 8vo. Cloth.
The Three Musketeers. Illustrated in Colour by Frank
Adams. 2s. 6d.
The Prince of Thieves. Illustrated in Colour by Frank
Adams. 2s.
Robin Hood the Outlaw. Illustrated in Colour by Frank
Adams. 2s.
The Corsican Brothers. Illustrated in Colour by A. M.
M’Lellan. 1s. 6d.
The Wolf-Leader. Illustrated in Colour by Frank Adams.
1s. 6d.
Georges. Illustrated in Colour by Munro Orr. 2s.
Twenty Years After. Illustrated in Colour by Frank
Adams. 3s.
Amaury. Illustrated in Colour by Gordon Browne. 2s.
The Snowball, and Sultanetta. Illustrated in Colour by
Frank Adams. 2s.
The Vicomte de Bragelonne. Illustrated in Colour by
Frank Adams.
Part I. Louise de la Vallière. 3s.
Part II. The Man in the Iron Mask. 3s.
Crop-Eared Jacquot; Jane; Etc. Illustrated in Colour by
Gordon Browne. 2s.
The Castle of Eppstein. Illustrated in Colour by Stewart
Orr. 1s. 6d.
Acté. Illustrated in Colour by Gordon Browne. 1s. 6d.
Cecile; or, The Wedding Gown. Illustrated in Colour by
D. Murray Smith. 1s. 6d.
The Adventures of Captain Pamphile. Illustrated in
Colour by Frank Adams. 1s. 6d.