Professional Documents
Culture Documents
Pattern Recognition
Letters
ELSEVIER Pattern Recognition Letters 15 ( 1994 ) 403-408
a Department of Electrical Engineering, National Cheng Kung University, Tainan, Taiwan 701, ROC
b Institute oflnformation Engineering, National Cheng Kung University, 1 University Road, Tainan, Taiwan 701, ROC
Received 3 September 1992; revised 17 September 1993
Abstract
A Bayesian neural network for separating characters with the same number of linear like strokes and cross-points is proposed.
It is trained by an incremental learning vector quantization algorithm which endows this system with incremental learning
ability.
Key words: Handwritten Chinese character; Neural network; Shape coding; Recognition
Inputslab .......
Fig. 2. Two different handwriting variations of the character
1 2 N
" 2: "and their two upper comer codes. Fig. 3. The fundamental architecture of a Bayesian neural network.
H.-D. Changet al. /Pattern Recognition Letters 15 (1994) 403-408 405
×exp [ ~ (~.,,-,ui;,,,;,,) ]
network consists of L fundamental Bayesian neural
networks and each fundamental Bayesian neural net-
work represents a Chinese character. For example, the
m = 1, 2 , . . . , M , (2) chosen group S5F2 contains 18 characters and all of
these characters have 5 linear like strokes and 2 cross-
where N is the dimension of the sub-cluster center and points as shown in Fig. 5. The number L is equal to
the input vector, /t~;m;n is the mean of sub-cluster 18. The output of each fundamental Bayesian neural
W R m ) and at;,,;n represents the standard deviation network is the distance between the input unknown
in W~(m). Then we select the Pj;k=max(P,,;k), m= 1, pattern Xk and the reference character. Namely, if Xk
2, ..., M, and check whether or not the Pj;k is larger is closest to the ith Chinese character in the network,
than a threshold TH. If it is larger than TH, ten Xk is then the output of the ith fundamental Bayesian
assigned to the jth sub-cluster and the mean and var- neural network will be the minimum one. The arbi-
iance of the jth sub-cluster are adapted, according to tration comparator uses the following formula
406 H.-D. Chang et al. /Pattern Recognition Letters 15 (1994) 403-408
~iii~i~i!ii~i~Jiiiiii~!iiiiiiiiii~i~i~ii~iiiiiiiii~!~i!i!ii~iiiii~i~!~!i~i~i~iiiiii~i~!!
~ii~iiiiii~iii~i~!i!i!iiiiiiiiiii
iii~iiiiiHiii
!~iiii~iiiiii~i~i~iiiiiiii~iiiii~i~i!!i!!iiiiii~i
bitrauonT itraonI bitraonI comparat~ L comparator
IInniiiiiluii ~
-
~om~~.~ ~ of~o~-~o~e,-o~o~.,
fofUrw-Codrner-code/_
word2 J
,o~-~oo~-~
of wordL f
l
~
four:comer- code 1 2 N
of Xk
feature vector of Xk
Fig.4.Therecognitionarchitectureforeachclass.
H.-D. Chang et al. /Pattern Recognition Letters 15 (1994) 403-408 407
Table 1
The recognitionresults of two test files
Testing pattern
References Wang, J.F., H.D. Chang and J.H. Tseng (1991). Handwritten
Chinese radical recognition via neural networks. In: Proc. 1991
Internat. Conf. Computer Processing of Chinese and Oriental
Fukushima, K., S. Miyake and T. Ito (1983). Neocognitron: a Languages, 92-97.
neural network model for a mechanism of visual pattern Yong, Y. (1988). Handprinted Chinese character recognition via
recognition. IEEE Trans. Systems Man Cybernet. 13, 826- neural networks. Pattern Recognition Left. 7, 19-25.
834.
Lippmann, R.P. (1987) An introduction to computing with
neural nets. IEEE Acoust. Speech Signal Process. Mag. 4,
4-22.