You are on page 1of 10

Systems and Computers in Japan, Vol. 30, No.

6, 1999
Translated from Denshi Joho Tsushin Gakkai Ronbunshi, Vol. J81-D-II, No. 1, January 1998, pp. 127–136

Optimization of Fuzzy Reasoning by Genetic Algorithm Using


Variable Bit-Selection Probability

Makoto Ohki, Toshiaki Moriyama, and Masaaki Ohkita

Tottori University, Japan

SUMMARY numerical optimum search, and can be also applied to


combinatorial search. In this paper, the application of GA
Genetic algorithms (GA) are known as optimization to optimization of fuzzy reasoning is discussed.
algorithms that can avoid convergence to local solutions by With optimization of fuzzy reasoning using GA,
global search in solution space. However, especially in the there may be cases in which no good solutions are found
field of control, when fuzzy reasoning is to be optimized, for certain fitness values, e.g., when a large number of
global optimal solutions are not necessarily required. In parameters are to be optimized. One of the reasons appears
many cases, local solutions obtained at low cost are prefer- to be related to the bit-selection probability (BSP) used for
able. For this purpose, GAs are used in combination with mutations. Conventional GAs use a fixed BSP, and hence
other optimization algorithms, for example, steepest de- mutations always result in global changes of the parameters
scent or pattern search. In so doing, however, there is a to be optimized. This makes possible global search for
problem of differently representing the parameters to be global solutions, but on the other hand, local search around
optimized. Besides, complicated software is required to some good solution is impossible. For this reason, a tech-
implement such combined methods. A method is proposed nique was proposed to adjust the GA search range. Specifi-
in this paper to provide locality in search space by varying cally, in the course of mutation, the BSP value is altered
the bit-selection probability in GA-based mutations in ac- slightly for each bit of the parameters to be optimized so
cord with learning progress. This makes possible local that the value of the BSP is modified by learning. In other
search in the vicinity of good solutions found in the course words, local search is performed around certain good solu-
of optimization, resulting in rapid finding of local optimal tions so that local solutions can be found in its vicinity.
solutions. © 1999 Scripta Technica, Syst Comp Jpn, 30(6): There is another method, called simulated annealing
54–63, 1999 (SA), that allows probabilistic search like the GA, and
several learning methods merging SA with GA have been
Key words: Fuzzy reasoning; optimization algo- proposed [4–6]. With the method proposed by Sirag et al.
rithm; genetic algorithm; mutation; bit-selection prob- (4], mutational probability is given by a parameter corre-
ability sponding to the acceptance probability in SA to obtain a
global optimal solution. On the other hand, with the modi-
fied SA proposed by Koakutsu et al. [5], the difference in
1. Introduction estimation before and after mutation and the acceptance
probability defined through temperature are used to identify
Genetic algorithms (GAs) [1–3] are a variety of refused individuals and replace them with good individuals
optimization algorithms. However, distinct from the steep- achieving better estimation than the population average to
est descent and other methods, GAs offer solutions for implement selection. In so doing, the temperature is re-

CCC0882-1666/99/060054-10
54 © 1999 Scripta Technica
duced by 10% for every generation. In the recent study by generated and evaluated repeatedly. Such search objects are
Jeong et al. [6], the same definition of acceptance prob- called individuals or genes. An algorithm cycle is called a
ability is used as in the aforementioned research by generation, and a set of generated search objects, that is,
Koakutsu. However, if the fitness does not improve within genes, is called a population. Several GA procedures are
a certain range of generations, the number of generations used to generate the population of the next generation.
for learning is reset to unity so that the high-temperature A formulation of fuzzy reasoning is offered below,
state is restored and the probability of avoiding local opti- gene coding is described and relevant GA procedures ex-
mal solutions grows. Here, the initial temperature T0 is set plained.
according to the worst estimation of initial individuals and
is then reduced at a rate of T0/log q, where q is the number 2.1. Genotype coding of fuzzy reasoning
of learning generations.
As in SA, the proposed method uses the principle of Fuzzy reasoning is used widely in the field of control.
elite preservation, in which declined individuals are not Among several types of fuzzy reasoning, so-called simpli-
passed to future generations. Further, mutations based on fied fuzzy reasoning is quite popular. In this case, the
the variable-rate selection probability do not necessitate consequent part is defined as a real number rather than as a
processing of the aforementioned temperature decrease. membership function (MSF). Simplified fuzzy reasoning
Other examples of fuzzy reasoning optimization us- [10, 11] is assumed in this study. With simplified fuzzy
ing GA include methods combining GA with various opti- reasoning, fuzzy rules are described as follows:
mization techniques, e.g., the steepest descent method [7,
8]. In addition, the methods have been extended to include
search based on fuzzy reasoning rules and positive results (1)
have been reported [9]. In all cases, the parameters are
readjusted using optimization algorithms other than GA Here i(i = 1, 2, . . . , M) is the rules’ number, y is the
because locally stable solution offering good fitness, that reasoning output, Ai1, . . . , AiN are fuzzy labels defined for
is, local optimal solution, is required as the end solution. the respective input variables, assumed here to represent the
Readjustment is performed primarily through the steepest corresponding MSF, and wi denotes consequent real num-
descent method or the simplex method. In so doing, fuzzy bers. In these rules, antecedent compatibility is expressed
inference estimation is required to adjust for every parame- as follows:
ter to determine the search direction vector. If fuzzy reason-
ing is optimized using such algorithms, there normally exist
many parameters to be optimized. Therefore, fuzzy reason- (2)
ing estimation requires numerous iterations, and finally, the
optimization time becomes a problem. In addition, combin-
ing different learning methods is complicated in practice. The fuzzy reasoning output y is defined by weighted aver-
The method proposed in this paper has only a slight aging as in the following expression:
change in optimization overhead with an increasing number
of parameters to be optimized because only the GA is used
(3)
as an optimization algorithm. In other words, the estimation
iterations do not increase in number in direct proportion to
the number of parameters when the gradient and search
The MSF used in expression (1) is defined as shown in Fig.
direction are determined, in contrast to the steepest descent
1, and is composed of three parameters. Using these pa-
method or pattern search.
rameters, the grade is calculated as follows:
The problem treated in this study is formulated in
section 2, and section 3 suggests an approach to the problem
based on variable bit-selection probability. Examples veri-
fying the proposed method are given in section 4, and
section 5 discusses its effectiveness. Finally, the results of
this study are summed up in section 6. (4)

2. Optimization of Fuzzy Reasoning by GA


The gene is configured as shown in Fig. 2. One gene
With GA, the parameters to be optimized are modi- represents a set of rules, that is, fuzzy reasoning. Fields are
fied stochastically. Then, objects for subsequent search are provided in the gene to express every rule. In every rule

55
Fig. 3. Example of a crossover.
Fig. 1. MSF as a triangular function.

(semi-elite), and a single superior individual in terms of the


total evaluation are selected. In this study, the higher the
field, three parameters represent the antecedent MSF for
evaluation index, the worse the evaluation, and conversely,
each rule, and the consequent real values are assigned. All
the lower the evaluation index, the better the evaluation, that
parameters are represented as binary numbers of finite bit
is, the better the fitness.
length in a fixed-point format. This allows easy implemen- 3. Generating the population of a new generation
tation in computer programming languages. In this study, The elite and semi-elite individuals selected in pro-
the parameters to be optimized are represented as 16-bit cedure (2) are subjected to combined crossover/mutation as
data (short integer data in C language). explained below, and new individuals are generated, to be
evaluated as a new generation.
2.2. Operations of GA 4. Crossover
A pair of genes is chosen from the new individuals,
When developing a GA, a variety of implementations and subjected to one-point crossover as shown in Fig. 3. A
are available from rudimentary to rather complicated, such randomly selected crossover position is taken as the rule
as those offering multipoint crossover. This study uses a field boundary.
simple GA (which does not mean that a more complicated 5. Mutation
GA could not be used). GA-based optimization involves the A rule field is selected at random for the gene sub-
following procedures: jected to mutation. For all parameters in this field, every bit
is inverted using a fixed BSP, as shown in the following
1. Generating initial individuals expression (see Fig. 4).
Using random numbers, a certain number of genes is (5)
generated.
2. Evaluating all individuals and preserving supe- Here [ is a parameter’s bit position, and Pb([) is the BSP at
rior ones position [.
If two or more evaluation criteria are specified, mul- When the BSP is fixed, new individuals obtained by
tiple superior individuals with respect to each criterion mutation always offer global changes. Therefore, mutation

Fig. 2. Configuration of a gene. Fig. 4. Fixed BSP.

56
may be said always to perform global search. However,
when superior individuals are obtained in the course of
optimization, the conventional GAs described above do not
have sufficient capability for searching the solution space
in the vicinity of these superior individuals. This is com-
pensated for—but only to some degree—by randomly se-
lecting the parameters for mutation.

3. Variable Bit-Selection Probability

To allow a search for local optimal solutions in the


GA, the variable bit-selection probability (VBSP) is intro-
duced.
Here, two definitions of VBSP are proposed, and a
method of varying the VBSP in accordance with the learn-
ing evaluation is proposed.

3.1. Form of VBSP

The VBSP can be defined in various forms depending Fig. 5. Definitions of VBSPs.
on the bit-length of the parameters to be optimized. This
study considers the definition of BSP in terms of a triangu-
lar function across all bits of the parameters to be optimized,
and its definition in terms of a triangular window function
for a subset of the bits. inversion is not needed, which should reduce processing
The definition of the VBSP by a triangular function load.
is illustrated in Fig. 5(a). In the diagram, bit position [ of
the parameter to be optimized is plotted on the abscissa, and 3.2. VBSP change
the domain of definition is from the least-significant bit
(LSB) position through the most-significant bit (MSB) Normally it is assumed in GA-based learning that
position: initial individuals have poor fitness and that the fitness
improves as learning proceeds. Therefore, in the initial
(6) stage of learning, a rough search should be performed
across as wide a range as possible, while gradually focusing
The vertical axis represents the BSP with respect to on the elite area. To realize this approach, at the initial stage
bit position [. Here the vertex of the triangular function is of learning, bits close to the MSBs of the parameters to be
placed at the point (K, 0.5). The ends of the function base optimized are subjected to intensive mutation, and as learn-
are placed at (LSB – 1.0) and (MSB + 1.0), so that the VBSP ing proceeds, mutation gradually shifts toward the LSB.
does not equal 0. Here KH = ;. VBSP is defined according The VBSP parameter K is varied in the following
to the learning fitness by moving the vertex position K, as way:
will be explained below. With the VBSP defined by a
1. Initially, learning occurs while using a fixed BSP
triangular function, Pb([) is nonzero for all bits of the
as given in expression (5).
parameter to be optimized. Therefore, the mutations are 2. The initial fitness is taken as e0 when the elite
endowed with locality while maintaining the possibility of estimation meets the following condition:
global search offered by the conventional methods.
(7)
The definition of VBSP in terms of a triangular
window function is shown in Fig. 5(b). Here a triangular Here q is the number of generations, e(q) is the elite fitness
window with the center at (K, 0.5) and a width of 2 bits is for the q-th generation, and eT is the threshold for the initial
defined. This central position K is moved, as will be ex- elite fitness.
plained below. With such a VBSP defined by a triangular 3. From this generation on, the point is shifted from
window function, Pb([) includes zero areas. Therefore, in (e0, Hmax) toward (ef, Hmin) and ef denotes the target fitness,
the course of mutation, areas can be specified where bit as shown in Fig. 6.

57
The evaluation for gene G, that is, the fitness, is defined in
terms of the mean square error in the following way:

(13)

Here y0 and yf are the target output and fuzzy reasoning


output, respectively. In addition to this fitness covering the
whole input range, the range is divided into two parts, as
shown in Fig. 7(a), and a partial fitness is defined for each
Fig. 6. Setting of K by fitness.
part as follows:

(14)
The target fitness ef is the fitness that must be
achieved through learning. Hence, learning terminates as
soon as the fitness becomes equal to, or smaller than, the
target value.

4. Examples of VBSP Application

The application of the proposed method to two dif-


ferent types of problems is explained below. The first deals
with modeling a relationship expressed by a 2-input-1-out-
put numerical function using fuzzy reasoning. The other
involves acquisition of fuzzy reasoning rules for an autono-
mous mobile robot to turn left at a crossroad.

4.1. Approximation of 2-input-1-output


function

Consider modeling by fuzzy reasoning of the three


following 2-input-1-output functions defined on the do-
main D = {(x1, x2)~0 d x1d1, 0d x2 < 1}:

(8)
(9)

(10)

(11)

(12)
Fig. 7. Approximation of 2-input-1-output function:
problem setting.

58
(15) (M-2) The fixed BSP was applied to genes 1–9, and
the rest of the genes 10–18 were handled using the VBSP
proposed in this paper.
Here D1 and D2 denote range 1 and range 2, respectively. (M-3) The VBSP was applied to all genes.
In this example, estimation is performed throughout
the input range. Therefore, the antecedent MSF covers the Figure 8 presents results obtained with the initial fitness
input range in full, as shown in Fig. 7(a), and the genes are threshold in expression (7) set to eT = 0.3, and the target
configured as shown in Fig. 7(b). In particular, a gene fitness to ef = 0. In all cases, the fitness for each generation
includes a field for the antecedent MSF of variable x1, a field is the average of 10 learning cycles.
for antecedent MSF of variable x2, and a field for the
consequent real number.
The crossover position is fixed at the boundary be- 4.2. Travel of an autonomous mobile robot at
tween two antecedent MSF fields, and in the consequent a crossroad
field it is selected at random, as shown in Fig. 7(b).
Populations are configured using elite and semi-elite Another problem used for verification dealt with
individuals, as illustrated in Fig. 7(c). In the diagram, the making an autonomous mobile robot turn left at a cross-
elite individual and two semi-elite individuals are desig- road.
nated by E and S1, S2, respectively, and the symbols x and The input parameters for the autonomous mobile
c stand for crossover and mutation, respectively. robot are x1, x2, x3, and I, as shown in Fig. 9(a), and the
To verify the effect of VBSP, three different instances output is the robot’s steering angle. Therefore, the fuzzy
were examined, as follows: reasoning to be optimized is the 4-input-1-output type. The
MSF and genes for all input variables were configured, as
(M-1) A conventional fixed BSP given by expression explained in section 2. In this example, however, a semi-
(5) was applied to all genes. elite is not defined, and the population is composed of eight

Fig. 8. Change of fitness in numerical approximation examples.

59
If a collision occurs before the target exit is reached,

(19)

Here Dcollision is the distance from the collision point to the


target exit. Finally, the fitness is defined as follows:

(20)

To confirm the efficiency of VBSP, three different instances


were examined as follows:

(M-1) The conventional fixed BSP given by expres-


sion (5) was applied to all genes.
(M-2) A fixed BSP was applied to genes 1–4 and
while the rest of the genes 5–8 were handled using the
VBSP proposed in this paper.
(M-3) The VBSP was applied to all genes.

Figure 10 presents the results obtained with the initial


fitness threshold in expression (7) set to eT = 500, and with

Fig. 9. Travel at a crossroad: problem setting.

individuals, as shown in Fig. 9(b). Here S is the next-to-elite


gene as evaluated in the previous step. In addition, (A or B)c
stands for mutation where either A or B is chosen with equal
probabilities of 50%.
Here a gene’s fitness is defined in the following way.
At the moment when the mobile robot enters the crossroad,
estimation f1 is

(16)

Bf and Br are the tread lengths of the front and rear wheels,
respectively. When the target exit is reached, estimation f2
is given by

(17)

If a specified traveling distance is exceeded, for example,


if the robot arrives at an incorrect exit or is turning within
the crossroad, the estimation is given by
Fig. 10. Change of fitness in numerical crossroad travel
(18) example.

60
a target fitness ef = 100. In all cases, the fitness for each input–output relationships. In such cases, generalization
generation is the average of 10 learning cycles. performance is assumed to be dictated by MSF shape. In
other words, learning results can be expected to improve
dramatically when applying the VBSP. As will be seen in
5. Discussion Fig. 10, the triangular modification (M-3) and triangular
window modification (M-2) produced better learning re-
As is obvious from the learning results shown in Figs. sults than (M-1). Although a fitness of only about 82.2802
8 and 10, the application of the VBSP to learning proved could be obtained with (M-1), the triangular modification
efficient in all cases, ensuring rapid and good solutions. (M-3) and the triangular window modification (M-2) pro-
When only the fixed BSP was used (M-1), fitness improve- duced better fitness values of 62.5329 and 70.2247, respec-
ment ceased at an early stage of learning, and better solu- tively.
tions were no longer obtained. On the other hand, when
only the triangular VBSP was applied (M-3, triangular) and 5.3. Effect of triangular VBSP
when a combination of the fixed BSP and the triangular
window VBSP was used (M-2, triangular window), the In the triangular VBSP, the value of the BSP is set
results were nearly the same as with (M-1) to a certain point, highest at the vertex of the triangular function and is defined
but thereafter better solutions were obtained in subsequent for all parameter bits. Therefore, a local search feature is
generations. Besides, subtle improvements in good solu- provided while maintaining the global search capability of
tions were often observed. This may be attributed to the fact the conventional fixed BSP, so that in all examples (M-2)
that when the VBSP is used for mutations, local search is offered better results than (M-1) and (M-3) offered even
performed around good solutions obtained up to the current better results than (M-2).
generation, which is helpful in achieving even better solu-
tions. The individual cases are examined below in more 5.4. Effect of triangular window VBSP
detail.
The triangular VBSP (M-2) produced better results
5.1. Efficiency in approximating the than (M-3) in the triangular window VBSP. This may be
2-input-1-output function explained by the fact that BSP is defined for a subset of the
parameter bits. When only the triangular window VBSP
Among the problems related to approximating 2- (M-3) is used, global search capabilities are not supported.
input-1-output functions, the planar function in expression Therefore, if no good solutions are obtained near the MSB
(8) is comparatively simple. The function in (9) represents at the initial stage, the next generation is unlikely to produce
a smooth curved surface. When approximating using a better solutions. On the other hand, if the triangular window
triangular MSF, the mean square error is always nonzero VBSP is applied together with the fixed BSP (M-2), the
and at most three peaks exist within the domain. Thus, this global search functions of the fixed BSP are combined with
function can be considered the next simplest example prob- the local search function of the triangular window VBSP,
lem after function (8). Expression (10) represents a discon- which yields good results.
tinuous function, whose approximation is a relatively Now consider the volume of computation required
complicated problem. Thus, the fitness obtained when for mutational processing. With the triangular window
learning is completed depends on the target function. With VBSP, a nonzero BSP is assigned to five bits at most.
function (8), learning was completed at a fitness of 0.06638 Therefore, in 16-bit parameters, as in this study, the remain-
for (M-1) whereas the triangular modification (M-3) and ing 11 bits are not processed. Normally, mutational proc-
triangular window modification (M-2) produced better fit- essing includes random number generation, threshold
ness values of 0.02068 and 0.01347, respectively. A similar processing, and inversion for each bit. Thus, mutational
tendency was also observed for functions (9) and (10). processing is expected to be reduced to about 30% com-
pared to the conventional method using a fixed BSP.
5.2. Effectiveness in robot travel problem
5.5. Setting initial fitness threshold
In this example, inference rules were sought for an
evaluation related to multiple history-dependent parame- In the examples presented above, the initial fitness
ters (in our case, travel at the crossroad’s entrance and exit threshold eT was set without special consideration. How-
was primarily evaluated). Therefore, as distinct from ap- ever, if eT is not set properly, VBSP-based mutations will
proximation problems, some generalization capability was not necessarily produce a good search. This problem is
required here to estimate external conditions from control examined below in more detail.

61
Results obtained for the examples given in section 4.1
through five learning cycles, with eT set to 0.1, 0.2, . . ., 1.0
are presented in Fig. 11(a)–(d). The initial fitness threshold
eT and the fitness obtained at learning completion are rep-
resented by the abscissa and ordinate, respectively. The
minimum, average, and maximum fitness are shown for
every value of the initial fitness threshold. Diagrams (a) and
(c) pertain to the approximation of function (8) and dia-
grams (b) and (d) pertain to the approximation of function
(9). Diagrams (a) and (b) pertain to the triangular modifi-
cation (M-3), and diagrams (c) and (d) pertain to the trian-
gular window modification (M-2).
In Fig. 11(a), at a small eT of 0.1, the switch from
fixed BSP to VBSP comes late and yields relatively poor
results. However, for 0.2 d eT d 1.0, good results are
obtained. On the other hand, in Fig. 11(b), 0.4 d eT d 1.0
proves to be an appropriate range. Because function (8) is
more difficult to approximate than function (9), and it is
therefore assumed that a higher initial threshold eT should
be set. However, if eT is set too high, the switch to VBSP
will take place too early so that the chances of performing
a search near MSB will be small.
In Fig. 11(c), nearly the same results are obtained
across the whole range of 0.1 d eT d 1.0. In contrast, Fig.
11(d) shows good results at 0.6 d eT d 0.8. When eT is set
higher, the switching to VBSP may even occur too soon, so
that the search near the MSB is not sufficient.
Thus, eT is assumed to be dependent on the VBSP
form and the problem to be solved. In Section 4.1, the
threshold eT was fixed at 0.3 for all examples, but different
thresholds yielded better results in some cases. Therefore,
the problem of setting the initial fitness threshold calls for
further investigation.

6. Conclusions

This paper has dealt with problems of optimizing


fuzzy reasoning by a GA. With problems of this sort,
acceptable solutions obtainable at low cost are often re-
quired rather than global optimal solutions. In this context,
the authors have proposed using variable bit-selection prob-
ability in GA mutation according to the learning fitness.
This imparts locality to the search range, thus making
possible a concentrated search in the vicinity of good solu-
tions produced in the course of learning. Then, local solu-
tions can be found quickly in the process of GA-based
learning.
The effectiveness of the proposed method was con-
firmed by examples of 2-input-1-output function approxi-
mation and the travel of an autonomous mobile robot at a
crossroad. However, the proposed method requires some
problem-specific knowledge, such as the initial fitness
Fig. 11. Comparison of learning results at varied eT. threshold and target fitness. In the future, the VBSP tech-

62
nique should be developed to perform without such prob- 6. Jeong IK, Lee JJ. Adaptive simulated annealing ge-
lem-specific knowledge. netic algorithm for system identification. Eng Appl
Artif Intell 1996;9:523–532.
7. Katayama R, Kajitani Y, Nishida Y. A self-generating
REFERENCES and tuning method for fuzzy modeling using inte-
rior penalty method and its application to knowl-
1. Goldberg DE. Genetic algorithms in search, optimi-
edge acquisition of fuzzy controller. In: Fuzzy
zation and machine learning. New York: Addison-
control systems. CRC Press; 1993, chap 9, pp 197–
Wesley; 1989.
2. Nomura H, Hayashi I, Wakami N. A self-tuning 224.
method of fuzzy reasoning by genetic algorithm. In: 8. Ohki M et al. Self-tuning of fuzzy rules using
Fuzzy control systems. CRC Press; 1993, chap 16, pp piecewise linear membership functions. Trans IEE
337–354. Jpn 1995;116-C:776–784.
3. Sakawa M, Tanaka M. Genetic algorithms. Tokyo: 9. Fukuda T et al. Structure organization of hierarchical
Asakura Shoten, 1995. fuzzy model using genetic algorithm. Nihon Faji
4. Sirag D, Weisser P. Toward a unified thermodynamic Gakkai-shi, J Jpn Soc Fuzzy Theory Systems 1995;
genetic operator. Proc 2nd Intl Conf Genetic Algo- 7:988–996.
rithm 1987:116–122. 10. Kanno M. Fuzzy control. Tokyo: Kogyo Shinbun-
5. Koakutsu S et al. Block placement by improved sha; 1996.
simulated Annealing based on genetic algorithm. 11. Maeda M, Murakami S. Self-tuning fuzzy controller.
Trans IEICE 1990;J73-A:87–94. Trans Soc Instrum Control Eng 1988;24:191–197.

AUTHORS (from left to right)

Makoto Ohki (member) completed his MS program at Tottori University in 1988 and was employed by Oki Electric Co.
Since 1994 he has been an assistant at Tottori University. His research interests include development of LISP workstations,
research in parallel computers for signal processing, autonomous mobile robots, and optimization of fuzzy reasoning. He is a
member of the IEE Japan.

Toshiaki Moriyama (student member) graduated from Tottori University in 1996 and was employed by Tottori prefectural
office. His research interests include autonomous mobile robots and automation of fuzzy reasoning.

Masaaki Ohkita (member) completed postgraduate studies at Osaka Prefectural University in 1968 and was employed
as an assistant by Tottori University. In 1990, he became an assistant professor and since 1993 he has been a professor. He has
a D.Eng. degree. His research interests include the load characteristics of power networks, function approximation, numerical
analysis and its applications to electric circuit analysis, and autonomous mobile robots. He is a member of the Information
Processing Society of Japan, the Robotics Society of Japan, the Japanese Fuzzy Technology Society, and IMACS.

63

You might also like