1
Abstract—In this paper we propose a new method to represent
information granules by Gaussian functional forms. First, the
fuzzy granules are extracted from data by a fuzzy clustering
algorithm. Then, they are properly represented by Gaussian
functions determined by solving a constrained quadratic
programming problem on membership values returned by the
clustering algorithm. Simulation results show that compact and
robust fuzzy granules are attained, with the appreciable feature
of being represented in a short functional form.
Index Terms—Fuzzy information granulation, Gaussian
membership functions, Constrained quadratic programming,
Fuzzy clustering.
I. INTRODUCTION
uzzy information granulation is the process of discovering
pieces of information, called information granules,
expressed in terms of fuzzy theory [1], [2], [3]. The
attained granules can be successively used in Fuzzy
Information Systems (FIS) to perform inferences on the
working environment.
Fuzzy clustering is a general unsupervised method to
induce fuzzy granules (i.e. clusters) that represent groups of
observations that are “close” in the sense of some predefined
metric. Many fuzzy clustering algorithms return a prototype
vector and a partition matrix that contains the membership
values of each observation to each cluster [4]. Such partition
matrix needs large memory requirements, since its space
complexity is linear in the number of observations and in the
number of clusters. Moreover, the partition matrix does not
convey any direct information about fuzzy memberships of
new data. For these reasons, when a FIS is built on the derived
clusters, only prototype information is usually used to define
the fuzzy granules, while the partition matrix is partially or
totally ignored.
When Gaussian functions are adopted to represent fuzzy
granules, one problem is choosing the widths of membership
G. Castellano is with the Computer Science Department, University of
Bari, Via Orabona 4, 70125 Bari, Italy (phone: +390805442456; fax: +39
0805443156; email: castellano@ di.uniba.it).
A. M. Fanelli. is with the Computer Science Department, University of
Bari, Via Orabona 4, 70125 Bari, Italy (email: fanelli@di.uniba.it).
C. Mencar is with the Computer Science Department, University of Bari,
Via Orabona 4, 70125 Bari, Italy (email: mencar@di.uniba.it).
functions. Indeed, while the centers of the Gaussian functions
can coincide with the prototypes calculated by the clustering
algorithm, there is no analytical way to define their widths if
partition matrix is ignored.
In literature, some heuristic techniques have been proposed
to define the Gaussian widths [5]; however, most of them
require the introduction of some userdefined parameters and
do not exploit the information about the fuzzy clusters
discovered by the clustering algorithm. Often, the widths are
chosen by trialanderror, so that Gaussian functions are not
too flat (too much overlapping) or too peaked (they do not
cover the whole input space). The consequence is a large
waste of time that sum up with unexploited useful information
provided by the clustering algorithm.
In this work, we propose a new method to define fuzzy
information granules represented in terms of Gaussian
membership functions, with the main feature that widths are
calculated by exploiting the information conveyed by the
partition matrix of the clustering algorithm. The key
advantage of the proposed approach is the ability of
automatically finding “good” Gaussian representations of
fuzzy granules in terms of mean squared error. The approach
does not require trialanderror procedures or strong
constraints, such as imposing the same width for all the
granules (i.e. isotropic Gaussian functions).
The proposed method can be applied with any fuzzy
clustering algorithm that returns a prototype vector and the
corresponding partition matrix. In this work, we use Fuzzy C
Means [4] as basic clustering algorithm from which the
Gaussian representation is derived.
The paper is organized as follows. Section II defines the
Gaussian representation as a constrained quadratic
programming problem. In Section III, a realword
experimentation is carried out to validate the approach, and in
Section IV some final conclusions are drawn.
II. GAUSSIAN REPRESENTATION OF FUZZY GRANULES
A fuzzy clustering algorithm can be described as a function
that accepts a training set of observations and returns a set of
prototypes other than a partition matrix. The number of
clusters may be predefined or determined by the algorithm.
Hence, a generic fuzzy clustering algorithm may be
formalized as:
A Compact Gaussian Representation of Fuzzy
Information Granules
Giovanna Castellano, Anna M. Fanelli, Member, IEEE and Corrado Mencar
F
2
  : 0,1 ,
, 1, 1
m c
m c
n
fc
m c
×
→ ×
⊆ > ≥
X X
X
(1)
such that:
( )
1 2
, , , ,
N
fc P U = x x x … (2)
where:
 
1 2
, , ,
c
P = p p p … (3)
is the matrix of all prototypes (one for each column), and:
  1,2, ,
1 2
1,2, ,
, , , i m
c ij
j c
U u =
=
= =
u u u …
…
… (4)
is the partition matrix, that contains the membership value
of each observation to each cluster.
The objective is to find, for each cluster, a set of Gaussian
representations of the discovered clusters, corresponding to
the following functional form:
 
( ) ( ) ( )
( ) ,
: exp
T
C
C
ω
u ω ω = − − − x x x (5)
where ω is the center and C is the inverse of the width
matrix. Matrix C should be symmetric positive definite (s.p.d.)
in order to have the classical “bell” shape centered onω of the
function graph. In many cases, a simpler diagonal positive
width matrix is often required. Indeed, if C is a diagonal
matrix, that is:
( )
1 2
: diag diag , , , , 0
n i
C c c c c = = > c … (6)
then the fuzzy granule can be represented as product of
independent scalar exponential functions:
 
( )
 
( ) ( )
( )
2
, ,
1 1
exp
i i
n n
i i i i C c
i i
x c x
ω ω
u u ω
= =
= = − −
∏ ∏
x (7)
The problem can be decomposed into c independent sub
problems that find the best representation for each cluster
discovered by the clustering algorithm. Hence, in the
following we concentrate on a single cluster and we will omit
the cluster index j when unnecessary.
Generally, there is no an exact solution to the problem, i.e.
there is not a pair , C ω such that:
 
( )
,
:
and: : 0
i i C
T
i u
C
ω
u ∀ =
∀ ≠ >
x
x 0 x x
(8)
In order to choose the “best” Gaussian representation,
some error function has to be defined. Because of the
nonlinearity of the equations in (8), it is not possible to apply
general linear systems theory. On the other hand, the equation
system in (8) is equivalent to the following:
( ) ( ) : log , spd
T
i i i
i C u C ω ω ∀ − − − = x x (9)
The system (9) can be rewritten as:
ˆ ˆ : log , spd
T
i i i
i C u C ∀ = − x x (10)
where the center of the Gaussian membership function is
put equal to the cluster prototype:
j
ω = p (11)
and the following change of variables is done:
ˆ
i i
ω = − x x (12)
By imposing C to be positive diagonal, the system can be
further simplified as:
2
1
ˆ : log , 0
n
ik i i i
k
i x c u c
=
∀ = − >
∑
(13)
where:  
1,2, ,
ˆ ˆ
i ik
k n
x
=
= x
…
.
The equations in (13) form a constrained linear system;
generally, it has not an exact solution, so a constrained least
squared error minimization problem can be formulated as
follows:
( )
2
2
1
1 1
ˆ minimize: log
subject to: 0
m n
ik i i m
i k
f x c u
= =
 
= +

\ .
>
∑ ∑
c
c
(14)
If the following matrix is defined:
2
1,2, ,
1,2, ,
ˆ
i m
ik
k n
H x =
=
=
…
…
(15)
then, excluding the constant terms, the problem (14) can be
reformulated as:
( )
1
2
minimize:
subject to: 0
T T
f G ′ = +
>
c c c g c
c
(16)
where:
2
T
G H H = (17)
and:
2 log
T
H = g u (18)
The problem can be solved with classical constrained
quadratic programming techniques. Usually, quadratic
programming algorithms only accept constraints in the form:
A ≥ c b (19)
In this case, it is useful to express the constraints of the
objective function in the form:
min
≥ c c (20)
where the vector
min
c defines the maximum admissible
amplitudes and it is provided manually. If
min
= c 0 , then all
possible amplitudes are admissible, even infinite.
The objective function f can be usefully rewritten as:
3
( )
( )
2
1
1
ˆ ˆ exp diag
log
T
m
i i
m
i i
f
u
=
 
− ⋅ ⋅
 =

\ .
∑
x c x
c (21)
which is the mean squared logratio between the Gaussian
membership approximation and the actual membership value
assigned by the clustering algorithm. The squared logratio is
a concave positive function with global minimum in 1 with
value 0. By expanding the Taylor series of the squared log
ratio with center in 1, it is possible to observe that in a
sufficiently small neighbor of point 1, the function can be
approximated by:
( ) ( ) ( )
( )
2 2 3
log 1 1 x x O x = − + − (22)
In such neighborhood, the following approximation can be
done:
 
( )
 
( )
2 2
, ,
log 1
i i C C
i i
u u
ω ω
u u
ε
   
= ≈ −  
 
\ . \ .
x x
(23)
This implies that:
 
( )
( )
2
2
, i i i C
u u
ω
u ε ε − ≈ ≤ x (24)
As a consequence, if the objective function assumes small
values, the resulting Gaussian membership function
approximates the partition matrix with a small mean squared
error. This property validates the proposed approach.
The space complexity of the proposed representation is
O(nc), while the memory required for storing the partition
matrix is O((m+n)c). In this sense, the proposed approach
leads to a compact representation of fuzzy granules.
III. SIMULATION RESULTS
In this Section we use a computer experiment to illustrate
the proposed approach. As an information granulation
problem, we have chosen the North East dataset (fig. 1),
containing 123,593 postal addresses (represented as points),
which represent three metropolitan areas (New York,
Philadelphia and Boston) [6]. The dataset can be grouped into
three clusters, with a lot of noise, in the form of uniformly
distributed rural areas and smaller population centers.
We have used FCM to generate three fuzzy clusters from
the dataset. Successively, the prototype vector and the
partition matrix returned by FCM were used by the proposed
Figure 1: The North East Dataset
Figure 2: Fuzzy cluster for Philadelphia city and its Gaussian
representation
Figure 3: Fuzzy cluster for Boston city and its Gaussian representation
Figure 4: Fuzzy cluster for New York city and its Gaussian
representation
Philadelphia
New York
Boston
4
method to obtain a Gaussian representation of the three
clusters. For FCM and quadratic programming, the
MATLAB® R11.1 Fuzzy toolbox and Optimization toolbox
have been used respectively.
Centers and widths of the derived Gaussian functions are
reported in Table I. Figures 2, 3 and 4, depict for each cluster
both membership values in the partition matrix as greylevels,
and the radial contours of the corresponding Gaussian
function.
As it can be seen in the figures, Gaussian granules obtained
by the proposed approach properly model some qualitative
concepts about the available data. Specifically, regarding each
cluster as one of the three metropolitan areas (Boston, New
York, Philadelphia), membership values of postal addresses
can be interpreted as the degree of closeness to one city
(cluster prototype). Such concept is not easily captured with
clusters discovered by FCM alone, since, as the figures
illustrate, the membership values of the addresses do not
always decrease as the distances from the prototype cluster
increase.
Also, table I reports the Mean Squared Errors (MSE)
between Gaussian granules and fuzzy clusters, defined as:
( )
( )
2
1
,
1
j j
m
j i i m C
i
u
ω
u
=
= −
∑
x E (25)
The low values of MSE for each granule, demonstrate how
well the resulting Gaussian membership functions
approximate the partition matrix of FCM.
In order to evaluate quantitatively the derived Gaussian
information granules, the XieBeni index has been used as
compactness and separation validity measure [7]. Such
measure is defined as:
2
2
1 1
2
,
min
c m
ij j i
j i
i j
i j
S
m
ϑ
= =
−
=
−
∑∑
p x
p p
(26)
where:
( )
,
, for FCM clusters
, for Gaussian granules
j j
ij
ij
i
C
u
ω
ϑ
u
¦
¦
=
´
¦
¹
x
(27)
In other words, the XieBeni index for the FCM clusters has
been directly computed on the partition matrix returned by the
clustering algorithm. Conversely, for Gaussian granules the
measure has been computed by recalculating the membership
values of each observation of the dataset with the derived
Gaussian membership functions.
Table II summarizes a comparison between fuzzy granules
extracted by FCM alone, and those obtained by the proposed
approach, in terms of XieBeni index, number of floating
point operations (FLOPS) and time/memory requirements on a
Intel™ Pentium® III 500MHz and 128MB RAM.
As it can be seen, the XieBeni index values for the
Gaussian granules and FCM clusters are comparable. The
slight difference is due to the nature of the proposed method
that generates convex (Gaussian) approximations for the
partition matrix, which is generally not convex, i.e. it assumes
high values even for points very distant from the prototype
(see figs. 24).
The time required for representing granules with Gaussian
functional forms is negligible compared to the time required
for FCM, hence the total computational cost of the proposed
method (FCM + Gaussian representation) is comparable with
FCM alone. More important, the method provides a compact
representation of the granules. Indeed, each Gaussian granule
is fully described only with a prototype vector and a diagonal
width matrix. As a consequence, once granules have been
represented by Gaussian functions, the partition matrix can be
discarded, thus saving a large amount of memory.
IV. CONCLUSIONS
In this paper, we have proposed a method to derive a
Gaussian representation of information granules by solving a
constrained quadratic programming problem. Unlike heuristic
techniques, no hyperparameter has to be specified, and the
granules representation fully exploits all the information
returned by the fuzzy clustering algorithm used to extract
granules from data. The derived granules have good features
in terms of fine approximation and compact representation.
Moreover, they are very robust against noise, as the realworld
experimentation showed, and they can be usefully integrated
in most Inference Systems to perform fuzzy reasoning about
the working environment.
REFERENCES
[1] L.A. Zadeh, Fuzzy sets and information granularity. In M.M. Gupta,
R.K. Ragade and R.R. Yager, eds., Advances in Fuzzy Sets Theory and
Applications, North Holland, Amsterdam, 1979, pp. 318.
[2] L.A. Zadeh, Towards a theory of fuzzy information granulation and its
centrality in human reasoning and fuzzy logic. Fuzzy Sets and Systems,
Vol. 90(1997), pp. 111127.
[3] W. Pedrycz, Granular computing: an introduction. In Proc. of IFSA
NAFIPS 2001, Vancouver, Canada, 2001, pp. 13491354.
TABLE I
PARAMETERS OF GAUSSIAN INFORMATION GRANULES
Granule
Measure
Boston New York Philadelphia
Center (0.6027, 0.6782) (0.3858, 0.4870) (0.1729, 0.2604)
Amplitudes (0.0906, 0.1027) (0.0580, 0.0606) (0.1013, 0.1151)
MSE 0.0360 0.0203 0.0347
TABLE II
PERFORMANCE MEASUREMENTS
Quantity FCM
Gaussian
representation
XieBeni index 0.1656 0.2687
FLOPS 792M 14.1M
Time required 138.1 s (81 iterations) 14.7 s
Memory required 2,966,280 B 144 B
5
[4] J.C. Bezdek, Pattern recognition with fuzzy objective function
algorithms. Plenum, New York, 1981.
[5] S. Haykin, Neural Networks: A Comprehensive Foundation, Prentice
Hall, NJ, 1999.
[6] G. Kollios, D. Gunopulos, N. Koudas, S. Berchtold, Efficient Biased
Sampling for Approximate Clustering and Outlier Detection in Large
Datasets. IEEE Transactions on Knowledge and Data Engineering, to
appear, 2002. Avalable:
http://dias.cti.gr/~ytheod/research/datasets/spatial.html
[7] X.L. Xie, G. Beni, A Validity Measure for Fuzzy Clustering, IEEE
Transactions on Pattern Analysis and Machine Intelligence, 1991, 13(4),
pp. 841846.
then all possible amplitudes are admissible. C spd (10) such that: where the center of the Gaussian membership function is put equal to the cluster prototype: P.c By imposing C to be positive diagonal. a simpler diagonal positive width matrix is often required.d.… . some error function has to be defined. .n .1] X⊆ n m×c .2. Matrix C should be symmetric positive definite (s. that contains the membership value of each observation to each cluster. corresponding to the following functional form: ˆ ˆ where: x i = [ xik ]k =1. m > 1. If c min = 0 .) in order to have the classical “bell” shape centered on ω of the function graph. it has not an exact solution. the problem (14) can be reformulated as: minimize: f ′ ( c ) = 1 cT Gc + g T c 2 subject to: c > 0 where: G = 2H T H µ[ω . it is not possible to apply general linear systems theory.n (15) (6) then the fuzzy granule can be represented as product of independent scalar exponential functions: then.C ] ( x ) := exp − ( x − ω ) C ( x − ω ) ( T ) The equations in (13) form a constrained linear system. The objective function f can be usefully rewritten as: ∀i : − ( x i − ω ) C ( x i − ω ) = log ui . x 2 . Because of the nonlinearity of the equations in (8).2. p c ] (3) (12) is the matrix of all prototypes (one for each column). quadratic programming algorithms only accept constraints in the form: Ac ≥ b (19) In order to choose the “best” Gaussian representation. cn ) . excluding the constant terms.2 fc : X m → X c × [ 0. c ≥ 1 (1) ˆ ˆ ∀i : x i T Cx i = − log ui . Indeed. i. U (2) fc ( x1 . µ[ω .C ] ( x i ) = ui and: ∀x ≠ 0 : x T Cx > 0 (8) (17) and: g = 2 H T log u (18) The problem can be solved with classical constrained quadratic programming techniques. ci > 0 (14) If the following matrix is defined: 2 H = xik i =1. Generally.2. the system can be further simplified as: ˆ2 ∀i : ∑ xik ci = − log ui .….…. Hence. so a constrained least squared error minimization problem can be formulated as follows: n 2 ˆ minimize: f ( c ) = ∑ ∑ xik ci + log ui i =1 k =1 subject to: c > 0 m 1 m 2 (5) where ω is the center and C is the inverse of the width matrix. ci > 0 k =1 n (4) (13) is the partition matrix. it is useful to express the constraints of the objective function in the form: c ≥ c min (20) where the vector c min defines the maximum admissible amplitudes and it is provided manually.… . generally. Usually.…. c2 . there is no an exact solution to the problem.c ] ( xi ) = ∏ exp − ci ( xi − ωi ) i =1 i i n n i =1 ( 2 ) (16) (7) The problem can be decomposed into c independent subproblems that find the best representation for each cluster discovered by the clustering algorithm.2. u c ] = uij i =1. if C is a diagonal matrix. that is: C := diag c = diag ( c1 . there is not a pair ω .C such that: ∀i : µ[ω . The objective is to find. for each cluster. a set of Gaussian representations of the discovered clusters. even infinite.m j =1. In many cases.… .… . in the following we concentrate on a single cluster and we will omit the cluster index j when unnecessary.C ] ( x ) = ∏ µ[ω . C spd T (9) The system (9) can be rewritten as: . p2 . On the other hand.….p.e. and: U = [ u1 .m ˆ k =1. the equation system in (8) is equivalent to the following: In this case.…. u 2 .2. x N where: )= ω = pj and the following change of variables is done: ˆ xi = xi − ω (11) P = [ p1 .
while the memory required for storing the partition matrix is O((m+n)c). in the form of uniformly distributed rural areas and smaller population centers. with a lot of noise. Successively. the following approximation can be done: µ ω . it is possible to observe that in a sufficiently small neighbor of point 1. The squared logratio is a concave positive function with global minimum in 1 with value 0. SIMULATION RESULTS In this Section we use a computer experiment to illustrate the proposed approach. the function can be approximated by: ( log x ) 2 = ( x − 1) + O ( x − 1) 2 ( 3 ) 2 (22) Figure 2: Fuzzy cluster for Philadelphia city and its Gaussian representation In such neighborhood. containing 123.C ] ( x i ) − ui ) 2 ≈ ui2ε ≤ ε (24) As a consequence.3 ˆ ˆ exp ( − x i T ⋅ diag c ⋅ x i ) f ( c ) = ∑ log ui i =1 m 1 m 2 (21) which is the mean squared logratio between the Gaussian membership approximation and the actual membership value assigned by the clustering algorithm. The space complexity of the proposed representation is O(nc). the resulting Gaussian membership function approximates the partition matrix with a small mean squared error. 1). the prototype vector and the partition matrix returned by FCM were used by the proposed . Figure 3: Fuzzy cluster for Boston city and its Gaussian representation Figure 4: Fuzzy cluster for New York city and its Gaussian representation Boston New York Philadelphia Figure 1: The North East Dataset Philadelphia and Boston) [6]. We have used FCM to generate three fuzzy clusters from the dataset. This property validates the proposed approach. In this sense.593 postal addresses (represented as points). if the objective function assumes small values. the proposed approach leads to a compact representation of fuzzy granules. which represent three metropolitan areas (New York. we have chosen the North East dataset (fig. As an information granulation problem.C ] ( x i ) − 1 ε = log [ ] ≈ ui ui 2 (23) This implies that: ( µ[ ω . The dataset can be grouped into three clusters. III. By expanding the Taylor series of the squared logratio with center in 1.C ( x i ) µ[ω .
0.1M 14. As it can be seen.C j ) 2 (25) values of each observation of the dataset with the derived Gaussian membership functions.M. .966. IV. North Holland.2687 14. Conversely. Yager. j c m 2 ij p j − xi 2 2 m min pi − p j (26) where: ϑij = µ ω uij . W. since. we have proposed a method to derive a Gaussian representation of information granules by solving a constrained quadratic programming problem.0580. eds. Canada. Zadeh.1151) 0.A. 111127.7 s 144 B method to obtain a Gaussian representation of the three clusters. R. The time required for representing granules with Gaussian functional forms is negligible compared to the time required for FCM. 2001.2604) (0. New York. Specifically. 0. Zadeh. 0.e.6027. hence the total computational cost of the proposed method (FCM + Gaussian representation) is comparable with FCM alone. 1979. of IFSANAFIPS 2001. membership values of postal addresses can be interpreted as the degree of closeness to one city (cluster prototype). Ragade and R.4870) (0. Gaussian granules obtained by the proposed approach properly model some qualitative concepts about the available data.R. Figures 2. the XieBeni index has been used as compactness and separation validity measure [7]. 0. As a consequence.1729. REFERENCES The low values of MSE for each granule. Granular computing: an introduction. for FCM clusters ( x ) . no hyperparameter has to be specified. In M. which is generally not convex.0360 New York (0. in terms of XieBeni index. Centers and widths of the derived Gaussian functions are reported in Table I. As it can be seen in the figures. Indeed. In Proc. Advances in Fuzzy Sets Theory and Applications. thus saving a large amount of memory. Unlike heuristic techniques. and the granules representation fully exploits all the information returned by the fuzzy clustering algorithm used to extract granules from data. it assumes high values even for points very distant from the prototype (see figs. Vol. Vancouver. pp.K.3858. Moreover. the membership values of the addresses do not always decrease as the distances from the prototype cluster increase.6782) (0. and they can be usefully integrated in most Inference Systems to perform fuzzy reasoning about the working environment.1 s (81 iterations) 2. number of floating point operations (FLOPS) and time/memory requirements on a Intel™ Pentium® III 500MHz and 128MB RAM. Towards a theory of fuzzy information granulation and its centrality in human reasoning and fuzzy logic. demonstrate how well the resulting Gaussian membership functions approximate the partition matrix of FCM. Table II summarizes a comparison between fuzzy granules extracted by FCM alone. Such concept is not easily captured with clusters discovered by FCM alone. 13491354. 3 and 4.1013. 24). the MATLAB® R11. In order to evaluate quantitatively the derived Gaussian information granules. Amsterdam. and the radial contours of the corresponding Gaussian function. Philadelphia). the XieBeni index values for the Gaussian granules and FCM clusters are comparable. table I reports the Mean Squared Errors (MSE) between Gaussian granules and fuzzy clusters.280 B TABLE II PERFORMANCE MEASUREMENTS FCM Gaussian representation 0. as the figures illustrate. the method provides a compact representation of the granules. L. depict for each cluster both membership values in the partition matrix as greylevels.0606) 0. The slight difference is due to the nature of the proposed method that generates convex (Gaussian) approximations for the partition matrix.0906. 0. the partition matrix can be discarded. for Gaussian granules the measure has been computed by recalculating the membership L. pp.C j i (27) [1] [2] [3] In other words. as the realworld experimentation showed.0347 Quantity XieBeni index FLOPS Time required Memory required 0. pp. regarding each cluster as one of the three metropolitan areas (Boston.1 Fuzzy toolbox and Optimization toolbox have been used respectively. they are very robust against noise. Pedrycz. The derived granules have good features in terms of fine approximation and compact representation. CONCLUSIONS In this paper.A. Also. once granules have been represented by Gaussian functions. 90(1997). More important. Such measure is defined as: S= ∑∑ϑ j =1 i =1 i.0203 Philadelphia (0. the XieBeni index for the FCM clusters has been directly computed on the partition matrix returned by the clustering algorithm. and those obtained by the proposed approach. Gupta..4 TABLE I PARAMETERS OF GAUSSIAN INFORMATION GRANULES Granule Measure Center Amplitudes MSE Boston (0. For FCM and quadratic programming. defined as: Ej = 1 m ∑ (µ ω m i =1 j ( xi ) − ui . each Gaussian granule is fully described only with a prototype vector and a diagonal width matrix. 0. Fuzzy Sets and Systems. i. for Gaussian granules j .1656 792M 138. Fuzzy sets and information granularity. 318.1027) 0.
S. Beni. 841846. Koudas. Avalable: http://dias. New York. 13(4). N. Efficient Biased Sampling for Approximate Clustering and Outlier Detection in Large Datasets. 1999. Haykin.cti. 1981. IEEE Transactions on Pattern Analysis and Machine Intelligence. Kollios. Gunopulos.C. Plenum. [7] . NJ. 1991.5 [4] [5] [6] J. A Validity Measure for Fuzzy Clustering. to appear. pp.gr/~ytheod/research/datasets/spatial. Neural Networks: A Comprehensive Foundation. PrenticeHall. Bezdek. Berchtold. IEEE Transactions on Knowledge and Data Engineering. Xie. D. G. S. 2002.L. Pattern recognition with fuzzy objective function algorithms.html X. G.