Professional Documents
Culture Documents
DOI:10.3233/IFS-130810
IOS Press
Abstract. In this paper we have introduced the notion of distance between two single valued neutrosophic sets and studied its
properties. We have also defined several similarity measures between them and investigated their characteristics. A measure of
entropy of a single valued neutrosophic set has also been introduced.
Keywords: Single valued neutrosophic set, Hausdorff distance, similarity measure, weights, entropy, cardinality
1064-1246/14/$27.50 © 2014 – IOS Press and the authors. All rights reserved
1246 P. Majumdar and S.K. Samanta / On similarity and entropy of neutrosophic sets
Definition 2.6. The intersection of two SVNS • The normalized Hamming distance between A and
A & B is a SVNS C, written as C = A ∩ B, which is B is defined as follows:
defined as follows:
1
n
lN (A, B) = {|tA (xi ) − tB (xi )| + |iA (xi )
tC (x) = min(tA (x), tB (x)); iC (x) 3n
i=1
− iB (xi )| + |fA (xi ) − fB (xi )|} (2)
= min(iA (x), iB (x)) &
• The Euclidian distance between A and B is
fC (x) = max(fA (x), fB (x)) ∀x ∈ X. defined as follows:
n
eN (A, B) = (tA (xi ) − tB (xi ))2 + (iA (xi ) − iB (xi ))2 + (fA (xi ) − fB (xi ))2 (3)
i=1
In this section we introduce the notion of distance Example 3.2. Let X = {a, b, c, d} be the universe and
between two single valued neutrosophic sets A and B A and B be two single valued neutrosophic sets in X
in the universe X = {x1 , x2 , x3 , .... , xn }. defined as follows:
a b
Definition 3.1. Let A= , ,
< 0.5, 0.2, 0.9 > < 0.8, 0.4, 0.2 >
n
xi
A= and c d
< tA (xi ), iA (xi ), fA (xi ) > , .
i=1 < 0.3, 0.8, 0.7 > < 0.6, 0.3, 0.5 >
n
xi a b
B= B= , ,
< tB (xi ), iB (xi ), fB (xi ) > < 0.7, 0.4, 0.2 > < 0.5, 0.5, 0.3 >
i=1
c d
be two single valued neutrosophic sets in X = , .
< 0.1, 0.2, 0.3 > < 0.8, 0.1, 0.6 >
{x1 , x2 , x3 , ...., xn }. Then
Then the distance between A, B will be as follows:
• The Hamming distance between A and B is defined
dN (A, B) = 3.3.
as follows:
Similarly the other three distances will be lN (A, B) =
n
∼ ∼
12 = 0.275, eN (A, B) = 1.15, qN (A, B) = 0.33.
3.3
dN (A, B) = {|tA (xi ) − tB (xi )| + |iA (xi ) − iB (xi )|
Then the following result can be easily proved.
i=1
4. Similarity measure between two single Proposition 4.1.2. The distance based similarity mea-
valued neutrosophic sets sure s1 , between two SVNSs A and B satisfies the
following properties:
In this section we define the notion of similarity
between two SVNSs. We have adopted various meth- (i) 0 ≤ s1 (A, B) ≤ 1
ods for calculating this similarity. The first method is
based on distances defined in the previous section. The (ii) s1 (A, B) = 1 iff A = B
second one is based on a matching function and the
(iii) s1 (A, B) = s1 (B, A)
last one is based on membership grades. In general
a similarity measure between two SVNSs is a func- (iv) A ⊂ B ⊂ C ⇒ S(A, C) ≤ S(A, B) ∧ S(B, C)
tion defined as s : N(X)2 → [0, 1] which satisfies the
following properties:
Proof. The results (i)–(iii) holds trivially from defini-
(i) S(A, B) ∈ [0, 1], tion. We only prove (iv).
(ii) S(A, B) = 1 ⇔ A = B, (9) Let A ⊂ B ⊂ C. Then we have
(iii) S(A, B) = S(B, A), tA (x) ≤ tB (x) ≤ tc (x); iA (x) ≤ iB (x) ≤ ic (x) and
(iv) A ⊂ B ⊂ C ⇒ S(A, C) ≤ S(A, B) ∧ S(B, C)
fA (x) ≥ fB (x) ≥ fc (x) ∀x ∈ U.
But individual measures may satisfy more properties in
addition to (9). Now
Now similarity can be calculated using several tech-
niques. Here we have adopted three common techniques |tA (x) − tB (x)| ≤ |tA (x) − tc (x)| and
namely, the distance based, the one based on a matching
function and lastly on membership grade based. |tB (x) − tC (x)| ≤ |tA (x) − tC (x)| will hold.
4.1. Distance based similarity measure Similarly, |iA (x) − iB (x)| ≤ |iA (x) − ic (x)| and
|iB (x) − iC (x)| ≤ |iA (x) − iC (x)| and also
We know that similarity is inversely proportional
with the distance between them. Using the distances |fA (x) − fB (x)| ≤ |fA (x) − fc (x)| and
defined in Equation (1)–(4) we define measures of sim-
ilarity s1 between two SVN sets A and B as follows: |fB (x) − fC (x)| ≤ |fA (x) − fC (x)| holds.
P. Majumdar and S.K. Samanta / On similarity and entropy of neutrosophic sets 1249
Thus ⇒ {min{tA (xi ), tB (xi )}
x
d(A, B) ≤ d(A, C) ⇒ s1 (A, B) ≥ s1 (A, C) and
+ min{iA (xi ), iB (xi )}
d(B, C) ≤ d(A, C) ⇒ s1 (B, C) ≥ s1 (A, C)
+ min{fA (xi ), fB (xi )}}
⇒ s1 (A, C) ≤ s1 (A, B)
= {max{tA (xi ), tB (xi )} + max{iA (xi ),
∧s1 (B, C). x
iB (xi )} + max{fA (xi ), fB (xi )}
This is true for all the distance functions defined in
Equations (1) to (4). ⇒ {[min{tA (xi ), tB (xi )}−max{tA (xi ), tB (xi )}]
Hence the result. x
+ [min{iA (xi ), iB (xi )} − max{iA (xi ), iB (xi )}]
4.2. Similarity measure based on membership + min[{fA (xi ), fB (xi )}
degrees
− max{fA (xi ), fB (xi )}] = 0
Another measure of similarity s2 between two SVN Thus for each x,
sets A and B could be defined as follows:
n
{min{tA (xi ), tB (xi )} + min{iA (xi ), iB (xi )} + min{fA (xi ), fB (xi )}}
i=1
s2 (A, B) = (12)
n
{max{tA (xi ), tB (xi )} + max{iA (xi ), iB (xi )} + max{fA (xi ), fB (xi )}
i=1
Example 4.2.1. Here the similarity measure between min{tA (xi ), tB (xi )} − max{tA (xi ), tB (xi )} = 0,
the two SVN sets defined in example 3.2 will be
min{iA (xi ), iB (xi )} − max{iA (xi ), iB (xi )} = 0 and
3.8 ∼ min{fA (xi ), fB (xi )} − max{fA (xi ), fB (xi )} = 0 holds.
s2 (A, B) = = 0.535.
7.1
Thus tA (x) = tB (x), iA (x) = iB (x) &fA (x) = fB (x)
Proposition 4.2.2. The distance based similarity mea- → A = B.
sure s2 , between two SVNS A and B satisfies the
following properties: (iv) Now we prove the last result.
n
{min{tA (xi ), tB (xi )} + min{iA (xi ), iB (xi )} + min{fA (xi ), fB (xi )}}
i=1
s2 (A, B) = 1 ⇒ =1
n
{max{tA (xi ), tB (xi )} + max{iA (xi ), iB (xi )} + max{fA (xi ), fB (xi )}
i=1
1250 P. Majumdar and S.K. Samanta / On similarity and entropy of neutrosophic sets
tA (x) + iA (x) + fB (x) common. So for a specific disease all the symptoms are
∴ s2 (A, B) = not equally important. Here we can assign weights to
tB (x) + iB (x) + fA (x)
each symptom corresponding to a particular disease. So
tA (x) + iA (x) + fC (x) we have a universe of symptoms where each symptom
≥ = s2 (A, C)
tC (x) + iC (x) + fA (x) has its corresponding weights. In this case we cannot
use similarity measures described in 4.1 and 4.2.
Again similarly we have:
Again a suitable function is often used to measure
tB (x) + iB (x) + fC (x) ≥ tA (x) + iA (x) + fC (x) & the similarity between two sets which is called a match-
ing function. Chen [2, 3] first introduced the notion of
tC (x) + iC (x) + fA (x) ≥ tC (x) + iC (x) + fB (x)
matching function. Here a weighted similarity measure
sw between A and B has been defined using a matching
function as follows:
n
ωi (tA (xi )·tB (xi ) + iA (xi ) · iB (xi ) + fA (xi ) · fB (xi ))2
i=1 wi .(PA (xi )PB (xi ))2
s (A, B) =
w
= , (13)
n
wi (PA (xi )2 .PB (xi )2 )
ωi {(tA (xi )2 + iA (xi )2 + fA (xi )2 ).(tB (xi )2 + iB (xi )2 + fB (xi )2 )}
i=1
Entropy as a measure of fuzziness was first men- Proposition 5.2. E1 satisfies all the axioms given in
tioned by Zadeh [15] in 1965. Later De Luca-Termini definition 5.1.
[5] axiomatized the non-probabilistic entropy. Accord-
Proof.
ing to them the entropy E of a fuzzy set A should satisfy
the following axioms: (i) For a crisp set A and iA (x) = 0 ∀x ∈ X. Hence
E1 (A) = 0 holds.
(DT 1) E(A) = 0 iff A ∈ 2X (ii) If A be such that (tA (x), iA (x), fA (x)) = (0.5,
(DT 2)E(A) = 1 iff µA (x) = 0.5, ∀x ∈ X 0.5, 0.5)∀x ∈ X, then
(DT 3)E(A) ≤ E(B) iff A is less fuzzy than B, tA (x) + fA (x) = 1 and iA (x) − iAc (x)
i.e. ifµA (x) ≤ µB (x) ≤ 0.5 ∀x ∈ X = 0.5 − 0.5
or ifµA (x) ≥ µB (x) ≥ 0.5, ∀x ∈ X. = 0 ∀x ∈ X ⇒ E1 (A) = 1
7. Conclusion [2] S.M. Chen, A new approach to handling fuzzy decision making
problems, IEEE Transactions on Systems, Man and Cybernet
18 (1988), 1012–1016.
The recently proposed notion of neutrosophic sets [3] S.M. Chen and P.H. Hsiao, A Comparison of similarity mea-
is a general formal framework for studying uncertain- sures of fuzzy values, Fuzzy Sets and Systems 72 (1995), 79–89.
ties arising due to ‘indeterminacy’ factors. From the [4] C.C. Cheng and K.H. Liao, Parameter optimization based on
philosophical point of view, it has been shown that a entropy weight and triangular fuzzy number, Int J of Engineer-
ing and Industries 2(2) (2011), 62–75.
neutrosophic set generalizes a classical set, fuzzy set, [5] A. De Luca, and A.S. Termini, A definition of a non-
interval valued fuzzy set, intuitionistic fuzzy set etc. A probabilistic entropy in the setting of fuzzy sets theory,
single valued neutrosophic set is an instance of neu- Information & Control 20 (1972), 301–312.
trosophic set which can be used in real scientific and [6] A. Kaufmann, Introduction to the theory of Fuzzy Subsets-Vol
1: Fundamental Theoretical Elements, Academic Press, New
engineering applications. Therefore the study of sin- York, 1975.
gle valued neutrosophic sets and its properties have a [7] B. Kosoko, Fuzzy entropy and conditioning, Information Sci-
considerable significance in the sense of applications ence 40(2) (1986), 165–174.
[8] P. Majumdar and S.K. Samanta, Softness of a soft set: Soft
as well as in understanding the fundamentals of uncer-
set entropy, accepted in Annals of fuzzy math and informatics,
tainty. In this paper we have introduced some measures 2012.
of similarity and entropy of single valued neutrosophic [9] E. Pasha, Fuzzy entropy as cost function in image processing,
sets for the first time. These measures are consistent Proc of the 2nd IMT-GT Regional Conf on Math, Stat and Appl,
Universiti Sains Malaysia, Penang, 2006.
with similar considerations for other sets like fuzzy sets [10] F. Smarandache, A Unifying Field in Logics. Neutrosophy:
and intuitionistic fuzzy sets etc. Neutrosophic Probability, Set & Logic. American Research
Press, Rehoboth, 1999.
[11] E. Szmidt and J. Kacprzyk, Entropy for intuitionistic fuzzy
sets, Fuzzy Sets & Systems 118 (2001), 467–477.
Acknowledgments [12] H. Wang, et al., Single valued neutrosophic sets, Proc Of 10th
Int Conf on Fuzzy Theory and Technology, Salt Lake City,
We sincerely thank the anonymous reviewers for Utah, 2005.
their valuable comments which helped us to write the [13] R.R. Yagar, On the measure of fuzziness and negation, Part
I: Membership in the unit interval, International Journal of
paper in its present form. General Systems 5 (1979), 189–200.
[14] L. Zadeh, Fuzzy sets, Information and Control 8 (1965), 87–96.
[15] L. Zadeh, Fuzzy sets and systems, in: Proc Symp on Systems
References Theory, Polytechnic Institute of Brooklyn, New York, 1965.