You are on page 1of 8

Journal of Intelligent & Fuzzy Systems 26 (2014) 1245–1252 1245

DOI:10.3233/IFS-130810
IOS Press

On similarity and entropy of neutrosophic sets


Pinaki Majumdara,b,∗ and Syamal Kumar Samantab
a Department of Mathematics, M.U.C. Women’s College, Burdwan, West-Bengal, India
b Department of Mathematics, Visva-Bharati, Santiniketan, West-Bengal, India

Abstract. In this paper we have introduced the notion of distance between two single valued neutrosophic sets and studied its
properties. We have also defined several similarity measures between them and investigated their characteristics. A measure of
entropy of a single valued neutrosophic set has also been introduced.

Keywords: Single valued neutrosophic set, Hausdorff distance, similarity measure, weights, entropy, cardinality

1. Introduction in different real physical problems such as problems


involving incomplete information. Hence further gen-
In many practical situations and in many complex eralizations of fuzzy and intuitionistic fuzzy sets are
systems like biological, behavioral and chemical etc., required. After that many theories have been evolved
we encounter different types of uncertainties. Our which are successful in their respective domains.
classical mathematics does not practice any kind of Recently a new theory has been introduced and
uncertainty in its tools, excluding possibly the case which is known as neutrosophic logic and sets. The
of probability, where it can handle a particular kind term “neutro-sophy” means “knowledge of neutral
of uncertainty called randomness. Therefore new tech- thought” and this ‘neutral’ represents the main distinc-
niques and modification of classical tools are required tion between ‘fuzzy’ and ‘intuitionistic fuzzy’ logic and
to model such uncertain phenomenon. In 1965, L. A. set. ‘Neutrosophic logic’ was introduced by Florentin
Zadeh [14] coined his remarkable theory of Fuzzy sets Smarandache [10] in 1995. It is a logic in which each
that deals with a kind of uncertainty known as “Fuzzi- proposition is estimated to have a degree of truth (T),
ness” and which is due to partial membership of an a degree of indeterminacy (I) and a degree of falsity
element in a set. Later this “Fuzziness” concept leads (F). A Neutrosophic set is a set where each element of
to the highly acclaimed theory of Fuzzy Logic. After the universe has a degree of truth, indeterminacy and
the invention of fuzzy sets many other hybrid concepts falsity respectively and which lies between ]− 0, 1+ [,
begun to develop. In 1983, K. Atanasov [1] introduced the non-standard unit interval. Unlike in intuitionistic
the idea of Intuitionistic fuzzy sets, a set with each fuzzy sets, where the incorporated uncertainty is depen-
member having a degree of belongingness as well as dent of the degree of belongingness and degree of non
a degree of non-belongingness. This is again a general- belongingness, here the uncertainty present, i.e. indeter-
ization of fuzzy set theory. Although Fuzzy set theory is minacy factor, is independent of truth and falsity values.
very successful in handling uncertainties arising from In 2005, Wang et al. [12] introduced an instance of
vagueness or partial belongingness of an element in a neutrosophic set known as single valued neutrosophic
set, it cannot model all sorts of uncertainties prevailing sets (SVNS) which were motivated from the practical
point of view and that can be used in real scientific
and engineering applications. The single valued neutro-
∗ Corresponding author. Pinaki Majumdar, E-mail: pmajumdar2 sophic set is a generalization of classical set, fuzzy set,
@rediffmail.com. intuitionistic fuzzy set and paraconsistent sets etc.

1064-1246/14/$27.50 © 2014 – IOS Press and the authors. All rights reserved
1246 P. Majumdar and S.K. Samanta / On similarity and entropy of neutrosophic sets

Again fuzziness is a feature of imperfect information 


n
xi
which is due to partial belongingness of an element to A= or
< tA (xi ), iA (xi ), fA (xi ) >
a set. The term ‘entropy’ as a measure of fuzziness was i=1
first coined by Zadeh [15] in 1965. The entropy measure 
n
< tA (xi ), iA (xi ), fA (xi ) >
of fuzzy sets has many applications in areas like image A= .
xi
processing, optimizations etc. [4, 9]. Actually a mea- i=1
sure of imperfectness of information represented by a
set is measured in terms of entropy. So measuring of In case of SVNS the degree of truth membership (T),
entropy of single valued neutrosophic sets will be use- indeterminacy membership (I) and the falsity member-
ful in cases whenever modeling of uncertain situations ship(F) values lies in [0, 1] instead of the non standard
is done through SVNSs. On the other hand similarity is unit interval ]− 0, 1+ [ as in the case of ordinary neutro-
a key concept in a number of fields such as linguistics, sophic sets.
psychology and computational intelligence. In several An example of SVNS is given below:
problems we often need to compare two sets. We are
Example 2.2. Assume that X = {x1 , x2 , x3 }, where
often interested to know whether two patterns or images
x1 is capacity, x2 is trustworthiness and, x3 is price of a
are identical or approximately identical or at least to
machine, be the universal set. The values of x1 , x2 , x3
what degree they are identical. Similar elements are
are in [0, 1]. They are obtained from the questionnaire of
regarded from different points of view by using resem-
some domain experts, their option could be a degree of
blances, distances, closeness, proximity, dissimilarities
“good service”, a degree of indeterminacy and a degree
etc. So it is natural and very useful to address the issues
of “poor service”. A is a single valued Neutrosophic set
similarity measure for this new set, viz. single valued
of X defined by
neutrosophic set.
The rest of the paper is constructed as follows:  
Some preliminary definitions and results were given A = < 0.3, 0.4, 0.5 > x1 + < 0.5, 0.2, 0.3 > x2
in Section 2. In Section 3, different types of distances + < 0.7, 0.2, 0.2 > .
between two single valued neutrosophic sets have been x3
introduced. Different measures of similarity and their
Next we state the definitions of complement and con-
properties have been discussed in Section 4. In Section
tainment as follows:
5, the notion of entropy of a single valued neutrosophic
set has been given. Section 6 briefly compares the
methods described here with earlier available methods. Definition 2.3. The complement of a SVNS A
Section 7 concludes the paper. is denoted by Ac and is defined by tAc (x) =
fA (x); iAc (x) = 1 − iA (x) & fAc (x) = tA (x) ∀x ∈ X.
2. Preliminaries
Definition 2.4. A SVNS A is contained in the
In this section we recall some definitions, operations other SVNS B, denoted as A ⊂ B, if and only if
and properties regarding single valued neutrosophic tA (x) ≤ tB (x); iA (x) ≤ iB (x) and fA (x) ≥ fB (x) ∀x ∈ X.
sets (SVNS in short) from [12], which will be used Two sets will be equal, i.e. A = B, iff A ⊂ B & B ⊂ A.
in the rest of the paper. Let us denote the collection of all SVNS in X as
A single valued neutrosophic set has been defined in N(X).
[12] as follows: Several operations like union and intersection has
been defined on SVNS’s and they satisfy most of the
Definition 2.1. Let X be a universal set. A Neutrosophic common algebraic properties of ordinary sets.
set A in X is characterized by a truth-membership func-
tion tA , a indeterminacy-membership function iA and a Definition 2.5. The union of two SVNS A&B is a SVNS
falsity-membership function fA , where tA , iA , fA : C, written as C = A ∪ B, which is defined as follows:
X → [0, 1], are functions and
∀x ∈ X, x ≡ x(tA (x), iA (x), fA (x)) ∈ A, is a sin- tC (x) = max(tA (x), tB (x)); iC (x)
gle valued neutrosophic element of A.
= max(iA (x), iB (x)) &
A single valued neutrosophic set A over a finite uni-
verse X = {x1 , x2 , x3 , .... , xn } is represented as fC (x) = min(fA (x), fB (x)) ∀x ∈ X.
P. Majumdar and S.K. Samanta / On similarity and entropy of neutrosophic sets 1247

Definition 2.6. The intersection of two SVNS • The normalized Hamming distance between A and
A & B is a SVNS C, written as C = A ∩ B, which is B is defined as follows:
defined as follows:
1 
n
lN (A, B) = {|tA (xi ) − tB (xi )| + |iA (xi )
tC (x) = min(tA (x), tB (x)); iC (x) 3n
i=1
− iB (xi )| + |fA (xi ) − fB (xi )|} (2)
= min(iA (x), iB (x)) &
• The Euclidian distance between A and B is
fC (x) = max(fA (x), fB (x)) ∀x ∈ X. defined as follows:


 n

eN (A, B) =  (tA (xi ) − tB (xi ))2 + (iA (xi ) − iB (xi ))2 + (fA (xi ) − fB (xi ))2 (3)
i=1

• The normalized Euclidian distance between A and


B is defined as follows:


1  n
qN (A, B) =  {(tA (xi ) − tB (xi ))2 + (iA (xi ) − iB (xi ))2 + (fA (xi ) − fB (xi ))2 (4)
3n
i=1
Now for equations (1)–(4) the following holds:
For practical purpose, throughout the rest of the (i) 0 ≤ dN (A, B) ≤ 3n (5)
paper, we have considered only SVNS in a finite uni-
(ii) 0 ≤ lN (A, B) ≤ 1 (6)
verse.

(iii) 0 ≤ eN (A, B) ≤ 3n (7)
(iv) (iv) 0 ≤ qN (A, B) ≤ 1 (8)
3. Distance between two neutrosophic sets

In this section we introduce the notion of distance Example 3.2. Let X = {a, b, c, d} be the universe and
between two single valued neutrosophic sets A and B A and B be two single valued neutrosophic sets in X
in the universe X = {x1 , x2 , x3 , .... , xn }. defined as follows:

a b
Definition 3.1. Let A= , ,
< 0.5, 0.2, 0.9 > < 0.8, 0.4, 0.2 >

n
xi 
A= and c d
< tA (xi ), iA (xi ), fA (xi ) > , .
i=1 < 0.3, 0.8, 0.7 > < 0.6, 0.3, 0.5 >


n
xi a b
B= B= , ,
< tB (xi ), iB (xi ), fB (xi ) > < 0.7, 0.4, 0.2 > < 0.5, 0.5, 0.3 >
i=1 
c d
be two single valued neutrosophic sets in X = , .
< 0.1, 0.2, 0.3 > < 0.8, 0.1, 0.6 >
{x1 , x2 , x3 , ...., xn }. Then
Then the distance between A, B will be as follows:
• The Hamming distance between A and B is defined
dN (A, B) = 3.3.
as follows:
Similarly the other three distances will be lN (A, B) =

n
∼ ∼
12 = 0.275, eN (A, B) = 1.15, qN (A, B) = 0.33.
3.3
dN (A, B) = {|tA (xi ) − tB (xi )| + |iA (xi ) − iB (xi )|
Then the following result can be easily proved.
i=1

Proposition 3.3. The distances dN , lN , eN , qN


+ |fA (xi ) − fB (xi )|} (1) defined above are metric.
1248 P. Majumdar and S.K. Samanta / On similarity and entropy of neutrosophic sets

Definition 3.4. (Cardinality) The minimum (or 1


s1 (A, B) = (10)
sure) cardinality of a SVNS A is denoted as 1 + d(A, B)
 
n
min count(A) or cl and is defined as cl = tA (xi ).
i=1 For example if we use Hamming distance dN then the
The  maximum cardinality of A is denoted by 1
associated measure of similarity will be denoted by sN
max count(A) or cu and is defined as cu = and is defined by:
n
{tA (xi ) + (1 − iA (xi ))}. The cardinality of A is
i=1 1
1
sN (A, B) = (11)
defined by the interval [cl , cu ]. 1 + dN (A, B)
Example 3.5. For the SVNS A given in example 3.2
we have the following: The following example calculates the similarity mea-
sure between the two SVNSs:

n
cl = tA (xi ) = 0.5 + 0.8 + 0.3 + 0.6 = 2.2 & Example 4.1.1. The similarity measure between the
i=1 two SVNSs defined in example 3.2 will be
n
cu = {tA (xi ) + (1 − iA (xi ))} 1
i=1 s1 (A, B) = ∼
= 0.233.
1 + 3.3
= 1.3 + 1.4 + 0.5 + 1.3 = 4.5.

4. Similarity measure between two single Proposition 4.1.2. The distance based similarity mea-
valued neutrosophic sets sure s1 , between two SVNSs A and B satisfies the
following properties:
In this section we define the notion of similarity
between two SVNSs. We have adopted various meth- (i) 0 ≤ s1 (A, B) ≤ 1
ods for calculating this similarity. The first method is
based on distances defined in the previous section. The (ii) s1 (A, B) = 1 iff A = B
second one is based on a matching function and the
(iii) s1 (A, B) = s1 (B, A)
last one is based on membership grades. In general
a similarity measure between two SVNSs is a func- (iv) A ⊂ B ⊂ C ⇒ S(A, C) ≤ S(A, B) ∧ S(B, C)
tion defined as s : N(X)2 → [0, 1] which satisfies the
following properties:
Proof. The results (i)–(iii) holds trivially from defini-
(i) S(A, B) ∈ [0, 1], tion. We only prove (iv).
(ii) S(A, B) = 1 ⇔ A = B, (9) Let A ⊂ B ⊂ C. Then we have

(iii) S(A, B) = S(B, A), tA (x) ≤ tB (x) ≤ tc (x); iA (x) ≤ iB (x) ≤ ic (x) and
(iv) A ⊂ B ⊂ C ⇒ S(A, C) ≤ S(A, B) ∧ S(B, C)
fA (x) ≥ fB (x) ≥ fc (x) ∀x ∈ U.
But individual measures may satisfy more properties in
addition to (9). Now
Now similarity can be calculated using several tech-
niques. Here we have adopted three common techniques |tA (x) − tB (x)| ≤ |tA (x) − tc (x)| and
namely, the distance based, the one based on a matching
function and lastly on membership grade based. |tB (x) − tC (x)| ≤ |tA (x) − tC (x)| will hold.

4.1. Distance based similarity measure Similarly, |iA (x) − iB (x)| ≤ |iA (x) − ic (x)| and
|iB (x) − iC (x)| ≤ |iA (x) − iC (x)| and also
We know that similarity is inversely proportional
with the distance between them. Using the distances |fA (x) − fB (x)| ≤ |fA (x) − fc (x)| and
defined in Equation (1)–(4) we define measures of sim-
ilarity s1 between two SVN sets A and B as follows: |fB (x) − fC (x)| ≤ |fA (x) − fC (x)| holds.
P. Majumdar and S.K. Samanta / On similarity and entropy of neutrosophic sets 1249


Thus ⇒ {min{tA (xi ), tB (xi )}
x
d(A, B) ≤ d(A, C) ⇒ s1 (A, B) ≥ s1 (A, C) and
+ min{iA (xi ), iB (xi )}
d(B, C) ≤ d(A, C) ⇒ s1 (B, C) ≥ s1 (A, C)
+ min{fA (xi ), fB (xi )}}
⇒ s1 (A, C) ≤ s1 (A, B) 
= {max{tA (xi ), tB (xi )} + max{iA (xi ),
∧s1 (B, C). x
iB (xi )} + max{fA (xi ), fB (xi )}
This is true for all the distance functions defined in 
Equations (1) to (4). ⇒ {[min{tA (xi ), tB (xi )}−max{tA (xi ), tB (xi )}]
Hence the result. x
+ [min{iA (xi ), iB (xi )} − max{iA (xi ), iB (xi )}]
4.2. Similarity measure based on membership + min[{fA (xi ), fB (xi )}
degrees
− max{fA (xi ), fB (xi )}] = 0
Another measure of similarity s2 between two SVN Thus for each x,
sets A and B could be defined as follows:

n
{min{tA (xi ), tB (xi )} + min{iA (xi ), iB (xi )} + min{fA (xi ), fB (xi )}}
i=1
s2 (A, B) = (12)
n
{max{tA (xi ), tB (xi )} + max{iA (xi ), iB (xi )} + max{fA (xi ), fB (xi )}
i=1

Example 4.2.1. Here the similarity measure between min{tA (xi ), tB (xi )} − max{tA (xi ), tB (xi )} = 0,
the two SVN sets defined in example 3.2 will be
min{iA (xi ), iB (xi )} − max{iA (xi ), iB (xi )} = 0 and
3.8 ∼ min{fA (xi ), fB (xi )} − max{fA (xi ), fB (xi )} = 0 holds.
s2 (A, B) = = 0.535.
7.1
Thus tA (x) = tB (x), iA (x) = iB (x) &fA (x) = fB (x)
Proposition 4.2.2. The distance based similarity mea- → A = B.
sure s2 , between two SVNS A and B satisfies the
following properties: (iv) Now we prove the last result.

(i) 0 ≤ s2 (A, B) ≤ 1 Let A ⊂ B ⊂ C. Then we have


(ii) s2 (A, B) = 1 iff A = B
tA (x) ≤ tB (x) ≤ tc (x); iA (x) ≤ iB (x) ≤ ic (x) and
(iii) s2 (A, B) = s2 (B, A)
(iv) A ⊂ B ⊂ C ⇒ s3 (A, C) ≤ s3 (A, B) ∧ fA (x) ≥ fB (x) ≥ fc (x) ∀x ∈ U.
s3 (B, C).
Now
Proof. Properties (i) and (iii) follows readily from
definition. tA (x) + iA (x) + fB (x) ≥ tA (x) + iA (x) + fC (x) and
(ii) It is clear that if A = B ⇒ s2 (A, B) = 1. tB (x) + iB (x) + fA (x) ≤ tC (x) + iC (x) + fA (x)
Conversely, let


n
{min{tA (xi ), tB (xi )} + min{iA (xi ), iB (xi )} + min{fA (xi ), fB (xi )}}
i=1
s2 (A, B) = 1 ⇒ =1
n
{max{tA (xi ), tB (xi )} + max{iA (xi ), iB (xi )} + max{fA (xi ), fB (xi )}
i=1
1250 P. Majumdar and S.K. Samanta / On similarity and entropy of neutrosophic sets

tA (x) + iA (x) + fB (x) common. So for a specific disease all the symptoms are
∴ s2 (A, B) = not equally important. Here we can assign weights to
tB (x) + iB (x) + fA (x)
each symptom corresponding to a particular disease. So
tA (x) + iA (x) + fC (x) we have a universe of symptoms where each symptom
≥ = s2 (A, C)
tC (x) + iC (x) + fA (x) has its corresponding weights. In this case we cannot
use similarity measures described in 4.1 and 4.2.
Again similarly we have:
Again a suitable function is often used to measure
tB (x) + iB (x) + fC (x) ≥ tA (x) + iA (x) + fC (x) & the similarity between two sets which is called a match-
ing function. Chen [2, 3] first introduced the notion of
tC (x) + iC (x) + fA (x) ≥ tC (x) + iC (x) + fB (x)
matching function. Here a weighted similarity measure
sw between A and B has been defined using a matching
function as follows:

n
ωi (tA (xi )·tB (xi ) + iA (xi ) · iB (xi ) + fA (xi ) · fB (xi ))2 
i=1 wi .(PA (xi )PB (xi ))2
s (A, B) =
w
=  , (13)

n
wi (PA (xi )2 .PB (xi )2 )
ωi {(tA (xi )2 + iA (xi )2 + fA (xi )2 ).(tB (xi )2 + iB (xi )2 + fB (xi )2 )}
i=1

tB (x) + iB (x) + fC (x) where PA (xi ) = (tA (xi ), iA (xi ), fA (xi ))


∴ s2 (B, C) =
tC (x) + iC (x) + fB (x)
Example 4.3.1. Consider the two SVN sets in example
tA (x) + iA (x) + fC (x) 3.2. Further let the elements a, b, c, d of the universe
≥ = s2 (A, C)
tC (x) + iC (x) + fA (x) X have weights 0.1, 0.3, 0.5, 0.2 respectively.
Then the weighted similarity measure between the
⇒ s2 (A, C) ≤ s2 (A, B) ∧ s2 (B, C).
two SVN sets will be
Hence the proof of this proposition is complete.

0.1 × 0.3721 + 0.3 × 0.4356 + 0.5 × 0.16 + 0.2 × 0.6561 0.37911 ∼


sw (A, B) = = = 0.84
0.1 × 0.759 + 0.3 × 0.4956 + 0.5 × 0.1708 + 0.2 × 0.707 0.45138

Proposition 4.3.2. The weighted similarity measure sw ,


4.3. Similarity measure based on a matching between two SVNS A and B satisfies the following prop-
function erties:
(i) 0 ≤ sw (A, B) ≤ 1
Next consider a universe where each element xi has
(ii) sw (A, B) = 1 if A = B
a weight ωi . Then we require a new measure of similar-
(iii) sw (A, B) = sw (B, A)
ity different from those discussed earlier. Often weights
are associated with each element of an universe to give Proof. Trivially follows from definition and Cauchy-
an order of importance among the elements. To illus- Schawarz inequality.
trate the situation we give an example: Suppose that Note that here the result (ii) and (iv) (i.e. monotonac-
there is a system to detect a disease based on several ity law) of Equation (9) will not hold due to the effects
symptoms associated with it. Now each symptoms is of the weights.
characterized by three things namely a degree of truth,
a degree of indeterminacy and a degree of falsity. So
for each disease we can have a corresponding single 5. Entropy of a single valued neutrosophic set
valued neutrosophic set with symptoms as its elements.
These SVNSs will act as our knowledgebase. Whenever Entropy can be considered as a measure of uncer-
a patient comes with some health problem the system tainty involved in a set, whether fuzzy or intuitionistic
will generate his corresponding SVNS. The measure of fuzzy or vague etc. Here the SVNS are also capable of
similarity, as discussed in previous sections, between handling uncertain data, therefore as a natural conse-
these SVNSs can detect the possible disease. But we quence we are also interested in finding the entropy of
know that many diseases have several symptoms in a single valued neutrosophic set.
P. Majumdar and S.K. Samanta / On similarity and entropy of neutrosophic sets 1251

Entropy as a measure of fuzziness was first men- Proposition 5.2. E1 satisfies all the axioms given in
tioned by Zadeh [15] in 1965. Later De Luca-Termini definition 5.1.
[5] axiomatized the non-probabilistic entropy. Accord-
Proof.
ing to them the entropy E of a fuzzy set A should satisfy
the following axioms: (i) For a crisp set A and iA (x) = 0 ∀x ∈ X. Hence
E1 (A) = 0 holds.
(DT 1) E(A) = 0 iff A ∈ 2X (ii) If A be such that (tA (x), iA (x), fA (x)) = (0.5,
(DT 2)E(A) = 1 iff µA (x) = 0.5, ∀x ∈ X 0.5, 0.5)∀x ∈ X, then

(DT 3)E(A) ≤ E(B) iff A is less fuzzy than B, tA (x) + fA (x) = 1 and iA (x) − iAc (x)
i.e. ifµA (x) ≤ µB (x) ≤ 0.5 ∀x ∈ X = 0.5 − 0.5
or ifµA (x) ≥ µB (x) ≥ 0.5, ∀x ∈ X. = 0 ∀x ∈ X ⇒ E1 (A) = 1

(iii) It holds from definition.


(DT 4)E(Ac ) = E(A) (14) (iv) E1 (A) = E1 (Ac ) holds obviously from defini-
tion.
Several other authors have investigated the notion of
entropy. Kaufmann [6] proposed a distance based mea- Thus E1 is an entropy function defined on N(X).
sure of soft entropy; Yager [13] gave another view of
degree of fuzziness of any fuzzy set in terms of lack of Example 5.3. Let X = {a, b, c, d} be the universe and
distinction between the fuzzy set and its complement. A be a single valued neutrosophic set in X defined as
Kosko [7] investigated the fuzzy entropy in relation to a follows:
measure of subset hood. Szmidt & Kacprzyk [11] stud- 
a b
ied the entropy of intuitionistic fuzzy sets. Majumdar A= , ,
and Samanta [8] investigated the entropy of soft sets < 0.5, 0.2, 0.9 > < 0.8, 0.4, 0.2 >

etc. c d
, .
< 0.3, 0.8, 0.7 > < 0.6, 0.3, 0.5 >
Definition 5.1. Similarly here in case of SVNS also
we introduce the entropy as a function EN : N(X) → Then the entropy of A will be E1 (A) = 1 − 0.52 =
[0, 1] which satisfies the following axioms: 0.48.
(i) EN (A) = 0 if A is a crisp set
(ii) EN (A) = 1 if (tA (x), iA (x), fA (x)) =
(0.5, 0.5, 0.5)∀x ∈ X 6. Discussions
(iii) EN (A) ≥ EN (B) if A more uncertain than B,
The techniques of similarity and entropy described
i.e. tA (x) + fA (x) ≤ tB (x) + fB (x) and here are totally new concepts for single valued neutro-
sophic sets. For the case of entropy no earlier methods of
|iA (x) − iAc (x)| ≤ |iB (x) − iBc (x)|
measuring entropy will be able to determine the entropy
(iv) EN (A) = EN (Ac )∀A ∈ N(X) (15) of a SVNSs, as here the sum of the degrees of truth,
indeterminacy and falsity is not necessarily bounded. A
Now notice that in a SVNS the presence of uncertainty fuzzy set F can be considered as a SVNS with degree
is due to two factors, firstly due to the partial belonging- of indeterminacy and degree of  falsity zero, then its
ness and partial non-belongingness and secondly due to entropy will be E1 (F ) = 1 − n1 (tA (xi )). For crisp
the indeterminacy factor. Considering these two factors xi ∈X
set this value is 0, which is conformable with earlier
we propose an entropy measure E1 of a single valued
known result. The similarity measurements described
neutrosophic sets A as follows:
here are again totally new concepts for SVNSs. These
1  measures can also be applied to measure the similarity
E1 (A) = 1 − (tA (xi ) + fA (xi )) · |iA (xi ) of intuitionistic fuzzy sets (IFS) where the indetermi-
n
xi ∈X
nacy factor i should be replaced by π = 1 − t − f in
−iAc (xi )| (16) case of IFS.
1252 P. Majumdar and S.K. Samanta / On similarity and entropy of neutrosophic sets

7. Conclusion [2] S.M. Chen, A new approach to handling fuzzy decision making
problems, IEEE Transactions on Systems, Man and Cybernet
18 (1988), 1012–1016.
The recently proposed notion of neutrosophic sets [3] S.M. Chen and P.H. Hsiao, A Comparison of similarity mea-
is a general formal framework for studying uncertain- sures of fuzzy values, Fuzzy Sets and Systems 72 (1995), 79–89.
ties arising due to ‘indeterminacy’ factors. From the [4] C.C. Cheng and K.H. Liao, Parameter optimization based on
philosophical point of view, it has been shown that a entropy weight and triangular fuzzy number, Int J of Engineer-
ing and Industries 2(2) (2011), 62–75.
neutrosophic set generalizes a classical set, fuzzy set, [5] A. De Luca, and A.S. Termini, A definition of a non-
interval valued fuzzy set, intuitionistic fuzzy set etc. A probabilistic entropy in the setting of fuzzy sets theory,
single valued neutrosophic set is an instance of neu- Information & Control 20 (1972), 301–312.
trosophic set which can be used in real scientific and [6] A. Kaufmann, Introduction to the theory of Fuzzy Subsets-Vol
1: Fundamental Theoretical Elements, Academic Press, New
engineering applications. Therefore the study of sin- York, 1975.
gle valued neutrosophic sets and its properties have a [7] B. Kosoko, Fuzzy entropy and conditioning, Information Sci-
considerable significance in the sense of applications ence 40(2) (1986), 165–174.
[8] P. Majumdar and S.K. Samanta, Softness of a soft set: Soft
as well as in understanding the fundamentals of uncer-
set entropy, accepted in Annals of fuzzy math and informatics,
tainty. In this paper we have introduced some measures 2012.
of similarity and entropy of single valued neutrosophic [9] E. Pasha, Fuzzy entropy as cost function in image processing,
sets for the first time. These measures are consistent Proc of the 2nd IMT-GT Regional Conf on Math, Stat and Appl,
Universiti Sains Malaysia, Penang, 2006.
with similar considerations for other sets like fuzzy sets [10] F. Smarandache, A Unifying Field in Logics. Neutrosophy:
and intuitionistic fuzzy sets etc. Neutrosophic Probability, Set & Logic. American Research
Press, Rehoboth, 1999.
[11] E. Szmidt and J. Kacprzyk, Entropy for intuitionistic fuzzy
sets, Fuzzy Sets & Systems 118 (2001), 467–477.
Acknowledgments [12] H. Wang, et al., Single valued neutrosophic sets, Proc Of 10th
Int Conf on Fuzzy Theory and Technology, Salt Lake City,
We sincerely thank the anonymous reviewers for Utah, 2005.
their valuable comments which helped us to write the [13] R.R. Yagar, On the measure of fuzziness and negation, Part
I: Membership in the unit interval, International Journal of
paper in its present form. General Systems 5 (1979), 189–200.
[14] L. Zadeh, Fuzzy sets, Information and Control 8 (1965), 87–96.
[15] L. Zadeh, Fuzzy sets and systems, in: Proc Symp on Systems
References Theory, Polytechnic Institute of Brooklyn, New York, 1965.

[1] K. Atanasov, Intuitionistic Fuzzy Sets, Fuzzy Sets & Systems


20 (1986), 87–96.

You might also like