You are on page 1of 13

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/329968397

Entropy, Distance and Similarity Measures under Interval Valued


Intuitionistic Fuzzy Environment

Article  in  Informatica · December 2018


DOI: 10.31449/inf.v42i4.1303

CITATIONS READS

6 102

2 authors, including:

Pratiksha Tiwari
Delhi Institute of Advanced Studies
36 PUBLICATIONS   56 CITATIONS   

SEE PROFILE

Some of the authors of this publication are also working on these related projects:

A Study of Skill Development Situation and Model Development Relating to Employability for Vocational Education in National Capital Region, India View project

Structural Model for Skill Development and Women Empowerment through Vocation Education View project

All content following this page was uploaded by Pratiksha Tiwari on 03 January 2019.

The user has requested enhancement of the downloaded file.


https://doi.org/10.31449/inf.v42i4.1303 Informatica 42 (2018) 617–627 617

Entropy, Distance and Similarity Measures under Interval-Valued


Intuitionistic Fuzzy Environment
Pratiksha Tiwari and Priti Gupta
Delhi Institute of Advanced Studies, Plot No. 6, Sector- 25, Rohini, Delhi, India
E-mail: parth12003@yahoo.co.in

Keywords: entropy measure, distance, similarity measure, interval valued intuitionistic fuzzy sets
Received: July 21, 2016

This paper presents new axiomatic definitions of entropy measure using concept of probability and
distance for interval-valued intuitionistic fuzzy sets (IvIFSs) by considering degree of hesitancy which is
consistent with the definition of entropy given by De Luca and Termini. Thereafter, we propose some
entropy measures and also derived relation between distance, entropy and similarity measures for
IvIFSs. Further, we checked the performance of proposed entropy and similarity measures on the basis
of intuition and compared with the existing entropy and similarity measures using numerical examples.
Lastly, proposed similarity measures are used to solve problems in the field of pattern recognition and
medical diagnoses.
Povzetek: V prispevku so predstavljene nove aksiomatske definicije entropijske mere za intervalno
intuicionistične mehke množice.

1 Introduction
Fuzzy set theory (Zadeh, 1965) is tool that can handle IvIFSs. Applications of the aforesaid entropy, distance
uncertainty and imprecision effortlessly. Interval-valued, and similarity measures are for recognition of patterns,
intuitionistic, interval-valued intuitionistic fuzzy sets medical diagnoses, and decision making with multiple
(Zadeh (1975), Atanassov (1986), Atanassov & Gargov criteria and expert systems problems. However, most of
(1989)), vague sets (Gau & Buehrer, 1993) and R-fuzzy these distance, similarity or entropy measures do not
sets (Yang, Hinde, 2010) are various generalizations of consider hesitancy index between IvIFSs. Hesitance
Fuzzy sets (FSs). From all these generalizations IvIFSs index play a very important role when membership and
and intuitionistic fuzzy sets (IFSs) are two conventional non-membership degree do not differ much for two
extensions of FSs. IvIFSs are more practical and flexible IvIFSs but their hesitant index does. Some of the authors,
than IFSs as they are characterized by membership and Xu (2007), Xu & Xia (2011), Wu et al. (2014)
non-membership degree range instead of real numbers. It considered hesitancy into the measure of distance,
makes IvIFSs more useful in dealing with real world similarity and entropy developed by them. Since
complexities which arises due to insufficient information, hesitancy index also has a vital role in any decision
lack of data, imprecise knowledge and human nature making as it outclasses the existing methods and deals
wherein range is provided instead of real numbers. with decision process in a better way. Dammak et al.
Distance, entropy and similarity measures are the central (2016) studies some possibility measures in multi-criteria
arenas that are investigated by various researchers under decision making under 𝐼𝑣𝐼𝐹𝐸. Tiwari & Gupta(in press)
intuitionistic and interval-valued fuzzy environment (IFE proposed generalized entropy and similarity measure for
and IvFE). These measures identify the similarity or 𝐼𝑣𝐼𝐹𝑆 with application in decision making. Zang et al.
dissimilarity between two FSs. Till date, vivid entropy, (2016) defined some operations on 𝐼𝑣𝐼𝐹𝑆𝑠 and proposed
distance or similarity measures are presented by various some aggregation operators for 𝐼𝑣𝐼𝐹𝑆𝑠 w.r.t. the
investigators. Some of these research findings are restricted interval Shapley function with application in
mentioned as follows: Xu (2007 a, b) introduced the multi-criteria decision making. In this paper we have
concept of similarity between IvIFSs along with some developed some of the distance, entropy and similarity
distance measure. Zang et al. (2009) defined a entropy measures by taking all the three degrees in account and
axiomatically for interval-valued fuzzy sets (IvFSs) and applied it to pattern recognition and medical diagnoses
discussed relation between entropy and similarity under 𝐼𝑣𝐼𝐹𝐸.
measures. Xu and Yager (2009) studied preference This work is organized in various sections. Section 2
relation and defined similarity measure under IvFE and has basic definition and operations on 𝐼𝑣𝐼𝐹𝑆𝑠. Section 3,
interval-valued intuitionistic environment (IvIFE). Wei presents the relationship between distance and entropy
et al (2011) derived a generalized measure of entropy for measures along with example to check the performance
IvIFSs. Cosine similarity measures for IvIFSs are defined of entropy measures on the basis of intuition. A relation
by both Ye (2012) and Singh (2012). Sun & Liu (2012), between measure of entropy and similarity measure is
Hu & Li (2013), Zhang et al. (2014) proposed entropy proposed in Section 4. Further, comparison of new
and similarity measure along with their relationship for similarity measures with the few existing one in done.
618 Informatica 42 (2018) 617–627 P. Tiwari et al.

Thereafter in section 5 we applied new similarity 2) Normalized Hamming Distance


measures to recognition of patterns and medical 𝟏
𝐃𝟐 (𝐀, 𝐁) = ∑𝒏𝒊=𝟏[|𝑴𝑽𝑨𝑳 (𝒙𝒊 ) − 𝑴𝑽𝑩𝑳 (𝒙𝒊 )| +
diagnoses. Lastly conclusion is drawn in Section 6. 𝟖𝐧
|𝑴𝑽𝑨𝑼 (𝒙𝒊 ) − 𝑴𝑽𝑩𝑼 (𝒙𝒊 )| + |𝑵𝑽𝑨𝑳 (𝒙𝒊 ) −
𝑵𝑽𝑩𝑳 (𝒙𝒊 )| + |𝑵𝑽𝑨𝑼 (𝒙𝒊 ) − 𝑵𝑽𝑩𝑼 (𝒙𝒊 )| +
2 IvIFSs along with its distance and |𝑯𝑽𝑨𝑳 (𝒙𝒊 ) − 𝑯𝑽𝑩𝑳 (𝒙𝒊 )| + |𝑯𝑽𝑨𝑼 (𝒙𝒊 ) −
similarity measures 𝑯𝑽𝑩𝑼 (𝒙𝒊 )|]
This section has definitions and concepts for IvIFSs. In …(2)
this paper Ω = {𝑥1 , … . , 𝑥𝑛 } denotes the universe of 3) Hamming Hausdorff Normalized Distance
discourse;ℂ(Ω) and IvIFSs(Ω) denote all crisp sets and 1
𝐼𝑣𝐼𝐹𝑆𝑠 respectively in Ω. D3 (A, B) = ∑𝑛𝑖=1[|𝑀𝑉𝐴𝐿 (𝑥𝑖 ) − 𝑀𝑉𝐵𝐿 (𝑥𝑖 )| ∨
4n
|𝑀𝑉𝐴𝑈 (𝑥𝑖 ) − 𝑀𝑉𝐵𝑈 (𝑥𝑖 )| + |𝑁𝑉𝐴𝐿 (𝑥𝑖 ) −
Definition 1 ( Atanassov & Gargov,1989): An IvIFS A 𝑁𝑉𝐵𝐿 (𝑥𝑖 )| ∨ |𝑁𝑉𝐴𝑈 (𝑥𝑖 ) − 𝑁𝑉𝐵𝑈 (𝑥𝑖 )| +
in the finite universe Ω is defined by a triplet |𝐻𝑉𝐴𝐿 (𝑥𝑖 ) − 𝐻𝑉𝐵𝐿 (𝑥𝑖 )| ∨ |𝐻𝑉𝐴𝑈 (𝑥𝑖 ) −
〈𝑥𝑖 , 𝑀𝑉𝐴 (𝑥𝑖 ), 𝑁𝑉𝐴 (𝑥𝑖 ), 𝐻𝑉𝐴 (𝑥𝑖 )〉 as 𝑥𝑖 ∈ Ω where 𝐻𝑉𝐵𝑈 (𝑥𝑖 )|]
𝑀𝑉𝐴 (𝑥𝑖 ) = [𝑀𝑉𝐴𝐿 (𝑥𝑖 ), 𝑀𝑉𝐴𝑈 (𝑥𝑖 )] is called membership …(3)
value interval ,𝑁𝑉𝐴 (𝑥𝑖 ) = [𝑁𝑉𝐴𝐿 (𝑥𝑖 ), 𝑁𝑉𝐴𝑈 (𝑥𝑖 )] is called
non-membership value interval and 𝐻𝑉𝐴 (𝑥𝑖 ) = 4) Hausdorff Normalized Hamming Distance
[𝐻𝑉𝐴𝐿 (𝑥𝑖 ), 𝐻𝑉𝐴𝑈 (𝑥𝑖 )], 𝐻𝑉𝐴𝐿 (𝑥𝑖 ) = 1 − 𝑀𝑉𝐴𝑈 (𝑥𝑖 ) − D4 (A, B) =
𝑁𝑉𝐴𝑈 (𝑥𝑖 ), 𝐻𝐴𝑈 (𝑥𝑖 ) = 1 − 𝑀𝑉𝐴𝐿 (𝑥𝑖 ) − 𝑁𝑉𝐴𝐿 (𝑥𝑖 ) such |𝑀𝑉𝐴𝐿 (𝑥𝑖 )−𝑀𝑉𝐵𝐿 (𝑥𝑖 )|+|𝑀𝑉𝐴𝑈 (𝑥𝑖 )−𝑀𝑉𝐵𝑈 (𝑥𝑖 )|
,
that 0 ≤ 𝑀𝑉𝐴𝑈 (𝑥𝑖 ) + 𝑁𝑉𝐴𝑈 (𝑥𝑖 ) ≤ 1 for each 𝑥𝑖 ∈ 𝐴. 2
1 |𝑁𝑉𝐴𝐿 (𝑥𝑖 )−𝑁𝑉𝐵𝐿 (𝑥𝑖 )|+|𝑁𝑉𝐴𝑈 (𝑥𝑖 )−𝑁𝑉𝐵𝑈 (𝑥𝑖 )|
Liu (1992) defined distance and similarity measures ∑𝑛 𝑚𝑎𝑥 ,
4n 𝑖=1 2
for IvIFSs axiomatically which are given as follows: |𝐻𝑉𝐴𝐿 (𝑥𝑖 )−𝐻𝑉𝐵𝐿(𝑥𝑖 )|+|𝐻𝑉𝐴𝑈 (𝑥𝑖 )−𝐻𝑉𝐵𝑈 (𝑥𝑖 )|
{ 2 }
Definition 2: For any two IvIFSs A and B, a real valued
…(4)
function D: IvIFSs(Ω) × IvIFSs(Ω) ⟶ [0,1] is termed as
a distance measure of IvIFSs on Ω, if it satisfies the 5) Averaged fifth Distance Measure
below mentioned axioms: D5 (A, B) =
|𝑀𝑉𝐴𝐿 (𝑥𝑖 )−𝑀𝑉𝐵𝐿(𝑥𝑖 )|+|𝑀𝑉𝐴𝑈 (𝑥𝑖 )−𝑀𝑉𝐵𝑈 (𝑥𝑖 )|
1. ̅ ) = 1.
For any crisp set A, we have D(A, A [+|𝑁𝑉𝐴𝐿 (𝑥𝑖 )−𝑁𝑉𝐵𝐿 (𝑥𝑖 )|+|𝑁𝑉𝐴𝑈 (𝑥𝑖 )−𝑁𝑉𝐵𝑈 (𝑥𝑖 )|]
2. Distance between any two IvIFSs A and B is 1 +|𝐻𝑉𝐴𝐿 (𝑥𝑖 )−𝐻𝑉𝐵𝐿 (𝑥𝑖 )|+|𝐻𝑉𝐴𝑈 (𝑥𝑖 )−𝐻𝑉𝐵𝑈 (𝑥𝑖 )|
∑𝑛𝑖=1 +
zero iff A = B. 2n 8
3. Distance measure is symmetrical w.r.t to any
{
two IvIFSs A and B. |𝑀𝑉𝐴𝐿 (𝑥𝑖 )−𝑀𝑉𝐵𝐿 (𝑥𝑖 )|+|𝑀𝑉𝐴𝑈 (𝑥𝑖 )−𝑀𝑉𝐵𝑈 (𝑥𝑖 )|,
4. For any three IvIFSs A, B and C such that A ⊆ 𝑚𝑎𝑥( |𝑁𝑉𝐴𝐿 (𝑥𝑖 )−𝑁𝑉𝐵𝐿 (𝑥𝑖 )|+|𝑁𝑉𝐴𝑈 (𝑥𝑖 )−𝑁𝑉𝐵𝑈 (𝑥𝑖 )|, )
|𝐻𝑉𝐴𝐿 (𝑥𝑖 )−𝐻𝑉𝐵𝐿 (𝑥𝑖 )|+|𝐻𝑉𝐴𝑈 (𝑥𝑖 )−𝐻𝑉𝐵𝑈 (𝑥𝑖 )|
B ⊆ C, we have D(A, C) ≥ D(A, B)
4
and D(A, C) ≥ D(B, C).
}
Distance between FSs was presented by (Kacprzyk, …(5)
1997). Then its extension was proposed by Atanassov in
1999 as two dimensional distances whereas third 6) Generalized Measure of Distance , for p ≥ 2,
1
parameter hesitancy degree in distance was introduced by D6 (A, B) = { ∑𝑛𝑖=1(|𝑀𝑉𝐴𝐿 (𝑥𝑖 ) − 𝑀𝑉𝐵𝐿 (𝑥𝑖 )| ∨
12n
Szmidt and Kacprzyk (2000) for intuitionistic fuzzy sets.
|𝑀𝑉𝐴𝑈 (𝑥𝑖 ) − 𝑀𝑉𝐵𝑈 (𝑥𝑖 )|)𝑝 + (|𝑁𝑉𝐴𝐿 (𝑥𝑖 ) −
Yang & Chiclana (2012) proved three dimensional
𝑁𝑉𝐵𝐿 (𝑥𝑖 )| ∨ |𝑁𝑉𝐴𝑈 (𝑥𝑖 ) − 𝑁𝑉𝐵𝑈 (𝑥𝑖 )|)𝑝 +
distance consistency over two dimensional distances.
(|𝐻𝑉𝐴𝐿 (𝑥𝑖 ) − 𝐻𝑉𝐵𝐿 (𝑥𝑖 )| ∨ |𝐻𝑉𝐴𝑈 (𝑥𝑖 ) −
Grzegorzewski (2004) and Park et al. (2007) gave 1/p
distance measure for IvFSs and IvIFSs respectively. Here 𝐻𝑉𝐵𝑈 (𝑥𝑖 )|)𝑝 }
we extend the distance measures by considering …(6)
hesitancy degree for IvIFSs. For any two IvIFSs A and B,
we define the following measures of distance: Definition 3: Let A and B be any two IvIFSs, a real
valued function S: IvIFSs(Ω) × IvIFSs(Ω) ⟶ [0,1] is
1) Normalized Euclidean Distance defined as a measure of similarity for IvIFSs on Ω, if it
1
D1 (A, B) = { ∑𝑛𝑖=1 [(𝑀𝑉𝐴𝐿 (𝑥𝑖 ) − satisfies axioms mentioned below:
12𝑛 ̅)=0
2 2 1. For any crisp set A, we have S(A, A
𝑀𝑉𝐵𝐿 (𝑥𝑖 )) + (𝑀𝑉𝐴𝑈 (𝑥𝑖 ) − 𝑀𝑉𝐵𝑈 (𝑥𝑖 )) + 2. Measure of similarity between any two IvIFSs is
2
(𝑁𝑉𝐴𝐿 (𝑥𝑖 ) − 𝑁𝑉𝐵𝐿 (𝑥𝑖 )) + (𝑁𝑉𝐴𝑈 (𝑥𝑖 ) − 1 iff A = B.
2 2 3. Measure of similarity is symmetric w.r.t. any
𝑁𝑉𝐵𝑈 (𝑥𝑖 )) + (𝐻𝑉𝐴𝐿 (𝑥𝑖 ) − 𝐻𝑉𝐵𝐿 (𝑥𝑖 )) +
1⁄ two IvIFSs.
2 2
(𝐻𝑉𝐴𝑈 (𝑥𝑖 ) − 𝐻𝑉𝐵𝑈 (𝑥𝑖 )) ]} 4. For any three IvIFSs A, B and C such that A ⊆
…(1) B ⊆ C. We have S(A, C) ≤ S(A, B) and
S(A, C) ≤ S(B, C).
Entropy, Distance and Similarity Measures under... Informatica 42 (2018) 617–627 619

From axiomatic definition of distance and similarity ∀A, B ∈ IvIFSs(Ω), where D is measure of
measures it is clear that S (A, B) = 1 − D(A, B) where A distance.
and B are IvIFSs , D and S are distance and similarity 4. E(A) = E(A ̅), where A
̅ is complement of A,
measure for IvIFSs respectively. where A, B ∈ IvIFSs(Ω).

2.1 Entropy measure for IvIFSs In the next section, we derive a relation which relates
measure of distance and entropy for IvIFSs, which
In 1972, De Luca and Termini defined measure of satisfies all the axioms of the definition of entropy.
entropy for FSs. Hung & Yang (2006) extended
definition for IFSs considering hesitancy degree. The
following definition for entropy is an extension of 3 Relation between measure of
definition of entropy proposed by Hung & Yang (2006) distance and entropy
for IvIFSs.
Here, we develop a technique which obtains entropy
Definition 4: A real valued function E: IvIFSs(Ω) → measure for IvIFSs which satisfies the aforementioned
[0,1] is termed as measure of entropy under IvIFE, if properties.
below mentioned axioms are satisfied:
Theorem 2: Let Dj , j= 1, … ,6 be the above-mentioned
1. E(A) = 0, ∀A ∈ ℂ(Ω); six distance measure equations (1)-(6) between IvIFSs,
2. E(A) = 1, iff 𝑀𝐴 (𝑥𝑖 ) = 𝑁𝐴 (𝑥𝑖 ) = 𝐻𝐴 (𝑥𝑖 ) = 1 1 1 1 1 1
then, Ej (A) = 1 − 3Dj (A, 〈[ , ] , [ , ] , [ , ]〉), j=
1 1 3 3 3 3 3 3
[ , ] , ∀ 𝑥𝑖 ∈ Ω; 1, … ,6 for any A ∈ IvIFSs(Ω) are measure of entropy of
3 3
3. E(A) ≤ E(B), if A is less fuzzy than B; IvIFSs.
4. E(A) = E(A ̅), where A ̅ is complement of A,
Proof: We prove that E𝑗 (A), for j = 1, … ,6 satisfies
where A, B ∈ IvIFSs(Ω).
conditions given by definition 5.
Above definition is steady with description of measure of Property1): If A ∈ ℂ(Ω) ⇒ 𝐴(𝑥𝑖 ) = 〈[1,1], [0,0], [0,0]〉
entropy given by De Luca & Termini (1972). As it is or 𝐴(𝑥𝑖 ) = 〈[0,0], [1,1], [0,0]〉, ∀ 𝑥𝑖 ∈ 𝛺, then for j=
known that complete description of an IvIFS A ∈ Ω has 1, … ,6
three degrees membership, non-membership and 1 1 1 1 1 1 1
hesitancy with 𝑀𝑉𝐴𝑈 (𝑥𝑖 ) + 𝑁𝑉𝐴𝑈 (𝑥𝑖 ) + 𝐻𝑉𝐴𝐿 (𝑥𝑖 ) = 1 Dj (A, 〈[ , ] , [ , ] , [ , ]〉) = .
3 3 3 3 3 3 3
and 𝑀𝑉𝐴𝐿 (𝑥𝑖 ) + 𝑁𝑉𝐴𝐿 (𝑥𝑖 ) + 𝐻𝑉𝐴𝑈 (𝑥𝑖 ) = 1 with Thus, Ej (A) = 0
0 ≤ 𝑀𝑉𝐴𝑈 (𝑥𝑖 ), 𝑁𝑉𝐴𝑈 (𝑥𝑖 ), 𝐻𝑉𝐴𝐿 (𝑥𝑖 ),
𝑀𝑉𝐴𝐿 (𝑥𝑖 ), 𝑁𝑉𝐴𝐿 (𝑥𝑖 ), 𝐻𝑉𝐴𝑈 (𝑥𝑖 ) ≤ 1. By taking all the Property 2): For all j = 1, … ,6, Ej (A) = 1
three in to consideration we may assume them as 1 1 1 1 1 1
probability measure. Therefore the entropy is maximum ⟺ 1 − 3Dj (A, 〈[ , ] , [ , ] , [ , ]〉) = 1
3 3 3 3 3 3
when all the variables are equal (i.e. 𝑀𝑉𝐴𝑈 (𝑥𝑖 ) = 1 1 1 1 1 1
1 ⟺ 3D𝑗 (A, 〈[ , ] , [ , ] , [ , ]〉) = 0
𝑁𝑉𝐴𝑈 (𝑥𝑖 ) = 𝐻𝑉𝐴𝐿 (𝑥𝑖 ) = and 𝑀𝑉𝐴𝐿 (𝑥𝑖 ) = 𝑁𝑉𝐴𝐿 (𝑥𝑖 ) = 3 3 3 3 3 3
3 1 1 1 1 1 1
1
𝐻𝑉𝐴𝑈 (𝑥𝑖 ) = ) and zero (minimum) when only one ⟺ A = 〈[ , ] , [ , ] , [ , ]〉
3 3 3 3 3 3 3
variable is exists (i.e. 𝑀𝑉𝐴𝐿 (𝑥𝑖 ) = 𝑀𝑉𝐴𝑈 (𝑥𝑖 ) =
Property3): Let A and B be any two IvIFSs and
1, 𝑁𝑉𝐴𝐿 (𝑥𝑖 ) = 𝑁𝑉𝐴𝑈 (𝑥𝑖 ) = 0, 𝐻𝑉𝐴𝐿 (𝑥𝑖 ) = 𝐻𝑉𝐴𝑈 (𝑥𝑖 ) = 0 1 1 1 1 1 1
or 𝑀𝑉𝐴𝐿 (𝑥𝑖 ) = 𝑀𝑉𝐴𝑈 (𝑥𝑖 ) = 0, 𝑁𝑉𝐴𝐿 (𝑥𝑖 ) = 𝑁𝑉𝐴𝑈 (𝑥𝑖 ) = Dj (A, 〈[ , ] , [ , ] , [ , ]〉) ≥
3 3 3 3 3 3
1, 𝐻𝑉𝐴𝐿 (𝑥𝑖 ) = 𝐻𝑉𝐴𝑈 (𝑥𝑖 ) = 1 1 1 1 1 1 1
Dj (B, 〈[ , ] , [ , ] , [ , ]〉) then
or 𝑀𝑉𝐴𝐿 (𝑥𝑖 ) = 𝑀𝑉𝐴𝑈 (𝑥𝑖 ) = 0, 𝑁𝑉𝐴𝐿 (𝑥𝑖 ) = 3 3 3 3 3 3
1 1 1 1 1 1
𝑁𝑉𝐴𝑈 (𝑥𝑖 ) = 0, 𝐻𝑉𝐴𝐿 (𝑥𝑖 ) = 𝐻𝑉𝐴𝑈 (𝑥𝑖 ) = 1). 1 − 3Dj (A, 〈[ , ] , [ , ] , [ , ]〉)
Again we extend the definition of entropy given by Zang 3 3 3 3 3 3
1 1 1 1 1 1
et al. (2014) based on distance for IvIFSs. In following ≥ 1 − 3Dj (B, 〈[ , ] , [ , ] , [ , ]〉)
definition we have considered degree of hesitation, 3 3 3 3 3 3
which is not considered by other definitions of entropy. ⟹ Ej (A) ≤ Ej (B), for all j= 1, … ,6

Definition 5: A real-valued function E: IvIFSs(Ω) → Property 4) : Let A be any 𝐼𝑣𝐼𝐹𝑆 then A ̅=


[0,1] is termed as measure of entropy under IvIFE, if the {〈𝑥𝑖 , [𝑁𝑉𝐴𝐿 (𝑥𝑖 ), 𝑁𝑉𝐴𝑈 (𝑥𝑖 )], [𝑀𝑉𝐴𝐿 (𝑥𝑖 ), 𝑀𝑉𝐴𝑈 (𝑥𝑖 )]〉⁄𝑥𝑖 ∈ Ω}
following axioms are satisfied: 1 1 1 1 1 1
⟹ Dj (A, 〈[ , ] , [ , ] , [ , ]〉)
3 3 3 3 3 3
1. E(A) = 0, ∀A ∈ ℂ(Ω); 1 1 1 1 1 1
= D𝑗 (A ̅, 〈[ , ] , [ , ] , [ , ]〉)
2. E(A) = 1, iff all the three description of IvIFSs 3 3 3 3 3 3
Thus, E𝑗 (A) = Ej (A ̅), for all j= 1, … ,6. ∎
intervals satifies 𝑀𝑉𝐴 (𝑥𝑖 ) = 𝑁𝑉𝐴 (𝑥𝑖 ) =
1 1
𝐻𝑉𝐴 (𝑥𝑖 ) = [ , ] , ∀ 𝑥𝑖 ∈ Ω From theorem 2 and various distance formulas’
3 3
1 1 1 1 1 1 mentioned (equation (1) to (6)), we get corresponding
3. If D (A, 〈[ , ] , [ , ] , [ , ]〉) ≥
3 3 3 3 3 3 entropy formulas as follows:
1 1 1 1 1 1
D (B, 〈[ , ] , [ , ] , [ , ]〉),then E(A) ≤ E(B),
3 3 3 3 3 3
620 Informatica 42 (2018) 617–627 P. Tiwari et al.

1 1 2 Example: Consider two IvIFSs 𝐴 =


𝐸1 (𝐴) = 1 − 3 { ∑𝑛𝑖=1 [(𝑀𝑉𝐴𝐿 (𝑥𝑖 ) − ) +
12𝑛 3 {𝑥, 〈[0.2,0.2], [0.2,0.3], [0.5,0.6]〉, 𝑥 ∈ 𝛺} and 𝐵=
1 2 1 2 {𝑥, 〈[0.2,0.3], [0.4,0.6], [0.1,0.4]〉, 𝑥 ∈ 𝛺}, clearly we can
(𝑀𝑉𝐴𝑈 (𝑥𝑖 ) − ) + (𝑁𝑉𝐴𝐿 (𝑥𝑖 ) − ) +
3 3 see that A is more fuzzy than B. Then E𝑗 (A) and E𝑗 (B)
1 2 1 2
(𝑁𝑉𝐴𝑈 (𝑥𝑖 ) − ) + (𝐻𝑉𝐴𝐿 (𝑥𝑖 ) − ) + are given in Table 1:
3 3
1⁄
1 2 2 Entropies A B
(𝐻𝑉𝐴𝑈 (𝑥𝑖 ) − ) ]}
3 E1 0.5570 0.65866
E2 0.675 0.7
3 1
𝐸2 (𝐴) = 1 − ∑𝑛𝑖=1 [|𝑀𝑉𝐴𝐿 (𝑥𝑖 ) − | + E3 0.6 0.525
8𝑛 3
1 1 E4 0.675 0.75
|𝑀𝑉𝐴𝑈 (𝑥𝑖 ) − | + |𝑁𝑉𝐴𝐿 (𝑥𝑖 ) − | +
3 3 E5 0.5125 0.6
1 1
|𝑁𝑉𝐴𝑈 (𝑥𝑖 ) − | + |𝐻𝑉𝐴𝐿 (𝑥𝑖 ) − | + E6 0.7171 0.672
3 3
1
|𝐻𝑉𝐴𝑈 (𝑥𝑖 ) − |] Table 1: Entropies.
3

3 1
Since E3 (A) > E3 (B) and E6 (A) > E6 (B) which
𝐸3 (𝐴) = 1 − ∑𝑛𝑖=1 [|𝑀𝑉𝐴𝐿 (𝑥𝑖 ) − | ∨ indicates that E3 and E6 are consistent with the intuition.
4𝑛 3
1 1
|𝑀𝑉𝐴𝑈 (𝑥𝑖 ) − | + |𝑁𝑉𝐴𝐿 (𝑥𝑖 ) − | ∨
3 3
1 1
3.1 Comparison of existing entropy
|𝑁𝑉𝐴𝑈 (𝑥𝑖 ) − | + |𝐻𝑉𝐴𝐿 (𝑥𝑖 ) − | ∨ measure with proposed entropy
3 3
1
|𝐻𝑉𝐴𝑈 (𝑥𝑖 ) − |] measures
3
We compared performance of existing entropy measures
𝐸4 (𝐴) = 1 − with the proposed measures with the help of an example.
|𝑀𝑉𝐴𝐿 (𝑥𝑖 )−1/3|+|𝑀𝑉𝐴𝑈 (𝑥𝑖 )−1/3| Let A be an IvIFS, then
,
2
3 |𝑁𝑉𝐴𝐿 (𝑥𝑖 )−1/3|+|𝑁𝑉𝐴𝑈 (𝑥𝑖 )−1/3|
𝐸𝑍𝐽 (𝐴) =
∑𝑛 𝑚𝑎𝑥 ,
2𝑛 𝑖=1 2 1 𝑚𝑖𝑛(𝑀𝑉𝐴𝐿 (𝑥𝑖 ),𝑁𝑉𝐴𝐿 (𝑥𝑖 ))+𝑚𝑖𝑛(𝑀𝑉𝐴𝑈 (𝑥𝑖 ),𝑁𝑉𝐴𝑈 (𝑥𝑖 ))
|𝐻𝑉𝐴𝐿 (𝑥𝑖 )−1/3|+|𝐻𝑉𝐴𝑈 (𝑥𝑖 )−1/3| ∑𝑛=1 ,
𝑛 𝑚𝑎𝑥(𝑀𝑉𝐴𝐿 (𝑥𝑖 ),𝑁𝑉𝐴𝐿 (𝑥𝑖 ))+𝑚𝑎𝑥(𝑀𝑉𝐴𝑈 (𝑥𝑖 ),𝑁𝑉𝐴𝑈 (𝑥𝑖 ))
{ 2 }

E5 (A, B) 𝐸𝑊𝑊 (𝐴) =


=1
min(MVAL (xi ),NVAL (xi ))
|𝑀𝑉𝐴𝐿 (𝑥𝑖 ) − 1/3| + |𝑀𝑉𝐴𝑈 (𝑥𝑖 ) − 1/3| 1
[
+ min(MVAU (xi ),NVAU (xi ))+HVAL (xi )+HVAU (xi )
]

[+|𝑁𝑉𝐴𝐿 (𝑥𝑖 ) − 1/3| + |𝑁𝑉𝐴𝑈 (𝑥𝑖 ) − 1/3|] ∑n ,


n n =1 max(MVAL (xi ),NVAL (xi ))
[ ]
3 +|𝐻𝑉𝐴𝐿 (𝑥𝑖 ) − 1/3| + |𝐻𝑉𝐴𝑈 (𝑥𝑖 ) − 1/3| + max(MVAU (xi ),NVAU (xi ))++HVAL (xi )+HVAU (xi )
− ∑
2n 8
i=1
1
̅̅̅̅̅𝐴 (𝑥𝑖 ) +
𝐸𝑍𝑀 (𝐴) = ∑𝑛𝑖=1[1 − (𝑀𝑉
𝑛
{
|𝑀𝑉𝐴𝐿 (𝑥𝑖 ) − 1/3| + |𝑀𝑉𝐴𝑈 (𝑥𝑖 ) − 1/3|, ̅̅̅̅𝐴 (𝑥𝑖
𝑁𝑉 ̅̅̅̅̅𝐴 (𝑥𝑖 )+𝑁𝑉
))𝑒 1−(𝑀𝑉 ̅̅̅̅𝐴 (𝑥𝑖 ))
],
max ( |𝑁𝑉𝐴𝐿 (𝑥𝑖 ) − 1/3| + |𝑁𝑉𝐴𝑈 (𝑥𝑖 ) − 1/3|, )
|𝐻𝑉𝐴𝐿 (𝑥𝑖 ) − 1/3| + |𝐻𝑉𝐴𝑈 (𝑥𝑖 ) − 1/3|
+ where 𝑀𝑉 ̅̅̅̅̅𝐴 (𝑥𝑖 ) = 𝑀𝑉𝐴𝐿 (𝑥𝑖 ) + 𝜏∆𝑀𝑉𝐴 (𝑥𝑖 ), 𝑁𝑉
̅̅̅̅𝐴 (𝑥𝑖 ) =
2 𝑁𝑉𝐴𝐿 (𝑥𝑖 ) + 𝜏∆𝑁𝑉𝐴 (𝑥𝑖 ) and ∆𝑀𝑉𝐴 (𝑥𝑖 ) = 𝑀𝑉𝐴𝑈 (𝑥𝑖 ) −
𝑀𝑉𝐴𝐿 (𝑥𝑖 ) and ∆𝑁𝑉𝐴 (𝑥𝑖 ) = 𝑁𝑉𝐴𝑈 (𝑥𝑖 ) − 𝑁𝑉𝐴𝐿 (𝑥𝑖 ), 𝜏 ∈
} [0,1]
are the measures of entropies given by Zang et al. (2010),
𝐸6 (𝐴, 𝐵) = 1 − 3 ∑𝑛𝑖=1 [
1
[(|𝑀𝑉𝐴𝐿 (𝑥𝑖 ) − | ∨
1 Wei et al.(2011) and Zang et al. (2011) respectively.
12𝑛 3 Example: Consider the example from Sun & Liu (2012)
𝑝
1 1 to review the entropies for an IvIFSs 𝐴𝑖 which are as
|𝑀𝑉𝐴𝑈 (𝑥𝑖 ) − |) + (|𝑁𝑉𝐴𝐿 (𝑥𝑖 ) − | ∨
3 3 follows:
1 𝑝 1
|𝑁𝑉𝐴𝑈 (𝑥𝑖 ) − |) + (|𝐻𝑉𝐴𝐿 (𝑥𝑖 ) − | ∨
3 3
1⁄
𝑝
“𝐴1 = {𝑥, 〈[0.1,0.2], [0.2,0.4], [0.4,0.7]〉, 𝑥 ∈ 𝛺},
1 𝑝
|𝐻𝑉𝐴𝑈 (𝑥𝑖 ) − |) ]] 𝐴2 = {𝑥, 〈[0.2,0.2], [0.3,0.5], [0.3,0.5]〉, 𝑥 ∈ 𝛺},
3
𝐴3 = {𝑥, 〈[0.2,0.4], [0,0], [0.6,0.8]〉, 𝑥 ∈ 𝛺},
To check the consistency of proposed entropy measures
with the intuitionist beliefs we have used the following 𝐴4 = {𝑥, 〈[0.3,0.4], [0,0.142857], [0.4571,0.7]〉, 𝑥 ∈ 𝛺},
example. 𝐴5 = {𝑥, 〈[0.1,0.1], [0,0.2], [0.6,0.9]〉, 𝑥 ∈ 𝛺},
𝐴6 = {𝑥, 〈[0,0.2], [0,0.2], [0.4,1.0]〉, 𝑥 ∈ 𝛺}”
Entropy, Distance and Similarity Measures under... Informatica 42 (2018) 617–627 621

𝐸1 (𝐴𝑖 ) 𝐸2 (𝐴𝑖 ) 𝐸3 (𝐴𝑖 ) 𝐸4 (𝐴𝑖 ) 𝐸5 (𝐴𝑖 ) 𝐸6 (𝐴𝑖 )


A1 0.58167 0.625 0.45 0.675 0.4875 0.6063
A2 0.735425 0.75 0.65 0.8 0.675 0.765479
A3 0.367544 0.4 0.3 0.45 0.15 0.490098
A4 0.523512 0.582143 0.425 0.607143 0.398214 0.566987
A5 0.27889 0.3 0.15 0.3 0.05 0.395848
A6 0.271989 0.375 0.01 0.45 0.1375 0.292893
Table 2: Comparison of entropies.
The values of 𝐸𝑍𝐽 1 = 0.5 = 𝐸𝑍𝐽 (𝐴2 ),
(𝐴 ) 1 1 1 1 1 1
⟺ Sj (A, 〈[ , ] , [ , ] , [ , ]〉) = 1
𝐸𝑊𝑊 (𝐴3 ) = 0.7 = 𝐸𝑊𝑊 (𝐴4 ) and 𝐸𝑍𝑀 (𝐴5 ) = 0.5 = 3 3 3 3 3 3
𝐸𝑍𝑀 (𝐴6 ) . Thus, the entropies 𝐸𝑍𝐽 ( 𝐴𝑖 ), 𝐸𝑊𝑊 ( 𝐴𝑖 ) and ⟺ A = 〈[1 , 1] , [1 , 1] , [1 , 1]〉
𝐸𝑍𝑀 ( 𝐴𝑖 ) are unreasonable. Proposed entropies Ej , j = 3 3 3 3 3 3
1,2, . .6 can discriminate the fuzziness of all the IvIFSs Property3): Let A and B be any two IvIFSs and
1 1 1 1 1 1
𝐴𝑖 , 𝑖 = 1, … ,6 given as follows and give the reasonable Dj (A, 〈[3 , 3] , [3 , 3] , [3 , 3]〉) ≥
results given in Table 2. 1 1 1 1 1 1
D (B, 〈[ , ] , [ , ] , [ , ]〉) then
j 3 3 3 3 3 3
Here we have proposed some entropy measures and 1 1 1 1 1 1
evaluated its performance on the basis of intuitionistic 1 − D𝑗 (A, 〈[ , ] , [ , ] , [ , ]〉)
3 3 3 3 3 3
belief and comparison with existing measures. In next 1 1 1 1 1 1
section we propose relation between measures of entropy ≤ 1 − D𝑗 (B, 〈[ , ] , [ , ] , [ , ]〉)
3 3 3 3 3 3
and similarity under IvIFE. Also, we have defined new 1 1 1 1 1 1
measures of similarity and have evaluated their ⟺ Sj (A, 〈[ , ] , [ , ] , [ , ]〉)
3 3 3 3 3 3
performance by comparing them with some existing 1 1 1 1 1 1
measures. ≤ S𝑗 (B, 〈[ , ] , [ , ] , [ , ]〉)
3 3 3 3 3 3
⟺ E𝑗 (A) ≤ Ej (B), for all 𝑗 = 1, … ,6
4 Relations between measure of Property 4) : Let A be any IvIFS then A ̅=
{〈𝑥𝑖 , [𝑁𝑉𝐴𝐿 (𝑥𝑖 ), 𝑁𝑉𝐴𝑈 (𝑥𝑖 )], [𝑀𝑉𝐴𝐿 (𝑥𝑖 ), 𝑀𝑉𝐴𝑈 (𝑥𝑖 )]〉⁄𝑥𝑖 ∈ Ω}
entropy and similarity together 1 1 1 1 1 1
with new similarity measures ⟹ 𝐷𝑗 (A, 〈[ , ] , [ , ] , [ , ]〉)
3 3 3 3 3 3
In this section contains definition of new measures of 1 1 1 1 1 1
= Dj (A ̅, 〈[ , ] , [ , ] , [ , ]〉)
similarity under IvIFE and determined an important 3 3 3 3 3 3
relation between entropy and similarity measure IvIFSs 1 1 1 1 1 1
⟹ S𝑗 (A, 〈[ , ] , [ , ] , [ , ]〉)
which is discussed as follows. 3 3 3 3 3 3
1 1 1 1 1 1
Theorem 3: Let 𝑆𝑗 , for 𝑗 = 1, . . ,6 be measure of = S𝑗 (A̅, 〈[ , ] , [ , ] , [ , ]〉)
3 3 3 3 3 3
similarity of IvIFSs w.r.t. the measure of distance 𝐷𝑗 , for Thus, Ej (A) = Ej (A ̅), for all j = 1, … ,6. ∎
𝑗 = 1, . . ,6 respectively, and A be any IvIFS. Then Next, we present a conversion technique to define
1 1 1 1 1 1
𝐸𝑗 (𝐴) = 3𝑆𝑗 (𝐴, 〈[ , ] , [ , ] , [ , ]〉) − 2, for 𝑗= similarity measures established by entropy measure for
3 3 3 3 3 3
1, … ,6 are measures of entropy for IvIFSs. IvIFSs.

Proof: We prove that Ej (A), for j = 1, … ,6 satisfies Definition 6 : For any two IvIFSs A and B in Ω, such
that both A and B are defined by the triplet
conditions given by definition 5.
〈𝑥𝑖 , [𝑀𝑉𝐴𝐿 (𝑥𝑖 ), 𝑀𝑉𝐴𝑈 (𝑥𝑖 )], [𝑁𝑉𝐴𝐿 (𝑥𝑖 ), 𝑁𝑉𝐴𝑈 (𝑥𝑖 )]〉 and
〈𝑥𝑖 , [𝑀𝑉𝐵𝐿 (𝑥𝑖 ), 𝑀𝑉𝐵𝑈 (𝑥𝑖 )], [𝑁𝑉𝐵𝐿 (𝑥𝑖 ), 𝑁𝑉𝐵𝑈 (𝑥𝑖 )]〉 respec
Property 1): If A ∈ ℂ(Ω) ⇒ 𝐴(𝑥𝑖 ) = 〈[1,1], [0,0], [0,0]〉
tively. we define an IvIFSs ∅(A, B)using A and B as
or 𝐴(𝑥𝑖 ) = 〈[0,0], [1,1], [0,0]〉, ∀ 𝑥𝑖 ∈ 𝛺, then for j=
given below:
1, … ,6 1
1 1 1 1 1 1 𝑀𝑉∅(𝐴,𝐵)𝐿 (𝑥𝑖 )= {1 − [𝑚𝑎𝑥(|𝑀𝑉𝐴𝐿 (𝑥𝑖 ) − 𝑀𝑉𝐵𝐿 (𝑥𝑖 )| ∨
3
Sj (A, 〈[ , ] , [ , ] , [ , ]〉) |𝑀𝑉𝐴𝑈 (𝑥𝑖 ) − 𝑀𝑉𝐵𝑈 (𝑥𝑖 )|, |𝑁𝑉𝐴𝐿 (𝑥𝑖 ) − 𝑁𝑉𝐵𝐿 (𝑥𝑖 )| ∨
3 3 3 3 3 3
1 1 1 1 1 1 |𝑁𝑉𝐴𝑈 (𝑥𝑖 ) − 𝑁𝑉𝐵𝑈 (𝑥𝑖 )|, |𝐻𝑉𝐴𝐿 (𝑥𝑖 ) − 𝐻𝑉𝐵𝐿 (𝑥𝑖 )| ∨
= 1 − Dj (A, 〈[ , ] , [ , ] , [ , ]〉)
3 3 3 3 3 3 |𝐻𝑉𝐴𝑈 (𝑥𝑖 ) − 𝐻𝑉𝐵𝑈 (𝑥𝑖 )|)]1/2 };
2 1
= . 𝑀𝑉∅(𝐴,𝐵)𝑈 (𝑥𝑖 )= {1 − [𝑚𝑎𝑥(|𝑀𝑉𝐴𝐿 (𝑥𝑖 ) − 𝑀𝑉𝐵𝐿 (𝑥𝑖 )| ∨
3
3
Thus, Ej (A) = 0 |𝑀𝑉𝐴𝑈 (𝑥𝑖 ) − 𝑀𝑉𝐵𝑈 (𝑥𝑖 )|, |𝑁𝑉𝐴𝐿 (𝑥𝑖 ) − 𝑁𝑉𝐵𝐿 (𝑥𝑖 )| ∨
|𝑁𝐴𝑈 (𝑥𝑖 ) − 𝑁𝐵𝑈 (𝑥𝑖 )|, |𝐻𝐴𝐿 (𝑥𝑖 ) − 𝐻𝐵𝐿 (𝑥𝑖 )| ∨
|𝐻𝐴𝑈 (𝑥𝑖 ) − 𝐻𝐵𝑈 (𝑥𝑖 )|)]};
Property 2): For all 𝑗 = 1, … ,6, 𝐸𝑗 (𝐴) = 1 1
1 1 1 1 1 1 𝑁𝑉∅(𝐴,𝐵)𝐿 (𝑥𝑖 )= {1 + [𝑚𝑖𝑛(|𝑀𝑉𝐴𝐿 (𝑥𝑖 ) − 𝑀𝑉𝐵𝐿 (𝑥𝑖 )| ∨
3
⟺ 3Sj (A, 〈[ , ] , [ , ] , [ , ]〉) − 2 = 1 |𝑀𝑉𝐴𝑈 (𝑥𝑖 ) − 𝑀𝑉𝐵𝑈 (𝑥𝑖 )|, |𝑁𝑉𝐴𝐿 (𝑥𝑖 ) − 𝑁𝑉𝐵𝐿 (𝑥𝑖 )| ∨
3 3 3 3 3 3
622 Informatica 42 (2018) 617–627 P. Tiwari et al.

|𝑁𝑉𝐴𝑈 (𝑥𝑖 ) − 𝑁𝑉𝐵𝑈 (𝑥𝑖 )|, |𝐻𝑉𝐴𝐿 (𝑥𝑖 ) − 𝐻𝑉𝐵𝐿 (𝑥𝑖 )| ∨ |𝑁𝑉𝐴𝐿 (𝑥𝑖 ) − 𝑁𝑉𝐶𝐿 (𝑥𝑖 )| ≥ |𝑁𝑉𝐴𝐿 (𝑥𝑖 ) − 𝑁𝑉𝐵𝐿 (𝑥𝑖 )|,
|𝐻𝑉𝐴𝑈 (𝑥𝑖 ) − 𝐻𝑉𝐵𝑈 (𝑥𝑖 )|)]2 }; |𝑁𝑉𝐴𝑈 (𝑥𝑖 ) − 𝑁𝑉𝐶𝑈 (𝑥𝑖 )| ≥ |𝑁𝑉𝐴𝑈 (𝑥𝑖 ) − 𝑁𝑉𝐵𝑈 (𝑥𝑖 )|;
1 |𝐻𝑉𝐴𝐿 (𝑥𝑖 ) − 𝐻𝑉𝐶𝐿 (𝑥𝑖 )| = |2(𝑀𝑉𝐶𝐿 (𝑥𝑖 ) −
𝑁𝑉∅(𝐴,𝐵)𝑈 (𝑥𝑖 )= {1 + [𝑚𝑖𝑛(|𝑀𝑉𝐴𝐿 (𝑥𝑖 ) − 𝑀𝑉𝐵𝐿 (𝑥𝑖 )| ∨ and
3
|𝑀𝑉𝐴𝑈 (𝑥𝑖 ) − 𝑀𝑉𝐵𝑈 (𝑥𝑖 )|, |𝑁𝑉𝐴𝐿 (𝑥𝑖 ) − 𝑁𝑉𝐵𝐿 (𝑥𝑖 )| ∨ 𝑀𝑉𝐴𝐿 (𝑥𝑖 )) + 2(𝑁𝑉𝐶𝐿 (𝑥𝑖 ) − 𝑁𝑉𝐴𝐿 (𝑥𝑖 ))| ≥
|2(𝑀𝑉𝐵𝐿 (𝑥𝑖 ) − 𝑀𝑉𝐴𝐿 (𝑥𝑖 )) + 2(𝑁𝑉𝐵𝐿 (𝑥𝑖 ) −
|𝑁𝑉𝐴𝑈 (𝑥𝑖 ) − 𝑁𝑉𝐵𝑈 (𝑥𝑖 )|, |𝐻𝑉𝐴𝐿 (𝑥𝑖 ) − 𝐻𝑉𝐵𝐿 (𝑥𝑖 )| ∨
|𝐻𝑉𝐴𝑈 (𝑥𝑖 ) − 𝐻𝑉𝐵𝑈 (𝑥𝑖 )|)]}. 𝑁𝑉𝐴𝐿 (𝑥𝑖 ))| = |𝑁𝑉𝐴𝐿 (𝑥𝑖 ) − 𝑁𝑉𝐵𝐿 (𝑥𝑖 )|,
Similarly, we have |𝑁𝑉𝐴𝑈 (𝑥𝑖 ) − 𝑁𝑉𝐶𝑈 (𝑥𝑖 )| ≥
Theorem 4: E(∅(A, B)) be a similarity measure for |𝑁𝑉𝐴𝑈 (𝑥𝑖 ) − 𝑁𝑉𝐵𝑈 (𝑥𝑖 )|
IvIFSs Aand B, where E is an entropy. So, we have
1 1
𝑀𝑉∅(𝐴,𝐵) (𝑥𝑖 ) ≤ 𝑀𝑉∅(𝐴,𝐶) (𝑥𝑖 ) ≤ [ , ] and 𝑁𝑉∅(𝐴,𝐵) (𝑥𝑖 ) ≥
Proof: To prove that E(∅(A, B)) is a measure of 3 3
1 1
similarity, we need to prove property given by definition 𝑁𝑉∅(𝐴,𝐶) (𝑥𝑖 ) ≥ [ , ] for any 𝑥𝑖 ∈ 𝛺. ⟹ ∅(A, C) ⊆
3 3
3 holds . 1 1 1 1 1 1
∅(A, B) ⊆ 〈[ , ] , [ , ] , [ , ]〉
3 3 3 3 3 3
Property 1): If 𝐴 ∈ ℂ(𝛺) ⇒ 𝐴(𝑥𝑖 ) = 〈[1,1], [0,0], [0,0]〉 Similarly, we have ⟹ ∅(A, C) ⊆ ∅(B, C) ⊆
1 1 1 1 1 1
or 𝐴(𝑥𝑖 ) = 〈[0,0], [1,1], [0,0]〉, for any 𝑥𝑖 ∈ 𝛺, then 〈[ , ] , [ , ] , [ , ]〉. Thus
3 3 3 3 3 3
𝑀𝑉∅(𝐴,𝐴̅)𝐿 (𝑥𝑖 ) = 0 = 𝑀𝑉∅(𝐴,𝐴̅)𝑈 (𝑥𝑖 ); 1 1 1 1 1 1
D (∅(A, B), 〈[ , ] , [ , ] , [ , ]〉) ≤
And 𝑁𝑉∅(𝐴,𝐴̅)𝐿 (𝑥𝑖 ) = 1 = 𝑁𝑉∅(𝐴,𝐴̅)𝑈 (𝑥𝑖 ) 3 3
1 1
3 3
1 1
3 3
1 1
Thus, ∅(𝐴, 𝐴̅) = {〈𝑥𝑖 , [0,0], [1,1], [0,0]〉⁄𝑥𝑖 ∈ Ω} D (∅(A, C), 〈[ , ] , [ , ] , [ , ]〉)
3 3 3 3 3 3
⟹ S(A, A ̅) = E(∅(A, A ̅)) = 0 1 1 1 1 1 1
and D (∅(B, C), 〈[ , ] , [ , ] , [ , ]〉) ≤
3 3 3 3 3 3
1 1 1 1 1 1
Property 2): Assume that S(A, B) = 1 ⇒ E(∅(A, B)) = 1 D (∅(A, C), 〈[ , ] , [ , ] , [ , ]〉).
3 3 3 3 3 3
1 1 So form definition of entropy corresponding to distance
⟺ 𝑀𝑉∅(𝐴,𝐵) (𝑥𝑖 ) = 𝑁𝑉∅(𝐴,𝐵) (𝑥𝑖 ) = 𝐻𝑉∅(𝐴,𝐵) (𝑥𝑖 ) = [ , ]
3 3 function, we get E(∅(A, C)) ≤ E(∅(A, B)) and
⟺ 𝑚𝑎𝑥(|𝑀𝑉𝐴𝐿 (𝑥𝑖 ) − 𝑀𝑉𝐵𝐿 (𝑥𝑖 )| E(∅(A, C)) ≤ E(∅(B, C))
∨ |𝑀𝑉𝐴𝑈 (𝑥𝑖 ) − 𝑀𝑉𝐵𝑈 (𝑥𝑖 )|, |𝑁𝑉𝐴𝐿 (𝑥𝑖 )
or S(∅(A, C)) ≤ S(∅(A, B)) and S(∅(A, C)) ≤
− 𝑁𝑉𝐵𝐿 (𝑥𝑖 )|
∨ |𝑁𝑉𝐴𝑈 (𝑥𝑖 ) − 𝑁𝑉𝐵𝑈 (𝑥𝑖 )|, |𝐻𝑉𝐴𝐿 (𝑥𝑖 ) S(∅(B, C)). ∎
− 𝐻𝑉𝐵𝐿 (𝑥𝑖 )|
∨ |𝐻𝑉𝐴𝑈 (𝑥𝑖 ) − 𝐻𝑉𝐵𝑈 (𝑥𝑖 )|) = 0 Corollary 1: Let E be an entropy measure for IvIFSs and
and 𝑚𝑖𝑛(|𝑀𝑉𝐴𝐿 (𝑥𝑖 ) − 𝑀𝑉𝐵𝐿 (𝑥𝑖 )| ∨ |𝑀𝑉𝐴𝑈 (𝑥𝑖 ) − ∅(A, B)be an IvIFS defined on two IvIFSs A and
𝑀𝑉𝐵𝑈 (𝑥𝑖 )|, |𝑁𝑉𝐴𝐿 (𝑥𝑖 ) − 𝑁𝑉𝐵𝐿 (𝑥𝑖 )| ∨ |𝑁𝑉𝐴𝑈 (𝑥𝑖 ) − ̅̅̅̅̅̅̅̅̅
B acaccording to definition 6, then E(∅(A, B)) measure
𝑁𝑉𝐵𝑈 (𝑥𝑖 )|, |𝐻𝑉𝐴𝐿 (𝑥𝑖 ) − 𝐻𝑉𝐵𝐿 (𝑥𝑖 )| ∨ |𝐻𝑉𝐴𝑈 (𝑥𝑖 ) − of similarity for 𝐴, 𝐵 ∈ 𝐼𝑣𝐼𝐹𝑆𝑠(𝛺).
𝐻𝑉𝐵𝑈 (𝑥𝑖 )|) = 0
⟺ |𝑀𝑉𝐴𝐿 (𝑥𝑖 ) − 𝑀𝑉𝐵𝐿 (𝑥𝑖 )| ∨ |𝑀𝑉𝐴𝑈 (𝑥𝑖 ) − 𝑀𝑉𝐵𝑈 (𝑥𝑖 )| Proof: Proof followed from the definition of complement
= 0, interval-valued intuitionistic fuzzy sets and theorem 4.
|𝑁𝑉𝐴𝐿 (𝑥𝑖 ) − 𝑁𝑉𝐵𝐿 (𝑥𝑖 )| ∨ |𝑁𝑉𝐴𝑈 (𝑥𝑖 ) − 𝑁𝑉𝐵𝑈 (𝑥𝑖 )| = 0, ∎
and |𝐻𝑉𝐴𝐿 (𝑥𝑖 ) − 𝐻𝑉𝐵𝐿 (𝑥𝑖 )| ∨ |𝐻𝑉𝐴𝑈 (𝑥𝑖 ) − 𝐻𝑉𝐵𝑈 (𝑥𝑖 )| = Definition 7: Let 𝐴, 𝐵 ∈ 𝐼𝑣𝐼𝐹𝑆𝑠(𝛺) , we can define
0. 𝐼𝑣𝐼𝐹𝑆 η(A, B)using A, B as follows:
1
⟺ 𝑀𝑉𝐴𝐿 (𝑥𝑖 ) = 𝑀𝑉𝐵𝐿 (𝑥𝑖 ), 𝑀𝑉𝐴𝑈 (𝑥𝑖 ) = 𝑀𝑉𝐵𝑈 (𝑥𝑖 ), 𝑀𝑉𝜂(𝐴,𝐵)𝐿 (𝑥𝑖 )= {1 + [𝑚𝑖𝑛((|𝑀𝑉𝐴𝐿 (𝑥𝑖 ) − 𝑀𝑉𝐵𝐿 (𝑥𝑖 )| ∨
3
𝑁𝑉𝐴𝐿 (𝑥𝑖 ) = 𝑁𝑉𝐵𝐿 (𝑥𝑖 ), 𝑁𝑉𝐴𝑈 (𝑥𝑖 ) = 𝑁𝑉𝐵𝑈 (𝑥𝑖 ) |𝑀𝑉𝐴𝑈 (𝑥𝑖 ) − 𝑀𝑉𝐵𝑈 (𝑥𝑖 )|𝛼 ), (|𝑁𝑉𝐴𝐿 (𝑥𝑖 ) − 𝑁𝑉𝐵𝐿 (𝑥𝑖 )| ∨
and 𝐻𝑉𝐴𝐿 (𝑥𝑖 ) = 𝐻𝑉𝐵𝐿 (𝑥𝑖 ), 𝐻𝑉𝐴𝑈 (𝑥𝑖 ) = 𝐻𝑉𝐵𝑈 (𝑥𝑖 ). |𝑁𝑉𝐴𝑈 (𝑥𝑖 ) − 𝑁𝑉𝐵𝑈 (𝑥𝑖 )|)𝛼 , (|𝐻𝑉𝐴𝐿 (𝑥𝑖 ) − 𝐻𝑉𝐵𝐿 (𝑥𝑖 )| ∨
⟺ A = B. 2
|𝐻𝑉𝐴𝑈 (𝑥𝑖 ) − 𝐻𝑉𝐵𝑈 (𝑥𝑖 )|)𝛼 )] };
1
Property 3): ∅(𝐴, 𝐵) = ∅(𝐵, 𝐴) by definition of 𝑀𝑉𝜂(𝐴,𝐵)𝑈 (𝑥𝑖 )= {1 + [𝑚𝑖𝑛((|𝑀𝑉𝐴𝐿 (𝑥𝑖 ) − 𝑀𝑉𝐵𝐿 (𝑥𝑖 )| ∨
3
𝑀𝑉∅(𝐴,𝐵)𝐿 (𝑥𝑖 ), 𝑀𝑉∅(𝐴,𝐵)𝑈 (𝑥𝑖 ),𝑁𝑉∅(𝐴,𝐵)𝐿 (𝑥𝑖 ), |𝑀𝑉𝐴𝑈 (𝑥𝑖 ) − 𝑀𝑉𝐵𝑈 (𝑥𝑖 )|𝛼 ), (|𝑁𝑉𝐴𝐿 (𝑥𝑖 ) − 𝑁𝑉𝐵𝐿 (𝑥𝑖 )| ∨
𝑁𝑉∅(𝐴,𝐵)𝑈 (𝑥𝑖 ) for any 𝑥𝑖 ∈ 𝛺 |𝑁𝑉𝐴𝑈 (𝑥𝑖 ) − 𝑁𝑉𝐵𝑈 (𝑥𝑖 )|)𝛼 , (|𝐻𝑉𝐴𝐿 (𝑥𝑖 ) − 𝐻𝑉𝐵𝐿 (𝑥𝑖 )| ∨
⟹ 𝐸(∅(𝐴, 𝐵)) = 𝐸(∅(𝐵, 𝐴)) |𝐻𝑉𝐴𝑈 (𝑥𝑖 ) − 𝐻𝑉𝐵𝑈 (𝑥𝑖 )|)𝛼 )]};
⟺ 𝑆(𝐴, 𝐵) = 𝑆(𝐵, 𝐴) 1
𝑁𝑉𝜂(𝐴,𝐵)𝐿 (𝑥𝑖 )= {1 − [𝑚𝑎𝑥((|𝑀𝑉𝐴𝐿 (𝑥𝑖 ) − 𝑀𝑉𝐵𝐿 (𝑥𝑖 )| ∨
3
Property 4): Let A, B and C be any three IvIFSs such that |𝑀𝑉𝐴𝑈 (𝑥𝑖 ) − 𝑀𝑉𝐵𝑈 (𝑥𝑖 )|𝛼 ), (|𝑁𝑉𝐴𝐿 (𝑥𝑖 ) − 𝑁𝑉𝐵𝐿 (𝑥𝑖 )| ∨
A ⊆ B ⊆ C for any 𝑥𝑖 ∈ 𝛺, we have 𝑀𝑉𝐴 (𝑥𝑖 ) ≤ |𝑁𝑉𝐴𝑈 (𝑥𝑖 ) − 𝑁𝑉𝐵𝑈 (𝑥𝑖 )|)𝛼 , (|𝐻𝑉𝐴𝐿 (𝑥𝑖 ) − 𝐻𝑉𝐵𝐿 (𝑥𝑖 )| ∨
1/2
𝑀𝑉𝐵 (𝑥𝑖 ) ≤ 𝑀𝑉𝐶 (𝑥𝑖 ),𝑁𝑉𝐴 (𝑥𝑖 ) ≥ 𝑁𝑉𝐵 (𝑥𝑖 ) ≥ 𝑁𝑉𝐶 (𝑥𝑖 ) or |𝐻𝑉𝐴𝑈 (𝑥𝑖 ) − 𝐻𝑉𝐵𝑈 (𝑥𝑖 )|)𝛼 )] };
𝑀𝑉𝐴𝐿 (𝑥𝑖 ) ≤ 𝑀𝑉𝐵𝐿 (𝑥𝑖 ) ≤ 𝑀𝑉𝐶𝐿 (𝑥𝑖 ),𝑁𝑉𝐴𝐿 (𝑥𝑖 ) ≥ 1
𝑁𝑉𝜂(𝐴,𝐵)𝑈 (𝑥𝑖 )= {1 − [𝑚𝑎𝑥((|𝑀𝑉𝐴𝐿 (𝑥𝑖 ) − 𝑀𝑉𝐵𝐿 (𝑥𝑖 )| ∨
𝑁𝑉𝐵𝐿 (𝑥𝑖 ) ≥ 𝑁𝑉𝐶𝐿 (𝑥𝑖 ) and 𝑀𝑉𝐴𝑈 (𝑥𝑖 ) ≤ 𝑀𝑉𝐵𝑈 (𝑥𝑖 ) ≤ 3
𝑀𝑉𝐶𝑈 (𝑥𝑖 ),𝑁𝑉𝐴𝑈 (𝑥𝑖 ) ≥ 𝑁𝑉𝐵𝑈 (𝑥𝑖 ) ≥ 𝑁𝑉𝐶𝑈 (𝑥𝑖 ). |𝑀𝑉𝐴𝑈 (𝑥𝑖 ) − 𝑀𝑉𝐵𝑈 (𝑥𝑖 )|𝛼 ), (|𝑁𝑉𝐴𝐿 (𝑥𝑖 ) − 𝑁𝑉𝐵𝐿 (𝑥𝑖 )| ∨
⟹ |𝑀𝑉𝐴𝐿 (𝑥𝑖 ) − 𝑀𝑉𝐶𝐿 (𝑥𝑖 )| ≥ |𝑀𝑉𝐴𝐿 (𝑥𝑖 ) − 𝑀𝑉𝐵𝐿 (𝑥𝑖 )|, |𝑁𝑉𝐴𝑈 (𝑥𝑖 ) − 𝑁𝑉𝐵𝑈 (𝑥𝑖 )|)𝛼 , (|𝐻𝑉𝐴𝐿 (𝑥𝑖 ) − 𝐻𝑉𝐵𝐿 (𝑥𝑖 )| ∨
|𝑀𝑉𝐴𝑈 (𝑥𝑖 ) − 𝑀𝑉𝐶𝑈 (𝑥𝑖 )| ≥ |𝑀𝑉𝐴𝑈 (𝑥𝑖 ) − 𝑀𝑉𝐵𝑈 (𝑥𝑖 )|; |𝐻𝑉𝐴𝑈 (𝑥𝑖 ) − 𝐻𝑉𝐵𝑈 (𝑥𝑖 )|)𝛼 )]},
Where 𝛼 ∈ [1, ∞[ and 𝑥𝑖 ∈ 𝛺.
Entropy, Distance and Similarity Measures under... Informatica 42 (2018) 617–627 623

Theorem 5: For any two 𝐼𝑣𝐼𝐹𝑆𝑠 A and B, E(η(A, B)) is a and |𝐻𝑉𝐴𝐿 (𝑥𝑖 ) − 𝐻𝑉𝐶𝐿 (𝑥𝑖 )| = |2(𝑀𝑉𝐶𝐿 (𝑥𝑖 ) −
measure of similarity, where E is an entropy measure. 𝑀𝑉𝐴𝐿 (𝑥𝑖 )) + 2(𝑁𝑉𝐶𝐿 (𝑥𝑖 ) − 𝑁𝑉𝐴𝐿 (𝑥𝑖 ))| ≥
|2(𝑀𝑉𝐵𝐿 (𝑥𝑖 ) − 𝑀𝑉𝐴𝐿 (𝑥𝑖 )) + 2(𝑁𝑉𝐵𝐿 (𝑥𝑖 ) −
Proof: To prove that E(η(A, B)) is a measure of 𝑁𝑉𝐴𝐿 (𝑥𝑖 ))| = |𝑁𝑉𝐴𝐿 (𝑥𝑖 ) − 𝑁𝑉𝐵𝐿 (𝑥𝑖 )|,
similarity, we need to prove property given by definition Similarly, we have |𝑁𝑉𝐴𝑈 (𝑥𝑖 ) − 𝑁𝑉𝐶𝑈 (𝑥𝑖 )| ≥
3 holds . |𝑁𝑉𝐴𝑈 (𝑥𝑖 ) − 𝑁𝑉𝐵𝑈 (𝑥𝑖 )|
So, we have from definition 7
Property 1): If 𝐴 ∈ ℂ(𝛺) ⇒ 𝐴(𝑥𝑖 ) = 〈[1,1], [0,0], [0,0]〉 1 1
𝑀𝑉𝜂(𝐴,𝐶) (𝑥𝑖 ) ≥ 𝑀𝑉𝜂(𝐴,𝐵) (𝑥𝑖 ) ≥ [ , ] and 𝑁𝑉𝜂(𝐴,𝐶) (𝑥𝑖 ) ≤
or 𝐴(𝑥𝑖 ) = 〈[0,0], [1,1], [0,0]〉, for any 𝑥𝑖 ∈ 𝛺, then 3 3
1 1
𝑀𝑉∅(𝐴,𝐴̅)𝐿 (𝑥𝑖 ) = 1 = 𝑀𝑉∅(𝐴,𝐴̅)𝑈 (𝑥𝑖 ); 𝑁𝑉𝜂(𝐴,𝐵) (𝑥𝑖 ) ≤ [ , ] for any 𝑥𝑖 ∈ 𝛺. ⟹ η(A, C) ⊇
3 3
and 𝑁𝑉∅(𝐴,𝐴̅)𝐿 (𝑥𝑖 ) = 0 = 𝑁𝑉∅(𝐴,𝐴̅)𝑈 (𝑥𝑖 ) 1 1 1 1 1 1
η(A, B) ⊇ 〈[ , ] , [ , ] , [ , ]〉
̅) = {〈𝑥𝑖 , [1,1], [0,0], [0,0]〉⁄𝑥𝑖 ∈ Ω} 3 3 3 3 3 3
Thus, ∅(A, A Similarly, we have ⟹ η(A, C) ⊇ η(B, C) ⊇
⟹ S(A, A ̅) = E(∅(A, A ̅)) = 0 1 1 1 1 1 1
〈[ , ] , [ , ] , [ , ]〉. Thus
3 3 3 3 3 3
Property 2): Assume that S(A, B) = 1 1 1 1 1 1 1
D (η(A, B), 〈[ , ] , [ , ] , [ , ]〉)
Then E(η(A, B)) = 1 3 3 3 3 3 3
1 1 1 1 1 1 1 1
⟺ 𝑀𝑉𝜂(𝐴,𝐵) (𝑥𝑖 ) = [ , ] = 𝑁𝑉𝜂(𝐴,𝐵) (𝑥𝑖 ) ≤ D (η(A, C), 〈[ , ] , [ , ] , [ , ]〉)
3 3 3 3 3 3 3 3
1 1 1 1 1 1
= 𝐻𝑉𝜂(𝐴,𝐵) (𝑥𝑖 ) and D (η(B, C), 〈[ , ] , [ , ] , [ , ]〉) ≤
3 3 3 3 3 3
(|𝑀𝑉𝐴𝐿 (𝑥𝑖 ) − 𝑀𝑉𝐵𝐿 (𝑥𝑖 )| ∨ |𝑀𝑉𝐴𝑈 (𝑥𝑖 ) − 𝑀𝑉𝐵𝑈 (𝑥𝑖 )|)𝛼 , D (η(A, C), 〈[1 , 1] , [1 , 1] , [1 , 1]〉).
3 3 3 3 3 3
⟺ 𝑚𝑖𝑛 ( (|𝑁𝑉𝐴𝐿 (𝑥𝑖 ) − 𝑁𝑉𝐵𝐿 (𝑥𝑖 )| ∨ |𝑁𝑉𝐴𝑈 (𝑥𝑖 ) − 𝑁𝑉𝐵𝑈 (𝑥𝑖 )|)𝛼 , )So from definition of entropy corresponding to distance
𝛼
(|𝐻𝑉𝐴𝐿 (𝑥𝑖 ) − 𝐻𝑉𝐵𝐿 (𝑥𝑖 )| ∨ |𝐻𝑉𝐴𝑈 (𝑥𝑖 ) − 𝐻𝑉𝐵𝑈 (𝑥𝑖 )|) function, we get E(∅(A, C)) ≤ E(∅(A, B)) and
=0
E(η(A, C)) ≤ E(η(B, C))
and or S(η(A, C)) ≤ S(η(A, B)) and S(η(A, C)) ≤
(|𝑀𝑉𝐴𝐿 (𝑥𝑖 ) − 𝑀𝑉𝐵𝐿 (𝑥𝑖 )| ∨ |𝑀𝑉𝐴𝑈 (𝑥𝑖 ) − 𝑀𝑉𝐵𝑈 (𝑥𝑖 )|)𝛼 , S(η(B, C)). ∎
𝑚𝑎𝑥 ( (|𝑁𝑉𝐴𝐿 (𝑥𝑖 ) − 𝑁𝑉𝐵𝐿 (𝑥𝑖 )| ∨ |𝑁𝑉𝐴𝑈 (𝑥𝑖 ) − 𝑁𝑉𝐵𝑈 (𝑥𝑖 )|)𝛼 , )
(|𝐻𝑉𝐴𝐿 (𝑥𝑖 ) − 𝐻𝑉𝐵𝐿 (𝑥𝑖 )| ∨ |𝐻𝑉𝐴𝑈 (𝑥𝑖 ) − 𝐻𝑉𝐵𝑈 (𝑥𝑖 )|)𝛼 Corollary 2: Let E be an entropy for 𝐼𝑣𝐼𝐹𝑆𝑠 and
=0 η(A, B) be an 𝐼𝑣𝐼𝐹𝑆 defined on 𝐴, 𝐵 ∈ 𝐼𝑣𝐼𝐹𝑆𝑠(𝛺) as
defined in definition 7, then E(η(A, ̅̅̅̅̅̅̅̅̅
B)) is measure of
⟺ (|𝑀𝑉𝐴𝐿 (𝑥𝑖 ) − 𝑀𝑉𝐵𝐿 (𝑥𝑖 )|
𝛼
∨ |𝑀𝑉𝐴𝑈 (𝑥𝑖 ) − 𝑀𝑉𝐵𝑈 (𝑥𝑖 )|) = 0, similarity for 𝐴, 𝐵 ∈ 𝐼𝑣𝐼𝐹𝑆𝑠(𝛺).
(|𝑁𝑉𝐴𝐿 (𝑥𝑖 ) − 𝑁𝑉𝐵𝐿 (𝑥𝑖 )| ∨ |𝑁𝑉𝐴𝑈 (𝑥𝑖 ) − 𝑁𝑉𝐵𝑈 (𝑥𝑖 )|)𝛼 Proof: Proof followed from the definition of complement
= 0, of 𝐼𝑣𝐼𝐹𝑆𝑠 and theorem 5. ∎
and (|𝐻𝑉𝐴𝐿 (𝑥𝑖 ) − 𝐻𝑉𝐵𝐿 (𝑥𝑖 )| ∨ |𝐻𝑉𝐴𝑈 (𝑥𝑖 ) −
𝐻𝑉𝐵𝑈 (𝑥𝑖 )|)𝛼 = 0. 4.1 Weighted similarity measure
⟺ 𝑀𝑉𝐴𝐿 (𝑥𝑖 ) = 𝑀𝑉𝐵𝐿 (𝑥𝑖 ), 𝑀𝑉𝐴𝑈 (𝑥𝑖 ) Let w = (w1 , w2 , … , wn )T be the weights provided to
= 𝑀𝑉𝐵𝑈 (𝑥𝑖 ), 𝑁𝑉𝐴𝐿 (𝑥𝑖 ) each element 𝑥𝑖 ∈ 𝛺, 𝑖 = 1,2, … , 𝑛. Then the weighted
= 𝑁𝑉𝐵𝐿 (𝑥𝑖 ), 𝑁𝑉𝐴𝑈 (𝑥𝑖 ) = 𝑁𝑉𝐵𝑈 (𝑥𝑖 ) similarity measure based on the aforesaid similarity
and 𝐻𝑉𝐴𝐿 (𝑥𝑖 ) = 𝐻𝑉𝐵𝐿 (𝑥𝑖 ), 𝐻𝑉𝐴𝑈 (𝑥𝑖 ) = 𝐻𝑉𝐵𝑈 (𝑥𝑖 ). measures are defined as 𝑆(𝐴, 𝐵) =
⟺ A = B. ∑𝑛𝑖=1 𝑤𝑖 𝑆(𝐴(𝑥𝑖 ), 𝐵(𝑥𝑖 )), where 𝑤𝑖 ≥ 0 and ∑𝑛𝑖=1 𝑤𝑖 = 1.

Property 3): η(A, B) = η(B, A) by definition of


4.2 Comparison with some select measures
𝑀𝑉𝜂(𝐴,𝐵)𝐿 (𝑥𝑖 ), 𝑀𝑉𝜂(𝐴,𝐵)𝑈 (𝑥𝑖 ),𝑁𝑉𝜂(𝐴,𝐵)𝐿 (𝑥𝑖 ),
𝑁𝑉𝜂(𝐴,𝐵)𝑈 (𝑥𝑖 ) for any 𝑥𝑖 ∈ 𝛺
of similarity
⟹ E(∅(A, B)) = E(∅(B, A)) Here, we compare the performance of proposed measure
of similarity with some of the existing similarity
⟺ S(A, B) = S(B, A)
measures as follows.
For any two IvIFSs A and B, then some existing
Property 4): Let A, B and C be any three IvIFSs such that
similarity measures are given as follows:
A ⊆ B ⊆ C, then for any 𝑥𝑖 ∈ 𝛺, we have 𝑀𝑉𝐴 (𝑥𝑖 ) ≤
• 𝑆𝑊 (𝐴, 𝐵) =
𝑀𝑉𝐵 (𝑥𝑖 ) ≤ 𝑀𝑉𝐶 (𝑥𝑖 ),𝑁𝑉𝐴 (𝑥𝑖 ) ≥ 𝑁𝑉𝐵 (𝑥𝑖 ) ≥ 𝑁𝑉𝐶 (𝑥𝑖 ) or
1 𝑛 4−(𝑀𝑉𝐿 (𝑥𝑖 )+𝑀𝑉𝑈 (𝑥𝑖 )+𝑁𝑉𝐿 (𝑥𝑖 )+𝑁𝑉𝑈 (𝑥𝑖 ))+(𝐻𝑉𝐿 (𝑥𝑖 )+𝐻𝑉𝑈 (𝑥𝑖 ))
𝑀𝑉𝐴𝐿 (𝑥𝑖 ) ≤ 𝑀𝑉𝐵𝐿 (𝑥𝑖 ) ≤ 𝑀𝑉𝐶𝐿 (𝑥𝑖 ),𝑁𝑉𝐴𝐿 (𝑥𝑖 ) ≥ ∑
𝑛 𝑖=1 4+(𝑀𝑉𝐿 (𝑥𝑖 )+𝑀𝑉𝑈 (𝑥𝑖 )+𝑁𝑉𝐿 (𝑥𝑖 )+𝑁𝑉𝑈 (𝑥𝑖 ))+(𝐻𝑉𝐿 (𝑥𝑖 )+𝐻𝑉𝑈 (𝑥𝑖 ))
𝑁𝑉𝐵𝐿 (𝑥𝑖 ) ≥ 𝑁𝑉𝐶𝐿 (𝑥𝑖 ) and 𝑀𝑉𝐴𝑈 (𝑥𝑖 ) ≤ 𝑀𝑉𝐵𝑈 (𝑥𝑖 ) ≤ ,
𝑀𝑉𝐶𝑈 (𝑥𝑖 ), 𝑁𝑉𝐴𝑈 (𝑥𝑖 ) ≥ 𝑁𝑉𝐵𝑈 (𝑥𝑖 ) ≥ 𝑁𝑉𝐶𝑈 (𝑥𝑖 ). where 𝑀𝑉𝐿 (𝑥𝑖 ) = |𝑀𝑉𝐴𝐿 (𝑥𝑖 ) − 𝑀𝑉𝐵𝐿 (𝑥𝑖 )|, 𝑀𝑉𝑈 (𝑥𝑖 ) =
⟹ |𝑀𝑉𝐴𝐿 (𝑥𝑖 ) − 𝑀𝑉𝐶𝐿 (𝑥𝑖 )| ≥ |𝑀𝑉𝐴𝐿 (𝑥𝑖 ) − 𝑀𝑉𝐵𝐿 (𝑥𝑖 )|, |𝑀𝑉𝐴𝑈 (𝑥𝑖 ) − 𝑀𝑉𝐵𝑈 (𝑥𝑖 )|, 𝑁𝑉𝐿 (𝑥𝑖 ) = |𝑁𝑉𝐴𝐿 (𝑥𝑖 ) −
|𝑀𝑉𝐴𝑈 (𝑥𝑖 ) − 𝑀𝑉𝐶𝑈 (𝑥𝑖 )| ≥ |𝑀𝑉𝐴𝑈 (𝑥𝑖 ) − 𝑀𝑉𝐵𝑈 (𝑥𝑖 )|; 𝑁𝑉𝐵𝐿 (𝑥𝑖 )|, 𝑁𝑉𝑈 (𝑥𝑖 ) = |𝑁𝑉𝐴𝑈 (𝑥𝑖 ) −
|𝑁𝑉𝐴𝐿 (𝑥𝑖 ) − 𝑁𝑉𝐶𝐿 (𝑥𝑖 )| ≥ |𝑁𝑉𝐴𝐿 (𝑥𝑖 ) − 𝑁𝑉𝐵𝐿 (𝑥𝑖 )|, 𝑁𝑉𝐵𝑈 (𝑥𝑖 )|,𝐻𝑉𝐿 (𝑥𝑖 ) = 𝐻𝑉𝐴𝐿 (𝑥𝑖 ) + 𝐻𝑉𝐵𝐿 (𝑥𝑖 ) and
|𝑁𝑉𝐴𝑈 (𝑥𝑖 ) − 𝑁𝑉𝐶𝑈 (𝑥𝑖 )| ≥ |𝑁𝑉𝐴𝑈 (𝑥𝑖 ) − 𝑁𝑉𝐵𝑈 (𝑥𝑖 )|; 𝐻𝑉 (𝑥 ) = 𝐻𝑉 (𝑥 ) + 𝐻𝑉 (𝑥 ) is proposed by Wu et
𝑈 𝑖 𝐴𝑈 𝑖 𝐵𝑈 𝑖
al.(2014).
624 Informatica 42 (2018) 617–627 P. Tiwari et al.

1
• 𝑆𝐻𝐿 (𝐴, 𝐵) = 1 − ∑𝑛𝑖=1|𝑀𝑉𝐴𝐿 (𝑥𝑖 ) −
4𝑛
5 Applications of proposed
𝑀𝑉𝐵𝐿 (𝑥𝑖 )| + |𝑀𝑉𝐴𝑈 (𝑥𝑖 ) − 𝑀𝑉𝐵𝑈 (𝑥𝑖 )| + similarity measures
|𝑁𝑉𝐴𝐿 (𝑥𝑖 ) − 𝑁𝑉𝐵𝐿 (𝑥𝑖 )| + |𝑁𝑉𝐴𝑈 (𝑥𝑖 ) − 𝑁𝑉𝐵𝑈 (𝑥𝑖 )|is
Here the proposed similarity measures are applied to
given by Hu and Li (2013).
some of the situation that deals with imperfect
information.
• 𝑆𝑆 (𝐴, 𝐵) =
(𝑀𝑉𝐴𝐿 (𝑥𝑖 )+𝑀𝑉𝐴𝑈 (𝑥𝑖 ))(𝑀𝑉𝐵𝐿 (𝑥𝑖 )+𝑀𝑉𝐵𝑈 (𝑥𝑖 ))
[ ]
1 +(𝑁𝑉𝐴𝐿 (𝑥𝑖 )+𝑁𝑉𝐴𝑈 (𝑥𝑖 ))(𝑁𝑉𝐵𝐿 (𝑥𝑖 )+𝑁𝑉𝐵𝑈 (𝑥𝑖 )) 5.1 Pattern recognition
∑𝑛 is
𝑛 𝑖=1 2 2 Here we use an example of pattern recognition
√(𝑀𝑉𝐴𝐿 (𝑥𝑖 )+𝑀𝑉𝐴𝑈 (𝑥𝑖 )) +(𝑁𝑉𝐴𝐿 (𝑥𝑖 )+𝑁𝑉𝐴𝑈 (𝑥𝑖 ))

2 2
considered by Xu (2007) and adapted by Wei et al.

[ (𝑀𝑉𝐵𝐿 (𝑥𝑖 )+𝑀𝑉𝐵𝑈 (𝑥𝑖 )) +(𝑁𝑉𝐵𝐿(𝑥𝑖 )+𝑁𝑉𝐵𝑈 (𝑥𝑖 )) ] (2011) and Wu et al.(2014) for classification of building
introduced by Singh(2012) material.
• 𝑆𝑆𝑢 (𝐴, 𝐵) = Example: There are four types of building materials
|𝑀𝑉𝐴𝐿 (𝑥𝑖 )−𝑀𝑉𝐵𝐿 (𝑥𝑖 )|⋁|𝑁𝑉𝐴𝐿 (𝑥𝑖 )−𝑁𝑉𝐵𝐿 (𝑥𝑖 )| 𝐴𝑖 , 𝑖 = 1,2,3,4 and an anonymous building material B,
1 𝑛 +|𝑀𝑉𝐴𝑈 (𝑥𝑖 )−𝑀𝑉𝐵𝑈 (𝑥𝑖 )|∨|𝑁𝑉𝐴𝑈 (𝑥𝑖 )−𝑁𝑉𝐵𝑈 (𝑥𝑖 )|
∑ which is characterized by the IvIFSs defined on 𝑋 =
𝑛 𝑖=1 3−𝑚𝑖𝑛{ |𝑀𝑉𝐴𝐿 (𝑥𝑖 )−𝑀𝑉𝐵𝐿 (𝑥𝑖 )|⋁|𝑁𝑉𝐴𝐿 (𝑥𝑖 )−𝑁𝑉𝐵𝐿 (𝑥𝑖 )|, }
|𝑀𝑉𝐴𝑈 (𝑥𝑖 )−𝑀𝑉𝐵𝑈 (𝑥𝑖 )|∨|𝑁𝑉𝐴𝑈 (𝑥𝑖 )−𝑁𝑉𝐵𝑈 (𝑥𝑖 )| {𝑥1 , 𝑥2 , … , 𝑥12 } with weighted vector
0.1, 0.05, 0.08, 0.06, 0.03, 0.07, T
w=( )
is proposed by Sun & Liu (2012). 0.09, 0.12, 0.15, 0.07, 0.13, 0.05
and we have the data given as follows by Xu (2007).
To review the performance of similarity measures let “𝐴1
us consider an example. Consider the following IvIFSs 〈𝑥 , [0.1,0.2], [0.5, 0.6]〉, 〈𝑥2 , [0.1,0.2], [0.7, 0.8]〉,
A = {𝑥𝑖 , 〈[0.5,0.5], [0.5,0.5], [0,0]〉, 𝑥𝑖 ∈ Ω}, ( 1 )
〈𝑥3 , [0.5,0.6], [0.3, 0.4]〉, 〈𝑥4 , [0.8,0.9], [0.0, 0.1]〉,
B = {𝑥𝑖 , 〈[0.3,0.4], [0.4,0.5], [0.1,0.3]〉, 𝑥𝑖 ∈ Ω},
〈𝑥 , [0.4,0.5], [0.3, 0.4]〉, 〈𝑥6 , [0.0,0.1], [0.8, 0.9]〉,
C = {𝑥𝑖 , 〈[0.3,0.3], [0.3,0.3], [0.4,0.4]〉, 𝑥𝑖 ∈ Ω}, = ( 5 )
〈𝑥7 , [0.3,0.4], [0.5, 0.6]〉, 〈𝑥8 , [1.0,1.0], [0.0, 0.0]〉,
D = {𝑥𝑖 , 〈[0.6,0.6], [0.4,0.4], [0, 0]〉, 𝑥𝑖 ∈ Ω}
〈𝑥 , [0.2,0.3], [0.6, 0.7]〉, 〈𝑥10 , [0.4,0.5], [0.4, 0.5]〉,
Intuitively, it is clear that A is more similar to D than ( 9 )
{ 〈𝑥11 , [0.7,0.8], [0.1, 0.2]〉, 〈𝑥12 , [0.4,0.5], [0.4, 0.5]〉 }
B and C. The result corresponding to measure of 𝐴2
similarity measures given in Table 3: 〈𝑥 , [0.5,0.6], [0.3, 0.4]〉, 〈𝑥2 , [0.6,0.7], [0.1, 0.2]〉,
( 1 )
AB AC AD 〈𝑥3 , [1.0,1.0], [0.0, 0.0]〉, 〈𝑥4 , [0.1,0.2], [0.6, 0.7]〉,
SW 0.83333 0.69308 0.8181 〈𝑥 , [0.0,0.1], [0.8, 0.9]〉, 〈𝑥6 , [0.7,0.8], [0.1, 0.2]〉,
= ( 5 )
SHL 0.9 0.8 0.9 〈𝑥7 , [0.5,0.6], [0.3, 0.4]〉, 〈𝑥8 , [0.6,0.7], [0.2, 0.3]〉,
SS 0.99227 0.9 0.98058 〈𝑥 , [1.0,1.0], [0.0, 0.0]〉, 〈𝑥10 , [0.1,0.2], [0.7, 0.8]〉,
( 9 )
SSu 0.89655 0.8571 0.9310 { 〈𝑥11 , [0.0,0.1], [0.8, 0.9]〉, 〈𝑥12 , [0.7,0.8], [0.1, 0.2]〉 }
S1 (ϕ) 0.7450 0.73722 0.8646 𝐴3
S2 (ϕ) 0.78806 0.761886 0.89594 〈𝑥 , [0.4,0.5], [0.3, 0.4]〉, 〈𝑥2 , [0.6,0.7], [0.2, 0.3]〉,
( 1 )
S3 (ϕ) 0.72614 0.68377 0.84188 〈𝑥3 , [0.9,1.0], [0.0, 0.0]〉, 〈𝑥4 , [0.0,0.1], [0.8, 0.9]〉,
S4 (ϕ) 0.78806 0.74188 0.89594 〈𝑥 , [0.0,0.1], [0.8, 0.9]〉, 〈𝑥6 , [0.6,0.7], [0.2, 0.3]〉,
= ( 5 )
S5 (ϕ) 0.68210 0.62283 0.84391 〈𝑥7 , [0.1,0.2], [0.7, 0.8]〉, 〈𝑥8 , [0.2,0.3], [0.6, 0.7]〉,
S6 (ϕ) for p=2 0.77639 0.77141 0.87090 〈𝑥 , [0.5,0.6], [0.2, 0.4]〉, 〈𝑥10 , [1.0,1.0], [0.0, 0.0]〉,
( 9 )
S1 (η) 0.93205 0.89426 0.98708 { 〈𝑥11 , [0.3,0.4], [0.4, 0.5]〉, 〈𝑥12 , [0.0,0.1], [0.8, 0.9]〉 }
S2 (η) 0.95217 0.92075 0.99184
S3 (η) 0.91759 0.85853 0.98418 𝐴4
S4 (η) 0.95217 0.92075 0.99184 〈𝑥 , [1.0,1.0], [0.0, 0.0]〉, 〈𝑥2 , [1.0,1.0], [0.0, 0.0]〉,
( 1 )
S5 (η) 0.92825 0.88113 0.98776 〈𝑥3 , [0.8,0.9], [0.0, 0.1]〉, 〈𝑥4 , [0.7,0.8], [0.1, 0.2]〉,
S6 (η) for p=2 0.93292 0.89590 0.98709 〈𝑥 , [0.0,0.1], [0.7, 0.9]〉, 〈𝑥6 , [0.0,0.1], [0.8, 0.9]〉,
= ( 5 )
Table 3: Comparison of Similarity Measures. 〈𝑥7 , [0.1,0.2], [0.7, 0.8]〉, 〈𝑥8 , [0.1,0.2], [0.7, 0.8]〉,
〈𝑥 , [0.4,0.5], [0.3, 0.4]〉, 〈𝑥10 , [1.0,1.0], [0.0, 0.0]〉,
From the similarity measures listed in table 3, we can see ( 9 )
{ 11 , [0.3,0.4], [0.4, 0.5]〉, 〈𝑥12 , [0.0,0.1], [0.8, 0.9]〉 }
〈𝑥
that
SW , SHL and SS are inconsistent with intuition where as 𝐵=
SSu , Sj (ϕ) and Sj (η), j = 1, … , 6. 〈𝑥 , [0.9,1.0], [0.0, 0.0]〉, 〈𝑥2 , [0.9,1.0], [0.0, 0.0]〉,
In this section we have derived a relation between ( 1 )
〈𝑥3 , [0.7,0.8], [0.1, 0.2]〉, 〈𝑥4 , [0.6,0.7], [0.1, 0.2]〉,
entropy and similarity measure. Then we defined some 〈𝑥 , [0.0,0.1], [0.8, 0.9]〉, 〈𝑥6 , [0.1,0.2], [0.7, 0.8]〉,
similarity measures, compared its performance with ( 5 )
〈𝑥7 , [0.1,0.2], [0.7, 0.8]〉, 〈𝑥8 , [0.1,0.2], [0.7, 0.8]〉,
existing similarity measures. In section 5 we applied
〈𝑥 , [0.4,0.5], [0.3, 0.4]〉, 〈𝑥10 , [1.0,1.0], [0.0, 0.0]〉,
proposed similarity measures to draw conclusion in ( 9 )
pattern recognition and medical diagnoses. { 〈𝑥11 , [0.3,0.4], [0.4, 0.5]〉, 〈𝑥12 , [0.0,0.1], [0.7, 0.9]〉 }

Entropy, Distance and Similarity Measures under... Informatica 42 (2018) 617–627 625

We need to identify which pattern is most similar to B A1 P A2 P A3 P


using the maximum degree principle of measures of
S1 (ϕ) 0.80666 0.753483 0.770859
similarity between IvIFSs. Using the anticipated
similarity measures defined in this paper, we get the S2 (ϕ) 0.840736 0.788069 0.808633
following results given in table 4: S3 (ϕ) 0.766473 0.703639 0.743099
A1 B A2 B A3 B A4 B S4 (ϕ) 0.840736 0.788069 0.808633
S1 (ϕ) 0.688569 0.803061 0.84671 0.936548 S5 (ϕ) 0.761105 0.682104 0.712949
S2 (ϕ) 0.725141 0.854747 0.868809 0.95075 S6 (ϕ) for p=2 0.821222 0.776552 0.796418
S3 (ϕ) 0.654357 0.742627 0.818343 0.9265
S1 (η) 0.916133 0.872675 0.885256
S4 (ϕ) 0.725141 0.861997 0.868809 0.95075
S5 (ϕ) 0.587711 0.789371 0.803214 0.926125 S2 (η) 0.938333 0.9025 0.911667
S6 (ϕ) S3 (η) 0.89835 0.847525 0.865842
for p=2 0.738413 0.80768 0.863582 0.939988 S4 (η) 0.938333 0.9025 0.911667
S1 (η) 0.780631 0.773645 0.768612 0.979413
S5 (η) 0.9075 0.85375 0.8675
S2 (η) 0.816725 0.811175 0.804125 0.98595
S3 (η) 0.767465 0.749444 0.76081 0.975 S6 (η) for p=2 0.918319 0.877512 0.891129
S4 (η) 0.816725 0.811175 0.804125 0.98595 Table 5: Application to Medical diagnoses.
S5 (η) 0.725088 0.716763 0.706188 0.978925
S6 (η) From the above table 5 it is clear that patient P can be
for p=2 0.814325 0.801194 0.810194 0.979588 diagnosed with viral fever.
Table 4: Application to pattern recognition.
6 Conclusion
From the above values it is clear that B is most similar to Entropy, distance and similarity measure are
A4 as the value corresponding to each similarity measure significant research area in fuzzy information theory as
is highest for A4 . So, we can conclude that A4 building they are efficient tools to deal with uncertain and
material consistent with the specification and this result insufficient information. Here we have derived new
is consistent with the results presented by Wu et definition of entropy based on distance measure by
al.(2014). considering degree of hesitancy in to account and derived
relation between distance, entropy and similarity
5.2 Medical diagnoses measures under IvIFE. Further, we have compared the
Many authors Wei et al. (2011), Wu et al. (2014), Singh derived similarity measures with some of the existing
(2012) employed IvIFSs to execute medical diagnosis in similarity measure and instances are used to show that
their works. Here we use the data used by Singh(2012) to the derived measures are able to draw conclusion when
do medical diagnosis using the proposed measure of existing measures give the same result. Thereafter,
similarity proposed measures of similarity are applied to
recognition of patterns and medical diagnoses.
Example: Let A and B be the set that represent the set of
diagnoses and symptoms respectively given as A = References
{〈A1 , Viral fever〉, 〈A2 , Malaria〉, 〈A3 , Typhoid〉} and =
[1] Atanassov, K.(1986), Intuitionistic fuzzy sets,
{〈B1 , Temperature〉, 〈B2 , Headache〉, 〈B3 , Cough〉} . Fuzzy Sets and Systems, 20 (1), 87–96.
Assume the patient is represented by https://doi.org/10.1016/S0165-0114(86)80034-3
〈𝐵 , [0.6,0.8], [0.1,0.2]〉, 〈𝐵2 , [0.3,0.7], [0.2,0.3]〉,
𝑃={ 1 } [2] Atanassov, K. T. (1999). Intuitionistic Fuzzy
〈𝐵3 , [0.6,0.8], [0.1,0.2]〉 Sets. Heidelberg, New York: Physica-Verlag.
and the weights corresponding to each attribute is equal https://doi.org/10.1007/978-3-7908-1870-3
and each diagnosis is given by the following 𝐼𝑣𝐼𝐹𝑆𝑠 [3] Atanassov, K. , & Gargov, G.(1989) . Interval-
valued intuitionistic fuzzy sets, Fuzzy Sets and
〈B1 , [0.4,0.5], [0.3,0.4]〉, 〈B2 , [0.4,0.6], [0.2,0.4]〉, Systems, 31 (3) , 343–349.
A1 = { }
〈B3 , [0.4,0.8], [0.1,0.2]〉 https://doi.org/10.1016/0165-0114(89)90205-4
〈B , [0.3,0.6], [0.3,0.4]〉, 〈B2 , [0.5,0.6], [0.3,0.4]〉, [4] Dammak, F. Baccour, L and Alimi, A.M.(2016),
A2 = { 1 }
〈B3 , [0.4,0.5], [0.1,0.3]〉 An Exhaustive Study of Possibility Measures of
〈B , [0.7,0.8], [0.1,0.2]〉, 〈B2 , [0.6,0.7], [0.1,0.3]〉, Interval-valued Intuitionistic Fuzzy Sets abd
A3 = { 1 }
〈B3 , [0.3,0.4], [0.1,0.2]〉 Application to Multicriteria Decision Making,
Using the proposed similarity measure we classify the Advances in Fuzzy Systems, Vol 2016, Article
patient P in one of the diagnoses A1 ,A2 , A3. The results ID 9185706, 10 pages.
are as follows in Table 5. http://dx.doi.org/10.1155/2016/9185706
[5] De Luca A.,& Termini S. (1972). A definition
of a non-probabilistic entropy in setting of fuzzy
sets. Information and Control, 20, 30.
626 Informatica 42 (2018) 617–627 P. Tiwari et al.

https://doi.org/10.1016/S0019-9958(72)90199- [19] Xia, M. &Xu, Z. (2010), Some new similarity


41-312. measures for intuitionistic fuzzy values and their
[6] Gau, W.L. ,& Buehrer, D.J.(1993) , Vague sets, application in group decision making , Journal
IEEE Transactions on systems, Man and of Systems Science and System Engineering,
Cybernetics, 23 (2), 610–614. Vol. 19(4), 430-452.
https://doi.org/10.1109/21.229476 https://doi.org/10.1007/s11518-010-5151-9
[7] Grzegorzewski, P. (2004). Distances between [20] Xu, Z. (2007a), On similarity measures of
intuitionistic fuzzy sets and/or interval-valued interval-valued intuitionistic fuzzy sets and their
fuzzy sets based on the hausdorff metric. Fuzzy application to pattern recognitions, Journal of
Sets and Systems, 48, 319–328. Southeast University (English Edition) 23 (1),
https://doi.org/10.1016/j.fss.2003.08.005 139–143.
[8] Hu, K. & Li, J. (2013), The entropy and [21] Xu, Z. (2007 b), Some similarity measures of
similarity measure of interval valued intuitionistic fuzzy sets and their application to
intuitionistic fuzzy sets and their relationship, multiple attribute decision making, Fuzzy
International Journal of Fuzzy Systems, Vol. Optimization and Decision Making, Vol. 6(2),
15(3), 279-288. 109-121.
[9] Hung, W., & Yang, M., (2006), Fuzzy entropy https://doi.org/10.1007/s10700-007-9004-z
in intuitionistic fuzzy sets, International Journal [22] Xu, Z.S. &Yager, R. R., (2009), Intuitionistic
of Intelligent Systems, Vol.21, 443-451. and interval valued intuitionistic fuzzy
https://doi.org/10.1002/int.20131 preference relations and their measures of
[10] Kacprzyk, J. (1997). Multistage Fuzzy Control. similarity for the evaluation of agreement within
Chichester: Wiley. a group, Fuzzy Optimization and Decision
[11] Liu, X.C.(1992), Entropy, distance measure and Making, Vol. 8(2), 123-139.
similarity measure of fuzzy sets and their https://doi.org/10.1007/s10700-009-9056-3
relations, Fuzzy Sets and Systems, 52, 305–318. [23] Yang, Y., & Chiclana, F. (2012), Consistency of
https://doi.org/10.1016/0165-0114(92)90239-Z 2D and 3D distances of intuitionistic fuzzy sets,
[12] Park, J. H., Lin, K.M., Park, J.S. & Kwun, Y.C. Expert Systems with Applications, Vol. 39(10),
(2007), Distance between interval-valued 8665–8670.
intuitionistic fuzzy sets, Journal of Physics: https://doi.org/10.1016/j.eswa.2012.01.199
Conference Series 96. [24] Yang, Y.J. & Hinde, C. (2010), A new
https://doi.org/10.1088/1742-6596/96/1/012089 extension of fuzzy sets using rough sets: R-
[13] Singh, P.(2012), A new method on measure of fuzzy sets, Information Sciences, 180, 354–365.
similarity between interval-valued intuitionistic https://doi.org/10.1016/j.ins.2009.10.004
fuzzy sets for pattern recognition, Journal of [25] Ye, J. (2012), Multicriteria decision making
Applied & Computational Mathematics, method using the Dice similarity measure based
Vol.1(1), 1-5. on the reduct intuitionistic fuzzy sets of interval-
https://doi.org/10.4172/2168-9679.1000101 valued intuitionistic fuzzy sets, Applied
[14] Sun, M. & Liu, J.,(2012), New entropy and Mathematical Modelling, vol.36(9), 4466-4472.
similarity measures for interval-valued https://doi.org/10.1016/j.apm.2011.11.075
intuitionistic fuzzy set, Journal of information & [26] Zadeh, L. A.(1965), Fuzzy sets, Information and
Computational Sciences, vol.9(18), 5799-5806. Control 8 (3), 338–356.
[15] Szmidt, E., & Kacprzyk, J. (2000). Distance https://doi.org/10.1016/S0019-9958(65)90241-
between intuitionistic fuzzy sets. Fuzzy Sets and X
Systems, 114(3), 505–518. [27] Zadeh, L.A.(1975), The concept of a linguistic
https://doi.org/10.1016/S0165-0114(98)00244-9 variable and its application to approximate
[16] Tiwari, P. & Gupta, P. (in press), Generalized reasoning-I, Information Science, 8, 199–249.
Interval-valued Intuitionistic Fuzzy Entropy https://doi.org/10.1016/0020-0255(75)90036-5
with some Similarity Measures, International [28] Zang, Q.S., Jiang, S.Y., Jia, B. G., & Luo, S. H.,
Journal of Computing Science and Mathematics. (2010), Some information measures for interval-
[17] Wei, C.P. , Wang, P. , & Zhang, Y.Z. (2011), valued intuitionistic fuzzy sets, Information
Entropy, similarity measures of interval-valued Sciences, Vol.180, 5130-5145.
intuitionistic fuzzy sets and their applications, https://doi.org/10.1016/j.ins.2010.08.038
Inform. Sci., 181, 4273–4286. [29] Zhang, Y.J., Ma, P.J., Su, H.,& Zhang,
https://doi.org/10.1016/j.ins.2011.06.001 C.P.(2011), Entropy on interval-valued
[18] Wu, C., Luo, P., Li, Y. & Ren, X. (2014), A intuitionistic fuzzy sets and its application in
new similarity measure of interval-valued multi-attribute decision making, 2011
intuitionistic fuzzy sets considering its hesitancy proceedings of 14th International Conference
degree and applications in expert systems, on Fusion (FUSION),1-7.
Mathematical Problems in Engineering. [30] Zhang, S. Li, X. & Meng, F.(2016), An
http://dx.doi.org/10.1155/2014/359214 Approach to Multi-criteria Decision Making
under Interval-valued Intuitionistic Fuzzy Vales
Entropy, Distance and Similarity Measures under... Informatica 42 (2018) 617–627 627

and Interval Fuzzy Measures , Journal of


Industrial and Production Engineering, Vol
33(4).
https://doi.org/10.1080/21681015.2016.1146362
[31] Zhang, Q., Xing, H., Liu, F., Ye, J. & Tang, P.
(2014), Some entropy measures for interval –
valued intuitionistic fuzzy sets based on
distances and their relationships with similarity
and inclusion measures, Information Sciences,
283, 55-69.
https://doi.org/10.1016/j.ins.2014.06.012
[32] Zhang, H., Zhang, W., & Mei, C. (2009),
Entropy of interval-valued fuzzy sets based on
distance and its relationship with similarity
measure, Knowledge-Based Systems, Vol. 22,
449-454.
https://doi.org/10.1016/j.knosys.2009.06.007
628 Informatica 42 (2018) 617–627 P. Tiwari et al.

View publication stats

You might also like