Professional Documents
Culture Documents
Rough Set Theory: Assignment ON
Rough Set Theory: Assignment ON
ON
ROUGH SET THEORY
Submitted BY
Rosemelyne Wartde
MTech IT 2nd Semester
Roll No- 20MTechIT02
Thus, the information table represents input data, gathered from any
Attributes
domain.
Case Temperature Headache Nausa Cough
1 High yes no yes
2 Very_high yes yes no
3 high no no no
4 high yes yes yes
5 normal yes no no
6 normal no yes yes
Information system is a pair (U, A), U is a non-empty finite set of objects
and A is a non-empty finite set of attributes.
Attribute Decision
Case Temperature Headache Nausa Cough Flu
A way of reducing table size is to store only one representative object for
every set of objects with same features
where |X| denotes the cardinality of set X which is not null. Obviously, αp(X) will lie
between [0, 1] –
if αp(X)= 1, the upper and lower approximations are equal and X becomes a crisp set
with respect to P.
if αp(X)< 1, X is rough with respect to P.
if αp(X)= 0, the lower approximation is empty (regardless of the size of the upper
approximation).
Attribute Dependency
describes which variables are strongly related to which other variables.
Set of attribute Q depends totally on a set of attributes P, denoted if all values of
attributes from Q are uniquely determined by values of attributes from P.
Let us take two disjoint sets of attributes, set P and set Q. Each attribute set induces
an indiscernibility or equivalence class structure. The equivalence classes induced by
P is given by [x]P and the equivalence classes induced by Q is given by [x]Q.
Let, Qi is a given equivalence class from the equivalence-class structure induced by
attribute set Q. The dependency of attribute set Q on attribute set P, k or γ(P, Q), is
given by:
Note that –
If k or γ(P, Q)= 1, Q depends totally on P.
If k or γ(P, Q)< 1, Q depends partially (in a degree k) on P.
Reduct
The same or indiscernible objects may be represented several times. Some
of the attributes may be superfluous or redundant.
We should keep only those attributes that preserve the indiscernibility
relation and consequently set approximation.
There are usually several such subsets of attributes and those which are
minimal are called Reduct.
Reduct is a sufficient set of features which by itself can fully characterize
the knowledge in the database
Some of the important features of Reduct are –
->Produce same equivalence class structure as that expressed by the full
attribute set which can be expresses by [x]RED = [x]P.
->It is minimal.
->It is not unique.
Algorithm to Reduct Calculation
Input:
C, the set of all conditional features
D, the set of all decisional features
Output: R, a feature subset
1. T := { }, R : = { }
2. repeat
3. T : = R
4. ∀ x ∈ (C – R )
5. if γ RU{X} ( D ) > γT( D )
6. T : = R U {x}
7. R : = T
8. until γR( D ) = γC( D )
9. return R
Core
Core is the set of attributes which is common to all reducts and denoted
by CORE(P) = ∩ (RED(P)).
Reduct Calculation:
The set {Muscle-pain, Temp.} is a reduct of the original set of attributes
{Headache, Muscle_ pain, Temp.}. So, Reduct1 = {Muscle-pain, Temp.}. A new
information table based on this Reduct1 is represented as –
U Muscle Pain Temp. Flu
U1, U4 Yes Normal No
U2 Yes High Yes
U3, U6 Yes Very_High Yes
U5 No High No
An Example of Reducts & Core
The set {Headache, Temp.} is a reduct of the original set of attributes {Headache,
Muscle_ pain, Temp.}. So, Reduct2 = {Headache, Temp.}. A new information table
based on this Reduct2 is represented as –