Data Mining-Knowledge Presentation 2

Prof. Sin-Min Lee

Overview
q

Association rules are useful in that they suggest Association rules integrated into the generic

hypotheses for future research
q

actual argument model can assist in identifying the most plausible claim from given data items in a forward inference way or the likelihood of missing data values in a backward inference way

What is data mining ? What is knowledge discovery from databases KDD?
q

knowledge discovery in databases (KDD)

is the 'non trivial extraction of nontrivial of implicit, previously unknown, and potentially useful information from data

q

KDD encompasses a number of different

technical approaches, such as clustering, data summarization, learning classification rules, finding dependency networks, analyzing changes, and detecting anomalies
q

KDD has only recently emerged because we

only recently have been gathering vast quantities of data
q

Examples of KDD studies
Mangasarian et al (1997) Breast Cancer diagnosis. A sample from breast lump mass is assessed by: qmammagrophy (not sensitive 68%-79%) qdata mining from FNA test results and visual inspection (65%-98%) qsurgery (100% but invasive, expensive)
q

Basket analysis. People who buy nappies also buy beer qNBA. National Basketball Association of America. Player pattern profile.
q

Bhandary et al (1997)
q

Credit card fraud detection qStranieri/Zeleznikow (1997) predict family law property outcomes Rissland and Friedman (1997) discovers a change in the concept of ‘good faith’ in US Bankruptcy cases
q q

Pannu (1995) discovers a prototypical case from a library of cases

• Wilkins and Pillaipakkamnatt (1997) predicts the time a case takes to be heard • Veliev et al (1999) association rules for economic analaysis

Overview of process of knowledge discovery in databases ?

Raw data

Target data

Pre-proces Transform sed data ed data

patterns

knowledge

Select

Pre process

Trans form

Data mining

Interpret patterns

from Fayyad, Pitatetsky-Shapiro, Smyth (1996)

Phase 4. Data mining
q

Finding patterns in data or fitting models to data Categories of techniques
q

q

Predictive (classification: neural networks, rule induction, Segmentation (clustering, k-means, k-median) Summarisation (associations, visualisation) Change detection/modelling

linear, multiple regression)
q

q

q

What Is Association Mining?

Association rule mining:

Finding frequent patterns, associations, correlations, or causal structures among sets of items or objects in transaction databases, relational databases, and other information repositories. Basket data analysis, cross-marketing, catalog design, lossleader analysis, clustering, classification, etc. Rule form: “Body → Ηead [support, confidence]”. buys(x, “diapers”) → buys(x, “beers”) [0.5%, 60%] major(x, “CS”) ^ takes(x, “DB”) → grade(x, “A”) [1%, 75%]

Applications:

Examples.
– – –

More examples
– age(X, “20..29”) ^ income(X, “20..29K”)  buys(X, “PC”) [support = 2%, confidence = 60%] – contains(T, “computer”)  contains(x, “software”) [1%, 75%]

Association rules are a data mining technique
• An association rules tell us something about the association between two attributes • Agrawal et al (1993) developed the first association rule algorithm, Apriori • A famous (but unsubstantiated AR) from a hypothetical supermarket transaction database is if nappies then beer (80%) Read this as nappies are bought implies beer are bought 80% of the time • Association rules have only recently been applied to law with promising results • Association rules can automatically discover rules that may prompt an analyst to think of hypothesis they would otherwise have considered

Rule Measures: Support Support and confidence and Confidence two independent notions are
Customer buys both Customer buys diaper

• Find all the rules X & Y ⇒ Z with minimum confidence and support

Customer buys beer

Transaction ID Items Bought Let minimum support 50%, 2000 A,B,C and minimum confidence 1000 A,C 50%, we have 4000 A,D – A ⇒ C (50%, 66.6%) 5000 B,E,F – C ⇒ A (50%, 100%)

– support, s, probability that a transaction contains {X  Y  Z} – confidence, c, conditional probability that a transaction having {X  Y} also contains Z

Mining Association Rules—An Example
Transaction ID 2000 1000 4000 5000 Items Bought A,B,C A,C A,D B,E,F
Min. support 50% Min. confidence 50%
Frequent Itemset Support {A} 75% {B} 50% {C} 50% {A,C} 50%

For rule A ⇒ C:

support = support({A C}) = 50% confidence = support({A C})/support({A}) = 66.6%

Two Step Association Rule Mining
Step 1: Frequent itemset generation – use Support Step 2: Rule generation – use Confidence

{milk, bread} is a frequent item set. Folks buying milk, also buy bread. Is it also true?: “Folks buying bread also buy milk.”

Confidence and support of an association rule
• 80% is the confidence of the rule if nappies then beer (80%). This is

calculated by n2/n1 where: •n1 = no of records where nappies are bought •n2 = no of records where nappies were bought and beer was also bought. • if 1000 transactions for nappies, and of those, 800 also had beer then confidence is 80%. • A rule may have a high confidence but not be interesting because it doesn’t apply to many records in the database. i.e. no. of records where nappies were bought with beer / total records. • Rules that may be interesting have a confidence level and support level above a user set threshold

Interesting rules: Confidence and support of an association rule
• if 1000 transactions for nappies, and of those, 800 also had beer

then confidence is 80%. • A rule may have a high confidence but not be interesting because it doesn’t apply to many records in the database. i.e. no. of records where nappies were bought with beer / total records. • Rules that may be interesting have a confidence level and support level above a user set threshold

Association rule screen shot with A-Miner from Split Up data set

• In 73.4% of cases where the wife's needs are some to high then the

husband's future needs are few to some. • Prompts an analyst to posit plausible hypothesis e.g. it may be the case that the rule reflects the fact that more women remain custodial parents of the children following divorce than men do. The women that have some to high needs may do so because of their obligation to children.

Mining Frequent Itemsets: the Key Step
• Find the frequent itemsets: the sets of items that have minimum support
– A subset of a frequent itemset must also be a frequent itemset – Apriori principle
• i.e., if {AB} is a frequent itemset, both {A} and {B} should be a frequent itemset

– Iteratively find frequent itemsets with cardinality from 1 to k (k-itemset)

• Use the frequent itemsets to generate association rules.

The Apriori Algorithm
• Join Step: C is generated by joining L with itself • Prune Step: Any (k-1)-itemset that is not frequent cannot
k k-1

be a subset of a frequent k-itemset
Ck: Candidate itemset of size k Lk : frequent itemset of size k

• Pseudo-code:

L1 = {frequent items}; for (k = 1; Lk !=∅; k++) do begin Ck+1 = candidates generated from Lk; for each transaction t in database do

Lk+1 = candidates in Ck+1 with min_support end return ∪k Lk;

increment the count of all candidates in Ck+1 that are contained in t

Association rules in law
• Association rules generators are typically packaged with very expensive

data mining suites. We developed A-Miner (available from authors) for a PC platform. • Typically, too many association rules are generated for feasible analysis. So, our current research involves exploring metrics of interesting to restrict numbers of rules that might be interesting • In general, structured data is not collected in law as it is in other domains so very large databases are rare • Our current research involves 380,000 records from a Legal Aid organization data base that contains data on client features. • ArgumentDeveloper shell that can be used by judges to structure their reasoning in a way that will facilitate data collection and reasoning

The Apriori Algorithm — Example Support = 2
Database D
TID 100 200 300 400 Items 134 235 1235 25

itemset sup. C1 {1} 2 {2} 3 Scan D {3} 3 {4} 1 {5} 3

L1 itemset sup.
{1} {2} {3} {5} 2 3 3 3

L2 itemset sup
{1 3} {2 3} {2 5} {3 5} 2 2 3 2

C2 itemset sup
{1 {1 {1 {2 {2 {3 2} 3} 5} 3} 5} 5} 1 2 1 2 3 2

C2 itemset {1 2} Scan D
{1 {1 {2 {2 {3 3} 5} 3} 5} 5}

Join Operation — Example Infrequent
itemset {1 3} {2 3} {2 5} {3 5} sup 2 2 3 2 itemset {1 3} {2 3} {2 5} {3 5} sup 2 2 3 2

L2 join L2

Subset {1 2} {1 5}

{1 3} {1 3} {1 3} {2 3} {1 3} {2 5} {1 3} {3 5} {2 3} {2 3} {2 3} {2 5} {2 3} {3 5} {2 5} {2 5} {2 5} {3 5}

null {1 2 3} null {1 3 5} null {2 3 5} {2 3 5} null {2 3 5}

L2

L2

C3 itemset {2 3 5}

Scan D

L3 itemset sup {2 3 5} 2

Anti-Monotone Property
If a set cannot pass a test, all of its supersets will fail the same test as well.

If {2 3} does not have a support, nor will {1 2 3}, {2 3 5}, {1 2 3 5}, etc. If {2 3} occurs only in 5 times, can {2 3 5} occur in 8 times?

How to Generate Candidates?
• Suppose the items in Lk-1 are listed in an order • Step 1: self-joining Lk-1
insert into Ck select p.item1, p.item2, …, p.itemk-1, q.itemk-1 from Lk-1 p, Lk-1 q where p.item1=q.item1, …, p.itemk-2=q.itemk-2, p.itemk-1 < q.itemk-1

• Step 2: pruning
forall itemsets c in Ck do forall (k-1)-subsets s of c do
if (s is not in Lk-1) then delete c from Ck

Example of Generating Candidates
• L3={abc, abd, acd, ace, bcd} • Self-joining: L3*L3
– abcd from abc and abd – acde from acd and ace Problem of generate-&-test heuristic

• Pruning:
– acde is removed because ade is not in L3

• C4={abcd}

I.3

Severity of prior convictions constellation serious offender status Offender's health Offender's age Seriousness of armed robbery as an offense relative to other offenses Moral culpability of offender

I.2

Association rules can be used for forward and backward inferences in the generic/actual argument model for sentencing armed robbery
I.A I.4 I.5 I.6 I.7

extremely serious pattern of priors very serious pattern of priors serious pattern of priors not so serious pattern of priors no prior convictions yes no major psychiatric illness some psychiatric illness drug dependency major disability no major health issues >0 yrs extremely serious very serious serious not so serious trifling very high high average low very low extreme high some little none extreme serious very serious serious not so serious trifling extreme high some little none very appropriate somewhat appropriate not appropriate at all very appropriate somewhat appropriate not appropriate at all very appropriate somewhat appropriate not appropriate at all very appropriate somewhat appropriate not appropriate at all very appropriate somewhat appropriate not appropriate at all guilt plea early guilty plea during not guilty throughout

Degree of remorse displayed by offender

I.1

Offender lone penalty

seriousness of the offence relative to other armed robberies

Co-operation

Extent to which retribution is an appropriate purpose

Extent to which specific deterrence is an appropriate purpose Extent to which general deterrence is an appropriate purpose

Imprisonment Combined custody and treatment order Hospital security order Intensive correction order Suspended sentence Youth training centre detention Community based order Fine Adjournment on conditions Discharge offender Dismiss offence Defer sentence

Extent to which rehabilitation is an appropriate purpose Extent to which community protection is an appropriate purpose Offender's plea

Generic/actual argument model for sentencing armed robbery
Page-1
19 May, 2001 Page 1 of 1
Personal background Psychiatric illness Gambling Personal crisis Cultural adjustment Drug dependence Intellectual disability extreme impact serious impact some impact little impact bipolar disorder other psychiatric other psychological none extreme addiction serious addiction some gambling none extremely pertinent somewhat pertinent not an issue extremely pertinent somewhat pertinent not an issue extreme addiction serious addiction some addiction none extremely pertinent somewhat pertinent not an issue prior offence name prior offence type prior offence sentence date of prior offence prior sentence jurisdiction serious offender status at time Victoria Other Australia Other ? imprisonment, ico, cbo etc I.3 Severity of prior convictions constellation serious offender status Offender's health Offender's age I.A Seriousness of armed robbery as an offense relative to other offenses Moral culpability of offender I.2 extremely serious pattern of priors very serious pattern of priors serious pattern of priors not so serious pattern of priors no prior convictions yes no major psychiatric illness some psychiatric illness drug dependency major disability no major health issues >0 yrs extremely serious very serious serious not so serious trifling very high high average low very low extreme high some little none extreme serious very serious serious not so serious trifling extreme high some little none very appropriate somewhat appropriate not appropriate at all very appropriate somewhat appropriate not appropriate at all very appropriate somewhat appropriate not appropriate at all very appropriate somewhat appropriate not appropriate at all very appropriate somewhat appropriate not appropriate at all guilt plea early guilty plea during not guilty throughout

:
Plea Remarks to police Apology offered guilt plea early guilty plea during not guilty throughout indicate remorse neutral indicate no remorse yes no yes no

I.4

Reasons to depart from from parity with cooffender penalty

certainly exist probably exist possibly exist don't exist

Degree of remorse displayed by offender I.5

I.1

Offender lone penalty

Penalty

I.B

Degree of violence Degree of planning Extent to which Assisted victim Impact of the crime on victims Impact of the crime on the community Value of property stolen Duration of offence

extremely significant significant not so significant not significant at all extremely significant significant not so significant not significant at all extremely significant significant not so significant not significant at all extreme high some little none extreme high some little none Degree of assistence offered to police by the offender Police interview

Restitution made

I.6 verry highe high average low very low full admission partial admission complete denial positive defense offerred no instructions very important important provided but not important not provided major psychiatric illness Offender's health drug dependency no major health issues I.7

seriousness of the offence relative to other armed robberies

I.C

Co-operation

Extent to which retribution is an appropriate purpose Extent to which specific deterrence is an appropriate purpose Extent to which general deterrence is an appropriate purpose Extent to which rehabilitation is an appropriate purpose Extent to which community protection is an appropriate purpose Offender's plea

Imprisonment Combined custody and treatment order Hospital security order Intensive correction order Suspended sentence Youth training centre detention Community based order Fine Adjournment on conditions Discharge offender Dismiss offence Defer sentence
Cooffender's penalty

I.C

Assistance to Crown

I.D

None Imprisonment Combined custody and treatment order Hospital security order Intensive correction order Suspended sentence Youth training centre detention Community based order Fine Adjournment on conditions Discharge offender Dismiss offence Defer sentence

over many days/months or years over many hours over many minutes

Forward inference: confidence
I.3 Severity of prior convictions constellation serious offender status Offender's health Offender's age I.A Seriousness of armed robbery as an offense relative to other offenses Moral culpability of offender I.2 extremely serious pattern of priors very serious pattern of priors serious pattern of priors not so serious pattern of priors no prior convictions yes no major psychiatric illness some psychiatric illness drug dependency major disability no major health issues >0 yrs extremely serious very serious serious not so serious trifling very high high average low very low extreme high some little none extreme serious very serious serious not so serious trifling extreme high some little none very appropriate somewhat appropriate not appropriate at all very appropriate somewhat appropriate not appropriate at all very appropriate somewhat appropriate not appropriate at all very appropriate somewhat appropriate not appropriate at all very appropriate somewhat appropriate not appropriate at all guilt plea early guilty plea during not guilty throughout

• In the sentence actual argument

database the following outcomes were noted for the inputs suggested:
Imprisonment Combined custody and treatment order Hospital security order Intensive correction order Suspended sentence Youth training centre detention Community based order Fine Adjournment on conditions Discharge offender Dismiss offence Defer sentence
57% 0.1% 0% 12% 2% 10% 16% 0% 0% 0%

I.4

Degree of remorse displayed by offender I.5

I.1

Offender lone penalty

I.6 I.7

seriousness of the offence relative to other armed robberies

Co-operation

Extent to which retribution is an appropriate purpose Extent to which specific deterrence is an appropriate purpose Extent to which general deterrence is an appropriate purpose Extent to which rehabilitation is an appropriate purpose Extent to which community protection is an appropriate purpose Offender's plea

Imprisonment Combined custody and treatment order Hospital security order Intensive correction order Suspended sentence Youth training centre detention Community based order Fine Adjournment on conditions Discharge offender Dismiss offence Defer sentence

Backward inference: constructing the strongest argument

If all the items you suggest AND If extremely serious pattern of priors then imprisonment If very serious pattern of priors then imprisonment If serious pattern of priors then imprisonment If not so serious pattern of priors then imprisonment If no prior convictions then imprisonment

90% 75% 68% 78% 2%

2% 7% 17% 17% 3%

Conclusion

q

Data mining or Knowledge discovery from databases has not been Association rules are useful in that they suggest hypotheses for Association rules integrated into the generic actual argument model

appropriately exploited in law to date.
q

future research
q

can assist in identifying the most plausible claim from given data items in a forward inference way or the likelihood of missing data values in a backward inference way

Generating Association Rules
• For each nonempty subset s of l, output the rule: s => (l - s) if support_count(l) / support_count(s) >= min_conf where min_conf is the minimum confidence l threshold. s of l are {2 3}, {3 5}, {2 5}, {2}, {3}, & {5}. = {2 3 5}, Candidate rules: {2 3} => {5} {3 5} => {2} {2 5} => {3} {2} => {3 5} {3} => {2 5} {5} => {2 3}

Generating Association Rules
if support_count(l) / support_count(s) >= min_conf (e.g,75%),
itemset sup {1 2} 1 {1 3} 2 {1 5} 1 {2 3} 2 {2 5} 3 {3 5} 2

then introduce the rule s => (l - s).
itemset sup. {1} 2 {2} 3 {3} 3 {4} 1 {5} 3

itemset sup {2 3 5} 2

l = {2 3 5}

s = {2 3} {3 5} {2 5} {2} {3} {5} {2} => {3 5} : 2/3 {3} => {2 5} : 2/3 {5} => {2 3} : 2/3

{2 3} => {5} : 2/2 {3 5} => {2} : 2/2 {2 5} => {3} : 2/3

Presentation of Association Rules (Table Form )

Visualization of Association Rule Using Plane Graph

Visualization of Association Rule Using Rule Graph

Decision tree is a classifier in the form of a tree structure where each node is either: • a leaf node, indicating a class of instances, or • a decision node that specifies some test to be carried out on a single attribute   value, with one branch and sub-tree for each possible outcome of the test.

A decision tree can be used to classify an instance by starting at the root of the tree and moving through it until a leaf node, which provides the classification of the instance.

Example: Decision making in the London stock market

Suppose that the major factors affecting the London stock market are: • • • • •

what it did yesterday; what the New York market is doing today; bank interest rate; unemployment rate; England’s prospect at cricket.

The process of predicting an instance by this decision tree can also be expressed by answering the questions in the following order:

Is unemployment high? YES: The London market will rise today NO: Is the New York market rising today? YES: The London market will rise today NO: The London market will not rise today.

Decision tree induction is a typical inductive approach to learn knowledge on classification. The key requirements to do mining with decision trees are: • Attribute-value description: object or case must be expressible in terms of a fixed collection of properties or attributes. • Predefined classes: The categories to which cases are to be assigned must have been established beforehand (supervised data). • Discrete classes: A case does or does not belong to a particular class, and there must be for more cases than classes. • Sufficient data: Usually hundreds or even thousands of training cases. • “Logical” classification model: Classifier that can be only expressed as decision trees or set of production rules

An appeal of market analysis comes from the clarity and utility of its results, which are in the form of association rules. There is an intuitive appeal to a market analysis because it expresses how tangible products and services relate to each other, how they tend to group together. A rule like, “if a customer purchases three way calling, then that customer will also purchase call waiting” is clear. Even better, it suggests a specific course of action, like bundling three-way calling with call waiting into a single service package. While association rules are easy to understand, they are not always useful.

The following three rules are examples of real rules generated from real data: ∀• On Thursdays, grocery store consumers often purchase diapers and beer together. ∀• Customers who purchase maintenance agreements are very likely to purchase large appliances. ∀• When a new hardware store opens, one of the most commonly sold items is toilet rings. These three examples illustrate the three common types of rules produced by association rule analysis: the useful, the trivial, and the inexplicable.

OLAP (Summarization) Display Using MS/Excel 2000

Market-Basket-Analysis (Association)—Ball graph

Display of Association Rules in Rule Plane Form

Display of Decision Tree (Classification Results)

Display of Clustering (Segmentation) Results

3D Cube Browser