Professional Documents
Culture Documents
models of argumentation
Henry Prakken
Dundee (Scotland)
September 4th, 2014
What is argumentation?
In linguistics:
In Artificial Intelligence:
In Multi-Agent Systems:
Abstract argumentation
Argumentation as inference
Frameworks for structured
argumentation
Argument schemes
Argumentation as dialogue
Lower
taxes
increase
productiv
ity
Increased
productiv
ity is
good
Lower
taxes
increase
productiv
ity
Increased
productiv
ity is
good
Attack
on
conclus
ion
Lower
taxes
increase
inequalit
y
Increas
ed
inequal
ity is
bad
Lower
taxes
increase
productiv
ity
Attack
on
premise
Increased
productiv
ity is
good
Lower taxes
do not
increase
productivit
y
USA
lowered
taxes but
productivi
ty
decreased
Lower
taxes
increase
inequalit
y
Increas
ed
inequal
ity is
bad
Lower
taxes
increase
productiv
ity
Prof. P
says that
often
becomes
attack on
intermediate
conclusion
Increased
productiv
ity is
good
Lower taxes
do not
increase
productivit
y
USA
lowered
taxes but
productivi
ty
decreased
Lower
taxes
increase
inequalit
y
Increas
ed
inequal
ity is
bad
Lower
taxes
increase
productiv
ity
Prof. P
says that
People
with
political
ambitions
are not
objective
Increased
productiv
ity is
good
Prof. P is
not
objective
Prof. P
has
politica
l
ambition
s
Lower taxes
do not
increase
productivit
y
USA
lowered
taxes but
productivi
ty
decreased
Lower
taxes
increase
inequalit
y
Increas
ed
inequal
ity is
bad
Attack
on
inferen
ce
Lower
taxes
increase
productiv
ity
Prof. P
says that
People
with
political
ambitions
are not
objective
Increased
productiv
ity is
good
Prof. P is
not
objective
Prof. P
has
politica
l
ambition
s
Lower taxes
do not
increase
productivit
y
USA
lowered
taxes but
productivi
ty
decreased
Lower
taxes
increase
inequalit
y
Increas
ed
inequal
ity is
bad
Lower
taxes
increase
productiv
ity
Prof. P
says that
People
with
political
ambitions
are not
objective
Increased
productiv
ity is
good
Prof. P is
not
objective
Prof. P
has
politica
l
ambition
s
Lower
taxes
increase
inequalit
y
Increas
ed
inequal
ity is
good
Lower taxes
do not
increase
productivit
y
USA
lowered
taxes but
productivi
ty
decreased
Increas
ed
inequal
ity is
bad
Increased
inequalit
y
stimulate
s
competiti
on
Competiti
on is
good
Indirec
t
defence
Lower
taxes
increase
productiv
ity
Prof. P
says that
People
with
political
ambitions
are not
objective
Increased
productiv
ity is
good
Prof. P is
not
objective
Prof. P
has
politica
l
ambition
s
Lower
taxes
increase
inequalit
y
Increas
ed
inequal
ity is
good
Lower taxes
do not
increase
productivit
y
USA
lowered
taxes but
productivi
ty
decreased
Increas
ed
inequal
ity is
bad
Increased
inequalit
y
stimulate
s
competiti
on
Competiti
on is
good
Properties
B
D
C
B
D
C
Difference between
grounded
and preferred labellings
Justification status of
arguments
A is justified if A is In in all
labellings
A is overruled if A is Out in all
labellings
A is defensible otherwise
Argument status in
grounded and preferred
semantics
Preferred semantics:
A and B defensible
C overruled
D justified
Grounded semantics:
all arguments defensible
Labellings and
extensions
Grounded extension
F0AF =
C
F
= {A}
C
F
F
1
2
= {A}
= {A,D}
C
F
F
F
1
2
3
= {A}
= {A,D}
= F 2
Grounded
Stable extensions
Dung (1995):
Recall:
Preferred extensions
Dung (1995):
Recall:
Admissible?
Admissible?
Admissible?
Admissible?
Preferred?
S is preferred if
it is maximally
admissible
Preferred?
S is preferred if
it is maximally
admissible
Preferred?
S is preferred if
it is maximally
admissible
Strategies
P: A
O: B
P: E
P: D
O: F
O: C
O: G
P: H
43
Strategies
44
An attack graph
A
B
C
D
46
A game tree
move
P: A
B
C
D
47
A game tree
move
P: A
F
O: F
B
C
D
48
A game tree
A
P: A
F
O: F
B
P: E
move
D
49
A game tree
A
P: A
F
move
O: F
O: B
B
P: E
D
50
A game tree
A
P: A
O: B
O: F
B
P: E
move
P: C
D
51
A game tree
A
P: A
O: B
O: F
B
P: E
P: C
E
O: D
move
D
52
A game tree
A
P: A
O: B
O: F
B
P: E
move
P: C
P: E
E
O: D
D
53
Proponents winning
strategy
A
P: A
F
O: F
O: B
B
P: E
move
P: E
D
54
Exercise
F
P: D
O: B
O: C
P: A
P:
E
P: A
O: F
Research on abstract
argumentation
New semantics
Algorithms
Complexity
Dynamics (adding or deleting arguments or attacks)
Addition of new elements to AFs:
Reasons to be sceptical:
Domain-specific
Defeasible and conflicting
Will it rain in
Calcutta?
Modgil 2009
BBC
says
rain
CNN
says
sun
Will it rain in
Calcutta?
Modgil 2009
T
BBC
says
rain
CNN
says
sun
Will it rain in
Calcutta?
Modgil 2009
T
BBC
says
rain
CNN
says
sun
S
Stats say CNN better than BBC
Will it rain in
Calcutta?
Modgil 2009
T
BBC
says
rain
CNN
says
sun
S
Stats say CNN better than BBC
Lower
taxes
increase
productiv
ity
Prof. P
says that
People
with
political
ambitions
are not
objective
Lower
taxes
increase
inequalit
y
Increased
productiv
ity is
good
Prof. P is
not
objective
Prof. P
has
politica
l
ambition
s
Increas
ed
inequal
ity is
good
Lower taxes
do not
increase
productivit
y
USA
lowered
taxes but
productivi
ty
decreased
Increas
ed
inequal
ity is
bad
Increased
inequalit
y
stimulate
s
competiti
on
E
Competiti
on is
good
Arguments:
Conclusions:
Justification is nonmonotonic!
Two accounts of
the fallibility
of arguments
65
Nonmonotonic v.
Defeasible
Nonmonotonicity is a property
of consequence notions
Defeasibility is a property of
inference rules
Rationality postulates
for structured
argumentation
Extensions should be closed
under subarguments
Their conclusion sets should
be:
Consistent
Closed under deductive
inference
Classical argumentation
(Besnard, Hunter, )
Classical argumentation
formalised
S L and p L
S |- p
S is consistent
No S S is such that S |- p
Modelling default
reasoning in classical
argumentation
71
A modelling in classical
logic
Quaker Pacifist
Republican Pacifist
Facts: Quaker, Republican
Pacifist
Quaker
Pacifist
Quaker
Pacifist
Republican
Republican Pacifist
72
A modelling in classical
logic
Quaker Pacifist
Republican Pacifist
Facts: Quaker, Republican
(Quaker
Pacifist)
Pacifist
Quaker
Pacifist
Quaker
Pacifist
Republican
Republican Pacifist
73
A modelling in classical
logic
Ab1
Pacifist
Quaker
Pacifist
Ab2
A modelling in classical
logic
Ab2
Ab1
Pacifist
Quaker
Pacifist
Ab2
Extensions v. maximal
consistent subsets
Cayrol (1995)
Amgoud & Besnard (2013)
The ASPIC+
framework
Arguments:
Attack:
Strict rules
Defeasible rules
Trees where
On ordinary premises
On defeasible inferences (undercutting)
On conclusions of defeasible inferences
(rebutting)
Rs:
Rd:
Kn = {p,q}
{s}
r,s t
p,q r
r, s
Kp =
t Rs
s
p,q r Rd
78
Rs:
Rd:
Kn = {p,q}
{s}
r,s t
p,q r
Kp =
Attack:
Undermining: on ordinary
premises
Rebutting: on defeasible
inferences
Undercutting: on
1,defeasible
..., n
conclusions n(
of
inferences ) L
Attack + preferences =
79
Consistency in ASPIC+
(with symmetric
negation)
For any S L
80
Rationality postulates
for ASPIC+
Argument schemes
Domain-specific vs.
inference general
inference rules
Bird
Penguin
Rd = {, }
Rs = all valid inference Flies
rules of prop. l.
Bird Flies K
Bird
Bird
Penguin Bird K
Penguin K
Penguin
Flies
Flies
Penguin Bird
83
Preferred extensions do
not always coincide with
mcs
Quaker
Pacifist
r2
Republican
84
Preferred/stable
extensions do not always
coincide with mcs
A2
Pacifist
Pacifist
B2
A1
Quaker
Republican
B1
E1 = {A1,A2,B1,}
E2 = {A1,B1,B2,}
85
Preferred/stable
extensions do not always
coincide with mcs
A2
Pacifist
Pacifist
B2
A1
Quaker
Republican
B1
86
Preferred extensions do
not always coincide with
mcs
Rd = { , }
S p Rs iff S |- p in Prop. L and S is finite
K: Quaker, Republican,
Quaker Pacifist, Republican Pacifist
Pacifist
Quaker
Quaker
Pacifist
Pacifist
Republican
Quaker
Pacifist
87
To classical argumentation?
To assumption-based
argumentation?
In both cases:
Default contraposition
in classical
argumentation
John is a rapist
Assume when possible that things
are normal
What can we conclude about Johns
sex?
Default contraposition
in classical
argumentation
M & Ab R
Ab
Default contraposition
in classical
argumentation
Ab
M & Ab R
R & Ab M
Default contraposition
in classical
argumentation
Statisticians call
these inferences base
Assumption-based
argumentation (Dung,
Mancarella & Toni 2007)
L is a logical language
R is a set of rules (1, ..., n ) over L
Assumptions:
L consists of literals
No preferences
No rebuttals of undercutters
p1, , pn q
becomes
di, p1, , pn,notq q
where:
di = n(p1, , pn q)
di,notq are assumptions
= ~not, = ~, = ~
1-1
correspondence
between complete
extensions of
ASPIC+ and ABA
From defeasible to
strict rules: example
Pacifist
r1
Quaker
Pacifist
r2
Republican
95
From defeasible to
strict rules: example
Pacifist
Appl(s1)
Quaker notPacifist
Pacifist
notPacifistRepublican Appl(s2)
96
Lower
taxes
increase
productiv
ity
Prof. P
says that
People
with
political
ambitions
are not
objective
Lower
taxes
increase
inequalit
y
Increased
productiv
ity is
good
Prof. P is
not
objective
Prof. P
has
politica
l
ambition
s
Increas
ed
inequal
ity is
good
Lower taxes
do not
increase
productivit
y
USA
lowered
taxes but
productivi
ty
decreased
Increas
ed
inequal
ity is
bad
Increased
inequalit
y
stimulate
s
competiti
on
E
Competiti
on is
good
A
C
P1
P2
P3
P4
B
P5
C
P6
P7
P8
P9
Preferences in
abstract argumentation
a is an ordering on args
A defeats B iff A attacks B and not A < B
Apply Dungs theory to (args,defeat)
All attacks are preference-dependent
All attacks are independent from each other
John may be
removed
R3
John misbehaves
in the library
R1
John snores in
the library
103
John may be
removed
R3
John misbehaves
in the library
R1
John snores in
the library
104
A3
John may be
removed
R3
A2
John misbehaves
in the library
B2
B1
R1
A1
John snores in
the library
105
A3
A2
B2
A1
B1
106
A3
A2
B2
A1
B1
107
A3
The defeat
graph in PAFs
A2
B2
A1
B1
108
John may be
removed
R3
John misbehaves
in the library
R1
John Snores in
the library
109
A3
A2
B2
A1
B1
PAFs dont
recognize that B2s
attacks on
A2 and A3 are the
same
110
Argument(ation) schemes:
general form
Premise 1,
,
Premise n
Therefore (presumably), conclusi
112
Argument schemes in
ASPIC
113
Perception
P is observed
Therefore (presumably), P
Critical questions:
Inducing generalisations
Almost all observed Ps were Qs
Therefore (presumably), If P
then usually Q
A ballpoint shot with this
type of bow will usually cause
this type of eye injury
In 16 of 17 tests the
ballpoint shot with this bow
caused this type of eye injury
Critical questions:
Expert testimony
(Walton 1996)
E is expert on D
E says that P
P is within D
Therefore (presumably), P is the ca
Critical questions:
Is E biased?
Is P consistent with what other experts say?
Is P consistent with known evidence?
117
E is an expert
on this type of
injury
Expert
testimony
scheme
Action A causes G,
G is good (bad)
Therefore (presumably), A should (not) b
Critical questions:
Action A
results in Cn
C1 is good
Cn is good
Therefore,
Action A is
good
Action A
results in C1
Action A
results in Cn
C1 is bad
Cm is bad
Therefore,
Action A is bad
H. Prakken, Formalising a
legal opinion on a
legislative proposal in
the ASPIC+ framework.
GC3
GC
123
BC
12
GC
23
GC2
GC1
GC
13
C1
P1
GC
12
DMP
P2
Preferred
labelling 1
1. An argument is In iff all
arguments
that defeat it are Out.
2. An argument is Out iff some
argument that defeats it is
In.
GC
123
BC
12
GC
23
GC3
GC2
GC1
GC
13
C1
P1
GC
12
DMP
P2
Preferred
labelling 2
1. An argument is In iff all
arguments
that defeat it are Out.
2. An argument is Out iff some
argument that defeats it is
In.
GC
123
BC
12
GC
23
GC3
GC2
GC1
GC
13
C1
P1
GC
12
DMP
P2
Grounded labelling
1. An argument is In iff all
arguments
that defeat it are Out.
2. An argument is Out iff some
argument that defeats it is
In.
GC
123
BC
12
GC
23
GC3
GC2
GC1
GC
13
C1
P1
GC
12
DMP
P2
Summary
Aggregation of arguments
Relation with probability theory
Interaction
Distributed information
Dynamics
Real players!
Richer communication languages
Example
P: Tell me all you know about
recent trading in explosive
materials (request)
P: why dont you want to tell me?
P: why arent you allowed to tell
me?
P: You may be right in general
(concede) but in this case
there is an exception since
this is a matter of national
importance
P: since we have heard about a
possible terrorist attack
O: No I wont (reject)
O: since I am not allowed to tell
you
O: since sharing such information
could endanger an investigation
Example
P: Tell me all you know about
recent trading in explosive
materials (request)
P: why dont you want to tell me?
P: why arent you allowed to tell
me?
P: You may be right in general
(concede) but in this case
there is an exception since
this is a matter of national
importance
P: since we have heard about a
possible terrorist attack
O: No I wont (reject)
O: since I am not allowed to tell
you
O: since sharing such information
could endanger an investigation
Example
P: Tell me all you know about
recent trading in explosive
materials (request)
P: why dont you want to tell me?
P: why arent you allowed to tell
me?
P: You may be right in general
(concede) but in this case
there is an exception since
this is a matter of national
importance
P: since we have heard about a
possible terrorist attack
O: No I wont (reject)
O: since I am not allowed to tell
you
O: since sharing such information
could endanger an investigation
Types of dialogues
(Walton & Krabbe)
Dialogue Type
Dialogue Goal
Initial situation
Persuasion
resolution of
conflict
conflict of
opinion
Negotiation
making a deal
conflict of
interest
Deliberation
reaching a
decision
need for
action
Information
seeking
exchange of
information
personal
ignorance
Inquiry
growth of
knowledge
general
ignorance
Dialogue systems
(according to Carlson
1983)
conditions
appropriate
if it promotes
which it is
Dialogue game
systems
A communication language
Well-formed utterances
Protocol
Effect rules
Turntaking rules
Termination + outcome rules
Agent design:
strategies for selecting from the allowed utterance
Effect rules
Specify commitments
Determining outcome
Enforcing dialogical consistency
...
Example 1
Knowledge bases Inference rules
Paul: r
Olga: s
p q
r p
s r
P1: q since p
Example 1
Knowledge bases Inference rules
Paul: r
Olga: s
p q
r p
s r
P1: q since p
Example 1
Knowledge bases Inference rules
Paul: r
Olga: s
p q
r p
s r
P1: q since p
Example 1
Knowledge bases Inference rules
Paul: r
Olga: s
p q
r p
s r
P1: q since p
O1: why p?
Example 1
Knowledge bases Inference rules
Paul: r
Olga: s
p q
r p
s r
P1: q since p
O1: why p?
P2: p since r
Example 1
Knowledge bases Inference rules
Paul: r
Olga: s
p q
r p
s r
P1: q since p
O1: why p?
P2: p since r
O2: r since s
Example 2
Knowledge bases Inference rules
Paul:
p
q
Modus
ponens
Olga:
p
q
p
P1: claim p
Example 2
Knowledge bases Inference rules
Paul:
p
q
Modus
ponens
P1: claim p
O1: concede p
Olga:
p
q
p
Example 2
Knowledge bases Inference rules
Paul:
p
q
Modus
ponens
P1: claim p
Example 2
Knowledge bases Inference rules
Paul:
p
q
Modus
ponens
P1: claim p
P2: claim q
Example 2
Knowledge bases Inference rules
Paul:
p
q
Modus
ponens
P1: claim p
Problem: how to
ensure relevance?
P2: claim q
O2: p since q, q p
Automated Support of
Regulated Data
Exchange. A MultiAgent Systems Approach
PhD Thesis Pieter Dijkstra
(2012)
Faculty of Law
University of Groningen
The communication
language
Speech act
Attack
Surrender
request()
offer()
offer() ( ),
reject()
accept()
reject()
offer() ( ),
accept()
why-reject ()
-
why-reject()
claim ()
claim()
why()
concede()
why()
retract()
since S
why() ( S)
concede()
concede ( S)
concede()
since S (a
defeater)
-
retract()
deny()
The protocol
Termination:
Example dialogue
formalised
P: Request to tell
O: Reject to tell
Embedded
persuasion
...
Persuasion part
formalised
O: Claim Not allowed to tell
Persuasion part
formalised
O: Claim Not allowed to tell
Persuasion part
formalised
O: Claim Not allowed to tell
Persuasion part
formalised
O: Claim Not allowed to tell
Persuasion part
formalised
O: Claim Not allowed to tell
Persuasion part
formalised
O: Claim Not allowed to tell
Persuasion part
formalised
O: Claim Not allowed to tell
Persuasion part
formalised
O: Claim Not allowed to tell
O: Concede Exception to R1
O: Why National importance?
Persuasion part
formalised
O: Claim Not allowed to tell
O: Retract &
O: Not allowed to tell since telling endangers investigation
Not allowed to tel
What endangers an investigation is not allowed
O: Concede Exception to R1
O: Why National importance?
Conclusion
Inference
Dialogue
semantics
strict vs defeasible inferences
preferences
language + protocol
agent design
Reading (1)
Collections
Abstract argumentation
Reading (2)
ASPIC+
Reading (3)
Assumption-based argumentation
Dialogue