You are on page 1of 55

Machine Learning

Chapter 11

Machine Learning
• What is learning?

2

Machine Learning
• What is learning? • “That is what learning is. You suddenly understand something you've understood all your life, but in a new way.”
(Doris Lessing – 2007 Nobel Prize in Literature)

3

Machine Learning
• How to construct programs that automatically improve with experience.

4

Machine Learning
• How to construct programs that automatically improve with experience.

• Learning problem:

– Task T – Performance measure P – Training experience E

5

Machine Learning • Chess game: – Task T: playing chess games – Performance measure P: percent of games won against – Training experience E: playing practice games againts itself opponents 6 .

Machine Learning • Handwriting recognition: – Task T: recognizing and classifying handwritten words – Performance measure P: percent of words correctly – Training experience E: handwritten words with given classifications classified 7 .

Designing a Learning System • Choosing the training experience: – Direct or indirect feedback – Degree of learner's control – Representative distribution of examples 8 .

Designing a Learning System • Choosing the target function: – Type of knowledge to be learned – Function approximation 9 .

Designing a Learning System • Choosing a representation for the target function: algorithms – Expressive representation for a close function approximation – Simple representation for simple training data and learning 10 .

Designing a Learning System • Choosing a function approximation algorithm (learning algorithm) 11 .

Designing a Learning System • Chess game: – Task T: playing chess games – Performance measure P: percent of games won against – Training experience E: playing practice games againts itself – Target function: V: Board → R opponents 12 .

Designing a Learning System • Chess game: – Target function representation: V^(b) = w0 + w1x1 + w2x2 + w3x3 + w4x4 + w5x5 + w6x6 x1: the number of black pieces on the board x2: the number of red pieces on the board x4: the number of red kings on the board x3: the number of black kings on the board x5: the number of black pieces threatened by red x6: the number of red pieces threatened by black 13 .

x4 = 0.Designing a Learning System • Chess game: – Function approximation algorithm: (<x1 = 3. x5 = 0. 100) x1: the number of black pieces on the board x2: the number of red pieces on the board x4: the number of red kings on the board x3: the number of black kings on the board x5: the number of black pieces threatened by red x6: the number of red pieces threatened by black 14 . x2 = 0. x3 = 1. x6 = 0>.

Designing a Learning System • What is learning? 15 .

Designing a Learning System • Learning is an (endless) generalization or induction process. 16 .

(b2.} Critic 17 .. V1).Designing a Learning System New problem (initial board) Experiment Generator Hypothesis (V^) Performance System Solution trace (game history) Generalizer Training examples {(b1. .. V2).

Issues in Machine Learning • What learning algorithms to be used? • How much training data is sufficient? • When and how prior knowledge can guide the learning process? • What is the best strategy for choosing a next training experience? • What is the best way to reduce the learning task to one or more function approximation problems? • How can the learner automatically alter its representation to improve its learning ability? 18 .

Example Experience Example 1 2 3 4 Sky Sunny Sunny Rainy Sunny AirTemp Warm Warm Cold Warm Humidity Normal High High High Low Cold High Low Wind Strong Strong Strong Strong Weak Strong Strong Strong Warm Warm Cool Change Same Same ? ? ? 19 Water Warm Warm Warm Cool Forecast Same Same Change Change EnjoySport Yes Yes No Yes Prediction 5 6 7 Rainy Sunny Sunny Warm Warm Normal .

Example • Learning problem: – Task T: classifying days on which my friend enjoys water sport – Performance measure P: percent of days correctly classified – Training experience E: days with given attributes and classifications 20 .

Concept Learning • Inferring a boolean-valued function from training examples of its input (instances) and output (classifications). 21 .

1} ≡ Constraints on instance attributes Characteristics of all instances of the concept to be learned 22 .Concept Learning • Learning problem: – Target function: – Hypothesis: c: X → {0. 1} – Target concept: a subset of the set of instances X Sky × AirTemp × Humidity × Wind × Water × Forecast → {Yes. No} h: X → {0.

Concept Learning • Satisfaction: h(x) = 1 iff x satisfies all the constraints of h h(x) = 0 otherwsie • Consistency: • Correctness: h(x) = c(x) for every instance x of the training examples h(x) = c(x) for every instance x of X 23 .

Concept Learning • How to represent a hypothesis function? 24 .

Humidity. Forecast> 25 . Wind. Water.Concept Learning • Hypothesis representation (constraints on instance attributes): – ?: any value is acceptable – single required value – ∅: no value is acceptable <Sky. AirTemp.

?. ?. ?. ?. ?> General H 26 2 Lattice (Partial order) . 3 ? ? . ?. ? . ?. ?> 1 . Strong. ?> h2 = <Sunny. h3 = <Sunny.Concept Learning • General-to-specific ordering of hypotheses: hj ≥g hk iff ∀x∈X: hk(x) = 1 ⇒ hj(x) = 1 ∈ Specific h h h h1 = <Sunny. ? . Cool.

h = <Sunny. Same> h = <Sunny. Warm.FIND-S Example 1 2 3 4 Sky Sunny Sunny Rainy Sunny AirTemp Warm Warm Cold Warm Humidity Normal High High High Wind Strong Strong Strong Strong Water Warm Warm Warm Cool Forecast Same Same Change Change EnjoySport Yes Yes No Yes h=< ∅ . Warm. Warm. ∅ > 27 . ? . Strong. ? ? . Normal. Warm. ∅ . Warm. ? > ∅ . Same> . Strong. ∅ . ∅ . h = <Sunny. Strong.

FIND-S • Initialize h to the most specific hypothesis in H: • For each positive training instance x: For each attribute constraint ai in h: • Output hypothesis h Then replace ai by the next more general constraint satisfied by x If the constraint is not satisfied by x 28 .

Strong Strong Strong ? . Strong. Rainy Cold ? High Low . Warm. Warm Warm Cool ? > Change Same Same No Sunny Sunny Warm Warm Normal Yes Yes 29 .Example 1 2 3 4 Sky Sunny Sunny Rainy Sunny AirTemp Warm Warm Cold Warm Humidity Normal High High High FIND-S Wind Strong Strong Strong Strong Water Warm Warm Warm Cool Forecast Same Same Change Change EnjoySport Yes Yes No Yes Prediction 5 6 7 h = <Sunny.

30 .FIND-S • The output hypothesis is the most specific one that satisfies all positive training examples.

FIND-S • The result is consistent with the positive training examples. 31 .

FIND-S • Is the result is consistent with the negative training examples? 32 .

? . Strong. Warm.FIND-S Example 1 2 3 4 5 Sky Sunny Sunny Rainy Sunny Sunny AirTemp Warm Warm Cold Warm Warm Humidity Normal High High High Normal Wind Strong Strong Strong Strong Strong Water Warm Warm Warm Cool Cool Forecast Same Same Change Change Change EnjoySport Yes Yes No Yes No h = <Sunny. ? > 33 . ? .

34 .FIND-S • The result is consistent with the negative training examples if the target concept is contained in H (and the training examples are correct).

35 – Size of the concept space C = 2|X| = 296 .2.3.3.2.2. • Sizes of the space: – Size of the instance space: |X| = 3.3.3) + 1 = 973 << 296 ⇒ The target concept (in C) may not be contained in H.2.FIND-S • The result is consistent with the negative training examples if the target concept is contained in H (and the training examples are correct).2 = 96 – Size of the hypothesis space H = (4.3.

FIND-S • Questions: – Has the learner converged to the target concept. as there can be several consistent hypotheses (with both positive and negative training examples)? – Why the most specific hypothesis is preferred? – What if there are several maximally specific consistent hypotheses? – What if the training examples are not correct? 36 .

• Algorithm: – Initial version space = set containing every hypothesis in H – For each training example <x.List-then-Eliminate Algorithm • Version space: a set of all hypotheses that are consistent with the training examples. remove from the version space any hypothesis h for which h(x) ≠ c(x) – Output the hypotheses in the version space 37 . c(x)>.

List-then-Eliminate Algorithm • Requires an exhaustive enumeration of all hypotheses in H 38 .

D)} ¬∃s’∈ 39 .Compact Representation of Version Space • G (the generic boundary): set of the most generic hypotheses of H consistent with the training data D: G = {g∈H | consistent(g. D) ∧ ¬∃ ∈H: s >g s’ ∧ consistent(s’. D) ∧ ¬∃ ∈H: g’ >g g ∧ consistent(g’. D)} ¬∃g’∈ ∈ • S (the specific boundary): set of the most specific hypotheses of H consistent with the training data D: S = {s∈H | consistent(s.

Compact Representation of Version Space • Version space = <G. S> = {h∈H | ∃g∈G ∃s∈S: g ≥g h ≥g s} ∈ ∈ ∈ S G 40 .

?>} S1 = {<Sunny. Warm. Strong. ∅. ?. ?. ?. Normal. ?>} S2 = {<Sunny. ?>} S S3 = {<Sunny. <?. Same>} G1 = {<?. ?>} G4 = {<Sunny. ?. ?. ?. ?. ?. ?. ?. ?. ∅. ?. ?. ?. Strong. ?. Warm. Strong. ?. Warm. ?. ?. Same>} G3 = {<Sunny. ?>} 41 G . ?>. Warm. ?. ∅. <?. Warm. <?. ?. Warm. ?. ?. ?. ?. Same>} S4 = {<Sunny. ?. ?>. ?. ?. ?. ∅>} G0 = {<?. ?. ?. Warm. Strong. Warm. ?. ?.Example 1 2 3 4 Candidate-Elimination Algorithm Sunny Sunny Rainy Sunny Sky Warm Warm Cold Warm AirTemp Normal High High High Humidity Strong Strong Strong Strong Wind Warm Warm Warm Cool Water Same Same Forecast Change Change EnjoySport Yes Yes Yes No S0 = {<∅. ?. ?>. ∅. ?. Warm. Same>} G2 = {<?.

?. ?. Warm. ?. ?> <Sunny. ?. ?>. ?> <?. ?. Warm. ?. ?. ?. ?. <?. Strong. ?. ?. Warm. ?>} 42 . ?> G4 = {<Sunny. ?>} <Sunny. Strong. Strong. ?. ?. Warm. ?. ?. ?. ?.Candidate-Elimination Algorithm S4 = {<Sunny.

Candidate-Elimination Algorithm • Initialize G to the set of maximally general hypotheses in H • Initialize S to the set of maximally specific hypotheses in H 43 .

Candidate-Elimination Algorithm • For each positive example d: – Remove from G any hypothesis inconsistent with d – For each s in S that is inconsistent with d: Remove s from S Add to S all least generalizations h of s. such that h is consistent with d and some hypothesis in G is more general than h Remove from S any hypothesis that is more general than another hypothesis in S 44 .

such that h is consistent with d and some hypothesis in S is more specific than h Remove from G any hypothesis that is more specific than another hypothesis in G 45 .Candidate-Elimination Algorithm • For each negative example d: – Remove from S any hypothesis inconsistent with d – For each g in G that is inconsistent with d: Remove g from G Add to G all least specializations h of g.

Candidate-Elimination Algorithm • The version space will converge toward the correct target concepts if: – H contains the correct target concept – There are no errors in the training examples • A training instance to be requested next should discriminate among the alternative hypotheses in the current version space: 46 .

?>} <Sunny. ?. ?. ?. <?. ?. ?. ?. ?.Candidate-Elimination Algorithm • Partially learned concept can be used to classify new instances using the majority rule. ?> <Sunny. Warm. ?. ?. ?. Strong. ?. Warm. Strong. ?. ?> <?. ?. ?> ⊕ G4 = {<Sunny. S4 = {<Sunny. ?. ?>} 5 Rainy Warm High Strong Cool Same ? 47 ⊕ . Warm. Warm. Strong. ?. ?. ?>. ?.

3.Inductive Bias • Size of the instance space: |X| = 3.2.3.2.2.3.3.2 = 96 • Number of possible concepts = 2|X| = 296 • Size of H = (4.2.3) + 1 = 973 << 296 48 .

3.2.2.3.2.3.3.2 = 96 • Number of possible concepts = 2|X| = 296 • Size of H = (4.3) + 1 = 973 << 296 ⇒ a biased hypothesis space 49 .2.Inductive Bias • Size of the instance space: |X| = 3.

x3 Negative examples: x4. x2.Inductive Bias • An unbiased hypothesis space H’ that can represent every subset of the instance space X: Propositional logic sentences • Positive examples: x1. x5 h(x) ≡ (x = x1) ∨ (x = x2) ∨ (x = x3) ≡ x1 ∨ x2∨ x3 h’(x) ≡ (x ≠ x4) ∧ (x ≠ x5) ≡ ¬x4 ∧ ¬x5 50 .

Inductive Bias x1∨ x2 ∨ x3 x1∨ x2 ∨ x3 ∨ x6 ¬x4 ∧ ¬x5 Any new instance x is classified positive by half of the version space. and negative by the other half ⇒ not classifiable 51 .

Example 1 2 3 4 5 6 7 8 9 Inductive Bias Day Actor Price Mon Sat Sun Sun Thu Tue Sat Famous Famous Moderate Cheap Moderate Cheap Cheap Cheap Expensive EasyTicket Yes No No No Wed Infamous Infamous Famous Famous Famous Famous Infamous Expensive Expensive Yes Yes Yes No ? 10 Sat Wed Infamous Expensive ? 52 .

Inductive Bias Example 1 2 Quality Good Bad Price Low High Buy Yes No 3 4 Good Bad High Low ? ? 53 .

54 .Inductive Bias • A learner that makes no prior assumptions regarding the identity of the target concept cannot classify any unseen instances.

5 (Chapter 2.Homework Exercises 2-1 → 2. ML textbook) 55 .