You are on page 1of 33

CS8082 – Machine Learning Techniques

Unit 5 – Advanced Learning

Learning Sets of Rules –


Sequential Covering Algorithm
Unit Objectives

• Learn the new approaches in machine learning

• Design appropriate machine learning algorithms for problem solving

2
Unit Outcomes

At the end of the course, the student should be able to:


•CO 1: Differentiate between supervised, unsupervised, semi-supervised machine learning
approaches
•CO 2: Apply specific supervised or unsupervised machine learning algorithm for a particular
problem
•CO 3: Analyze and suggest the appropriate machine learning approach for the various types of
problem
•CO 4: Design and make modifications to existing machine learning algorithms to suit an
individual application
•CO 5: Provide useful case studies on the advanced machine learning algorithms

3
Prerequisite

 Basic knowledge of how to build a decision tree classifier from a set of


training data
 Knowledge on the instance - based learning algorithms
Introduction – Rule induction
Sequential covering Algorithm
General – to – Specific Search Space
Rule quality measures
First – Order Rules

 Learning rules that contain variables


 They are much more expressive than
propositional rules
 Inductive Logic Programming (ILP)
Example
Definitions
Horn Clause
Learning sets of First – Order Rules: FOIL
Generating candidate specializations in FOIL
Guiding the search in FOIL

Example:
INDUCTION AS INVERTED DEDUCTION

 Given some data D and some partial background knowledge B,


learning can be described as generating a hypothesis h that,
together with B, explains D
Example

The process of augmenting the set of predicates, based on background knowledge, is often referred to as
constructive induction
Inverse entailment operators

• Learning as finding some general concept that matches a given set of training examples
• Allows a more rich definition of when a hypothesis may be said to "fit“ the data learning
• Methods that use this background information to guide the search for h,
Inverting resolution

 Automated deduction is the resolution rule introduced by Robinson


(1965)
 The resolution rule is a sound and complete rule for deductive
inference in first-order logic.
Propositional resolution operator
Propositional resolution operator
Inverting resolution

The learning algorithm can use inverse entailment to construct hypotheses that, together with
the background information, entail the training data.
First-Order resolution
Resolution Operator
Inverse resolution
Inverse resolution
Summary

 Sets of first-order rules (i.e., rules containing variables) provide a highly


expressive representation.
 The programming language PROLOG represents general programs using
collections of first-order Horn clauses.
 The problem of learning first-order Horn clauses is therefore often referred to as
the problem of inductive logic programming.
 One approach to learning sets of first-order rules is to extend the sequential
covering algorithm of CN2 from propositional to first-order representations.
 This approach is exemplified by the FOIL program, which can learn sets of first-
order rules, including simple recursive rule sets.
Summary

 A second approach to learning first-order rules is based on the observation that


induction is the inverse of deduction.
 Following the view of induction as the inverse of deduction, some programs
search for hypotheses by using operators that invert the well-known operators
for deductive reasoning.
References

TEXT BOOKS:

1. Tom M. Mitchell, ―Machine Learning‖, McGraw-Hill Education (India) Private Limited, 2013.

REFERENCES:

1. Ethem Alpaydin, ―Introduction to Machine Learning (Adaptive Computation and Machine


Learning), The MIT Press 2004.
2. Stephen Marsland, ―Machine Learning: An Algorithmic Perspective ‖, CRC Press, 2009.

You might also like