This document discusses classification rules and models for machine learning problems. It provides examples of classification rules learned from weather and iris datasets. For the weather data, rules are provided to predict whether to "play" based on outlook, humidity, and temperature attributes. For the iris data, rules are given to classify the type of iris based on sepal width, petal length, petal width attributes. The document also discusses decision trees as a way to visualize classification rules with node tests and leaf predictions.
This document discusses classification rules and models for machine learning problems. It provides examples of classification rules learned from weather and iris datasets. For the weather data, rules are provided to predict whether to "play" based on outlook, humidity, and temperature attributes. For the iris data, rules are given to classify the type of iris based on sepal width, petal length, petal width attributes. The document also discusses decision trees as a way to visualize classification rules with node tests and leaf predictions.
This document discusses classification rules and models for machine learning problems. It provides examples of classification rules learned from weather and iris datasets. For the weather data, rules are provided to predict whether to "play" based on outlook, humidity, and temperature attributes. For the iris data, rules are given to classify the type of iris based on sepal width, petal length, petal width attributes. The document also discusses decision trees as a way to visualize classification rules with node tests and leaf predictions.
will use repeatedly to illustrate machine learning methods. A set of rules learned from this information • If outlook = overcast then play = yes • If humidity = normal then play = yes • If outlook = sunny and humidity = high then play = no A set of rules that are intended to be interpreted in sequence is called a Decision list. • In Table 1.3, two of the attributes—temperature and humidity—have numeric values.
• This means that any learning method must create
inequalities involving these attributes rather than simple equality tests, as in the former case. This is called a numeric-attribute problem
• In this case, a mixed-attribute problem
because not all attributes are numeric. Now the first rule given earlier might take the following form: • If outlook = sunny and humidity > 81 then play = no
Write few more rules
• The rules we have seen so far are classification rules
• They predict whether to play or not.
• Contact lenses: An idealized problem • Write few classification rules Decision Tree • Nodes in a decision tree involve testing a particular attribute.
• Usually, the test at a node compares an attribute
value with a constant.
• However, some trees compare two attributes with
each other, or use some function of one or more attributes Leaf nodes give a classification that applies to all instances that reach the leaf. • Write few classification rules The following set of Classification rules might be learned from this dataset:
• If petal length < 2.45 then Iris setosa
• If sepal width < 2.10 then Iris versicolor
• If sepal width < 2.45 and petal length < 4.55
then Iris versicolor
• If sepal width < 2.95 and petal width < 1.35
then Iris versicolor • If petal length ≥ 5.15 then Iris virginica
• If petal width ≥ 1.85 then Iris virginica
• If petal width ≥ 1.75 and sepal width < 3.05 then
Iris virginica
• If petal length ≥ 4.95 and petal width < 1.55
then Iris virginica Problem • Classify the attribute ‘Play’ of weather.nominal.arff dataset • Open weather.nominal.arff • Different types of classifier in WEKA Click on start to run that classifier No. of instances and Attributes No of leaves and trees Overall Accuracy= 57.14% Right click • Select option visualize tree Decision tree • Write Classification rules based on Decision tree Same Decision tree
Is There Any Way To Download The Whole Package of Asphalt 8 Airborne So That I Can Install It On Any Android Device Without An Internet Connection - Quora