You are on page 1of 15

DECISION TREES

DECISION TREES
A collection of decision nodes, connected by branches,
extending downward from the root node until terminating in
leaf nodes. Beginning at the root node, which by convention is
placed at the top of the decision tree diagram, attributes are
tested at the decision nodes, with each possible outcome
resulting in a branch. Each branch then leads either to another
decision node or to a terminating leaf node.
CLASSIFICATION AND
REGRESSION TREES
(CART) ALGORITHM
CLASSIFICATION AND
REGRESSION TREES (CART)

The decision trees produced by CART are strictly binary, containing exactly two branches for
each decision node. CART recursively partitions the records in the training data set into subsets
of records with similar values for the target attribute.
where,
C4.5 ALGORITHM
C4.5 ALGORITHM
The C4.5 algorithm is Quinlan’s extension of his own ID3 algorithm for
generating decision trees. Just as with CART, the C4.5 algorithm recursively visits
each decision node, selecting the optimal split, until no further splits are possible.
The Difference between CART and C4.5:
Unlike CART, the C4.5 algorithm is not restricted to binary splits. Whereas CART always
produces a binary tree, C4.5 produces a tree of more variable shape.

You might also like