ID3 algorithm
In decision tree learning, ID3 (Iterative Dichotomiser 3) is an algorithm invented by Ross Quinlan[1] used to generate a decision tree from a dataset. ID3 is the precursor to the C4.5 algorithm, and is typically used in the machine learning and natural language processing domain
Algorithm
The ID3 algorithm begins with the original set as the
of the set and calculates the entropy or theRecursion on a subset may
- every element in the subset belongs to the same class; in which case the node is turned into a leaf nodeand labelled with the class of the examples.
- there are no more attributes to be selected, but the examples still do not belong to the same class. In this case, the node is made a leaf node and labelled with the most common class of the examples in the subset.
- there are populationwith age over 100 years. Then a leaf node is created and labelled with the most common class of the examples in the parent node's set.
Throughout the algorithm, the decision tree is constructed with each non-terminal node (
Summary
- Calculate the entropy of every attribute of the data set .
- Partition ("split") the set into maximum.
- Make a decision tree node containing that attribute.
- Recurse on subsets using the remaining attributes.
Properties
ID3 does not guarantee an optimal solution. It can converge upon
ID3 can overfit the training data. To avoid overfitting, smaller decision trees should be preferred over larger ones.[further explanation needed] This algorithm usually produces small trees, but it does not always produce the smallest possible decision tree.
ID3 is harder to use on continuous data than on factored data (factored data has a discrete number of possible values, thus reducing the possible branch points). If the values of any given attribute are
Usage
The ID3 algorithm is used by training on a data set to produce a
The ID3 metrics
Entropy
Entropy is a measure of the amount of uncertainty in the (data) set (i.e. entropy characterizes the (data) set ).
Where,
- – The current dataset for which entropy is being calculated
- This changes at each step of the ID3 algorithm, either to a subset of the previous set in the case of splitting on an attribute or to a "sibling" partition of the parent in case the recursion terminated previously.
- – The set of classes in
- – The proportion of the number of elements in class to the number of elements in set
When , the set is perfectly classified (i.e. all elements in are of the same class).
In ID3, entropy is calculated for each remaining attribute. The attribute with the smallest entropy is used to split the set on this iteration. Entropy in information theory measures how much information is expected to be gained upon measuring a random variable; as such, it can also be used to quantify the amount to which the distribution of the quantity's values is unknown. A constant quantity has zero entropy, as its distribution is perfectly known. In contrast, a uniformly distributed random variable (discretely or continuously uniform) maximizes entropy. Therefore, the greater the entropy at a node, the less information is known about the classification of data at this stage of the tree; and therefore, the greater the potential to improve the classification here.
As such, ID3 is a
Information gain
Where,
- – Entropy of set
- – The subsets created from splitting set by attribute such that
- – The proportion of the number of elements in to the number of elements in set
- – Entropy of subset
In ID3, information gain can be calculated (instead of entropy) for each remaining attribute. The attribute with the largest information gain is used to split the set on this iteration.
See also
- Classification and regression tree(CART)
- C4.5 algorithm
- Decision tree learning
References
Further reading
- Mitchell, Tom Michael (1997). Machine Learning. New York, NY: McGraw-Hill. pp. 55–58. OCLC 36417892.
- Grzymala-Busse, Jerzy W. (February 1993). "Selected Algorithms of Machine Learning from Examples" (PDF). Fundamenta Informaticae. 18 (2): 193–207 – via ResearchGate.
External links
- Seminars – http://www2.cs.uregina.ca/
- Description and examples – http://www.cise.ufl.edu/
- Description and examples – http://www.cis.temple.edu/
- Decision Trees and Political Party Classification