Entropy in the Decision tree is used to find the splitting point at the nodes. It actually calculates how mixed a column is. We know a decision tree is a supervised machine-learning algorithm that creates a tree-like structure for training purposes. Now, the question is how it creates the tree and split the dataset in each node. Well, here comes the entropy which is used to calculate the impurity and helps to create a decision tree. In this short article, we will learn how to calculate the entropy in decision tree and will discuss various kinds of entropy in the decision tree.
Calculate the Entropy in Decision Tree
The entropy in the decision tree can be calculated by the simple formulae shown below:
Entropy = -(p(0) * log(P(0)) + p(1) * log(P(1)))
In general, the formula is also represented as:
Entropy is zero when the dataset is completely homogeneous and the value of entropy is 1 when the sample is equally divided.
How Entropy is Used to Build Decision Tree?
To build a decision tree, the model needs two types of entropy calculations:
- Entropy using the frequency table of one attribute
- Entropy using the frequency table of two attributes
How to visualize a Decision tree with Entropy Values?
Well, there are various methods to visualize the decision trees. Here we will use the simple method to visualize the decision tree with the entropy values.
We will assume that you already have built your decision tree and want to visualize to see the values of entropy. (Go through the Decision trees post to learn the training of the model).
# importing the plot tree method from sklearn.tree import DecisionTreeClassifier, plot_tree # output size of decision tree using Python plt.figure(figsize=(40,20)) # plotting the decision tree using Python plot_tree(classifer, filled=True) plt.title("Decision tree training for training dataset") plt.show()
As you can see, the entropy values in each of the splitting nodes.
A Decision tree is a supervised machine learning algorithm that is very common and popular as most of the boosting algorithms are based on decision trees. In this short post, we discussed how we can calculate the entropy in decision trees and learn to visualize it along with decision trees.
Can the value of Entropy be Zero?
Yes, the value of entropy can be zero which means the dataset is completely homogenous.
Can the value of Entropy be one?
Yes, when the value of entropy is one it means the data is equally divided.
What is the formula for Entropy in Decision Tree?
The formula of entropy in the decision tree is: -(p(0) * log(P(0)) + p(1) * log(P(1)))