Explain entropy

880    Asked by NikitaGavde in Data Science , Asked on Nov 9, 2019
Answered by Nikita Gavde

Entropy is the measure of impurity present in the data and it is derived from information theory. The entropy is considered as zero, if the sample is completely homogeneous and is considered as one of the sample is equally divided. In decision trees, the predictor which has the most heterogeneous will be considered nearest to the root node to classify the given data into classes in a greedy mode. Entropy works on the below formula


Here n is the number of classes. Entropy is maximum in the middle with a value of 1 and minimum at the extremes with a value of 0. The low valued entropy will segregate the classes better.

Here the below representation gives an idea of how entropy works.




Your Answer

Interviews

Parent Categories