Computer Science – Learning
Scientific paper
2007-11-23
Computer Science
Learning
8 pages, 8 figures, and 2 tables
Scientific paper
This correspondence studies the basic problem of classifications - how to evaluate different classifiers. Although the conventional performance indexes, such as accuracy, are commonly used in classifier selection or evaluation, information-based criteria, such as mutual information, are becoming popular in feature/model selections. In this work, we propose to assess classifiers in terms of normalized mutual information (NI), which is novel and well defined in a compact range for classifier evaluation. We derive close-form relations of normalized mutual information with respect to accuracy, precision, and recall in binary classifications. By exploring the relations among them, we reveal that NI is actually a set of nonlinear functions, with a concordant power-exponent form, to each performance index. The relations can also be expressed with respect to precision and recall, or to false alarm and hitting rate (recall).
Hu Bao-Gang
Wang Yong
No associations
LandOfFree
Derivations of Normalized Mutual Information in Binary Classifications does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Derivations of Normalized Mutual Information in Binary Classifications, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Derivations of Normalized Mutual Information in Binary Classifications will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-363160