Statistics – Machine Learning
Scientific paper
2009-12-06
Statistics
Machine Learning
31 pages, 14 figures
Scientific paper
After building a classifier with modern tools of machine learning we typically have a black box at hand that is able to predict well for unseen data. Thus, we get an answer to the question what is the most likely label of a given unseen data point. However, most methods will provide no answer why the model predicted the particular label for a single instance and what features were most influential for that particular instance. The only method that is currently able to provide such explanations are decision trees. This paper proposes a procedure which (based on a set of assumptions) allows to explain the decisions of any classification method.
Baehrens David
Hansen Katja
Harmeling Stefan
Kawanabe Motoaki
Mueller Klaus-Robert
No associations
LandOfFree
How to Explain Individual Classification Decisions does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with How to Explain Individual Classification Decisions, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and How to Explain Individual Classification Decisions will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-84195