Teaching categories to human learners with visual explanations
We study the problem of computer-assisted teaching with explanations. Conventional approaches for machine teaching typically only provide feedback at the instance level e.g., the category or label of the instance. However, it is intuitive that clear explanations from a knowledgeable teacher can significantly improve a student's ability to learn a new concept. To address these existing limitations, we propose a teaching framework that provides interpretable explanations as feedback and models how the learner incorporates this additional information. In the case of images, we show that we can automatically generate explanations that highlight the parts of the image that are responsible for the class label. Experiments on human learners illustrate that, on average, participants achieve better test set performance on challenging categorization tasks when taught with our interpretable approach compared to existing methods.
© 2018 IEEE. We would like to thank Google for their gift to the Visipedia project, AWS Research Credits, Bloomberg, Northrop Grumman, and the Swiss NSF for their Early Mobility Postdoctoral Fellowship. Thanks also to Kareem Moussa for providing the OCT dataset and to Kun ho Kim for helping generate crowd embeddings.
Accepted Version - 1802.06924.pdf