HRI 2018 presentation on explanatory biases + deep learning

My colleague, Esube Bekele, recently presented our research on integrating deep learning (specifically, a person re-identification network) with an explanatory bias known as the “inherence bias”. The work was featured in the “Explainable Robotics Systems” workshop at HRI 2018. Here’s the paper, and here’s the abstract:

Despite the remarkable progress in deep learning in recent years, a major challenge for present systems is to generate explanations compelling enough to serve as useful accounts of the system’s operations [1]. We argue that compelling explanations are those that exhibit human-like biases. For instance, humans prefer explanations that concern inherent properties instead of extrinsic influences. The bias is pervasive in that it affects the fitness of explanations across a broad swath of contexts [2], particularly those that concern conflicting or anomalous observations. We show how person re-identification (re-ID) networks can exhibit an inherence bias. Re-ID networks operate by computing similarity metrics between pairs of images to infer whether the images display the same individual. State-of-the-art re-ID networks tend to output a description of a particular individual, a similarity metric, or a discriminative model [3], but no existing re-ID network provides an explanation of its operations. To address the deficit, we developed a multi-attribute residual network that treats a subset of its features as either inherent or extrinsic, and we trained the network against the ViPER dataset [4]. Unlike previous systems, the network reports a judgment paired with an explanation of that judgment in the form of a description. The descriptions concern inherent properties when the network detects dissimilarity and extrinsic properties when it detects similarity. We argue that such a system provides a blueprint for how to make the operations of deep learning techniques comprehensible to human operators.



Leave a Reply

Your email address will not be published. Required fields are marked *