Repository | Book | Chapter

183817

(2015) Shape understanding system, Dordrecht, Springer.

Understanding explanations

Zbigniew Les , Magdalena Les

pp. 229-245

A machine in order to be able to understand needs to some extent mimic human understanding and for this reason machine understanding is based on the assumption that the results of understanding by the machine (SUS) can be evaluated according to the rules applied for evaluation of human understanding. The essential part of evaluation of the machine (SUS) ability to understand is to formulate problems and to use these problems to test if the machine (SUS) is able to solve these problems. However, while the ability to solve the problem by a machine can to some extent prove that the machine can understand, there is also a need proving this by testing the machine ability to explain how to solve the problem or to explain the causes, context, and consequences of given facts. Explanations can be often confused with arguments and when arguments try to demonstrate that something is, will be, or should be the case, explanations attempt to reveal why or how something is or will be. When arguments are referring to knowledge and its aim is to enrich the knowledge, explanations is referring to understanding and make contribution to understanding. Also explanation is often confused with justification and when justification is the reason why belief is properly holds, the explanation is the reason why the belief is true and statements which are justifications of some action can take the form of arguments.

Publication details

Full citation:

Les, Z. , Les, M. (2015). Understanding explanations, in Shape understanding system, Dordrecht, Springer, pp. 229-245.

This document is unfortunately not available for download at the moment.