Automatic Inconsistency Treatment in Case-Based Reasoning Systems

**
Vera Lucia M. Falquete,
Julio C. Nievola,
Celso A.A. Kaestner.
**

Inconsistency normally arises when using Case-Based Reasoning (CBR) systems. As these systems aim to solve problems using previously recorded situations, real applications frequently present cases that, even starting from similar premisses, lead to different conclusions. In this work we propose an automatic way of dealing with the inconsistency problem in CBRs. We employ the Evidential Logic (EL) formalism, a special case of the Paraconsistent Logics, which admit inconsistent but non-trivial theories. In EL belief and disbelief evidential factors are associated to formulas or, in CBRs context, to each case. We automatically calculate these factors by applying the well-known Machine Learning Na�ve-Bayes algorithm to the case base. The overall proposal is tested in a CBR platform that employs the k-Nearest Neighbors (k-NN) algorithm: the solution of an unseen case is obtained as the �closest� one in the case base given by k-NN. We test our proposal using six different rules for combining be lief and disbelief factors and the distances given by k-NN, in nine databases from the UCI Machine Learning repository. Obtained results indicate the utility of our proposal in practical applications.

http://www.lbd.dcc.ufmg.br:8080/colecoes/sbbd/2006/003.pdf

Biblioteca Digital Brasileira de Computação - Contato: bdbcomp@lbd.dcc.ufmg.br