Automatic Inconsistency Treatment in Case-Based Reasoning Systems

Vera Lucia M. FalqueteJulio C. NievolaCelso A.A. Kaestner

Inconsistency normally arises when using Case-Based Reasoning (CBR) systems. As these systems aim to solve problems using previously recorded situations, real applications frequently present cases that, even starting from similar premisses, lead to different conclusions. In this work we propose an automatic way of dealing with the inconsistency problem in CBRs. We employ the Evidential Logic (EL) formalism, a special case of the Paraconsistent Logics, which admit inconsistent but non-trivial theories. In EL belief and disbelief evidential factors are associated to formulas or, in CBRs context, to each case. We automatically calculate these factors by applying the well-known Machine Learning Na�ve-Bayes algorithm to the case base. The overall proposal is tested in a CBR platform that employs the k-Nearest Neighbors (k-NN) algorithm: the solution of an unseen case is obtained as the �closest� one in the case base given by k-NN. We test our proposal using six different rules for combining be lief and disbelief factors and the distances given by k-NN, in nine databases from the UCI Machine Learning repository. Obtained results indicate the utility of our proposal in practical applications.

Caso o link acima esteja inválido, faça uma busca pelo texto completo na Web: Buscar na Web

Biblioteca Digital Brasileira de Computação - Contato:
     Mantida por: