Document Server@UHasselt >
Research publications >
Please use this identifier to cite or link to this item:
|Title: ||Evaluating the performance of cost-based discretization versus entropy- and error-based discretization|
|Authors: ||JANSSENS, Davy|
|Issue Date: ||2002|
|Citation: ||Belgian-Dutch Conference on Artificial Intelligence BNAIC'02. p. 163-170.|
|Abstract: ||Discretization is defined as the process that divides continuous numeric values into intervals of discrete categorical values. In this article, the concept of cost-based discretization as a pre-processing step to the induction of a classifier is introduced in order to obtain an optimal multi-interval splitting for each numeric attribute. A transparent description of the method and the steps involved in cost-based discretization are given. The aim of this paper is to present this method and to assess the potential benefits of such an approach. Furthermore, its performance against two other well-knwon methods, i.e. entropy- and pure error-based discretization is examined. To this end, experiments on 14 data sets taken from the UCI Repository on Machine Learning were carried out. In order to compare the different methods, the area under the Receiver Operating Characteristic(ROC)graph was used and tested on its level of significance. For most data sets the results show that cost-based discretization achieves satisfactory results when compared to entropy- and error-based discretization.|
|Type: ||Proceedings Paper|
|Appears in Collections: ||Research publications|
Files in This Item:
There are no files associated with this item.
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.