Feature selection has proven to be a valuable technique
in supervised learning for improving predictive
accuracy while reducing the number of
attributes considered in a task. We investigate
the potential for similar benefits in an unsupervised
learning task, conceptual clustering. The
issues raised in feature selection by the absence
Weproposeanexactmethod,basedonGeneralizedBendersDecomposition,toselectthebestMfeaturesduringinduction.Weprovidedetailsofthemethodandhighlightsomeinterestingparallelsbetweenthetechniqueproposedhereandsomeofthosepublishedintheliterature.Wealsoproposearelaxationoftheproblemwhereselectingtoomanyfeaturesispenalized.Theoriginalmethodperformswellonavarietyofdatasets.Therelaxation,thoughcompetitive,issensitivetothepenaltyparameter.