In this article, we described robust extensions of the nearest neighbor pattern classifier,
analyzed their capabilities, and described their empirical behavior in incremental, supervised
learning applications. Our analysis showed that IBL algorithms can learn any concept
describable as a finite union of closed hypercurves of finite size in the instance space whose
instances are selected from any fixed and bounded continuous distribution. We then showed
that these algorithm's storage requirements can be sharply reduced with small sacrifices
in classification accuracy. Finally, we demonstrated that this storage-reducing algorithm
can be improved by using a selective utilization filter on the saved instances. This improved
the algorithm's ability to tolerate noisy instances.