We propose an approach for automatically identifying relevant change using a classifier that has been trained with user-identified examples of relevant change. If a user views exemplar regions of a pair of multi-temporal images and provides an assessment of whether or not a relevant change occurred, then we should be able to train a system to then classify other regions within the image. In previous work, we developed a query-by-example (QBE) system for content-based image retrieval (CBIR) that could identify imagery in a database that matched a given query image.
In, Barb and Kilicay-Ergin developed semantic models using genetic optimization of low-level image features. Other examples of applying data mining algorithms to remote sensing imagery include mining temporal-spatial information and using association rules to extract information from the
gaze patterns of individuals viewing satellite imagery.