Besides this, we also present visual results to illustrate the performance difference between our system and Google image search. Fig. 17 shows such results on two example search goals, finding images containing “a red apple below green leaves” and finding images containing “a BMW on green grass”. Our results come from the textual queries, “apple” and “BMW”, with the color maps shown in the left of Figs. 17(c) and 17(d). The results from Google image search are the best from several trials on similar image features and search by color with textual queries, “apple” and “BMW”, and finally the results shown in Figs. 17(a) and 17(b) are obtained with the first image as the seed for the similar image feature from Google image search. Furthermore, we also present the results, from the unique feature of our system that is the user can edit the selected image to satisfy his desire. The results are shown in Fig. 18. Figs. 18(a), 18(b) and 18(c) show the results expecting a similar visual appearance with the selected image, the results expecting a blue sky, and the results expecting the color pink at the top.