The demonstration shows the flow of the mobile
application. At the beginning, the LugaReco application
consults the server for the places around the user as far as the
horizon and displays them on a map. Those places not visible
to the user are highlighted in red, whereas the visible ones
are green. If the user moves a certain distance, the places will
be updated.
At any point, the users can ask for a recommendation. To
do this, they take a picture of whatever interests them and
they evaluate the photo by using a 5-star bar (see figure 4).
This information is sent to the server, which tries to
recognize the point of interest. If so, it sends the rating to our
recommendation engine. Whether the place is recognized or
not, the core subsystem asks for recommendations to give to
the user. These are sorted by similarity with the supplied
image using Isk-Daemon. Once this process has been carried
out, the users are presented with a list of recommendations
where they can view the images associated with the
recommended locations. They can then go back to the map
view where the recommended places are now shown.
Following this, LugaReco consults Google for car routes to
the places which are painted on the map. When the users are
on the map view, they can switch to the augmented reality
layer by rotating the smartphone to the landscape mode.
In this demonstration, we will show how to register a
point of interest with its associated images in our system. We
will register the sufficient number of suitable locations and
distances to show how the filtering visibility works. We will
see the results from the recommendation system and will
know whether or not the images have been recognized.