2. The System
For the validation of the concept, the prototype was developed on a netbook with a standard WiFi webcam.
It was tested at the Institute of Engineering (ISE) of the University of the Algarve (ISE/UAlg). The user holds
the camera in his hand. In the case of the SmartVision [2] and Blavigator prototypes [5], the camera is worn at
the chest. However, these prototypes are also being implemented on a Smartphone, such that its camera can be
easily pointed to different directions. In case of an uncertain user location due to missing or ambiguous visual
landmarks, the user can point the camera to different directions by rotating it 180º horizontally and about 45º
vertically. Any detected landmarks are matched against those in the GIS for each space. By combining the
detected landmark positions with the traced route, the user is informed about the current location by a speech
module. Finally, since the GIS/landmark system must be integrated in the Blavigator prototype, all visual
functions have been optimised for small CPU time and memory usage, and most visual functions are based on a
few basic algorithms which are employed only once for each video frame.
The system's component which handles spatial data was developed by using the OGR Simple Features
Library. OGR is a part of GDAL - the Geospatial Data Abstraction Library [11]. For details about the GIS of
ISE/UAlg (testing site) see [5]. Figure 1 shows a map of an ISE/UAlg GIS (left) and respective data table
(middle) with the attributes of the selected region highlighted in yellow.