3.3. A case study of AR-QAS
An advertising flyer (see Fig. 6) for a Hakka museum was
designed for this study. The users pointed their cell phones at
the flyer and scanned the image of the flyer. A virtual tour guide
then appeared on their phones. The users employed natural
language to ask questions and the mobile virtual tour guide, which
was also capable of simple body movements and facial expressions,
used natural language to provide answers. The system automatically
performed voice recognition through the QAS. The
queries were sent through the web service client to the application
server by using rule-based or case-oriented models, and the result
was sent to the user from the application server. Thus, the users
experienced AR by asking questions of a virtual tour guide at a
museum and receiving immediate answers (see Figs. 7 and 8).
The Hakka museum was chosen as a topic of research because
the Hakka are a subgroup of the Han Chinese and possess a unique
culture. Users accessed the AR-QAS to receive relevant information
about Hakka culture. For example, if a user asks, ‘‘What is lei cha
(pounded tea)?”, the QAS searches for a corresponding answer
based on the speech of the user and responds, ‘‘Lei cha is a Hakka