Abstract—Nutrition-related diseases are nowadays a main
threat to human health and pose great challenges to medical
care. A crucial step to solve the problems is to monitor the
daily food intake of a person precisely and conveniently. For this
purpose, we present AutoDietary, a wearable system to monitor
and recognize food intakes in daily life. An embedded hardware
prototype is developed to collect food intake sensor data, which is
highlighted by a high-fidelity microphone worn on the subject’s
neck to precisely record acoustic signals during eating in a noninvasive
manner. The acoustic data are preprocessed and then sent
to a smartphone via Bluetooth, where food types are recognized.
In particular, we use hidden Markov models to identify chewing
or swallowing events, which are then processed to extract their
time/frequency-domain and nonlinear features. A lightweight
decision-tree-based algorithm is adopted to recognize the type of
food. We also developed an application on the smartphone, which
aggregates the food intake recognition results in a user-friendly
way and provides suggestions on healthier eating, such as better
eating habits or nutrition balance. Experiments show that the
accuracy of food-type recognition by AutoDietary is 84.9%, and
those to classify liquid and solid food intakes are up to 97.6%
and 99.7%, respectively. To evaluate real-life user experience, we
conducted a survey, which collects rating from 53 participants
on wear comfort and functionalities of AutoDietary. Results show
that the current design is acceptable to most of the users.
Index Terms—Food intake recognition, wearable sensor,
acoustic signal processing, embedded system.