Wearable sensor tracks calories by recognising chewing sounds

By Will Chu

- Last updated on GMT

AutoDietary is a necklace equipped with a microphone placed against the throat that listens to the sounds of eating. This data is sent to the app and matched with the correct food. (© University of Buffalo)
AutoDietary is a necklace equipped with a microphone placed against the throat that listens to the sounds of eating. This data is sent to the app and matched with the correct food. (© University of Buffalo)
Scientists have developed a wearable system that monitors calorie intake by recognising the chewing and swallowing sounds of food, a research paper has revealed.  

The hands-free technology provides an automated way of continuously measuring daily calorie intake. Current solutions rely on users’ self-reports, which are neither convenient nor precise since food intake is versatile and energy contained in different food vary significantly.

AutoDietary is a wearable system that recognises food types by monitoring the eating process. The system is mainly composed of two parts – an embedded hardware system and smartphone application.

An acoustic sensor worn around the neck detects sound signals that are produced by the chewing and swallowing motion. The data are then transmitted via bluetooth to a smartphone, where categories of food are recognised from an extensive database of sounds that can be cross-referenced.

Mobile mastication

The team from the University of Buffalo also developed a mobile application, which collects the food sounds and presents the information in a user-friendly manner, offering suggestions on healthier eating, such as chewing slower or drinking more often.

“Food types can be distinguished from chewing information, since the energy exerted in mastication basically depends on the structural and the textural properties of the food material and can be extracted from the chewing sound,”​ the study noted.  

To evaluate the system’s effectiveness, the team gathered 12 subjects (five women and seven men aged between 13-49) and asked them to eat seven different types of food. The foods included apples, carrots, cookies, potato chips, walnuts, peanuts and water.

In total, 171 samplings composed of 4047 'events' - 54 bite events, 3433 chewing events, and 560 swallowing events - were collected.

green apple fruit
Subjects were asked to eat 7 different types of food including apples, carrots, cookies, potato chips, walnuts, and peanuts.(© iStock.com)

Results indicated that the event detection (identifying chewing/swallowing event) accuracy results for each of the 12 subjects remained within 80-90%. The smallest witnessed accuracy was 81.7%. Overall accuracy was calculated at 86.6%.

When the system was tested for its accuracy in recognising food types, figures ranged from 81.5% to 90.1%. Interestingly, the researchers noted that subjects with small body mass index (BMI) had the lower accuracy.

They believed that when that subject was wearing the throat microphone, it was not a perfect fit on their skin, compromising the quality of the sampled signals.

Each food was also picked up by the system’s microphones and recognised.  The team were able to record the chewing events of apple (579) carrot (607) potato chips (407) cookies (592), peanut (221) walnut (368) and water (221) swallowing events.

“Our method achieves an average recognition accuracy of 84.9%. This result confirms that the recognition algorithm performed sufficiently well to be used in the exhibition,” ​said the authors.

“Specifically, recognition precision of peanut is the lowest (0.755). This implies that sounds of chewing peanut are similar to chewing other solid food. A possible reason is, for solid food, varying physical properties directly impact the sounds of mastication, and consequently influence food type recognition.”

Emerging technologies

Recognising food types in food intake based on auditory monitoring has been of great interest in recent years.

The very latest technologies to be tested are radio frequency identification (RFID) tags on food packages, which detect and distinguish how and what people eat.

elderly older men mood cognitive social iStock.com diego_cervo
One study used video analysis to record eating behaviour of elderly residents of a retirement home. (© iStock.com/Diego Cervo)

While this sensor device is still under development, several limitations have been mentioned, one of which is cost. It is very expensive to deploy and requires extra efforts to attach RFID tags on every food package available.

Video fluoroscopy and electromyography (EMG) are considered the gold standard in studies of food intake and ingestion.

The disadvantage of video fluoroscopy is the use of bulky equipment. EMG is also invasive due to the placement of electrodes which can interfere with the muscles of the neck.

One study​ used video analysis to record eating behaviour of people. Typical food intake movements were tracked in video recordings and the meal duration was logged for every inhabitant of the retirement home. This solution was restricted though to special locations where the cameras were installed.

Source: IEEE Sensors Journal

Published online, doi.org/10.1109/JSEN.2015.2469095

“AutoDietary: A Wearable Acoustic Sensor System for Food Intake Recognition in Daily Life”

Authors: Yin Bi, Mingsong Lv, Chen Song, Wenyao Xu, Nan Guan, Member, IEEE, Wang Yi.

Related topics Science Diet & health Reformulation

Related news

Show more

Follow us

Products

View more

Webinars