Paper in ACM IUI15: “Inferring Meal Eating Activities in Real-World Settings from Ambient Sounds: A Feasibility Study”


Edison Thomaz, Cheng Zhang, Irfan Essa, Gregory Abowd

Inferring Meal Eating Activities in Real World Settings from Ambient Sounds: A Feasibility Study Best Paper Proceedings Article

In: ACM Conference on Intelligence User Interfaces (IUI), 2015.

Abstract | Links | BibTeX | Tags: ACM, activity recognition, AI, awards, behavioral imaging, best paper award, computational health, IUI, machine learning



Dietary self-monitoring has been shown to be an effective method for weight loss, but it remains an onerous task despite recent advances in food journaling systems. Semi-automated food journaling can reduce the effort of logging but often requires that eating activities be detected automatically. In this work, we describe results from a feasibility study conducted in the wild where eating activities were inferred from ambient sounds captured with a wrist-mounted device; twenty participants wore the device for one day for an average of 5 hours while performing normal and everyday activities. Our system identified meal eating with an F-score of 79.8% in a person-dependent evaluation and with 86.6% accuracy in a person-independent evaluation. Our approach is intended to be practical, leveraging off-the-shelf devices with audio sensing capabilities in contrast to systems for automated dietary assessment based on specialized sensors.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.