Skip to main content
Article thumbnail
Location of Repository

BodyScope: a wearable acoustic sensor for activity recognition

By Koji Yatani and Khai N. Truong

Abstract

Accurate activity recognition enables the development of a variety of ubiquitous computing applications, such as context-aware systems, lifelogging, and personal health systems. Wearable sensing technologies can be used to gather data for activity recognition without requiring sensors to be installed in the infrastructure. However, the user may need to wear multiple sensors for accurate recognition of a larger number of different activities. We developed a wearable acoustic sensor, called BodyScope, to record the sounds produced in the user’s throat area and classify them into user activities, such as eating, drinking, speaking, laughing, and coughing. The F-measure of the Support Vector Machine classification of 12 activities using only our BodyScope sensor was 79.5%. We also conducted a small-scale in-the-wild study, and found that BodyScope was able to identify four activities (eating, drinking, speaking, and laughing) at 71.5 % accuracy. Author Keywords Activity recognition, wearable sensor, acoustic sensor

Topics: General Terms Human Factors
Year: 2013
OAI identifier: oai:CiteSeerX.psu:10.1.1.352.8100
Provided by: CiteSeerX
Download PDF:
Sorry, we are unable to provide the full text but you may find it at the following location(s):
  • http://citeseerx.ist.psu.edu/v... (external link)
  • http://www.yatani.jp/paper/Ubi... (external link)
  • Suggested articles


    To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.