Skip to main content
Article thumbnail
Location of Repository

Multi-Level Sensory Interpretation and Adaptation in a Mobile Cube

By Kristof Van Laerhoven, Nicolas Villar and Hans Gellersen


Signals from sensors are often analyzed in a sequence of steps, starting with the raw sensor data and eventually ending up with a classification or abstraction of these data. This paper will give a practical example of how the same information can be trained and used to initiate multiple interpretations of the same data on different, application-oriented levels. Crucially, the focus is on expanding embedded analysis software, rather than adding more powerful, but possibly resource-hungry, sensors. Our illustration of this approach involves a tangible input device the shape of a cube that relies exclusively on lowcost accelerometers. The cube supports calibration with user supervision, it can tell which of its sides is on top, give an estimate of its orientation relative to the user, and recognize basic gestures

Year: 2003
OAI identifier:
Provided by: Lancaster E-Prints

Suggested articles


  1. (2001). An Inertial Measurement Framework for Gesture Recognition and Applications”. doi
  2. (2003). Analog Devices Inc. ADXL311 dual axis accelerometer datasheet,
  3. (2002). Context Awareness in Systems with Limited Resources".
  4. (1997). Coupled hidden Markov models for complex action recognition. doi
  5. Exploring Cube Affordance: Towards A Classification Of Non-Verbal Dynamics Of Physical Interfaces For Wearable Computing". doi
  6. (2002). Navigational blocks: navigating information space with tangible media”. doi
  7. (1953). Some experiments in the recognition of speech, with one and two ears”. doi
  8. (2003). The Cube. World of Mathematics. Online web resource.
  9. (2003). Using an Autonomous Cube for Basic Navigation and Input". doi

To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.