1 research outputs found
A novel hand-held interface supporting the self-management of Type 1 diabetes
The paper describes the interaction design of a hand-held interface
supporting the self-management of Type 1 diabetes. It addresses
well-established clinical and human-computer interaction requirements.
The design exploits three opportunities. One is associated with visible
context, whether conspicuous or inconspicuous. A second arises from the design
freedom made possible by the user's anticipated focus of attention during
certain interactions. A third opportunity to provide valuable functionality
arises from wearable sensors and machine learning algorithms. The resulting
interface permits ``What if?'' questions: it allows a user to dynamically and
manually explore predicted short-term (e.g., 2 hours) relationships between an
intended meal, blood glucose level and recommended insulin dosage, and thereby
readily make informed food and exercise decisions. Design activity has been
informed throughout by focus groups comprising people with Type 1 diabetes in
addition to experts in diabetes, interaction design and machine learning. The
design is being implemented prior to a clinical trial