Time management is an important aspect of a successful
professional life. In order to have a better understanding
of where our time goes, we propose a system that summarizes
the user’s daily activity (e.g. sleeping, walking, working
on the computer, talking, ...) using all-day multimodal data
recordings. Two main novelties are proposed:
• A system that combines both physical and contextual
awareness hardware and software. It records synchronized
audio, video, body sensors, GPS and computer
monitoring data.
• A semi-supervised temporal clustering (SSTC) algorithm
that accurately and efficiently groups large amounts
of multimodal data into different activities.
The effectiveness and accuracy of our SSTC is demonstrated
in synthetic and real examples of activity segmentation from
multimodal data gathered over long periods of time