2 research outputs found

    Enabling Context-Awareness in Mobile Systems via Multi-Modal Sensing

    Get PDF
    <p>The inclusion of rich sensors on modern smartphones has changed mobile phones from simple communication devices to powerful human-centric sensing platforms. Similar trends are influencing other personal gadgets such as the tablets, cameras, and wearable devices like the Google glass. Together, these sensors can provide</p><p>a high-resolution view of the user's context, ranging from simple information like locations and activities, to high-level inferences about the users' intention, behavior, and social interactions. Understanding such context can help solving existing system-side</p><p>challenges and eventually enable a new world of real-life applications. </p><p>In this thesis, we propose to learn human behavior via multi-modal sensing. The intuition is that human behaviors leave footprints on different sensing dimensions - visual, acoustic, motion and in cyber space. By collaboratively analyzing these footprints, the system can obtain valuable insights about the user. We show that the</p><p>analysis results can lead to a series of applications including capturing life-logging videos, tagging user-generated photos and enabling new ways for human-object interactions. Moreover, the same intuition may potentially be applied to enhance existing</p><p>system-side functionalities - offloading, prefetching and compression.</p>Dissertatio

    ConBrowse - contextual content browsing

    No full text
    corecore