5 research outputs found

    Challenges and Opportunities for Designing Tactile Codecs from Audio Codecs

    Get PDF
    Haptic communications allows physical interaction over long distances and greatly complements conventional means of communications, such as audio and video. However, whilst standardized codecs for video and audio are well established, there is a lack of standardized codecs for haptics. This causes vendor lock-in and thereby greatly limits scalability, increases cost and prevents advanced usage scenarios with multi-sensors/actuators and multi-users. The aim of this paper is to introduce a new approach for understanding and encoding tactile signals, i.e. the sense of touch, among haptic interactions. Inspired by various audio codecs, we develop a similar methodology for tactile codecs. Notably, we demonstrate that tactile and audio signals are similar in both time and frequency domains, thereby allowing audio coding techniques to be adapted to tactile codecs with appropriate adjustments. We also present the differences between audio and tactile signals that should be considered in future designs. Moreover, in order to evaluate the performance of a tactile codec, we propose a potential direction of designing an objective quality metric which complements haptic mean opinion scores (h-MOS). This, we hope, will open the door for designing and assessing tactile codecs

    Browse-to-search

    Full text link
    This demonstration presents a novel interactive online shopping application based on visual search technologies. When users want to buy something on a shopping site, they usually have the requirement of looking for related information from other web sites. Therefore users need to switch between the web page being browsed and other websites that provide search results. The proposed application enables users to naturally search products of interest when they browse a web page, and make their even causal purchase intent easily satisfied. The interactive shopping experience is characterized by: 1) in session - it allows users to specify the purchase intent in the browsing session, instead of leaving the current page and navigating to other websites; 2) in context - -the browsed web page provides implicit context information which helps infer user purchase preferences; 3) in focus - users easily specify their search interest using gesture on touch devices and do not need to formulate queries in search box; 4) natural-gesture inputs and visual-based search provides users a natural shopping experience. The system is evaluated against a data set consisting of several millions commercial product images. © 2012 Authors

    Haptic data reduction through dynamic perceptual analysis and event-based communication

    Full text link
    This research presents an adjustable and flexible framework for haptic data compression and communication that can be used in a robotic teleoperation session. The framework contains a customized event-driven transmission control protocol, several dynamically adaptive perceptual and prediction methods for haptic sample reduction, and last but not the least, an architecture for the data flow
    corecore