2 research outputs found

    Adaptation en temps réel pour une meilleure qualité d'expérience en réalité augmentée

    Get PDF
    In the framework of mobile augmented reality, a video stream is sent to the user with the help of a wireless communication link. To guarantee an efficient transmission, the video stream rate is controlled by adapting the encoding parameters such as to follow a given bandwidth. The rate can be reduced by reducing the frame rate and/or by choosing a higher compression factor for the video stream. These parameter modifications impact both the level of detail and the fluidity perceived by the user, and thus his/her subjective appreciation. The experience perceived by the user also depends on the context. During a rapid head motion, the notion of fluidity is more important than for a fixed head position. We propose an end-to-end adaptation scheme which enables the encoding of parameters such as to provide the best experience for the user regarding the dynamical context. For example, when the user moves quickly his/her head, the frame is compressed more to increase the frame rate and hence achieve a better perception of the motion. The lack of direct measurement for the subjective user experience is addressed with the design of objective metrics and a generic model to predict the user quality of experience in real time. A rate control strategy based on a systems approach is deployed to manage the multiple encoding parameters which control the stream rate. The encoder is modeled in an abstract manner as a single-variable linear system, where the content variation is taken as a perturbation. A stable and efficient controller is designed for the abstract model of the encoder. To implement the designed controller, the parameter combinations for the real encoder corresponding to the single input of the abstract model should be determined. A new one-pass algorithm determines this correspondence in real time based on a mapping method. Then, the proposed contextual adaptation enables to get the encoding parameter combination that maximizes the quality of experience using an appropriate model. Finally, the global adaptation scheme combines the rate control, the mapping method and the contextual adaptation for real-time implementation. Simulation and experiments illustrate the approach and the global adaptation scheme is validated through different scenarios

    Visual based finger interactions for mobile phones

    Get PDF
    Vision based technology such as motion detection has long been limited to the domain of powerful processor intensive systems such as desktop PCs and specialist hardware solutions. With the advent of much faster mobile phone processors and memory, a plethora of feature rich software and hardware is being deployed onto the mobile platform, most notably onto high powered devices called smart phones. Interaction interfaces such as touchscreens allow for improved usability but obscure the phone’s screen. Since the majority of smart phones are equipped with cameras, it has become feasible to combine their powerful processors, large memory capacity and the camera to support new ways of interacting with the phone which do not obscure the screen. However, it is not clear whether or not these processor intensive visual interactions can in fact be run at an acceptable speed on current mobile handsets or whether they will offer the user a better experience than the current number pad and direction keys present on the majority of mobile phones. A vision based finger interaction technique is proposed which uses the back of device camera to track the user’s finger. This allows the user to interact with the mobile phone with mouse based movements, gestures and steering based interactions. A simple colour thresholding algorithm was implemented in Java, Python and C++. Various benchmarks and tests conducted on a Nokia N95 smart phone revealed that on current hardware and with current programming environments only native C++ yields results plausible for real time interactions (a key requirement for vision based interactions). It is also shown that different lighting levels and background environments affects the accuracy of the system with background and finger contrast playing a large role. Finally a user study was conducted to ascertain the overall user’s satisfaction between keypad interactions and the finger interaction techniques concluding that the new finger interaction technique is well suited to steering based interactions and in time, mouse style movements. Simple navigation is better suited to the directional keypad
    corecore