1,343 research outputs found

    Visualising Bluetooth interactions: combining the Arc Diagram and DocuBurst techniques

    Get PDF
    Within the Bluetooth mobile space, overwhelmingly large sets of interaction and encounter data can very quickly be accumulated. This presents a challenge to gaining an understanding and overview of the dataset as a whole. In order to overcome this problem, we have designed a visualisation which provides an informative overview of the dataset. The visualisation combines existing Arc Diagram and DocuBurst techniques into a radial space-filling layout capable of conveying a rich understanding of Bluetooth interaction data, and clearly represents social networks and relationships established among encountered devices. The end result enables a user to visually interpret the relative importance of individual devices encountered, the relationships established between them and the usage of Bluetooth 'friendly names' (or device labels) within the data

    Deep Learning: Our Miraculous Year 1990-1991

    Full text link
    In 2020, we will celebrate that many of the basic ideas behind the deep learning revolution were published three decades ago within fewer than 12 months in our "Annus Mirabilis" or "Miraculous Year" 1990-1991 at TU Munich. Back then, few people were interested, but a quarter century later, neural networks based on these ideas were on over 3 billion devices such as smartphones, and used many billions of times per day, consuming a significant fraction of the world's compute.Comment: 37 pages, 188 references, based on work of 4 Oct 201

    Visualising Java Coupling and Fault Proneness

    Get PDF
    In this paper, a tool is described for visualising the Coupling Between Objects (CBO) metric for Java systems, decomposing it into coupling collaborators and using colour to denote the object-oriented mechanisms at work for each couple. The resulting visualisation is also envisaged to be useful for general program comprehension and is integrated into Java development in the Eclipse IDE. Evidence is also given that the visualisation may help detect classes tending to be less fault-prone than would be expected from inspection of their CBO values alone

    State of the Art About Remote Laboratories Paradigms - Foundations of Ongoing Mutations

    No full text
    9 pages. Litterature review made fall 2007 on exisiting Remote Laboratories approaches and technologies.International audienceIn this paper, we provide a literature review of modern remote laboratories. According to this state-of-theart, we explain why remote laboratories are at a technological crossroad, whereas they were slugging for a decade. From various observations based on our review, we try to identify possible evolutions for the next generation of remote laboratories

    Time travelling animated program executions

    Full text link
    Visualizations of program executions are often generated on the fly. This has many advantages relative to off-line generation of animated video files. Video files, however, trivially support flexible viewing via controls that include reverse and fast forward. Here we report on an implementation of time travel that combines the best of both techniques. In ToonTalk both the construction and execution of programs are animated. Time travel enables the user to move back in time and replay animated executions. The replay can be paused and the user can skip forward or further back in time. The implementation of time travel is based records of every input event and periodic snapshots of the state of the computation

    Detecting dressing failures using temporal–relational visual grammars

    Get PDF
    Evaluation of dressing activities is essential in the assessment of the performance of patients with psycho-motor impairments. However, the current practice of monitoring dressing activity (performed by the patients in front of the therapist) has a number of disadvantages when considering the personal nature of dressing activity as well as inconsistencies between the recorded performance of the activity and performance of the same activity carried out in the patients’ natural environment, such as their home. As such, a system that can evaluate dressing activities automatically and objectively would alleviate some of these issues. However, a number of challenges arise, including difficulties in correctly identifying garments, their position in the body (partially of fully worn) and their position in relation to other garments. To address these challenges, we have developed a novel method based on visual grammars to automatically detect dressing failures and explain the type of failure. Our method is based on the analysis of image sequences of dressing activities and only requires availability of a video recording device. The analysis relies on a novel technique which we call temporal–relational visual grammar; it can reliably recognize temporal dressing failures, while also detecting spatial and relational failures. Our method achieves 91% precision in detecting dressing failures performed by 11 subjects. We explain these results and discuss the challenges encountered during this work
    • 

    corecore