83,716 research outputs found

    Computers in Support of Musical Expression

    Get PDF

    Mapping, sensing and visualising the digital co-presence in the public arena

    Get PDF
    This paper reports on work carried out within the Cityware project using mobile technologies to map, visualise and project the digital co-presence in the city. This paper focuses on two pilot studies exploring the Bluetooth landscape in the city of Bath. Here we apply adapted and ‘digitally augmented’ methods for spatial observation and analysis based on established methods used extensively in the space syntax approach to urban design. We map the physical and digital flows at a macro level and observe static space use at the micro level. In addition we look at social and mobile behaviour from an individual’s point of view. We apply a method based on intervention through ‘Sensing and projecting’ Bluetooth names and digital identity in the public arena. We present early findings in terms of patterns of Bluetooth flow and presence, and outline initial observations about how people’s reaction towards the projection of their Bluetooth names practices in public. In particular we note the importance of constructing socially meaningful relations between people mediated by these technologies. We discuss initial results and outline issues raised in detail before finally describing ongoing work

    The Evolution of First Person Vision Methods: A Survey

    Full text link
    The emergence of new wearable technologies such as action cameras and smart-glasses has increased the interest of computer vision scientists in the First Person perspective. Nowadays, this field is attracting attention and investments of companies aiming to develop commercial devices with First Person Vision recording capabilities. Due to this interest, an increasing demand of methods to process these videos, possibly in real-time, is expected. Current approaches present a particular combinations of different image features and quantitative methods to accomplish specific objectives like object detection, activity recognition, user machine interaction and so on. This paper summarizes the evolution of the state of the art in First Person Vision video analysis between 1997 and 2014, highlighting, among others, most commonly used features, methods, challenges and opportunities within the field.Comment: First Person Vision, Egocentric Vision, Wearable Devices, Smart Glasses, Computer Vision, Video Analytics, Human-machine Interactio

    Virtual Borders: Accurate Definition of a Mobile Robot's Workspace Using Augmented Reality

    Full text link
    We address the problem of interactively controlling the workspace of a mobile robot to ensure a human-aware navigation. This is especially of relevance for non-expert users living in human-robot shared spaces, e.g. home environments, since they want to keep the control of their mobile robots, such as vacuum cleaning or companion robots. Therefore, we introduce virtual borders that are respected by a robot while performing its tasks. For this purpose, we employ a RGB-D Google Tango tablet as human-robot interface in combination with an augmented reality application to flexibly define virtual borders. We evaluated our system with 15 non-expert users concerning accuracy, teaching time and correctness and compared the results with other baseline methods based on visual markers and a laser pointer. The experimental results show that our method features an equally high accuracy while reducing the teaching time significantly compared to the baseline methods. This holds for different border lengths, shapes and variations in the teaching process. Finally, we demonstrated the correctness of the approach, i.e. the mobile robot changes its navigational behavior according to the user-defined virtual borders.Comment: Accepted on 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), supplementary video: https://youtu.be/oQO8sQ0JBR

    Collective choreography of space: modelling digital co-Presence in a public arena

    Get PDF
    In this paper we report on recent investigations within an ongoing research project, which aims at developing a better understanding of the urban space augmented with the digital space. We are looking at developing sensing environments acting as an interface that can facilitate interactions between people and people, and people and their surrounding. Here we describe a preliminary study that aims at mapping and visualising the digital presence of people in the public arena. We outline initial observations about how people move and congregate, and illustrate the impact of the spatial and syntactical properties on the type of shared interactions. We suggest that by altering the relation between consciousness of communication and the intention of interaction, technology can be appropriated to support emergent choreography of space. This may help throw further light on the complex relationship between the digital space and urban space in general, and people’s relationship to each other and to the sensing environment. Finally, we discuss our initial results and mention briefly our ongoing work
    corecore