3,152 research outputs found

    Toward a model of computational attention based on expressive behavior: applications to cultural heritage scenarios

    Get PDF
    Our project goals consisted in the development of attention-based analysis of human expressive behavior and the implementation of real-time algorithm in EyesWeb XMI in order to improve naturalness of human-computer interaction and context-based monitoring of human behavior. To this aim, perceptual-model that mimic human attentional processes was developed for expressivity analysis and modeled by entropy. Museum scenarios were selected as an ecological test-bed to elaborate three experiments that focus on visitor profiling and visitors flow regulation

    Wearable and mobile devices

    Get PDF
    Information and Communication Technologies, known as ICT, have undergone dramatic changes in the last 25 years. The 1980s was the decade of the Personal Computer (PC), which brought computing into the home and, in an educational setting, into the classroom. The 1990s gave us the World Wide Web (the Web), building on the infrastructure of the Internet, which has revolutionized the availability and delivery of information. In the midst of this information revolution, we are now confronted with a third wave of novel technologies (i.e., mobile and wearable computing), where computing devices already are becoming small enough so that we can carry them around at all times, and, in addition, they have the ability to interact with devices embedded in the environment. The development of wearable technology is perhaps a logical product of the convergence between the miniaturization of microchips (nanotechnology) and an increasing interest in pervasive computing, where mobility is the main objective. The miniaturization of computers is largely due to the decreasing size of semiconductors and switches; molecular manufacturing will allow for “not only molecular-scale switches but also nanoscale motors, pumps, pipes, machinery that could mimic skin” (Page, 2003, p. 2). This shift in the size of computers has obvious implications for the human-computer interaction introducing the next generation of interfaces. Neil Gershenfeld, the director of the Media Lab’s Physics and Media Group, argues, “The world is becoming the interface. Computers as distinguishable devices will disappear as the objects themselves become the means we use to interact with both the physical and the virtual worlds” (Page, 2003, p. 3). Ultimately, this will lead to a move away from desktop user interfaces and toward mobile interfaces and pervasive computing

    The Art of Engaging: Implications for Computer Music Systems

    Get PDF
    The art of engaging with computer music systems is multifaceted. This paper will provide an overview of the issues of interface between musician and computer, cognitive aspects of engagement as involvement, and metaphysical understandings of engagement as proximity. Finally, this paper will examine implications for the design of computer music systems when these issues are taken into account

    The Smart Stage: Designing 3D interaction metaphors for immersive and ubiquitous music systems

    Get PDF
    This conceptual paper describes a work in progress in the process of design and implementation of the Smart Stage, an interactive music system prototype for collaborative musical creativity in immersive and ubiquitous environments. This system is intended to have a low entry barrier, thus more forgiving to users with lesser experience or knowledge in music, and it is designed with affordances to support intuitive progress in improvisational performance in a collaborative setting. We present a preliminary technical overview of the system and a first case study of a 3D interaction metaphor for granular synthesis, developed for this environment.Innovation Agency (AgĂȘncia de Inovação, ADI, Portugal) and Quadro de ReferĂȘncia EstratĂ©gico Nacional (QREN, Portugal): VisualYzARt: Visual programming framework for augmented reality and ubiquitous natural user interfaces (QREN-ADI ref: 23201) and COMPETE - Programa Operacional Factores de Competitividade (POFC

    A Web Application for Audience Participation in Live Music Performance: The Open Symphony Use Case

    Get PDF
    This paper presents a web-based application enabling au- diences to collaboratively contribute to the creative pro- cess during live music performances. The system aims at enhancing audience engagement and creating new forms of live music experiences. Interaction between audience and performers is made possible through a client/server architecture enabling bidirectional communication of cre- ative data. Audience members can vote for pre-determined musical attributes using a smartphone-friendly and cross- platform web application. The system gathers audience members' votes and provide feedback through visualisations that can be tailored for speci c needs. In order to sup- port multiple performers and large audiences, automatic audience-to-performer groupings are handled by the appli- cation. The framework was applied to support live interac- tive musical improvisations where creative roles are shared amongst audience and performers (Open Symphony). Qual- itative analyses of user surveys highlighted very positive feedback related to themes such as engagement and cre- ativity and also identi ed further design challenges around audience sense of control and latency

    Understanding concurrent earcons: applying auditory scene analysis principles to concurrent earcon recognition

    Get PDF
    Two investigations into the identification of concurrently presented, structured sounds, called earcons were carried out. One of the experiments investigated how varying the number of concurrently presented earcons affected their identification. It was found that varying the number had a significant effect on the proportion of earcons identified. Reducing the number of concurrently presented earcons lead to a general increase in the proportion of presented earcons successfully identified. The second experiment investigated how modifying the earcons and their presentation, using techniques influenced by auditory scene analysis, affected earcon identification. It was found that both modifying the earcons such that each was presented with a unique timbre, and altering their presentation such that there was a 300 ms onset-to-onset time delay between each earcon were found to significantly increase identification. Guidelines were drawn from this work to assist future interface designers when incorporating concurrently presented earcons

    Open Symphony: Creative Participation for Audiences of Live Music Performances

    Get PDF
    This work is partly supported by the FAST-IMPACt EPSRC project (EP/L019981/1), the Centre for Digital Music EPSRC Platform Grant (EP/E045235/1), the EU H2020 Audio Commons project (688382), QMUL's Centre for Public Engagement, the China Scholarship Council, and Arts Council England (Sound and Music Organisation Audience Labs)

    Ambient Gestures

    No full text
    We present Ambient Gestures, a novel gesture-based system designed to support ubiquitous ‘in the environment’ interactions with everyday computing technology. Hand gestures and audio feedback allow users to control computer applications without reliance on a graphical user interface, and without having to switch from the context of a non-computer task to the context of the computer. The Ambient Gestures system is composed of a vision recognition software application, a set of gestures to be processed by a scripting application and a navigation and selection application that is controlled by the gestures. This system allows us to explore gestures as the primary means of interaction within a multimodal, multimedia environment. In this paper we describe the Ambient Gestures system, define the gestures and the interactions that can be achieved in this environment and present a formative study of the system. We conclude with a discussion of our findings and future applications of Ambient Gestures in ubiquitous computing
    • 

    corecore