12,940 research outputs found

    Sketching sonic interactions by imitation-driven sound synthesis

    Get PDF
    Sketching is at the core of every design activity. In visual design, pencil and paper are the preferred tools to produce sketches for their simplicity and immediacy. Analogue tools for sonic sketching do not exist yet, although voice and gesture are embodied abilities commonly exploited to communicate sound concepts. The EU project SkAT-VG aims to support vocal sketching with computeraided technologies that can be easily accessed, understood and controlled through vocal and gestural imitations. This imitation-driven sound synthesis approach is meant to overcome the ephemerality and timbral limitations of human voice and gesture, allowing to produce more refined sonic sketches and to think about sound in a more designerly way. This paper presents two main outcomes of the project: The Sound Design Toolkit, a palette of basic sound synthesis models grounded on ecological perception and physical description of sound-producing phenomena, and SkAT-Studio, a visual framework based on sound design workflows organized in stages of input, analysis, mapping, synthesis, and output. The integration of these two software packages provides an environment in which sound designers can go from concepts, through exploration and mocking-up, to prototyping in sonic interaction design, taking advantage of all the possibilities of- fered by vocal and gestural imitations in every step of the process

    Machine Understanding of Human Behavior

    Get PDF
    A widely accepted prediction is that computing will move to the background, weaving itself into the fabric of our everyday living spaces and projecting the human user into the foreground. If this prediction is to come true, then next generation computing, which we will call human computing, should be about anticipatory user interfaces that should be human-centered, built for humans based on human models. They should transcend the traditional keyboard and mouse to include natural, human-like interactive functions including understanding and emulating certain human behaviors such as affective and social signaling. This article discusses a number of components of human behavior, how they might be integrated into computers, and how far we are from realizing the front end of human computing, that is, how far are we from enabling computers to understand human behavior

    miMic: The microphone as a pencil

    Get PDF
    miMic, a sonic analogue of paper and pencil is proposed: An augmented microphone for vocal and gestural sonic sketching. Vocalizations are classified and interpreted as instances of sound models, which the user can play with by vocal and gestural control. The physical device is based on a modified microphone, with embedded inertial sensors and buttons. Sound models can be selected by vocal imitations that are automatically classified, and each model is mapped to vocal and gestural features for real-time control. With miMic, the sound designer can explore a vast sonic space and quickly produce expressive sonic sketches, which may be turned into sound prototypes by further adjustment of model parameters

    To “Sketch-a-Scratch”

    Get PDF
    A surface can be harsh and raspy, or smooth and silky, and everything in between. We are used to sense these features with our fingertips as well as with our eyes and ears: the exploration of a surface is a multisensory experience. Tools, too, are often employed in the interaction with surfaces, since they augment our manipulation capabilities. “Sketch-a-Scratch” is a tool for the multisensory exploration and sketching of surface textures. The user’s actions drive a physical sound model of real materials’ response to interactions such as scraping, rubbing or rolling. Moreover, different input signals can be converted into 2D visual surface profiles, thus enabling to experience them visually, aurally and haptically

    A first approach to understanding and measuring naturalness in driver-car interaction

    Get PDF
    With technology changing the nature of the driving task, qualitative methods can help designers understand and measure driver-car interaction naturalness. Fifteen drivers were interviewed at length in their own parked cars using ethnographically-inspired questions probing issues of interaction salience, expectation, feelings, desires and meanings. Thematic analysis and content analysis found five distinct components relating to 'rich physical' aspects of natural feeling interaction typified by richer physical, analogue, tactile styles of interaction and control. Further components relate to humanlike, intelligent, assistive, socially-aware 'perceived behaviours' of the car. The advantages and challenges of a naturalness-based approach are discussed and ten cognitive component constructs of driver-car naturalness are proposed. These may eventually be applied as a checklist in automotive interaction design.This research was fully funded by a research grant from Jaguar Land Rover, and partially funded by project n.220050/F11 granted by Research Council of Norway

    Interfaces of the Agriculture 4.0

    Get PDF
    The introduction of information technologies in the environmental field is impacting and changing even a traditional sector like agriculture. Nevertheless, Agriculture 4.0 and data-driven decisions should meet user needs and expectations. The paper presents a broad theoretical overview, discussing both the strategic role of design applied to Agri-tech and the issue of User Interface and Interaction as enabling tools in the field. In particular, the paper suggests to rethink the HCD approach, moving on a Human-Decentered Design approach that put together user-technology-environment and the importance of the role of calm technologies as a way to place the farmer, not as a final target and passive spectator, but as an active part of the process to aim the process of mitigation, appropriation from a traditional cultivation method to the 4.0 one
    • …
    corecore