2,190 research outputs found

    BNCI systems as a potential assistive technology: ethical issues and participatory research in the BrainAble project

    Get PDF
    This paper highlights aspects related to current research and thinking about ethical issues in relation to Brain Computer Interface (BCI) and Brain-Neuronal Computer Interfaces (BNCI) research through the experience of one particular project, BrainAble, which is exploring and developing the potential of these technologies to enable people with complex disabilities to control computers. It describes how ethical practice has been developed both within the multidisciplinary research team and with participants. Results: The paper presents findings in which participants shared their views of the project prototypes, of the potential of BCI/BNCI systems as an assistive technology, and of their other possible applications. This draws attention to the importance of ethical practice in projects where high expectations of technologies, and representations of “ideal types” of disabled users may reinforce stereotypes or drown out participant “voices”. Conclusions: Ethical frameworks for research and development in emergent areas such as BCI/BNCI systems should be based on broad notions of a “duty of care” while being sufficiently flexible that researchers can adapt project procedures according to participant needs. They need to be frequently revisited, not only in the light of experience, but also to ensure they reflect new research findings and ever more complex and powerful technologies

    Triggering social interactions:chimpanzees respond to imitation by a humanoid robot and request responses from it

    Get PDF
    Even the most rudimentary social cues may evoke affiliative responses in humans and promote socialcommunication and cohesion. The present work tested whether such cues of an agent may also promotecommunicative interactions in a nonhuman primate species, by examining interaction-promoting behavioursin chimpanzees. Here, chimpanzees were tested during interactions with an interactive humanoid robot, whichshowed simple bodily movements and sent out calls. The results revealed that chimpanzees exhibited twotypes of interaction-promoting behaviours during relaxed or playful contexts. First, the chimpanzees showedprolonged active interest when they were imitated by the robot. Second, the subjects requested ‘social’responses from the robot, i.e. by showing play invitations and offering toys or other objects. This study thusprovides evidence that even rudimentary cues of a robotic agent may promote social interactions inchimpanzees, like in humans. Such simple and frequent social interactions most likely provided a foundationfor sophisticated forms of affiliative communication to emerge

    Virual world users evaluated according to environment design, task based adn affective attention measures

    Get PDF
    This paper presents research that engages with virtual worlds for education users to understand design of these applications for their needs. An in-depth multi-method investigation from 12 virtual worlds participants was undertaken in three stages; initially a small scale within-subjects eye-tracking comparison was made between the role playing game 'RuneScape' and the virtual social world 'Second Life', secondly an in-depth evaluation of eye-tracking data for Second Life tasks (i.e. avatar, object and world based) was conducted, finally a qualitative evaluation of Second Life tutorials in comparative 3D situations (i.e. environments that are; realistic to surreal, enclosed to open, formal to informal) was conducted. Initial findings identified increased users attention within comparable gaming and social world interactions. Further analysis identified that 3D world focused interactions increased participants' attention more than object and avatar tasks. Finally different 3D situation designs altered levels of task engagement and distraction through perceptions of comfort, fun and fear. Ultimately goal based and environment interaction tasks can increase attention and potentially immersion. However, affective perceptions of 3D situations can negatively impact on attention. An objective discussion of the limitations and benefits of virtual world immersion for student learning is presented

    Drivers\u27 Ability to Engage in a Non-Driving Related Task While in Automated Driving Mode in Real Traffic

    Get PDF
    Engaging in non-driving related tasks (NDRTs) while driving can be considered distracting and safety detrimental. However, with the introduction of highly automated driving systems that relieve drivers from driving, more NDRTs will be feasible. In fact, many car manufacturers emphasize that one of the main advantages with automated cars is that it "frees up time" for other activities while on the move. This paper investigates how well drivers are able to engage in an NDRT while in automated driving mode (i.e., SAE Level 4) in real traffic, via a Wizard of Oz platform. The NDRT was designed to be visually and cognitively demanding and require manual interaction. The results show that the drivers\u27 attention to a great extent shifted from the road ahead towards the NDRT. Participants could perform the NDRT equally well as when in an office (e.g. correct answers, time to completion), showing that the performance did not deteriorate when in the automated vehicle. Yet, many participants indicated that they noted and reacted to environmental changes and sudden changes in vehicle motion. Participants were also surprised by their own ability to, with ease, disconnect from driving. The presented study extends previous research by identifying that drivers to a high extent are able to engage in a NDRT while in automated mode in real traffic. This is promising for future of automated cars ability to "free up time" and enable drivers to engage in non-driving related activities

    Mindreading in the balance : adults' mediolateral leaning and anticipatory looking foretell others' action preparation in a false-belief interactive task

    Get PDF
    Anticipatory looking on mindreading tasks can indicate our expectation of an agent's action. The challenge is that social situations are often more complex, involving instances where we need to track an agent's false belief to successfully identify the outcome to which an action is directed. If motor processes can guide how action goals are understood, it is conceivable— where that kind of goal ascription occurs in false-belief tasks— for motor representations to account for someone's belief-like state. Testing adults (N = 42) in a real-time interactive helping scenario, we discovered that participants' early mediolateral motor activity (leftwards– rightwards leaning on balance board) foreshadowed the agent's belief-based action preparation. These results suggest fast belief-tracking can modulate motor representations generated in the course of one's interaction with an agent. While adults' leaning, and anticipatory looking, revealed the contribution of fast false-belief tracking, participants did not correct the agent's mistake in their final helping action. These discoveries suggest that adults may not necessarily use another's belief during overt social interaction or find reflecting on another's belief as being normatively relevant to one's own choice of action. Our interactive task design offers a promising way to investigate how motor and mindreading processes may be variously integrated

    10081 Abstracts Collection -- Cognitive Robotics

    Get PDF
    From 21.02. to 26.02.2010, the Dagstuhl Seminar 10081 ``Cognitive Robotics \u27\u27 was held in Schloss Dagstuhl~--~Leibniz Center for Informatics. During the seminar, several participants presented their current research, and ongoing work and open problems were discussed. Abstracts of the presentations given during the seminar as well as abstracts of seminar results and ideas are put together in this paper. The first section describes the seminar topics and goals in general. Links to extended abstracts or full papers are provided, if available

    Eyes that bind us: Gaze leading induces an implicit sense of agency

    Get PDF
    Humans feel a sense of agency over the effects their motor system causes. This is the case for manual actions such as pushing buttons, kicking footballs, and all acts that affect the physical environment. We ask whether initiating joint attention – causing another person to follow our eye movement – can elicit an implicit sense of agency over this congruent gaze response. Eye movements themselves cannot directly affect the physical environment, but joint attention is an example of how eye movements can indirectly cause social outcomes. Here we show that leading the gaze of an on-screen face induces an underestimation of the temporal gap between action and consequence (Experiments 1 and 2). This underestimation effect, named ‘temporal binding,’ is thought to be a measure of an implicit sense of agency. Experiment 3 asked whether merely making an eye movement in a non-agentic, non-social context might also affect temporal estimation, and no reliable effects were detected, implying that inconsequential oculomotor acts do not reliably affect temporal estimations under these conditions. Together, these findings suggest that an implicit sense of agency is generated when initiating joint attention interactions. This is important for understanding how humans can efficiently detect and understand the social consequences of their actions

    Vocal Interactivity in-and-between Humans, Animals, and Robots

    Get PDF
    Almost all animals exploit vocal signals for a range of ecologically motivated purposes: detecting predators/prey and marking territory, expressing emotions, establishing social relations, and sharing information. Whether it is a bird raising an alarm, a whale calling to potential partners, a dog responding to human commands, a parent reading a story with a child, or a business-person accessing stock prices using Siri, vocalization provides a valuable communication channel through which behavior may be coordinated and controlled, and information may be distributed and acquired. Indeed, the ubiquity of vocal interaction has led to research across an extremely diverse array of fields, from assessing animal welfare, to understanding the precursors of human language, to developing voice-based human–machine interaction. Opportunities for cross-fertilization between these fields abound; for example, using artificial cognitive agents to investigate contemporary theories of language grounding, using machine learning to analyze different habitats or adding vocal expressivity to the next generation of language-enabled autonomous social agents. However, much of the research is conducted within well-defined disciplinary boundaries, and many fundamental issues remain. This paper attempts to redress the balance by presenting a comparative review of vocal interaction within-and-between humans, animals, and artificial agents (such as robots), and it identifies a rich set of open research questions that may benefit from an interdisciplinary analysis
    • 

    corecore