779 research outputs found
The kinked demand curve and price rigidity: evidence from scanner data
This paper uses scanner data from a large euro area retailer. We extend Deaton and Muellbauer's Almost Ideal Demand System to estimate the price elasticity and curvature of demand for a wide range of products. Our results support the introduction of a kinked (concave) demand curve in general equilibrium macro models. We find that the price elasticity of demand is on average higher for price increases than for price decreases. However, the degree of curvature in demand is much lower than is currently imposed. Moreover, for a significant fraction of products we observe a convex demand curve. We find no correlation between the estimated price elasticity/curvature and the observed size or frequency of price adjustment in our dat
On combining the facial movements of a talking head
We present work on Obie, an embodied conversational
agent framework. An embodied conversational agent, or
talking head, consists of three main components. The
graphical part consists of a face model and a facial muscle
model. Besides the graphical part, we have implemented
an emotion model and a mapping from emotions to facial
expressions. The animation part of the framework focuses
on the combination of different facial movements
temporally. In this paper we propose a scheme of
combining facial movements on a 3D talking head
Gaze Behavior, Believability, Likability and the iCat
The iCat is a user-interface robot with the ability to express a range of emotions through its facial features. This paper summarizes our research whether we can increase the believability and likability of the iCat for its human partners through the application of gaze behaviour. Gaze behaviour serves several functions during social interaction such as mediating conversation flow, communicating emotional information and avoiding distraction by restricting visual input. There are several types of eye and head movements that are necessary for realizing these functions. We designed and evaluated a gaze behaviour system for the iCat robot that implements realistic models of the major types of eye and head movements found in living beings: vergence, vestibulo ocular reflexive, smooth pursuit movements and gaze shifts. We discuss how these models are integrated into the software environment of the iCat and can be used to create complex interaction scenarios. We report about some user tests and draw conclusions for future evaluation scenarios
An Action Selection Architecture for an Emotional Agent
An architecture for action selection is presented linking emotion, cognition and behavior. It defines the information and emotion processes of an agent. The architecture has been implemented and used in a prototype environment
Embodied agents in virtual environments: The Aveiro project
We present current and envisaged work on the AVEIRO project of our research group concerning virtual environments inhabited by autonomous embodied agents. These environments are being built for researching issues in human-computer interactions and intelligent agent applications. We describe the various strands of research and development that we are focussing on. The undertaking involves the collaborative effort of researchers from different disciplines
Conceptual Frameworks for Multimodal Social Signal Processing
This special issue is about a research area which is developing rapidly. Pentland gave it a name which has become widely used, âSocial Signal Processingâ (SSP for short), and his phrase provides the title of a European project, SSPnet, which has a brief to consolidate the area. The challenge that Pentland highlighted was understanding the nonlinguistic signals that serve as the basis for âsubconscious discussions between humans about relationships, resources, risks, and rewardsâ. He identified it as an area where computational research had made interesting progress, and could usefully make more
Modality-specific Affective Responses and their Implications for Affective BCI
Reliable applications of multimodal affective brain-computer interfaces (aBCI) require a detailed understanding of the processes involved in emotions. To explore the modality-specific nature of affective responses, we studied neurophysiological responses of 24 subjects during visual, auditory, and audiovisual affect stimulation and obtained their subjective ratings. Coherent with literature, we found modality-specific responses in the EEG: parietal alpha power decreases during visual stimulation and increases during auditory stimulation, whereas more anterior alpha power decreases during auditory stimulation and increases during visual stimulation. We discuss the implications of these results for multimodal aBCI
- âŠ