589 research outputs found
Recommended from our members
The language void: the need for multimodality in primate communication research
Theories of language evolution often draw heavily on comparative evidence of the communicative abilities of extant nonhuman primates (primates). Many theories have argued exclusively for a unimodal origin of language, usually gestural or vocal. Theories are often strengthened by research on primates that indicates the absence of certain linguistic precursors in the opposing communicative modality. However, a systematic review of the primate communication literature reveals that vocal, gestural and facial signals have attracted differing theoretical and methodological approaches, rendering cross-modal comparisons problematic. The validity of the theories based on such comparisons can therefore be questioned. We propose that these a priori biases, inherent in unimodal research, highlight the need for integrated multimodal research. By examining communicative signals in concert we can both avoid methodological discontinuities as well as better understand the phylogenetic precursors to human language as part of a multimodal system
Development and application of CatFACS: are human cat adopters influenced by cat facial expressions?
The domestic cat (Felis silvestris catus) is quickly becoming the most popular animal companion in the world. The evolutionary processes that occur during domestication are known to have wide effects on the morphology, behaviour, cognition and communicative abilities of a species. Since facial expression is central to human communication, it is possible that cat facial expression has been subjected to selection during domestication. Standardised measurement techniques to study cat facial expression are, however, currently lacking. Here, as a first step to enable cat facial expression to be studied in an anatomically based and objective way, CatFACS (Cat Facial Action Coding System) was developed. Fifteen individual facial movements (Action Units), six miscellaneous movements (Action Descriptors) and seven Ear Action Descriptors were identified in the domestic cat. CatFACS was then applied to investigate the impact of cat facial expression on human preferences in an adoption shelter setting. Rehoming speed from cat shelters was used as a proxy for human selective pressure. The behaviour of 106 cats ready for adoption in three different shelters was recorded during a standardised encounter with an experimenter. This experimental setup aimed to mimic the first encounter of a cat with a potential adopter, i.e. an unfamiliar human. Each video was coded for proximity to the experimenter, body movements, tail movements and face movements. Cat facial movements were not related to rehoming speed, suggesting that cat facial expression may not have undergone significant selection. In contrast, rubbing frequency was positively related to rehoming speed. The findings suggest that humans are more influenced by overt prosocial behaviours than subtle facial expression in domestic cats
EquiFACS: the Equine Facial Action Coding System
Although previous studies of horses have investigated their facial expressions in specific contexts, e.g. pain, until now there has been no methodology available that documents all the possible facial movements of the horse and provides a way to record all potential facial configurations. This is essential for an objective description of horse facial expressions across a range of contexts that reflect different emotional states. Facial Action Coding Systems (FACS) provide a systematic methodology of identifying and coding facial expressions on the basis of underlying facial musculature and muscle movement. FACS are anatomically based and document all possible facial movements rather than a configuration of movements associated with a particular situation. Consequently, FACS can be applied as a tool for a wide range of research questions. We developed FACS for the domestic horse (Equus caballus) through anatomical investigation of the underlying musculature and subsequent analysis of naturally occurring behaviour captured on high quality video. Discrete facial movements were identified and described in terms of the underlying muscle contractions, in correspondence with previous FACS systems. The reliability of others to be able to learn this system (EquiFACS) and consistently code behavioural sequences was highâand this included people with no previous experience of horses. A wide range of facial movements were identified, including many that are also seen in primates and other domestic animals (dogs and cats). EquiFACS provides a method that can now be used to document the facial movements associated with different social contexts and thus to address questions relevant to understanding social cognition and comparative psychology, as well as informing current veterinary and animal welfare practices
Evolution of facial muscle anatomy in dogs
Domestication shaped wolves into dogs and transformed both their behavior and their anatomy. Here we show that, in only 33,000 y, domestication transformed the facial muscle anatomy of dogs specifically for facial communication with humans. Based on dissections of dog and wolf heads, we show that the levator anguli oculi medialis, a muscle responsible for raising the inner eyebrow intensely, is uniformly present in dogs but not in wolves. Behavioral data, collected from dogs and wolves, show that dogs produce the eyebrow movement significantly more often and with higher intensity than wolves do, with highest-intensity movements produced exclusively by dogs. Interestingly, this movement increases paedomorphism and resembles an expression that humans produce when sad, so its production in dogs may trigger a nurturing response in humans. We hypothesize that dogs with expressive eyebrows had a selection advantage and that "puppy dog eyes" are the result of selection based on humans' preferences
OrangFACS: a muscle-based facial movement coding system for orangutans (Pongo spp.)
Comparing homologous expressions between species can shed light on the phylogenetic and functional changes that have taken place during evolution. To assess homology across species we must approach primate facial expressions in an anatomical, systematic, and standardized way. The Facial Action Coding System (FACS), a widely used muscle-based tool for analyzing human facial expressions, has recently been adapted for chimpanzees (Pan troglodytes: ChimpFACS), rhesus macaques (Macaca mulatta: MaqFACS), and gibbons (GibbonFACS). Here, we present OrangFACS, a FACS adapted for orangutans (Pongo spp.). Orangutans are the most arboreal and the least social great ape, so their visual communication has been assumed to be less important than vocal communication and is little studied. We scrutinized the facial anatomy of orangutans and coded videos of spontaneous orangutan behavior to identify independent movements: Action Units (AUs) and Action Descriptors (ADs). We then compared these facial movements with movements of homologous muscles in humans, chimpanzees, macaques, and gibbons. We also noted differences related to sexual dimorphism and developmental stages in orangutan facial morphology. Our results show 17 AUs and 7 ADs in orangutans, indicating an overall facial mobility similar to that found in chimpanzees, macaques, and gibbons but smaller than that found in humans. This facial movement capacity in orangutans may be the result of several, nonmutually exclusive explanations, including the need for facial communication in specialized contexts, phylogenetic inertia, and allometric effects
Social use of facial expressions in hylobatids
Non-human primates use various communicative means in interactions with others. While primate gestures are commonly considered to be intentionally and flexibly used signals, facial expressions are often referred to as inflexible, automatic expressions of affective internal states. To explore whether and how non-human primates use facial expressions in specific communicative interactions, we studied five species of small apes (gibbons) by employing a newly established Facial Action Coding System for hylobatid species (GibbonFACS). We found that, despite individuals often being in close proximity to each other, in social (as opposed to non-social contexts) the duration of facial expressions was significantly longer when gibbons were facing another individual compared to non-facing situations. Social contexts included grooming, agonistic interactions and play, whereas non-social contexts included resting and self-grooming. Additionally, gibbons used facial expressions while facing another individual more often in social contexts than non-social contexts where facial expressions were produced regardless of the attentional state of the partner. Also, facial expressions were more likely âresponded toâ by the partnerâs facial expressions when facing another individual than non-facing. Taken together, our results indicate that gibbons use their facial expressions differentially depending on the social context and are able to use them in a directed way in communicative interactions with other conspecifics
Familiar and unfamiliar face recognition in crested macaques (Macaca nigra).
Many species use facial features to identify conspecifics, which is necessary to navigate a complex social environment. The fundamental mechanisms underlying face processing are starting to be well understood in a variety of primate species. However, most studies focus on a limited subset of species tested with unfamiliar faces. As well as limiting our understanding of how widely distributed across species these skills are, this also limits our understanding of how primates process faces of individuals they know, and whether social factors (e.g. dominance and social bonds) influence how readily they recognize others. In this study, socially housed crested macaques voluntarily participated in a series of computerized matching-to-sample tasks investigating their ability to discriminate (i) unfamiliar individuals and (ii) members of their own social group. The macaques performed above chance on all tasks. Familiar faces were not easier to discriminate than unfamiliar faces. However, the subjects were better at discriminating higher ranking familiar individuals, but not unfamiliar ones. This suggests that our subjects applied their knowledge of their dominance hierarchies to the pictorial representation of their group mates. Faces of high-ranking individuals garner more social attention, and therefore might be more deeply encoded than other individuals. Our results extend the study of face recognition to a novel species, and consequently provide valuable data for future comparative studies
Recommended from our members
Macaques can predict social outcomes from facial expressions
There is widespread acceptance that facial expressions are useful in social interactions, but empirical demonstration of their adaptive function has remained elusive. Here, we investigated whether macaques can use the facial expressions of others to predict the future outcomes of social interaction. Crested macaques (Macaca nigra) were shown an approach between two unknown individuals on a touchscreen and were required to choose between one of two potential social outcomes. The facial expressions of the actors were manipulated in the last frame of the video. One subject reached the experimental stage and accurately predicted different social outcomes depending on which facial expressions the actors displayed. The bared-teeth display (homologue of the human smile) was most strongly associated with predicted friendly outcomes. Contrary to our predictions, screams and threat faces were not associated more with conflict outcomes. Overall, therefore, the presence of any facial expression (compared to neutral) caused the subject to choose friendly outcomes more than negative outcomes. Facial expression in general, therefore, indicated a reduced likelihood of social conflict. The findings dispute traditional theories that view expressions only as indicators of present emotion and instead suggest that expressions form part of complex social interactions where individuals think beyond the present
Recommended from our members
Rethinking primate facial expression: a predictive framework
Primate facial expression has long been studied within a framework of emotion that has heavily influenced both theoretical approaches and scientific methods. For example, our understanding of the adaptive function and cognition of facial expression is tied to the assumption that facial expression is accompanied by an emotional internal state, which is decipherable by others. Here, we challenge this view and instead support the alternative that facial expression should also be conceptualised as an indicator of future behaviour as opposed to current emotional state alone (Behavioural Ecology View, Fridlund, 1994). We also advocate the use of standardised, objective methodology Facial Action Coding System, to avoid making assumptions about the underlying emotional state of animals producing facial expressions. We argue that broadening our approach to facial expression in this way will open new avenues to explore the underlying neurobiology, cognition and evolution of facial communication in both human and non-human primates
- âŠ