9 research outputs found

    Horses with sustained attention follow the pointing of a human who knows where food is hidden

    Get PDF
    International audienceWhen interacting with humans, domesticated species may respond to communicative gestures, such as pointing. However, it is currently unknown, except for in dogs, if species comprehend the communicative nature of such cues. Here, we investigated whether horses could follow the pointing of a human informant by evaluating the credibility of the information about the food-hiding place provided by the pointing of two informants. Using an object-choice task, we manipulated the attentional state of the two informants during food-hiding events and differentiated their knowledge about the location of the hidden food. Furthermore, we investigated the horses’ visual attention levels towards human behaviour to evaluate the relationship between their motivation and their performance of the task. The result showed that horses that sustained high attention levels could evaluate the credibility of the information and followed the pointing of an informant who knew where food was hidden (Z = − 2.281, P = 0.002, n = 36). This suggests that horses are highly sensitive to the attentional state and pointing gestures of humans, and that they perceive pointing as a communicative cue. This study also indicates that the motivation for the task should be investigated to determine the socio-cognitive abilities of animals

    Unwilling or willing but unable: can horses interpret human actions as goal directed?

    No full text
    International audienceSocial animals can gain important benefits by inferring the goals behind the behavior of others. However, this ability has only been investigated in a handful of species outside of primates. In this study, we tested for the first time whether domestic horses can interpret human actions as goal directed. We used the classical "unwilling versus unable" paradigm: an experimenter performed three similar actions that have the same outcome, but the goal of the experimenter differed. In the unwilling condition, the experimenter had no intention to give a piece of food to a horse and moved it out of reach when the horse tried to eat it. In the two unable conditions, the experimenter had the intention to give the food to the horse but was unable to do so, either because there was a physical barrier between them or because of the experimenter's clumsiness. The horses (n = 21) reacted differently in the three conditions: they showed more interest in the unable conditions, especially in the unable clumsy condition, than in the unwilling condition. These results are similar to results found in primates with the same paradigm and suggest that horses might have taken the experimenter's goal, or even intentions, into account to adapt their behavior. Hence, our study offers more insights into horse interspecific social cognition towards humans

    Horses Categorize Human Emotions Cross-Modally Based on Facial Expression and Non-Verbal Vocalizations

    No full text
    International audienceSimple Summary: Recently, an increasing number of studies have investigated the expression and perception of emotions by non-human animals. In particular, it is of interest to determine whether animals can link emotion stimuli of different modalities (e.g., visual and oral) based on the emotions that are expressed (i.e., to recognize emotions cross-modally). For domestic species that share a close relationship with humans, we might even wonder whether this ability extends to human emotions. Here, we investigated whether domestic horses recognize human emotions cross-modally. We simultaneously presented two animated pictures of human facial expressions, one typical of joy and the other of anger; simultaneously, a speaker played a human non-verbal vocalization expressing joy or anger. Horses looked at the picture that did not match the emotion of the vocalization more (probably because they were intrigued by the paradoxical combination). Moreover, their behavior and heart rate differed depending on the vocalization: they reacted more negatively to the anger vocalization and more positively to the joy vocalization. These results suggest that horses can match visual and vocal cues for the same emotion and can perceive the emotional valence of human non-verbal vocalizations. Abstract: Over the last few years, an increasing number of studies have aimed to gain more insight into the field of animal emotions. In particular, it is of interest to determine whether animals can cross-modally categorize the emotions of others. For domestic animals that share a close relationship with humans, we might wonder whether this cross-modal recognition of emotions extends to humans, as well. In this study, we tested whether horses could recognize human emotions and attribute the emotional valence of visual (facial expression) and vocal (non-verbal vocalization) stimuli to the same perceptual category. Two animated pictures of different facial expressions (anger and joy) were simultaneously presented to the horses, while a speaker played an emotional human non-verbal vocalization matching one of the two facial expressions. Horses looked at the picture that was incongruent with the vocalization more, probably because they were intrigued by the paradoxical combination. Moreover, horses reacted in accordance with the valence of the vocalization, both behaviorally and physiologically (heart rate). These results show that horses can cross-modally recognize human emotions and react emotionally to the emotional states of humans, assessed by non-verbal vocalizations

    Horses are sensitive to baby talk: pet-directed speech facilitates communication with humans in a pointing task and during grooming

    No full text
    International audiencePet-directed speech (PDS) is a type of speech humans spontaneously use with their companion animals. It is very similar to speech commonly used when talking to babies. A survey on social media showed that 92.7% of the respondents used PDS with their horse, but only 44.4% thought that their horse was sensitive to it, and the others did not know or doubted its efficacy. We, therefore, decided to test the impact of PDS on two tasks. During a grooming task that consisted of the experimenter scratching the horse with their hand, the horses (n = 20) carried out significantly more mutual grooming gestures toward the experimenter, looked at the person more, and moved less when spoken to with PDS than with Adult-directed speech (ADS). During a pointing task in which the experimenter pointed at the location of a reward with their finger, horses who had been spoken to with PDS (n = 10) found the food significantly more often than chance, which was not the case when horses were spoken to with ADS (n = 10). These results thus indicate that horses, like certain non-human primates and dogs are sensitive to PDS. PDS could thus foster communication between people and horses during everyday interactions

    Horses Solve Visible but Not Invisible Displacement Tasks in an Object Permanence Paradigm

    No full text
    International audienceA key question in the field of animal cognition is how animals comprehend their physical world. Object permanence is one of the fundamental features of physical cognition. It is the ability to reason about hidden objects and to mentally reconstruct their invisible displacements. This cognitive skill has been studied in a wide range of species but never directly in the horse (Equus caballus). In this study, we therefore assessed the understanding of visible and invisible displacements in adult Welsh mares in two complementary experiments, using different horses. In experiment 1, visible displacement was investigated using two tasks adapted from the Uzgiris and Hunt scale 1. Invisible displacement was assessed using a transposition task, in which food was first hidden in one of two containers and the location of the containers was then switched. In experiment 2, we further investigated horses' understanding of visible and invisible displacements using an easier procedure designed to avoid potentially confounding factors. In both experiments, horses successfully completed the tasks involving visible displacement with two or three possible hiding places. However, in both experiments, horses failed the transposition tasks, suggesting that they may not be able to track the displacement of an object that is not directly perceived (i.e., invisible displacement). These results bring new insights into object permanence in horses and how they represent their physical world

    Female horses spontaneously identify a photograph of their keeper, last seen six months previously

    No full text
    International audienceHorses are capable of identifying individual conspecifics based on olfactory, auditory or visual cues. However, this raises the questions of their ability to recognize human beings and on the basis of what cues. This study investigated whether horses could differentiate between a familiar and unfamiliar human from photographs of faces. Eleven horses were trained on a discrimination task using a computer-controlled screen, on which two photographs were presented simultaneously (32 trials/ session): touching one was rewarded (S+) and the other not (S−). In the training phase, the S+ faces were of four unfamiliar people which gradually became familiar over the trials. The S− faces were novel for each trial. After the training phase, the faces of the horses’ keepers were presented opposite novel faces to test whether the horses could identify the former spontaneously. A reward was given whichever face was touched to avoid any possible learning effect. Horses touched the faces of keepers significantly more than chance, whether it was their current keeper or one they had not seen for six months (t = 3.65; p < 0.004 and t = 6.24; p < 0.0001). Overall, these results show that horses have advanced human facerecognition abilities and a long-term memory of those human faces

    Horses feel emotions when they watch positive and negative horse–human interactions in a video and transpose what they saw to real life

    No full text
    International audienceAnimals can indirectly gather meaningful information about other individuals by eavesdropping on their third-party interactions. In particular, eavesdropping can be used to indirectly attribute a negative or positive valence to an individual and to adjust one’s future behavior towards that individual. Few studies have focused on this ability in nonhuman animals, especially in nonprimate species. Here, we investigated this ability for the first time in domestic horses (Equus caballus) by projecting videos of positive and negative interactions between an unknown human experimenter (a “positive” experimenter or a “negative” experimenter) and an actor horse. The horses reacted emotionally while watching the videos, expressing behavioral (facial expressions and contact-seeking behavior) and physiological (heart rate) cues of positive emotions while watching the positive video and of negative emotions while watching the negative video. This result shows that the horses perceived the content of the videos and suggests an emotional contagion between the actor horse and the subjects. After the videos were projected, the horses took a choice test, facing the positive and negative experimenters in real life. The horses successfully used the interactions seen in the videos to discriminate between the experimenters. They touched the negative experimenter significantly more, which seems counterintuitive but can be interpreted as an appeasement attempt, based on the existing literature. This result suggests that horses can indirectly attribute a valence to a human experimenter by eavesdropping on a previous third-party interaction with a conspecific

    Horses prefer to solicit a person who previously observed a food-hiding process to access this food: A possible indication of attentional state attribution

    No full text
    Inferring what others witnessed provides important benefits in social contexts, but evidence remains scarce in nonhuman animals. We investigated this ability in domestic horses by testing whether they could discriminate between two experimenters who differed in what they previously witnessed and decide whom to solicit when confronted with an unreachable food source based on that information. First, horses saw food being hidden in a closed bucket (impossible for them to open) in the presence of two experimenters who behaved identically but differed in their attention to the baiting process (the “witness” experimenter faced the bucket, the “non-witness” faced away). Horses were then let free with both experimenters, and their interest towards each (gaze and touch) was measured. They gazed at and touched the witness significantly more than the non-witness (n = 15, gaze: p = 0.004; touch: p = 0.003). These results might suggest that horses inferred the attentional state of the experimenters during the baiting process and used this information to adapt their later behavior. Although further study would be necessary to conclude, our study provides new insight into attentional state attribution in horses and might hint to the existence of precursors of a Theory of Mind in horses
    corecore