11 research outputs found

    Understanding social responses to artificial agents : building blocks for persuasive technology

    Get PDF

    When artificial social agents try to persuade people : the role of social agency on the occurrence of psychological reactance

    Get PDF
    In the near future, robotic agents might employ persuasion to influence people’s behavior or attitudes, just as human agents do in many situations. People can comply with these requests, but, people can also experience psychological reactance, which may lead to the complete opposite of the proposed behavior. In this study we are interested in the social nature of psychological reactance. Social agency theory proposes that more social cues lead to a more social interaction. We argue that this also holds for psychological reactance. Therefore, we expect a positive relationship between the level of social agency of the source of a persuasive message and the amount of psychological reactance the message arouses. In an online experiment, participants read an advice on how to conserve energy when using a washing machine. The advice was either provided as text-only, as text accompanied by a still picture of a robotic agent, or as text accompanied by a short film clip of the same robotic agent. Confirming our expectations, results indicated that participants experienced more psychological reactance when the advice was accompanied by the still picture or when the advice was accompanied by the short film clip as compared to when the advice was provided as text-only. This indicates that stronger social agency of the messenger can lead to more psychological reactance. Furthermore, our results confirmed earlier research about the effects of controlling language on psychological reactance. Implications are discussed

    Does it make a difference who tells you what to do? : exploring the effect of social agency on psychological reactance

    No full text
    Nowadays, many advertising campaigns attempt to persuade people to perform a specific behavior. In response to such messages, people can comply and adapt their behavior in the proposed direction. However, people can also experience psychological reactance, which may lead to the complete opposite of the target behavior. In the present study, we were interested in the social nature of psychological reactance. According to Social Agency Theory [12], more social cues lead to more social interaction. We suggest that this also holds for psychological reactance. We argue that there is a positive relation between the level of social agency of the source of a message and the level of psychological reactance that this message can arouse. In an online study, participants received low-controlling or high-controlling advice about energy conservation. This advice was delivered either solely as text, as text with a still picture of a robotic agent, or as text with a brief film clip of the same robotic agent. Results showed that a high-controlling advisory message resulted in more reactance than a low-controlling advisory message. Confirming our expectancies, stronger social agency of the messenger led to more psychological reactance. Implications are discussed

    The dominant robot: threatening robots cause psychological reactance, especially when they have incongruent goals

    No full text
    Persuasive technology can take the form of a social agent that persuades people to change behavior or attitudes. However, like any persuasive technology, persuasive social agents might trigger psychological reactance, which can lead to restoration behavior. The current study investigated whether interacting with a persuasive robot can cause psychological reactance. Additionally, we investigated whether goal congruency plays a role in psychological reactance. Participants programmed a washing machine while a robot gave threatening advice. Confirming expectations, participants experienced more psychological reactance when receiving high-threatening advice compared to low-threatening advice. Moreover, when the robot gave high-threatening advice and expressed an incongruent goal, participants reported the highest level of psychological reactance (on an anger measure). Finally, high-threatening advice led to more restoration, and this relationship was partially mediated by psychological reactance. Overall, results imply that under certain circumstances persuasive technology can trigger opposite effects, especially when people have incongruent goal intentions

    Persuasive agents and the occurrence of reactance as a result of restricting communication

    No full text
    Nowadays, technologies are being developed to persuade people. Examples of such technologies are embodied conversational agents, robots, or even the personal computer. When people are the subject of persuasion, they are directed to perform a specific behavior. For example, they can be directed to stop smoking, start exercising or increase energy conservation. In response to such messages people can conform and act on the desired behavior. However, people can also experience psychological reactance and consequently perform the opposite of the intended behavior. But when does this happen and why? And more interestingly, does psychological reactance also occurs when interacting with an agent, rather than a human being? According to the theory of psychological reactance (Brehm, 1966), psychological reactance is defined by; "…a state of arousal that occurs when a person experiences a perceived threat to freedom, which motivates to restore that threatened freedom." A model is proposed that consists of four elements; an experienced intrusion to autonomy, a perceived freedom, psychological reactance, and a desire to restore the threatened freedom. First, people perceive that they have the freedom to choose. Then, people are being persuaded to perform a specific behavior, which is experienced as an intrusion to their autonomy. This results in a state of arousal, which is called psychological reactance. This will eventually lead participants to feel the desire to restore that threatened freedom. It is hypothesized, as in line with previous studies, that more psychological reactance will be experienced by a highthreatening message compared with a low-threatening message, or a no-threatening message. Furthermore, it is expected that as a result of this experience of psychological reactance, people will report higher intentions to restore the threatened freedom.In addition, we were interested in the social nature of psychological reactance. The authors of this article suggest that psychological reactance is a social phenomenon that is the outcome of an interaction between human beings. We further suggest that this can also be true when interacting with a social agent. Evidence for this hypothesis can be found from The Media Equation (Reeves & Nass, 2002), in which it is stated that people react to computers as if they were reacting to other people. Furthermore, according to the Social Agency Theory, the more social cues that are available in an interaction, the more social the interaction becomes, and consequently leads people to try to better understand the relationship with the other actor (Mayer, Sobko, & Mautone, 2003). We hypothesize that this also holds for psychological reactance; the more social cues, the more people experience psychological reactance. Lastly, it is hypothesized that the most psychological reactance will occur when a high-threatening message is combined with a high social agent. These hypotheses were tested by using a 3 (Threat: No Threat vs. Low Threat vs. High Threat) x 3 (Social Agency: Text vs. Still picture vs. Brief moving film clip) between-subjects experimental design. Eighty-nine participants participated in an online study, in which they had to read an advisory text about energy conservation with the washing machine. After reading this text, participants were requested to perform a though-listing task and answer some questionnaires. Participants were randomly assigned to a no-threatening message, a low-threatening message or a high-threatening message. This advice was either delivered solely by text, by a text that was accompanied by a still picture of a robotic agent, or a text accompanied by a brief film clip of the same robotic agent. Results show that more psychological reactance was reported by a high-threatening message compared to a low-threatening message or a no-threatening message. Furthermore, it was found that the more social cues the agent possessed, participants experienced the more psychological reactance. Unfortunately, we did not find any support for an interaction between the amount of threat and the level of social agency leading to the strongest reactance when high threat would co-occur with high social agency

    Does it make a difference who tells you what to do? : exploring the effect of social agency on psychological reactance

    No full text
    Nowadays, many advertising campaigns attempt to persuade people to perform a specific behavior. In response to such messages, people can comply and adapt their behavior in the proposed direction. However, people can also experience psychological reactance, which may lead to the complete opposite of the target behavior. In the present study, we were interested in the social nature of psychological reactance. According to Social Agency Theory [12], more social cues lead to more social interaction. We suggest that this also holds for psychological reactance. We argue that there is a positive relation between the level of social agency of the source of a message and the level of psychological reactance that this message can arouse. In an online study, participants received low-controlling or high-controlling advice about energy conservation. This advice was delivered either solely as text, as text with a still picture of a robotic agent, or as text with a brief film clip of the same robotic agent. Results showed that a high-controlling advisory message resulted in more reactance than a low-controlling advisory message. Confirming our expectancies, stronger social agency of the messenger led to more psychological reactance. Implications are discussed

    Talking Ally: Toward Persuasive Communication in Everyday Life

    No full text

    The dominant robot:threatening robots cause psychological reactance, especially when they have incongruent goals

    No full text
    \u3cp\u3ePersuasive technology can take the form of a social agent that persuades people to change behavior or attitudes. However, like any persuasive technology, persuasive social agents might trigger psychological reactance, which can lead to restoration behavior. The current study investigated whether interacting with a persuasive robot can cause psychological reactance. Additionally, we investigated whether goal congruency plays a role in psychological reactance. Participants programmed a washing machine while a robot gave threatening advice. Confirming expectations, participants experienced more psychological reactance when receiving high-threatening advice compared to low-threatening advice. Moreover, when the robot gave high-threatening advice and expressed an incongruent goal, participants reported the highest level of psychological reactance (on an anger measure). Finally, high-threatening advice led to more restoration, and this relationship was partially mediated by psychological reactance. Overall, results imply that under certain circumstances persuasive technology can trigger opposite effects, especially when people have incongruent goal intentions.\u3c/p\u3
    corecore