260,546 research outputs found

    Opinion dynamics with backfire effect and biased assimilation

    Get PDF
    The democratization of AI tools for content generation, combined with unrestricted access to mass media for all (e.g. through microblogging and social media), makes it increasingly hard for people to distinguish fact from fiction. This raises the question of how individual opinions evolve in such a networked environment without grounding in a known reality. The dominant approach to studying this problem uses simple models from the social sciences on how individuals change their opinions when exposed to their social neighborhood, and applies them on large social networks. We propose a novel model that incorporates two known social phenomena: (i) Biased Assimilation: the tendency of individuals to adopt other opinions if they are similar to their own; (ii) Backfire Effect: the fact that an opposite opinion may further entrench someone in their stance, making their opinion more extreme instead of moderating it. To the best of our knowledge this is the first DeGroot-type opinion formation model that captures the Backfire Effect. A thorough theoretical and empirical analysis of the proposed model reveals intuitive conditions for polarization and consensus to exist, as well as the properties of the resulting opinions

    Extremism propagation in social networks with hubs

    No full text
    One aspect of opinion change that has been of academic interest is the impact of people with extreme opinions (extremists) on opinion dynamics. An agent-based model has been used to study the role of small-world social network topologies on general opinion change in the presence of extremists. It has been found that opinion convergence to a single extreme occurs only when the average number of network connections for each individual is extremely high. Here, we extend the model to examine the effect of positively skewed degree distributions, in addition to small-world structures, on the types of opinion convergence that occur in the presence of extremists. We also examine what happens when extremist opinions are located on the well-connected nodes (hubs) created by the positively skewed distribution. We find that a positively skewed network topology encourages opinion convergence on a single extreme under a wider range of conditions than topologies whose degree distributions were not skewed. The importance of social position for social influence is highlighted by the result that, when positive extremists are placed on hubs, all population convergence is to the positive extreme even when there are twice as many negative extremists. Thus, our results have shown the importance of considering a positively skewed degree distribution, and in particular network hubs and social position, when examining extremist transmission

    Reality Inspired Voter Models: A Mini-Review

    Get PDF
    This mini-review presents extensions of the voter model that incorporate various plausible features of real decision-making processes by individuals. Although these generalizations are not calibrated by empirical data, the resulting dynamics are suggestive of realistic collective social behaviors.Comment: 13 pages, 16 figures. Version 2 contains various proofreading improvements. V3: fixed one trivial typ

    Opinion Formation in Laggard Societies

    Full text link
    We introduce a statistical physics model for opinion dynamics on random networks where agents adopt the opinion held by the majority of their direct neighbors only if the fraction of these neighbors exceeds a certain threshold, p_u. We find a transition from total final consensus to a mixed phase where opinions coexist amongst the agents. The relevant parameters are the relative sizes in the initial opinion distribution within the population and the connectivity of the underlying network. As the order parameter we define the asymptotic state of opinions. In the phase diagram we find regions of total consensus and a mixed phase. As the 'laggard parameter' p_u increases the regions of consensus shrink. In addition we introduce rewiring of the underlying network during the opinion formation process and discuss the resulting consequences in the phase diagram.Comment: 5 pages, eps fig

    Modulating interaction times in an artificial society of robots

    Get PDF
    In a collaborative society, sharing information is advantageous for each individual as well as for the whole community. Maximizing the number of agent-to-agent interactions per time becomes an appealing behavior due to fast information spreading that maximizes the overall amount of shared information. However, if malicious agents are part of society, then the risk of interacting with one of them increases with an increasing number of interactions. In this paper, we investigate the roles of interaction rates and times (aka edge life) in artificial societies of simulated robot swarms. We adapt their social networks to form proper trust sub-networks and to contain attackers. Instead of sophisticated algorithms to build and administrate trust networks, we focus on simple control algorithms that locally adapt interaction times by changing only the robots' motion patterns. We successfully validate these algorithms in collective decision-making showing improved time to convergence and energy-efficient motion patterns, besides impeding the spread of undesired opinions
    • …
    corecore