105 research outputs found

    The amplification of risk in experimental diffusion chains

    Full text link
    Understanding how people form and revise their perception of risk is central to designing efficient risk communication methods, eliciting risk awareness, and avoiding unnecessary anxiety among the public. However, public responses to hazardous events such as climate change, contagious outbreaks, and terrorist threats are complex and difficult-to-anticipate phenomena. Although many psychological factors influencing risk perception have been identified in the past, it remains unclear how perceptions of risk change when propagated from one person to another and what impact the repeated social transmission of perceived risk has at the population scale. Here, we study the social dynamics of risk perception by analyzing how messages detailing the benefits and harms of a controversial antibacterial agent undergo change when passed from one person to the next in 10-subject experimental diffusion chains. Our analyses show that when messages are propagated through the diffusion chains, they tend to become shorter, gradually inaccurate, and increasingly dissimilar between chains. In contrast, the perception of risk is propagated with higher fidelity due to participants manipulating messages to fit their preconceptions, thereby influencing the judgments of subsequent participants. Computer simulations implementing this simple influence mechanism show that small judgment biases tend to become more extreme, even when the injected message contradicts preconceived risk judgments. Our results provide quantitative insights into the social amplification of risk perception, and can help policy makers better anticipate and manage the public response to emerging threats.Comment: Published online in PNAS Early Edition (open-access): http://www.pnas.org/content/early/2015/04/14/142188311

    FFTrees: A toolbox to create, visualize, and evaluate fast-and-frugal decision trees

    Get PDF
    Fast-and-frugal trees (FFTs) are simple algorithms that facilitate efficient and accurate decisions based on limited information. But despite their successful use in many applied domains, there is no widely available toolbox that allows anyone to easily create, visualize, and evaluate FFTs. We fill this gap by introducing the R package FFTrees. In this paper, we explain how FFTs work, introduce a new class of algorithms called fan for constructing FFTs, and provide a tutorial for using the FFTrees package. We then conduct a simulation across ten real-world datasets to test how well FFTs created by FFTrees can predict data. Simulation results show that FFTs created by FFTrees can predict data as well as popular classification algorithms such as regression and random forests, while remaining simple enough for anyone to understand and use

    The environment matters: Comparing individuals and dyads in their adaptive use of decision strategies

    Get PDF
    Individuals have been shown to adaptively select decision strategies depending on the environment structure. Two experiments extended this research to the group level. Subjects (N = 240) worked either individually or in two-person groups, or dyads, on a multi-attribute paired-comparison task. They were randomly assigned to two different environments that favored one of two prototypical decision strategies—weighted additive or take-the-best (between-subjects design in Experiment 1 and within-subject design in Experiment 2). Performance measures revealed that both individuals and dyads learned to adapt over time. A higher starting and overall performance rate in the environment in which weighted additive performed best led to the conclusion that weighted additive served as a default strategy. When this default strategy had to be replaced, because the environment structure favored take-the-best, the superior adaptive capacity of dyads became observable in the form of a steeper learning rate. Analyses of nominal dyads indicate that real dyads performed at the level of the best individuals. Fine-grained analyses of information-search data are presented. Results thus point to the strong moderating role of the environment structure when comparing individual with group performance and are discussed within the framework of adaptive strategy selection

    Reverse Engineering tools: development and experimentation of innovative methods for physical and geometrical data integration and post-processing

    Get PDF
    In recent years, the use of Reverse Engineering systems has got a considerable interest for a wide number of applications. Therefore, many research activities are focused on accuracy and precision of the acquired data and post processing phase improvements. In this context, this PhD Thesis deals with the definition of two novel methods for data post processing and data fusion between physical and geometrical information. In particular a technique has been defined for error definition in 3D points’ coordinates acquired by an optical triangulation laser scanner, with the aim to identify adequate correction arrays to apply under different acquisition parameters and operative conditions. Systematic error in data acquired is thus compensated, in order to increase accuracy value. Moreover, the definition of a 3D thermogram is examined. Object geometrical information and its thermal properties, coming from a thermographic inspection, are combined in order to have a temperature value for each recognizable point. Data acquired by an optical triangulation laser scanner are also used to normalize temperature values and make thermal data independent from thermal-camera point of view.L’impiego di tecniche di Ingegneria Inversa si ù ampiamente diffuso e consolidato negli ultimi anni, tanto che questi sistemi sono comunemente impiegati in numerose applicazioni. Pertanto, numerose attività di ricerca sono volte all’analisi del dato acquisito in termini di accuratezza e precisione ed alla definizione di tecniche innovative per il post processing. In questo panorama, l’attività di ricerca presentata in questa tesi di dottorato ù rivolta alla definizione di due metodologie, l’una finalizzata a facilitare le operazioni di elaborazione del dato e l’altra a permettere un agevole data fusion tra informazioni fisiche e geometriche di uno stesso oggetto. In particolare, il primo approccio prevede l’individuazione della componente di errore nelle coordinate di punti acquisiti mediate un sistema di scansione a triangolazione ottica. Un’opportuna matrice di correzione della componente sistematica ù stata individuata, a seconda delle condizioni operative e dei parametri di acquisizione del sistema. Pertanto, si ù raggiunto un miglioramento delle performance del sistema in termini di incremento dell’accuratezza del dato acquisito. Il secondo tema di ricerca affrontato in questa tesi consiste nell’integrazione tra il dato geometrico proveniente da una scansione 3D e le informazioni sulla temperatura rilevata mediante un’indagine termografica. Si ù così ottenuto un termogramma in 3D registrando opportunamente su ogni punto acquisito il relativo valore di temperatura. L’informazione geometrica, proveniente dalla scansione laser, ù stata inoltre utilizzata per normalizzare il termogramma, rendendolo indipendente dal punto di vista della presa termografica

    We favor formal models of heuristics rather than lists of loose dichotomies: a reply to Evans and Over

    Get PDF
    In their comment on Marewski et al. (good judgments do not require complex cognition, 2009) Evans and Over (heuristic thinking and human intelligence: a commentary on Marewski, Gaissmaier and Gigerenzer, 2009) conjectured that heuristics can often lead to biases and are not error free. This is a most surprising critique. The computational models of heuristics we have tested allow for quantitative predictions of how many errors a given heuristic will make, and we and others have measured the amount of error by analysis, computer simulation, and experiment. This is clear progress over simply giving heuristics labels, such as availability, that do not allow for quantitative comparisons of errors. Evans and Over argue that the reason people rely on heuristics is the accuracy-effort trade-off. However, the comparison between heuristics and more effortful strategies, such as multiple regression, has shown that there are many situations in which a heuristic is more accurate with less effort. Finally, we do not see how the fast and frugal heuristics program could benefit from a dual-process framework unless the dual-process framework is made more precise. Instead, the dual-process framework could benefit if its two “black boxes” (Type 1 and Type 2 processes) were substituted by computational models of both heuristics and other processes

    Opportunities and challenges of Web 2.0 for vaccination decisions.

    No full text
    A growing number of people use the Internet to obtain health information, including information about vaccines. Websites that allow and promote interaction among users are an increasingly popular source of health information. Users of such so-called Web 2.0 applications (e.g. social media), while still in the minority, represent a growing proportion of online communicators, including vocal and active anti-vaccination groups as well as public health communicators. In this paper, the authors: define Web 2.0 and examine how it may influence vaccination decisions; discuss how anti-vaccination movements use Web 2.0 as well as the challenges Web 2.0 holds for public health communicators; describe the types of information used in these different settings; introduce the theoretical background that can be used to design effective vaccination communication in a Web 2.0 environment; make recommendations for practice and pose open questions for future research. The authors conclude that, as a result of the Internet and Web 2.0, private and public concerns surrounding vaccinations have the potential to virally spread across the globe in a quick, efficient and vivid manner. Web 2.0 may influence vaccination decisions by delivering information that alters the perceived personal risk of vaccine-preventable diseases or vaccination side-effects. It appears useful for public health officials to put effort into increasing the effectiveness of existing communication by implementing interactive, customized communication. A key step to providing successful public health communication is to identify those who are particularly vulnerable to finding and using unreliable and misleading information. Thus, it appears worthwhile that public health websites strive to be easy to find, easy to use, attractive in its presentation and readily provide the information, support and advice that the searcher is looking for. This holds especially when less knowledgeable individuals are in need of reliable information about vaccination risks and benefits

    Presenting quantitative information about decision outcomes: a risk communication primer for patient decision aid developers

    Full text link
    Abstract Background Making evidence-based decisions often requires comparison of two or more options. Research-based evidence may exist which quantifies how likely the outcomes are for each option. Understanding these numeric estimates improves patients’ risk perception and leads to better informed decision making. This paper summarises current “best practices” in communication of evidence-based numeric outcomes for developers of patient decision aids (PtDAs) and other health communication tools. Method An expert consensus group of fourteen researchers from North America, Europe, and Australasia identified eleven main issues in risk communication. Two experts for each issue wrote a “state of the art” summary of best evidence, drawing on the PtDA, health, psychological, and broader scientific literature. In addition, commonly used terms were defined and a set of guiding principles and key messages derived from the results. Results The eleven key components of risk communication were: 1) Presenting the chance an event will occur; 2) Presenting changes in numeric outcomes; 3) Outcome estimates for test and screening decisions; 4) Numeric estimates in context and with evaluative labels; 5) Conveying uncertainty; 6) Visual formats; 7) Tailoring estimates; 8) Formats for understanding outcomes over time; 9) Narrative methods for conveying the chance of an event; 10) Important skills for understanding numerical estimates; and 11) Interactive web-based formats. Guiding principles from the evidence summaries advise that risk communication formats should reflect the task required of the user, should always define a relevant reference class (i.e., denominator) over time, should aim to use a consistent format throughout documents, should avoid “1 in x” formats and variable denominators, consider the magnitude of numbers used and the possibility of format bias, and should take into account the numeracy and graph literacy of the audience. Conclusion A substantial and rapidly expanding evidence base exists for risk communication. Developers of tools to facilitate evidence-based decision making should apply these principles to improve the quality of risk communication in practice.http://deepblue.lib.umich.edu/bitstream/2027.42/116070/1/12911_2013_Article_751.pd

    Risk-Adjusted Cancer Screening and Prevention (RiskAP): Complementing Screening for Early Disease Detection by a Learning Screening Based on Risk Factors

    Get PDF
    Background: Risk-adjusted cancer screening and prevention is a promising and continuously emerging option for improving cancer prevention. It is driven by increasing knowledge of risk factors and the ability to determine them for individual risk prediction. However, there is a knowledge gap between evidence of increased risk and evidence of the effectiveness and efficiency of clinical preventive interventions based on increased risk. This gap is, in particular, aggravated by the extensive availability of genetic risk factor diagnostics, since the question of appropriate preventive measures immediately arises when an increased risk is identified. However, collecting proof of effective preventive measures, ideally by prospective randomized preventive studies, typically requires very long periods of time, while the knowledge about an increased risk immediately creates a high demand for action. Summary: Therefore, we propose a risk-adjusted prevention concept that is based on the best current evidence making needed and appropriate preventive measures available, and which is constantly evaluated through outcome evaluation, and continuously improved based on these results. We further discuss the structural and procedural requirements as well as legal and socioeconomical aspects relevant for the implementation of this concept
    • 

    corecore