1,123 research outputs found

    Interrupting the social amplification of risk process: a case study in collective emissions reduction

    Get PDF
    One of the main approaches we have for studying the progressive divergence of understandings around a risk issue is that of social risk amplification. This article describes a case study of a particular environmental contaminant, a chemical flame retardant that could be interpreted as having produced a risk amplifying process. It describes in particular how a group of industrial organizations acted collectively to reduce emissions of this contaminant, in an apparent attempt to avert regulation and boycotts—that is, to intercept the social amplification process and avoid its secondary effects. The aim of the study was to investigate the constitutive qualities of this collective action: the qualities that defined it and made it effective in the eyes of those involved. These include institutionalisation and independence, the ability to confer individual as well as collective benefit, the capacity to attract (rather than avoid) criticism, and the ‘branding’ that helps communicate what otherwise appear to be a set of unconnected, local actions. Although the risk amplification framework has been criticised for implying that there is some externally given risk level that is subsequently amplified, it does appear to capture the mentality of actors involved in issues of this kind. They talk and act as though they believe they are participants in a risk amplification process

    An Evaluation of the Aesthetics of Junkyard Screening and Billboard Densities

    Get PDF

    Considerations in the design of large space structures

    Get PDF
    Several analytical studies of topics relevant to the design of large space structures are presented. Topics covered are: the types and quantitative evaluation of the disturbances to which large Earth-oriented microwave reflectors would be subjected and the resulting attitude errors of such spacecraft; the influence of errors in the structural geometry of the performance of radiofrequency antennas; the effect of creasing on the flatness of tensioned reflector membrane surface; and an analysis of the statistics of damage to truss-type structures due to meteoroids

    Null hypothesis testing ≠ scientific inference: a critique of the shaky premise at the heart of the science and values debate, and a defense of value‐neutral risk assessment

    Get PDF
    Many philosophers and statisticians argue that risk assessors are morally obligated to evaluate the probabilities and consequences of methodological error, and to base their decisions of whether to adopt a given parameter value, model, or hypothesis on those considerations. This argument is couched within the rubric of null hypothesis testing, which I suggest is a poor descriptive and normative model for risk assessment. Risk regulation is not primarily concerned with evaluating the probability of data conditional upon the null hypothesis, but rather with measuring risks, estimating the consequences of available courses of action and inaction, formally characterizing uncertainty, and deciding what to do based upon explicit values and decision criteria. In turn, I defend an ideal of value‐neutrality, whereby the core inferential tasks of risk assessment—such as weighing evidence, estimating parameters, and model selection—should be guided by the aim of correspondence to reality. This is not to say that value judgments be damned, but rather that they should be accounted for within a structured approach to decision analysis, rather than embedded within risk assessment in an informal manner

    Fast and frugal crisis management: an analysis of rule-based judgment and choice during water contamination events

    Get PDF
    Drawing on the fast and frugal research program, this paper describes a retrospective field study of decision making during water contamination events. It characterizes three heuristics employed in real-world decision making. The credibility heuristic discriminates between signals from targets and noise from distracters on the basis of the perceived trustworthiness of the message conveyor. With the precedent heuristic, the response to an unfolding event is determined by searching for past analogues (i.e. precedents) and, if found, treating the current event in the same fashion. By contrast, the facts-trump-speculation heuristic discriminates between conflicting explanations or claims according to how they rank on pre-determined hierarchies of evidence (orders of cue validities), neglecting utilities and avoiding the aggregation of competing lines of evidence. Rather than cataloguing the biases that these heuristics lead to, this paper focuses on the structural factors which shape each heuristic’s ecological rationality. In doing so, the study develops ideas about how particular infrastructure systems and forms of social organization structure the validity of cues, the accessibility of information, and the application of particular heuristics. The study also introduces the concept of safeguards to rule-based reasoning, and the idea that heuristics can be used to rationalize decisions, and deployed strategically to persuade other social actors. The over-arching claim is that the fast and frugal program provides a powerful framework for analyzing judgment and choice in organizations, and offers a bridge between psychological and political models of organizational behavior

    Handling uncertainty in models of seismic and postseismic hazards: toward robust methods and resilient societies

    Get PDF
    Earthquakes, tsunamis, and landslides take a devastating toll on human lives, critical infrastructure, and ecosystems. Harnessing the predictive capacities of hazard models is key to transitioning from reactive approaches to disaster management toward building resilient societies, yet the knowledge that these models produce involves multiple uncertainties. The failure to properly account for these uncertainties has at times had important implications, from the flawed safety measures at the Fukushima power plant, to the reliance on short‐term earthquake prediction models (reportedly at the expense of mitigation efforts) in modern China. This article provides an overview of methods for handling uncertainty in probabilistic seismic hazard assessment, tsunami hazard analysis, and debris flow modeling, considering best practices and areas for improvement. It covers sensitivity analysis, structured approaches to expert elicitation, methods for characterizing structural uncertainty (e.g., ensembles and logic trees), and the value of formal decision‐analytic frameworks even in situations of deep uncertainty

    Heuristics structure and pervade formal risk assessment

    Get PDF
    Lay perceptions of risk appear rooted more in heuristics than in reason. A major concern of the risk regulation literature is that such “error-strewn” perceptions may be replicated in policy, as governments respond to the (mis)fears of the citizenry. This has led many to advocate a relatively technocratic approach to regulating risk, characterized by high reliance on formal risk and cost-benefit analysis. However, through two studies of chemicals regulation, we show that the formal assessment of risk is pervaded by its own set of heuristics. These include rules to categorize potential threats, define what constitutes valid data, guide causal inference, and to select and apply formal models. Some of these heuristics lay claim to theoretical or empirical justifications, others are more back-of-the-envelope calculations, while still more purport not to reflect some truth but simply to constrain discretion or perform a desk-clearing function. These heuristics can be understood as a way of authenticating or formalizing risk assessment as a scientific practice, representing a series of rules for bounding problems, collecting data, and interpreting evidence (a methodology). Heuristics are indispensable elements of induction. And so they are not problematic per se, but they can become so when treated as laws rather than as contingent and provisional rules. Pitfalls include the potential for systematic error, masking uncertainties, strategic manipulation, and entrenchment. Our central claim is that by studying the rules of risk assessment qua rules, we develop a novel representation of the methods, conventions, and biases of the prior art

    Anisotropic Distribution of SDSS Satellite Galaxies: Planar (not Polar) Alignment

    Full text link
    The distribution of satellite galaxies relative to isolated host galaxies in the Sloan Digital Sky Survey (SDSS) is investigated. Host-satellite systems are selected using three different methods, yielding samples of ~3300, ~1600, and \~950 satellites. In the plane of the sky, the distributions of all three samples show highly significant deviations from circular symmetry (> 99.99%, > 99.99%, and 99.79% confidence levels, respectively), and the degree of anisotropy is a strong function of the projected radius, r_p, at which the satellites are found. For r_p < 100 kpc, the SDSS satellites are aligned preferentially with the major axes of the hosts. This is in stark contrast to the Holmberg effect, in which satellites are aligned with the minor axes of host galaxies. The degree of anisotropy in the distribution of the SDSS satellites decreases with r_p and is consistent with an isotropic distribution at of order the 1-sigma level for 250 kpc < r_p < 500 kpc.Comment: ApJ Letters (in press); Discussion section substantially revised, SDSS DR3 included in the analysis, no significant changes to the result
    • 

    corecore