252 research outputs found

    Meta-informational cue inconsistency and judgment of information accuracy: spotlight on intelligence analysis

    Get PDF
    Meta-information is information about information that can be used as cues to guide judgments and decisions. Three types of meta-information that are routinely used in intelligence analysis are source reliability, information credibility and classification level. The first two cues are intended to speak to information quality (in particular, the probability that the information is accurate), and classification level is intended to describe the information’s security sensitivity. Two experiments involving professional intelligence analysts (N = 25 and 27, respectively) manipulated meta-information in a 6 (source reliability) by 6 (information credibility) by 2 (classification) repeated-measures design. Ten additional items were retested to measure intra-individual reliability. Analysts judged the probability of information accuracy based on its meta-informational profile. In both experiments, the judged probability of information accuracy was sensitive to ordinal position on the scales and the directionality of linguistic terms used to anchor the levels of the two scales. Directionality led analysts to group the first three levels of each scale in a positive group and the fourth and fifth levels in a negative group, with the neutral term “cannot be judged” falling between these groups. Critically, as reliability and credibility cue inconsistency increased, there was a corresponding decrease in intra-analyst reliability, inter-analyst agreement, and effective cue utilization. Neither experiment found a significant effect of classification on probability judgments

    Improving the Safety of Accidentally Damaged Reinforced Concrete Columns through Composite Action

    Get PDF
    This paper provides initially an overview of some general issues associated with the robustness of structures. Firstly, a brief discussion related to the progressive collapse, from its basic definition, to the inherent difficulties of understanding, analysing and mitigating this phenomenon is presented. Attention is also drawn to the potential sources of abnormal loads that should be examined when designing for progressive collapse performance. In addition, some of the design standards that have been developed, and methods for designing to progressive collapse hazards, are discussed. Finally, a numerical analysis of a four storey reinforced concrete frame structure has been carried out and the results concerning the assessment of a progressively damaged structure are presented

    COP21 climate negotiators' responses to climate model forecasts

    Get PDF
    Policymakers involved in climate change negotiations are key users of climate science. It is therefore vital to understand how to communicate scientific information most effectively to this group. We tested how a unique sample of policymakers and negotiators at the Paris COP21 conference update their beliefs on year 2100 global mean temperature increases in response to a statistical summary of climate models' forecasts. We randomized the way information was provided across participants using three different formats similar to those used in Intergovernmental Panel on Climate Change reports. In spite of having received all available relevant scientific information, policymakers adopted such information very conservatively, assigning it less weight than their own prior beliefs. However, providing individual model estimates in addition to the statistical range was more effective in mitigating such inertia. The experiment was repeated with a population of European MBA students who, despite starting from similar priors, reported conditional probabilities closer to the provided models' forecasts than policymakers. There was also no effect of presentation format in the MBA sample. These results highlight the importance of testing visualization tools directly on the population of interest

    Cognitive and psychological science insights to improve climate change data visualization

    Get PDF
    Visualization of climate data plays an integral role in the communication of climate change findings to both expert and non-expert audiences. The cognitive and psychological sciences can provide valuable insights into how to improve visualization of climate data based on knowledge of how the human brain processes visual and linguistic information. We review four key research areas to demonstrate their potential to make data more accessible to diverse audiences: directing visual attention, visual complexity, making inferences from visuals, and the mapping between visuals and language. We present evidence-informed guidelines to help climate scientists increase the accessibility of graphics to non-experts, and illustrate how the guidelines can work in practice in the context of Intergovernmental Panel on Climate Change graphics

    Full-Scale Shaking Table Tests on a Substandard RC Building Repaired and Strengthened with Post-Tensioned Metal Straps

    Get PDF
    The effectiveness of a novel Post-Tensioned Metal Strapping (PTMS) technique at enhancing the seismic behaviour of a substandard RC building was investigated through full-scale shake-table tests during the EU-funded project BANDIT. The building had inadequate reinforcement detailing in columns and joints to replicate old construction practices. After the bare building was initially damaged significantly, it was repaired and strengthened with PTMS to perform additional seismic tests. The PTMS technique improved considerably the seismic performance of the tested building. Whilst the bare building experienced critical damage at an earthquake of PGA=0.15g, the PTMS-strengthened building sustained a PGA=0.35g earthquake without compromising stability

    Forecasting the duration of volcanic eruptions: an empirical probabilistic model

    Get PDF
    The ability to forecast future volcanic eruption durations would greatly benefit emergency response planning prior to and during a volcanic crises. This paper introduces a probabilistic model to forecast the duration of future and on-going eruptions. The model fits theoretical distributions to observed duration data and relies on past eruptions being a good indicator of future activity. A dataset of historical Mt. Etna flank eruptions is presented and used to demonstrate the model. The data has been compiled through critical examination of existing literature along with careful consideration of uncertainties on reported eruption start and end dates between the years 1300 AD and 2010 and data following 1600 is considered to be reliable and free of reporting biases. The distribution of eruption durations between the years 1600 and 1670 is found to be statistically different from that following 1670 and represents the culminating phase of a century-scale cycle. The forecasting model is run on two datasets ofMt. Etna flank eruption durations; 1600-2010 and 1670-2010. Each dataset is modelled using a log-logistic distribution with parameter values found by maximum likelihood estimation. Survivor function statistics are applied to the model distributions to forecast (a) the probability of an eruption exceeding a given duration, (b) the probability of an eruption that has already lasted a particular number of days exceeding a given total duration and (c) the duration with a given probability of being exceeded. Results show that excluding the 1600-1670 data has little effect of the forecasting model result, especially where short durations are involved. By assigning the terms ‘likely’ and ‘unlikely’ to probabilities of 66 % and 33 %, respectively the forecasting model is used on the 1600-2010 dataset to indicate that a future flank eruption on Mt. Etna would be likely to exceed 20 days (± 7 days) but unlikely to exceed 68 days (± 29 days). This model can easily be adapted for use on other highly active, well-documented volcanoes or for different duration data such as the duration of explosive episodes or the duration of repose periods between eruptions
    corecore