21 research outputs found
Risk Game: Capturing impact of information quality on human belief assessment and decision making
This paper presents the Risk Game, a general methodology to elicit experts’ knowledge and know-how, in their ability to deal with information provided by different types of sources (sensors or humans) of variable quality, to take into account the information quality and to reason about concurrent events. It is a contrived technique capturing data expressing human reasoning features during a specific task of situation assessment. The information is abstracted by cards and its quality, which varies along the three dimensions of uncertainty, imprecision and falseness, is randomly selected by dice roll. The game has been played by experts of maritime surveillance, mostly marine officers from several nations. The Risk Game is domain-independent and can be designed for any specific application involving reasoning with multi-sources. The preliminary results obtained are promising and allow validating the efficiency of the elicitation method in capturing the link between information quality and human belief assessment. Besides the positive feedback collected from the players and their perceived effectiveness of the method, the data effectively capture the impact some specific information quality dimensions on belief assessment. We highlight, for instance, that the relevance of information perceived by the players may differ from the effective information relevance, that a high ratio of false information increases the uncertainty of the player before decision and may lead to wrong decisions, or that the context has a high impact on the decision made. Future extensions of the Risk Game are finally sketched
A semi-empirical Bayesian chart to monitor Weibull percentiles
This paper develops a Bayesian control chart for the percentiles of the
Weibull distribution, when both its in-control and out-of-control parameters
are unknown. The Bayesian approach enhances parameter estimates for small
sample sizes that occur when monitoring rare events as in high-reliability
applications or genetic mutations. The chart monitors the parameters of the
Weibull distribution directly, instead of transforming the data as most
Weibull-based charts do in order to comply with their normality assumption. The
chart uses the whole accumulated knowledge resulting from the likelihood of the
current sample combined with the information given by both the initial prior
knowledge and all the past samples. The chart is adapting since its control
limits change (e.g. narrow) during the Phase I. An example is presented and
good Average Run Length properties are demonstrated. In addition, the paper
gives insights into the nature of monitoring Weibull processes by highlighting
the relationship between distribution and process parameters.Comment: 21 pages, 3 figures, 5 table
Recommended from our members
Causes of differences in model and satellite tropospheric warming rates
In the early twenty-first century, satellite-derived tropospheric warming trends were generally smaller than trends estimated from a large multi-model ensemble. Because observations and coupled model simulations do not have the same phasing of natural internal variability, such decadal differences in simulated and observed warming rates invariably occur. Here we analyse global-mean tropospheric temperatures from satellites and climate model simulations to examine whether warming rate differences over the satellite era can be explained by internal climate variability alone. We find that in the last two decades of the twentieth century, differences between modelled and observed tropospheric temperature trends are broadly consistent with internal variability. Over most of the early twenty-first century, however, model tropospheric warming is substantially larger than observed; warming rate differences are generally outside the range of trends arising from internal variability. The probability that multi-decadal internal variability fully explains the asymmetry between the late twentieth and early twenty-first century results is low (between zero and about 9%). It is also unlikely that this asymmetry is due to the combined effects of internal variability and a model error in climate sensitivity. We conclude that model overestimation of tropospheric warming in the early twenty-first century is partly due to systematic deficiencies in some of the post-2000 external forcings used in the model simulations
ADVANCES IN BAYESIAN CHARTS FOR RELIABILITY CONTROL
The general objective of the research study underlying this thesis was to develop innovative charts able to support reliability control by monitoring over time a critical-to-quality characteristic of interest within a Bayesian framework.
The proposed control charts use the Bayes Theorem to directly produce and effectively update information about the parameters of the reliability distributions involved in the problem. For this peculiar feature, the charts can fit in a recent stream of research in the area of Statistical Process Control (SPC) dealing with Bayesian Process Monitoring and Control.
However, the developed methodology differs from the available techniques where Bayesian inference is combined to the traditional Shewhart control charts as a supplementary tool to compute the expected loss relating to the state of the process. Only in the latest literature some interesting new approaches can be found where Bayesian inference is used as the working engine to sequentially estimate the probability of occurrence of an assignable cause responsible for the change in the process state. More specifically, the charts were designed exploiting some Bayesian reliability estimators known as Practical Bayes Estimators (PBE) which resulted very suitable for this specific application.
Thanks to the specific properties of the PBE, new charts to monitor Weibull percentiles are developed when very small samples are available (say 2, 3). As a matter of fact, the charts are able to integrate the sample information together with the prior technological knowledge into the estimation process.
So, the proposed charts can support the analyst in facing some critical scenarios where, based on very few experimental data, prompt decisions are needed (and both the Weibull parameters are unknown). In practice, this environment is typical in short production runs (parts with a long cycle time, just-in-time production, prototype parts, destructive testing of parts) and in low volume production.
The proposed methodology was investigated via numerical simulations and two real examples are presented in order to show how to operatively implement the proposed charts in two specific application areas
Vessel Pattern Knowledge Discovery from AIS Data: A Framework for Anomaly Detection and Route Prediction
Understanding maritime traffic patterns is key to Maritime Situational Awareness applications, in particular, to classify and predict activities. Facilitated by the recent build-up of terrestrial networks and satellite constellations of Automatic Identification System (AIS) receivers, ship movement information is becoming increasingly available, both in coastal areas and open waters. The resulting amount of information is increasingly overwhelming to human operators, requiring the aid of automatic processing to synthesize the behaviors of interest in a clear and effective way. Although AIS data are only legally required for larger vessels, their use is growing, and they can be effectively used to infer different levels of contextual information, from the characterization of ports and off-shore platforms to spatial and temporal distributions of routes. An unsupervised and incremental learning approach to the extraction of maritime movement patterns is presented here to convert from raw data to information supporting decisions. This is a basis for automatically detecting anomalies and projecting current trajectories and patterns into the future. The proposed methodology, called TREAD (Traffic Route Extraction and Anomaly Detection) was developed for different levels of intermittency (i.e., sensor coverage and performance), persistence (i.e., time lag between subsequent observations) and data sources (i.e., ground-based and space-based receivers)
Automatic Generation of Geographical Networks for Maritime Traffic Surveillance
In this paper, an algorithm is proposed to automatically produce hierarchical graph-based representations of maritime shipping lanes extrapolated from historical vessel positioning data. Each shipping lane is generated based on the detection of the vessels behavioural changes and represented in a compact synthetic route composed of the network nodes and route segments. The outcome of the knowledge discovery process is a geographical maritime network that can be used in Maritime Situational Awareness (MSA) applications such as track reconstruction from missing information, situation/destination prediction, and detection of anomalous behaviour. Experimental results are presented, testing the algorithm in a specific scenario of interest, the Dover Strait.JRC.G.3-Maritime affair
Unsupervised Maritime Pattern Analysis to Enhance Contextual Awareness
Maritime Situational Awareness aims at monitoring maritime activities and ensuring safety and security, based on contextual knowledge. Maritime contextual information is difficult to access, resource-consuming to update and sometimes unavailable. Thus, data-driven approaches to derive contextual information are required to support maritime situational awareness systems. In this paper, a data-driven algorithm is proposed to extrapolate maritime traffic contextual information from real-time self-reporting data. The knowledge discovery process focuses on the detection and definition of the maritime corridors, based on the construction of maritime traffic networks. The maritime traffic network provides maritime contextual knowledge to automatically update the Maritime Situational Picture, contributing towards Maritime Situational Awareness and risk management systems evolution.JRC.G.3-Maritime affair