5,125 research outputs found
The Basic Principles of Uncertain Information Fusion. An organized review of merging rules in different representation frameworks
We propose and advocate basic principles for the fusion of incomplete or uncertain information items, that should apply regardless of the formalism adopted for representing pieces of information coming from several sources. This formalism can be based on sets, logic, partial orders, possibility theory, belief functions or imprecise probabilities. We propose a general notion of information item representing incomplete or uncertain information about the values of an entity of interest. It is supposed to rank such values in terms of relative plausibility, and explicitly point out impossible values. Basic issues affecting the results of the fusion process, such as relative information content and consistency of information items, as well as their mutual consistency, are discussed. For each representation setting, we present fusion rules that obey our principles, and compare them to postulates specific to the representation proposed in the past. In the crudest (Boolean) representation setting (using a set of possible values), we show that the understanding of the set in terms of most plausible values, or in terms of non-impossible ones matters for choosing a relevant fusion rule. Especially, in the latter case our principles justify the method of maximal consistent subsets, while the former is related to the fusion of logical bases. Then we consider several formal settings for incomplete or uncertain information items, where our postulates are instantiated: plausibility orderings, qualitative and quantitative possibility distributions, belief functions and convex sets of probabilities. The aim of this paper is to provide a unified picture of fusion rules across various uncertainty representation settings
Recommended from our members
Maritime data integration and analysis: Recent progress and research challenges
The correlated exploitation of heterogeneous data sources offering very large historical as well as streaming data is important to increasing the accuracy of computations when analysing and predicting future states of moving entities. This is particularly critical in the maritime domain, where online tracking, early recognition of events, and real-time forecast of anticipated trajectories of vessels are crucial to safety and operations at sea. The objective of this paper is to review current research challenges and trends tied to the integration, management, analysis, and visualization of objects moving at sea as well as a few suggestions for a successful development of maritime forecasting and decision-support systems
Evidence Propagation and Consensus Formation in Noisy Environments
We study the effectiveness of consensus formation in multi-agent systems
where there is both belief updating based on direct evidence and also belief
combination between agents. In particular, we consider the scenario in which a
population of agents collaborate on the best-of-n problem where the aim is to
reach a consensus about which is the best (alternatively, true) state from
amongst a set of states, each with a different quality value (or level of
evidence). Agents' beliefs are represented within Dempster-Shafer theory by
mass functions and we investigate the macro-level properties of four well-known
belief combination operators for this multi-agent consensus formation problem:
Dempster's rule, Yager's rule, Dubois & Prade's operator and the averaging
operator. The convergence properties of the operators are considered and
simulation experiments are conducted for different evidence rates and noise
levels. Results show that a combination of updating on direct evidence and
belief combination between agents results in better consensus to the best state
than does evidence updating alone. We also find that in this framework the
operators are robust to noise. Broadly, Yager's rule is shown to be the better
operator under various parameter values, i.e. convergence to the best state,
robustness to noise, and scalability.Comment: 13th international conference on Scalable Uncertainty Managemen
MARITIME DATA INTEGRATION AND ANALYSIS: RECENT PROGRESS AND RESEARCH CHALLENGES
The correlated exploitation of heterogeneous data sources offering very large historical as well as streaming data is important to increasing the accuracy of computations when analysing and predicting future states of moving entities. This is particularly critical in the maritime domain, where online tracking, early recognition of events, and real-time forecast of anticipated trajectories of vessels are crucial to safety and operations at sea. The objective of this paper is to review current research challenges and trends tied to the integration, management, analysis, and visualization of objects moving at sea as well as a few suggestions for a successful development of maritime forecasting and decision-support systems.
Document type: Articl
Fusing Automatically Extracted Annotations for the Semantic Web
This research focuses on the problem of semantic data fusion. Although various solutions have been developed in the research communities focusing on databases and formal logic, the choice of an appropriate algorithm is non-trivial because the performance of each algorithm and its optimal configuration parameters depend on the type of data, to which the algorithm is applied. In order to be reusable, the fusion system must be able to select appropriate techniques and use them in combination.
Moreover, because of the varying reliability of data sources and algorithms performing fusion subtasks, uncertainty is an inherent feature of semantically annotated data and has to be taken into account by the fusion system. Finally, the issue of schema heterogeneity can have a negative impact on the fusion performance. To address these issues, we propose KnoFuss: an architecture for Semantic Web data integration based on the principles of problem-solving methods. Algorithms dealing with different fusion subtasks are represented as components of a modular architecture, and their capabilities are described formally. This allows the architecture to select appropriate methods and configure them depending on the processed data. In order to handle uncertainty, we propose a novel algorithm based on the Dempster-Shafer belief propagation. KnoFuss employs this algorithm to reason about uncertain data and method results in order to refine the fused knowledge base. Tests show that these solutions lead to improved fusion performance. Finally, we addressed the problem of data fusion in the presence of schema heterogeneity. We extended the KnoFuss framework to exploit results of automatic schema alignment tools and proposed our own schema matching algorithm aimed at facilitating data fusion in the Linked Data environment. We conducted experiments with this approach and obtained a substantial improvement in performance in comparison with public data repositories
Deliberative Democracy in the EU. Countering Populism with Participation and Debate. CEPS Paperback
Elections are the preferred way to freely transfer power from one
term to the next and from one political party or coalition to another.
They are an essential element of democracy. But if the process of
power transfer is corrupted, democracy risks collapse. Reliance on
voters, civil society organisations and neutral observers to fully
exercise their freedoms as laid down in international human rights
conventions is an integral part of holding democratic elections.
Without free, fair and regular elections, liberal democracy is
inconceivable.
Elections are no guarantee that democracy will take root and
hold, however. If the history of political participation in Europe over
the past 800 years is anything to go by, successful attempts at gaining
voice have been patchy, while leaders’ attempts to silence these
voices and consolidate their own power have been almost constant
(Blockmans, 2020).
Recent developments in certain EU member states have again
shown us that democratically elected leaders will try and use
majoritarian rule to curb freedoms, overstep the constitutional limits
of their powers, protect the interests of their cronies and recycle
themselves through seemingly free and fair elections. In their recent
book How Democracies Die, two Harvard professors of politics write:
“Since the end of the Cold War, most democratic breakdowns have
been caused not by generals and soldiers but by elected governments
themselves” (Levitsky and Ziblatt, 2018)
NATO Code of Best Practice for C2 Assessment
This major revision to the Code of Best Practice (COBP) for C2 Assessment is the product of a NATO Research and Technology Organisation (RTO) sponsored Research Group (SAS-026). It represents over a decade of work by many of the best analysts from the NATO countries. A symposium (SAS-039) was hosted by the NATO Consultation Command Control Agency (NC3A) that provided the venue for a rigorous peer review of the code. This new version of the COBP for C2 assessment builds upon the initial version of the COBP produced by SAS-002. The earlier version focused on the analysis of ground forces at a tactical echelon in mid- to high-intensity conflicts. In developing this new version of the COBP, SAS-026 focused on a changed geopolitical context characterized by a shift from preoccupation with a war involving NATO and the Warsaw Pact to concern for a broad range of smaller military conflicts and Operations Other Than War (OOTW). This version also takes into account the impact of significantly improved information-related capabilities and their implications for reducing the fog and friction traditionally associated with conflict. Significantly reduced levels of fog and friction offer an opportunity for the military to develop new concepts of operations, new organizational forms, and new approaches to C2, as well as to the processes that support it. In addition, SAS-026 was cognizant that NATO operations are likely to include coalitions of the willing that might involve Partnership for Peace (PfP) nations, other partners outside of NATO, international organizations, and NGOs. Cost analyses continue to be excluded because they differ among NATO members, so no single approach would be appropriate. Advances in technology are expected to continue at an increasing rate and spur both sustaining and disruptive innovation in military organizations. It is to be expected that this COBP will need to be periodically revisited in light of these developments.https://digitalcommons.odu.edu/msve_books/1012/thumbnail.jp
- …