7,665 research outputs found
Solving multiple-criteria R&D project selection problems with a data-driven evidential reasoning rule
In this paper, a likelihood based evidence acquisition approach is proposed
to acquire evidence from experts'assessments as recorded in historical
datasets. Then a data-driven evidential reasoning rule based model is
introduced to R&D project selection process by combining multiple pieces of
evidence with different weights and reliabilities. As a result, the total
belief degrees and the overall performance can be generated for ranking and
selecting projects. Finally, a case study on the R&D project selection for the
National Science Foundation of China is conducted to show the effectiveness of
the proposed model. The data-driven evidential reasoning rule based model for
project evaluation and selection (1) utilizes experimental data to represent
experts' assessments by using belief distributions over the set of final
funding outcomes, and through this historic statistics it helps experts and
applicants to understand the funding probability to a given assessment grade,
(2) implies the mapping relationships between the evaluation grades and the
final funding outcomes by using historical data, and (3) provides a way to make
fair decisions by taking experts' reliabilities into account. In the
data-driven evidential reasoning rule based model, experts play different roles
in accordance with their reliabilities which are determined by their previous
review track records, and the selection process is made interpretable and
fairer. The newly proposed model reduces the time-consuming panel review work
for both managers and experts, and significantly improves the efficiency and
quality of project selection process. Although the model is demonstrated for
project selection in the NSFC, it can be generalized to other funding agencies
or industries.Comment: 20 pages, forthcoming in International Journal of Project Management
(2019
A method of classification for multisource data in remote sensing based on interval-valued probabilities
An axiomatic approach to intervalued (IV) probabilities is presented, where the IV probability is defined by a pair of set-theoretic functions which satisfy some pre-specified axioms. On the basis of this approach representation of statistical evidence and combination of multiple bodies of evidence are emphasized. Although IV probabilities provide an innovative means for the representation and combination of evidential information, they make the decision process rather complicated. It entails more intelligent strategies for making decisions. The development of decision rules over IV probabilities is discussed from the viewpoint of statistical pattern recognition. The proposed method, so called evidential reasoning method, is applied to the ground-cover classification of a multisource data set consisting of Multispectral Scanner (MSS) data, Synthetic Aperture Radar (SAR) data, and digital terrain data such as elevation, slope, and aspect. By treating the data sources separately, the method is able to capture both parametric and nonparametric information and to combine them. Then the method is applied to two separate cases of classifying multiband data obtained by a single sensor. In each case a set of multiple sources is obtained by dividing the dimensionally huge data into smaller and more manageable pieces based on the global statistical correlation information. By a divide-and-combine process, the method is able to utilize more features than the conventional maximum likelihood method
Cluelessness
Decisions, whether moral or prudential, should be guided at least in part by considerations of the consequences that would result from the various available actions. For any given action, however, the majority of its consequences are unpredictable at the time of decision. Many have worried that this leaves us, in some important sense, clueless. In this paper, I distinguish between âsimpleâ and âcomplexâ possible sources of cluelessness. In terms of this taxonomy, the majority of the existing literature on cluelessness focusses on the simple sources. I argue, contra James Lenman in particular, that these would-be sources of cluelessness are unproblematic, on the grounds that indifference-based reasoning is far less problematic than Lenman (along with many others) supposes. However, there does seem to be a genuine phenomenon of cluelessness associated with the âcomplexâ sources; here, indifference-based reasoning is inapplicable by anyoneâs lights. This âcomplex problem of cluelessnessâ is vivid and pressing, in particular, in the context of Effective Altruism. This motivates a more thorough examination of the precise nature of cluelessness, and the precise source of the associated phenomenology of discomfort in forced-choice situations. The latter parts of the paper make some initial explorations in those directions
Doxastic responsibility, guidance control, and ownership of belief
ABSTRACTThe contemporary debate over responsibility for belief is divided over the issue of whether such responsibility requires doxastic control, and whether this control must be voluntary in nature. It has recently become popular to hold that responsibility for belief does not require voluntary doxastic control, or perhaps even any form of doxastic âcontrolâ at all. However, Miriam McCormick has recently argued that doxastic responsibility does in fact require quasi-voluntary doxastic control: âguidance control,â a complex, compatibilist form of control. In this paper, I pursue a negative and a positive task. First, I argue that grounding doxastic responsibility in guidance control requires too much for agents to be the proper targets for attributions of doxastic responsibility. I will focus my criticisms on three cases in which McCormick's account gives the intuitively wrong verdict. Second, I develop a modified conception of McCormick's notion of âownership of belief,â which I call Weak Doxastic Ownership. I employ this conception to argue that responsibility for belief is possible even in the absence of guidance control. In doing so, I argue that the notion of doxastic ownership can do important normative work in grounding responsibility for belief without being subsumed under or analyzed in terms of the notion of doxastic control
Complementary Lipschitz continuity results for the distribution of intersections or unions of independent random sets in finite discrete spaces
We prove that intersections and unions of independent random sets in finite
spaces achieve a form of Lipschitz continuity. More precisely, given the
distribution of a random set , the function mapping any random set
distribution to the distribution of its intersection (under independence
assumption) with is Lipschitz continuous with unit Lipschitz constant if
the space of random set distributions is endowed with a metric defined as the
norm distance between inclusion functionals also known as commonalities.
Moreover, the function mapping any random set distribution to the distribution
of its union (under independence assumption) with is Lipschitz continuous
with unit Lipschitz constant if the space of random set distributions is
endowed with a metric defined as the norm distance between hitting
functionals also known as plausibilities.
Using the epistemic random set interpretation of belief functions, we also
discuss the ability of these distances to yield conflict measures. All the
proofs in this paper are derived in the framework of Dempster-Shafer belief
functions. Let alone the discussion on conflict measures, it is straightforward
to transcribe the proofs into the general (non necessarily epistemic) random
set terminology
- âŠ