779 research outputs found
Analysing Sensitivity Data from Probabilistic Networks
With the advance of efficient analytical methods for sensitivity analysis
ofprobabilistic networks, the interest in the sensitivities revealed by
real-life networks is rekindled. As the amount of data resulting from a
sensitivity analysis of even a moderately-sized network is alreadyoverwhelming,
methods for extracting relevant information are called for. One such methodis
to study the derivative of the sensitivity functions yielded for a network's
parameters. We further propose to build upon the concept of admissible
deviation, that is, the extent to which a parameter can deviate from the true
value without inducing a change in the most likely outcome. We illustrate these
concepts by means of a sensitivity analysis of a real-life probabilistic
network in oncology.Comment: Appears in Proceedings of the Seventeenth Conference on Uncertainty
in Artificial Intelligence (UAI2001
Sensitivity Analysis for Threshold Decision Making with Dynamic Networks
The effect of inaccuracies in the parameters of a dynamic Bayesian network
can be investigated by subjecting the network to a sensitivity analysis. Having
detailed the resulting sensitivity functions in our previous work, we now study
the effect of parameter inaccuracies on a recommended decision in view of a
threshold decision-making model. We detail the effect of varying a single and
multiple parameters from a conditional probability table and present a
computational procedure for establishing bounds between which assessments for
these parameters can be varied without inducing a change in the recommended
decision. We illustrate the various concepts involved by means of a real-life
dynamic network in the field of infectious disease.Comment: Appears in Proceedings of the Twenty-Second Conference on Uncertainty
in Artificial Intelligence (UAI2006
Exploiting Evidence-dependent Sensitivity Bounds
Studying the effects of one-way variation of any number of parameters on any
number of output probabilities quickly becomes infeasible in practice,
especially if various evidence profiles are to be taken into consideration. To
provide for identifying the parameters that have a potentially large effect
prior to actually performing the analysis, we need properties of sensitivity
functions that are independent of the network under study, of the available
evidence, or of both. In this paper, we study properties that depend upon just
the probability of the entered evidence. We demonstrate that these properties
provide for establishing an upper bound on the sensitivity value for a
parameter; they further provide for establishing the region in which the vertex
of the sensitivity function resides, thereby serving to identify parameters
with a low sensitivity value that may still have a large impact on the
probability of interest for relatively small parameter variations.Comment: Appears in Proceedings of the Twenty-First Conference on Uncertainty
in Artificial Intelligence (UAI2005
Learning Bayesian Network Parameters with Prior Knowledge about Context-Specific Qualitative Influences
We present a method for learning the parameters of a Bayesian network with
prior knowledge about the signs of influences between variables. Our method
accommodates not just the standard signs, but provides for context-specific
signs as well. We show how the various signs translate into order constraints
on the network parameters and how isotonic regression can be used to compute
order-constrained estimates from the available data. Our experimental results
show that taking prior knowledge about the signs of influences into account
leads to an improved fit of the true distribution, especially when only a small
sample of data is available. Moreover, the computed estimates are guaranteed to
be consistent with the specified signs, thereby resulting in a network that is
more likely to be accepted by experts in its domain of application.Comment: Appears in Proceedings of the Twenty-First Conference on Uncertainty
in Artificial Intelligence (UAI2005
Evidence-invariant Sensitivity Bounds
The sensitivities revealed by a sensitivity analysis of a probabilistic
network typically depend on the entered evidence. For a real-life network
therefore, the analysis is performed a number of times, with different
evidence. Although efficient algorithms for sensitivity analysis exist, a
complete analysis is often infeasible because of the large range of possible
combinations of observations. In this paper we present a method for studying
sensitivities that are invariant to the evidence entered. Our method builds
upon the idea of establishing bounds between which a parameter can be varied
without ever inducing a change in the most likely value of a variable of
interest.Comment: Appears in Proceedings of the Twentieth Conference on Uncertainty in
Artificial Intelligence (UAI2004
From Qualitative to Quantitative Probabilistic Networks
Quantification is well known to be a major obstacle in the construction of a
probabilistic network, especially when relying on human experts for this
purpose. The construction of a qualitative probabilistic network has been
proposed as an initial step in a network s quantification, since the
qualitative network can be used TO gain preliminary insight IN the projected
networks reasoning behaviour. We extend on this idea and present a new type of
network in which both signs and numbers are specified; we further present an
associated algorithm for probabilistic inference. Building upon these
semi-qualitative networks, a probabilistic network can be quantified and
studied in a stepwise manner. As a result, modelling inadequacies can be
detected and amended at an early stage in the quantification process.Comment: Appears in Proceedings of the Eighteenth Conference on Uncertainty in
Artificial Intelligence (UAI2002
Making Sensitivity Analysis Computationally Efficient
To investigate the robustness of the output probabilities of a Bayesian
network, a sensitivity analysis can be performed. A one-way sensitivity
analysis establishes, for each of the probability parameters of a network, a
function expressing a posterior marginal probability of interest in terms of
the parameter. Current methods for computing the coefficients in such a
function rely on a large number of network evaluations. In this paper, we
present a method that requires just a single outward propagation in a junction
tree for establishing the coefficients in the functions for all possible
parameters; in addition, an inward propagation is required for processing
evidence. Conversely, the method requires a single outward propagation for
computing the coefficients in the functions expressing all possible posterior
marginals in terms of a single parameter. We extend these results to an n-way
sensitivity analysis in which sets of parameters are studied.Comment: Appears in Proceedings of the Sixteenth Conference on Uncertainty in
Artificial Intelligence (UAI2000
Enhancing QPNs for Trade-off Resolution
Qualitative probabilistic networks have been introduced as qualitative
abstractions of Bayesian belief networks. One of the major drawbacks of these
qualitative networks is their coarse level of detail, which may lead to
unresolved trade-offs during inference. We present an enhanced formalism for
qualitative networks with a finer level of detail. An enhanced qualitative
probabilistic network differs from a regular qualitative network in that it
distinguishes between strong and weak influences. Enhanced qualitative
probabilistic networks are purely qualitative in nature, as regular qualitative
networks are, yet allow for efficiently resolving trade-offs during inference.Comment: Appears in Proceedings of the Fifteenth Conference on Uncertainty in
Artificial Intelligence (UAI1999
Stable Independance and Complexity of Representation
The representation of independence relations generally builds upon the
well-known semigraphoid axioms of independence. Recently, a representation has
been proposed that captures a set of dominant statements of an independence
relation from which any other statement can be generated by means of the
axioms; the cardinality of this set is taken to indicate the complexity of the
relation. Building upon the idea of dominance, we introduce the concept of
stability to provide for a more compact representation of independence. We give
an associated algorithm for establishing such a representation.We show that,
with our concept of stability, many independence relations are found to be of
lower complexity than with existing representations.Comment: Appears in Proceedings of the Twentieth Conference on Uncertainty in
Artificial Intelligence (UAI2004
Elicitation of Probabilities for Belief Networks: Combining Qualitative and Quantitative Information
Although the usefulness of belief networks for reasoning under uncertainty is
widely accepted, obtaining numerical probabilities that they require is still
perceived a major obstacle. Often not enough statistical data is available to
allow for reliable probability estimation. Available information may not be
directly amenable for encoding in the network. Finally, domain experts may be
reluctant to provide numerical probabilities. In this paper, we propose a
method for elicitation of probabilities from a domain expert that is
non-invasive and accommodates whatever probabilistic information the expert is
willing to state. We express all available information, whether qualitative or
quantitative in nature, in a canonical form consisting of (in) equalities
expressing constraints on the hyperspace of possible joint probability
distributions. We then use this canonical form to derive second-order
probability distributions over the desired probabilities.Comment: Appears in Proceedings of the Eleventh Conference on Uncertainty in
Artificial Intelligence (UAI1995
- …