4,073 research outputs found
Nonmonotonic Probabilistic Logics between Model-Theoretic Probabilistic Logic and Probabilistic Logic under Coherence
Recently, it has been shown that probabilistic entailment under coherence is
weaker than model-theoretic probabilistic entailment. Moreover, probabilistic
entailment under coherence is a generalization of default entailment in System
P. In this paper, we continue this line of research by presenting probabilistic
generalizations of more sophisticated notions of classical default entailment
that lie between model-theoretic probabilistic entailment and probabilistic
entailment under coherence. That is, the new formalisms properly generalize
their counterparts in classical default reasoning, they are weaker than
model-theoretic probabilistic entailment, and they are stronger than
probabilistic entailment under coherence. The new formalisms are useful
especially for handling probabilistic inconsistencies related to conditioning
on zero events. They can also be applied for probabilistic belief revision.
More generally, in the same spirit as a similar previous paper, this paper
sheds light on exciting new formalisms for probabilistic reasoning beyond the
well-known standard ones.Comment: 10 pages; in Proceedings of the 9th International Workshop on
Non-Monotonic Reasoning (NMR-2002), Special Session on Uncertainty Frameworks
in Nonmonotonic Reasoning, pages 265-274, Toulouse, France, April 200
Relevance and Conditionals: A Synopsis of Open Pragmatic and Semantic Issues
Recently several papers have reported relevance effects on the cognitive assessments of indicative conditionals, which pose an explanatory challenge to the Suppositional Theory of conditionals advanced by David Over, which is influential in the psychology of reasoning. Some of these results concern the “Equation” (P(if A, then C) = P(C|A)), others the de Finetti truth table, and yet others the uncertain and-to-inference task. The purpose of this chapter is to take a Birdseye view on the debate and investigate some of the open theoretical issues posed by the empirical results. Central among these is whether to count these effects as belonging to pragmatics or semantics
Probabilistic entailment and iterated conditionals
In this paper we exploit the notions of conjoined and iterated conditionals,
which are defined in the setting of coherence by means of suitable conditional
random quantities with values in the interval . We examine the iterated
conditional , by showing that p-entails if and only if
. Then, we show that a p-consistent family
p-entails a conditional event if
and only if , or for some nonempty
subset of , where is the quasi
conjunction of the conditional events in . Then, we examine the
inference rules , , , and of System~P
and other well known inference rules ( , ,
). We also show that , where
is the conjunction of the conditional events in
. We characterize p-entailment by showing that
p-entails if and only if .
Finally, we examine \emph{Denial of the antecedent} and \emph{Affirmation of
the consequent}, where the p-entailment of from does
not hold, by showing that $(E_3|H_3)|\mathcal{C}(\mathcal{F})\neq1.
A Probabilistic Defense of Proper De Jure Objections to Theism
A common view among nontheists combines the de jure objection that theism is epistemically unacceptable with agnosticism about the de facto objection that theism is false. Following Plantinga, we can call this a “proper” de jure objection—a de jure objection that does not depend on any de facto objection. In his Warranted Christian Belief, Plantinga has produced a general argument against all proper de jure objections. Here I first show that this argument is logically fallacious (it makes subtle probabilistic fallacies disguised by scope ambiguities), and proceed to lay the groundwork for the construction of actual proper de jure objections
Belief Revision with Uncertain Inputs in the Possibilistic Setting
This paper discusses belief revision under uncertain inputs in the framework
of possibility theory. Revision can be based on two possible definitions of the
conditioning operation, one based on min operator which requires a purely
ordinal scale only, and another based on product, for which a richer structure
is needed, and which is a particular case of Dempster's rule of conditioning.
Besides, revision under uncertain inputs can be understood in two different
ways depending on whether the input is viewed, or not, as a constraint to
enforce. Moreover, it is shown that M.A. Williams' transmutations, originally
defined in the setting of Spohn's functions, can be captured in this framework,
as well as Boutilier's natural revision.Comment: Appears in Proceedings of the Twelfth Conference on Uncertainty in
Artificial Intelligence (UAI1996
A process model of the understanding of uncertain conditionals
To build a process model of the understanding of conditionals we extract a common core of three semantics of if-then sentences: (a) the conditional event interpretation in the coherencebased probability logic, (b) the discourse processingtheory of Hans Kamp, and (c) the game-theoretical approach of Jaakko Hintikka. The empirical part reports three experiments in which each participant assessed the probability of 52 if-then sentencesin a truth table task. Each experiment included a second task: An n-back task relating the interpretation of conditionals to working memory, a Bayesian bookbag and poker chip task relating the interpretation of conditionals to probability updating, and a probabilistic modus ponens task relating the interpretation of conditionals to a classical inference task. Data analysis shows that the way in which the conditionals are interpreted correlates with each of the supplementary tasks. The results are discussed within the process model proposed in the introduction
Coherent frequentism
By representing the range of fair betting odds according to a pair of
confidence set estimators, dual probability measures on parameter space called
frequentist posteriors secure the coherence of subjective inference without any
prior distribution. The closure of the set of expected losses corresponding to
the dual frequentist posteriors constrains decisions without arbitrarily
forcing optimization under all circumstances. This decision theory reduces to
those that maximize expected utility when the pair of frequentist posteriors is
induced by an exact or approximate confidence set estimator or when an
automatic reduction rule is applied to the pair. In such cases, the resulting
frequentist posterior is coherent in the sense that, as a probability
distribution of the parameter of interest, it satisfies the axioms of the
decision-theoretic and logic-theoretic systems typically cited in support of
the Bayesian posterior. Unlike the p-value, the confidence level of an interval
hypothesis derived from such a measure is suitable as an estimator of the
indicator of hypothesis truth since it converges in sample-space probability to
1 if the hypothesis is true or to 0 otherwise under general conditions.Comment: The confidence-measure theory of inference and decision is explicitly
extended to vector parameters of interest. The derivation of upper and lower
confidence levels from valid and nonconservative set estimators is formalize
- …