1,476 research outputs found
Reasoning with Partial Knowledge
We investigate how sociological argumentation differs from the classical first-order logic. We focus on theories about age dependence of organizational mortality. The overall pattern of argument does not comply with the classical monotonicity principle: adding premises does not overturn conclusions in an argument. The cause of nonmonotonicity is the need to derive conclusions from partial knowledge. We identify meta-principles that appear to guide the observed sociological argumentation patterns, and we formalize a semantics to represent them. This semantics yields a new kind of logical consequence relation. We demonstrate that this new logic can reproduce the results of informal sociological theorizing and lead to new insights. It allows us to unify existing theory fragments and paves the way towards a complete classical theory
Limits of Preprocessing
We present a first theoretical analysis of the power of polynomial-time
preprocessing for important combinatorial problems from various areas in AI. We
consider problems from Constraint Satisfaction, Global Constraints,
Satisfiability, Nonmonotonic and Bayesian Reasoning. We show that, subject to a
complexity theoretic assumption, none of the considered problems can be reduced
by polynomial-time preprocessing to a problem kernel whose size is polynomial
in a structural problem parameter of the input, such as induced width or
backdoor size. Our results provide a firm theoretical boundary for the
performance of polynomial-time preprocessing algorithms for the considered
problems.Comment: This is a slightly longer version of a paper that appeared in the
proceedings of AAAI 201
Almost Ideal: Computational Epistemology and the Limits of Rationality for Finite Reasoners
The notion of an ideal reasoner has several uses in epistemology. Often, ideal reasoners are used as a parameter of (maximum) rationality for finite reasoners (e.g. humans). However, the notion of an ideal reasoner is normally construed in such a high degree of idealization (e.g. infinite/unbounded memory) that this use is unadvised. In this dissertation, I investigate the conditions under which an ideal reasoner may be used as a parameter of rationality for finite reasoners. In addition, I present and justify the research program of computational epistemology, which investigates the parameter of maximum rationality for finite reasoners using computer simulations
Housing prices and multiple employment nodes: is the relationship nonmonotonic?
Standard urban economic theory predicts that house prices will decline with distance from the central business district. Empirical results have been equivocal, however. Disjoints between theory and empirics may be due to a nonmonotonic relationship between house prices and access to employment arising from the negative externalities associated with proximity to multiple centres of employment. Based on data from Glasgow (Scotland), we use gravity-based measures of accessibility estimated using a flexible functional form that allows for nonmonotonicity. The results are thoroughly tested using recent advances in spatial econometrics. We find compelling evidence of a nonmonotonic effect in the accessibility measure and discuss the implications for planning and housing policy
Change-Point Testing and Estimation for Risk Measures in Time Series
We investigate methods of change-point testing and confidence interval
construction for nonparametric estimators of expected shortfall and related
risk measures in weakly dependent time series. A key aspect of our work is the
ability to detect general multiple structural changes in the tails of time
series marginal distributions. Unlike extant approaches for detecting tail
structural changes using quantities such as tail index, our approach does not
require parametric modeling of the tail and detects more general changes in the
tail. Additionally, our methods are based on the recently introduced
self-normalization technique for time series, allowing for statistical analysis
without the issues of consistent standard error estimation. The theoretical
foundation for our methods are functional central limit theorems, which we
develop under weak assumptions. An empirical study of S&P 500 returns and US
30-Year Treasury bonds illustrates the practical use of our methods in
detecting and quantifying market instability via the tails of financial time
series during times of financial crisis
Guarantees and Limits of Preprocessing in Constraint Satisfaction and Reasoning
We present a first theoretical analysis of the power of polynomial-time
preprocessing for important combinatorial problems from various areas in AI. We
consider problems from Constraint Satisfaction, Global Constraints,
Satisfiability, Nonmonotonic and Bayesian Reasoning under structural
restrictions. All these problems involve two tasks: (i) identifying the
structure in the input as required by the restriction, and (ii) using the
identified structure to solve the reasoning task efficiently. We show that for
most of the considered problems, task (i) admits a polynomial-time
preprocessing to a problem kernel whose size is polynomial in a structural
problem parameter of the input, in contrast to task (ii) which does not admit
such a reduction to a problem kernel of polynomial size, subject to a
complexity theoretic assumption. As a notable exception we show that the
consistency problem for the AtMost-NValue constraint admits a polynomial kernel
consisting of a quadratic number of variables and domain values. Our results
provide a firm worst-case guarantees and theoretical boundaries for the
performance of polynomial-time preprocessing algorithms for the considered
problems.Comment: arXiv admin note: substantial text overlap with arXiv:1104.2541,
arXiv:1104.556
A Nonlinear Random Coefficients Model for Degradation Testing
As an alternative to traditional life testing, degradation tests can be effective in assessing product reliability when measurements of degradation leading to failure can be observed. This article presents a degradation model for highly reliable light displays, such as plasma display panels and vacuum fluorescent displays (VFDs). Standard degradation models fail to capture the burn-in characteristics of VFDs, when emitted light actually increases up to a certain point in time before it decreases (or degrades) continuously. Random coefficients are used to model this phenomenon in a nonlinear way, which allows for a nonmonotonic degradation path. In many situations, the relative efficiency of the lifetime estimate is improved over the standard estimators based on transformed linear models
- …