513 research outputs found
Ultimate approximations in nonmonotonic knowledge representation systems
We study fixpoints of operators on lattices. To this end we introduce the
notion of an approximation of an operator. We order approximations by means of
a precision ordering. We show that each lattice operator O has a unique most
precise or ultimate approximation. We demonstrate that fixpoints of this
ultimate approximation provide useful insights into fixpoints of the operator
O.
We apply our theory to logic programming and introduce the ultimate
Kripke-Kleene, well-founded and stable semantics. We show that the ultimate
Kripke-Kleene and well-founded semantics are more precise then their standard
counterparts We argue that ultimate semantics for logic programming have
attractive epistemological properties and that, while in general they are
computationally more complex than the standard semantics, for many classes of
theories, their complexity is no worse.Comment: This paper was published in Principles of Knowledge Representation
and Reasoning, Proceedings of the Eighth International Conference (KR2002
The Integration of Connectionism and First-Order Knowledge Representation and Reasoning as a Challenge for Artificial Intelligence
Intelligent systems based on first-order logic on the one hand, and on
artificial neural networks (also called connectionist systems) on the other,
differ substantially. It would be very desirable to combine the robust neural
networking machinery with symbolic knowledge representation and reasoning
paradigms like logic programming in such a way that the strengths of either
paradigm will be retained. Current state-of-the-art research, however, fails by
far to achieve this ultimate goal. As one of the main obstacles to be overcome
we perceive the question how symbolic knowledge can be encoded by means of
connectionist systems: Satisfactory answers to this will naturally lead the way
to knowledge extraction algorithms and to integrated neural-symbolic systems.Comment: In Proceedings of INFORMATION'2004, Tokyo, Japan, to appear. 12 page
An analysis of the equational properties of the well-founded fixed point
Well-founded fixed points have been used in several areas of knowledge
representation and reasoning and to give semantics to logic programs involving
negation. They are an important ingredient of approximation fixed point theory.
We study the logical properties of the (parametric) well-founded fixed point
operation. We show that the operation satisfies several, but not all of the
equational properties of fixed point operations described by the axioms of
iteration theories
Conditions for a Monotonic Channel Capacity
Motivated by results in optical communications, where the performance can
degrade dramatically if the transmit power is sufficiently increased, the
channel capacity is characterized for various kinds of memoryless vector
channels. It is proved that for all static point-to-point channels, the channel
capacity is a nondecreasing function of power. As a consequence, maximizing the
mutual information over all input distributions with a certain power is for
such channels equivalent to maximizing it over the larger set of input
distributions with upperbounded power. For interference channels such as
optical wavelength-division multiplexing systems, the primary channel capacity
is always nondecreasing with power if all interferers transmit with identical
distributions as the primary user. Also, if all input distributions in an
interference channel are optimized jointly, then the achievable sum-rate
capacity is again nondecreasing. The results generalizes to the channel
capacity as a function of a wide class of costs, not only power.Comment: This is an updated and expanded version of arXiv:1108.039
Big Entropy Fluctuations in Nonequilibrium Steady State: A Simple Model with Gauss Heat Bath
Large entropy fluctuations in a nonequilibrium steady state of classical
mechanics were studied in extensive numerical experiments on a simple 2-freedom
model with the so-called Gauss time-reversible thermostat. The local
fluctuations (on a set of fixed trajectory segments) from the average heat
entropy absorbed in thermostat were found to be non-Gaussian. Approximately,
the fluctuations can be discribed by a two-Gaussian distribution with a
crossover independent of the segment length and the number of trajectories
('particles'). The distribution itself does depend on both, approaching the
single standard Gaussian distribution as any of those parameters increases. The
global time-dependent fluctuations turned out to be qualitatively different in
that they have a strict upper bound much less than the average entropy
production. Thus, unlike the equilibrium steady state, the recovery of the
initial low entropy becomes impossible, after a sufficiently long time, even in
the largest fluctuations. However, preliminary numerical experiments and the
theoretical estimates in the special case of the critical dynamics with
superdiffusion suggest the existence of infinitely many Poincar\'e recurrences
to the initial state and beyond. This is a new interesting phenomenon to be
farther studied together with some other open questions. Relation of this
particular example of nonequilibrium steady state to a long-standing persistent
controversy over statistical 'irreversibility', or the notorious 'time arrow',
is also discussed. In conclusion, an unsolved problem of the origin of the
causality 'principle' is touched upon.Comment: 21 pages, 7 figure
Recommended from our members
A Bayesian approach for statistical–physical bulk parameterization of rain microphysics. Part II: Idealized Markov chain Monte Carlo experiments
Observationally informed development of a new framework for bulk rain microphysics, the Bayesian Observationally Constrained Statistical–Physical Scheme (BOSS; described in Part I of this study), is demonstrated. This scheme’s development is motivated by large uncertainties in cloud and weather simulations associated with approximations and assumptions in existing microphysics schemes. Here, a proof-of-concept study is presented using a Markov chain Monte Carlo sampling algorithm with BOSS to probabilistically estimate microphysical process rates and parameters directly from a set of synthetically generated rain observations. The framework utilized is an idealized steady-state one-dimensional column rainshaft model with specified column-top rain properties and a fixed thermodynamical profile. Different configurations of BOSS—flexibility being a key feature of this approach—are constrained via synthetic observations generated from a traditional three-moment bulk microphysics scheme. The ability to retrieve correct parameter values when the true parameter values are known is illustrated. For cases when there is no set of true parameter values, the accuracy of configurations of BOSS that have different levels of complexity is compared. It is found that addition of the sixth moment as a prognostic variable improves prediction of the third moment (proportional to bulk rain mass) and rain rate. In contrast, increasing process rate formulation complexity by adding more power terms has little benefit—a result that is explained using further-idealized experiments. BOSS rainshaft simulations are shown to well estimate the true process rates from constraint by bulk rain observations, with the additional benefit of rigorously quantified uncertainty of these estimates
- …