17,333 research outputs found
Predicting globally-coherent temporal structures from texts via endpoint inference and graph decomposition
International audienceAn elegant approach to learning temporal order- ings from texts is to formulate this problem as a constraint optimization problem, which can be then given an exact solution using Integer Linear Programming. This works well for cases where the number of possible relations between temporal entities is restricted to the mere precedence rela- tion [Bramsen et al., 2006; Chambers and Jurafsky, 2008], but becomes impractical when considering all possible interval relations. This paper proposes two innovations, inspired from work on temporal reasoning, that control this combinatorial blow-up, therefore rendering an exact ILP inference viable in the general case. First, we translate our network of constraints from temporal intervals to their end- points, to handle a drastically smaller set of con- straints, while preserving the same temporal infor- mation. Second, we show that additional efficiency is gained by enforcing coherence on particular sub- sets of the entire temporal graphs. We evaluate these innovations through various experiments on TimeBank 1.2, and compare our ILP formulations with various baselines and oracle systems
Agent-Based Models and Simulations in Economics and Social Sciences: from conceptual exploration to distinct ways of experimenting
Now that complex Agent-Based Models and computer simulations
spread over economics and social sciences - as in most sciences of complex
systems -, epistemological puzzles (re)emerge. We introduce new
epistemological tools so as to show to what precise extent each author is right
when he focuses on some empirical, instrumental or conceptual significance of
his model or simulation. By distinguishing between models and simulations,
between types of models, between types of computer simulations and between
types of empiricity, section 2 gives conceptual tools to explain the rationale of
the diverse epistemological positions presented in section 1. Finally, we claim
that a careful attention to the real multiplicity of denotational powers of
symbols at stake and then to the implicit routes of references operated by
models and computer simulations is necessary to determine, in each case, the
proper epistemic status and credibility of a given model and/or simulation
Ingardenâs Combinatorial Analysis of The Realism-Idealism Controversy
The Controversy over the Existence of the World (henceforth Controversy) is the magnum opus of Polish philosopher Roman Ingarden. Despite the renewed interest for Ingardenâs pioneering ontological work whithin analytic philosophy, little attention has been dedicated to Controversy's main goal, clearly indicated by the very title of the book: finding a solution to the centuries-old philosophical controversy about the ontological status of the external world.
There are at least three reasons for this relative indifference. First, even at the time when the book was published, the Controversy was no longer seen as a serious polemical topic, whether it was disqualified as an archaic metaphysical pseudo-problem, or taken to be the last remnant of an antiscientific approach to philosophy culminating in idealism and relativism. Second, Ingardenâs reasoning on the matter is highly complex, at times misleading, and even occasionally faulty. Finally, his analysis is not only incomplete â Controversy being unachieved â but also arguably aporetic.
One may wonder, then, why it is still worth excavating this mammoth treatise to study an issue apparently no longer relevant to contemporary philosophy. Aside from historical and exegetical purposes, which are of course very interesting in their own right, Ingardenâs treatment of the Controversy remains one of the most detailed and ambitious ontological undertakings of the twentieth century. Not only does it lay out an incredibly detailed map of possible solutions to the Controversy, but it also tries to show why the latter is a genuine and fundamental problem that owes its hasty disqualification to various oversimplifications over the course of the history of philosophy.
In this chapter, I first give an overview of Ingardenâs method, which relies mainly on a combinatorial analysis. Then, I summarize his examination of possible solutions to the Controversy, and determine which ones can be ruled out on ontological grounds. Finally, I explain why this ambitious project ultimately leads to a theoretical impasse, leaving Ingarden unable to come up with a definitive solution to the Controversy â regardless of the fact that the book is unachieved. I argue that his analysis of the problem yields a more modest but nonetheless valuable result
A reusable iterative optimization software library to solve combinatorial problems with approximate reasoning
Real world combinatorial optimization problems such as scheduling are
typically too complex to solve with exact methods. Additionally, the problems
often have to observe vaguely specified constraints of different importance,
the available data may be uncertain, and compromises between antagonistic
criteria may be necessary. We present a combination of approximate reasoning
based constraints and iterative optimization based heuristics that help to
model and solve such problems in a framework of C++ software libraries called
StarFLIP++. While initially developed to schedule continuous caster units in
steel plants, we present in this paper results from reusing the library
components in a shift scheduling system for the workforce of an industrial
production plant.Comment: 33 pages, 9 figures; for a project overview see
http://www.dbai.tuwien.ac.at/proj/StarFLIP
Training neural networks to encode symbols enables combinatorial generalization
Combinatorial generalization - the ability to understand and produce novel
combinations of already familiar elements - is considered to be a core capacity
of the human mind and a major challenge to neural network models. A significant
body of research suggests that conventional neural networks can't solve this
problem unless they are endowed with mechanisms specifically engineered for the
purpose of representing symbols. In this paper we introduce a novel way of
representing symbolic structures in connectionist terms - the vectors approach
to representing symbols (VARS), which allows training standard neural
architectures to encode symbolic knowledge explicitly at their output layers.
In two simulations, we show that neural networks not only can learn to produce
VARS representations, but in doing so they achieve combinatorial generalization
in their symbolic and non-symbolic output. This adds to other recent work that
has shown improved combinatorial generalization under specific training
conditions, and raises the question of whether specific mechanisms or training
routines are needed to support symbolic processing
Reformulation in planning
Reformulation of a problem is intended to make the problem more amenable to efficient solution. This is equally true in the special case of reformulating a planning problem. This paper considers various ways in which reformulation can be exploited in planning
Probabilistic Model Checking for Energy Analysis in Software Product Lines
In a software product line (SPL), a collection of software products is
defined by their commonalities in terms of features rather than explicitly
specifying all products one-by-one. Several verification techniques were
adapted to establish temporal properties of SPLs. Symbolic and family-based
model checking have been proven to be successful for tackling the combinatorial
blow-up arising when reasoning about several feature combinations. However,
most formal verification approaches for SPLs presented in the literature focus
on the static SPLs, where the features of a product are fixed and cannot be
changed during runtime. This is in contrast to dynamic SPLs, allowing to adapt
feature combinations of a product dynamically after deployment. The main
contribution of the paper is a compositional modeling framework for dynamic
SPLs, which supports probabilistic and nondeterministic choices and allows for
quantitative analysis. We specify the feature changes during runtime within an
automata-based coordination component, enabling to reason over strategies how
to trigger dynamic feature changes for optimizing various quantitative
objectives, e.g., energy or monetary costs and reliability. For our framework
there is a natural and conceptually simple translation into the input language
of the prominent probabilistic model checker PRISM. This facilitates the
application of PRISM's powerful symbolic engine to the operational behavior of
dynamic SPLs and their family-based analysis against various quantitative
queries. We demonstrate feasibility of our approach by a case study issuing an
energy-aware bonding network device.Comment: 14 pages, 11 figure
Symmetries in planning problems
Symmetries arise in planning in a variety of ways. This paper describes the ways that symmetry aises most naturally in planning problems and reviews the approaches that have been applied to exploitation of symmetry in order to reduce search for plans. It then introduces some extensions to the use of symmetry in planning before moving on to consider how the exploitation of symmetry in planning might be generalised to offer new approaches to exploitation of symmetry in other combinatorial search problems
Proving soundness of combinatorial Vickrey auctions and generating verified executable code
Using mechanised reasoning we prove that combinatorial Vickrey auctions are
soundly specified in that they associate a unique outcome (allocation and
transfers) to any valid input (bids). Having done so, we auto-generate verified
executable code from the formally defined auction. This removes a source of
error in implementing the auction design. We intend to use formal methods to
verify new auction designs. Here, our contribution is to introduce and
demonstrate the use of formal methods for auction verification in the familiar
setting of a well-known auction
- âŠ