2,683 research outputs found
Decidability of the Monadic Shallow Linear First-Order Fragment with Straight Dismatching Constraints
The monadic shallow linear Horn fragment is well-known to be decidable and
has many application, e.g., in security protocol analysis, tree automata, or
abstraction refinement. It was a long standing open problem how to extend the
fragment to the non-Horn case, preserving decidability, that would, e.g.,
enable to express non-determinism in protocols. We prove decidability of the
non-Horn monadic shallow linear fragment via ordered resolution further
extended with dismatching constraints and discuss some applications of the new
decidable fragment.Comment: 29 pages, long version of CADE-26 pape
Complexity of Equivalence and Learning for Multiplicity Tree Automata
We consider the complexity of equivalence and learning for multiplicity tree
automata, i.e., weighted tree automata over a field. We first show that the
equivalence problem is logspace equivalent to polynomial identity testing, the
complexity of which is a longstanding open problem. Secondly, we derive lower
bounds on the number of queries needed to learn multiplicity tree automata in
Angluin's exact learning model, over both arbitrary and fixed fields.
Habrard and Oncina (2006) give an exact learning algorithm for multiplicity
tree automata, in which the number of queries is proportional to the size of
the target automaton and the size of a largest counterexample, represented as a
tree, that is returned by the Teacher. However, the smallest
tree-counterexample may be exponential in the size of the target automaton.
Thus the above algorithm does not run in time polynomial in the size of the
target automaton, and has query complexity exponential in the lower bound.
Assuming a Teacher that returns minimal DAG representations of
counterexamples, we give a new exact learning algorithm whose query complexity
is quadratic in the target automaton size, almost matching the lower bound, and
improving the best previously-known algorithm by an exponential factor
Spreading the game: An experimental study on the link between childrenâs overimitation and their adoption, transmission, and modification of conventional information
Overimitation is hypothesized to foster the spread of conventional information within populations. The current study tested this claim by assigning 5-year-old children (NâŻ=âŻ64) to one of two study populations based on their overimitation (overimitators [OIs] vs. non-overimitators [non-OIs]). Children were presented with conventional information in the form of novel games lacking instrumental outcomes, and we observed childrenâs adoption, transmission, and modification of this information across two study phases. Results reveal little variation across study populations in the number of game elements that were adopted and transmitted. However, OIs were more likely to use normative language than non-OIs when transmitting game information to their peers. Furthermore, non-OIs modified the games more frequently in the initial study phase, suggesting an inverse relationship between childrenâs overimitation and their tendency to modify conventional information. These findings indicate subtle yet coherent links between childrenâs overimitation and their tendency to transmit and modify conventional information
Synthesis for Polynomial Lasso Programs
We present a method for the synthesis of polynomial lasso programs. These
programs consist of a program stem, a set of transitions, and an exit
condition, all in the form of algebraic assertions (conjunctions of polynomial
equalities). Central to this approach is the discovery of non-linear
(algebraic) loop invariants. We extend Sankaranarayanan, Sipma, and Manna's
template-based approach and prove a completeness criterion. We perform program
synthesis by generating a constraint whose solution is a synthesized program
together with a loop invariant that proves the program's correctness. This
constraint is non-linear and is passed to an SMT solver. Moreover, we can
enforce the termination of the synthesized program with the support of test
cases.Comment: Paper at VMCAI'14, including appendi
CHRONO: a parallel multi-physics library for rigid-body, flexible-body, and fluid dynamics
Abstract. The last decade witnessed a manifest shift in the microprocessor industry towards chip designs that promote parallel computing. Until recently the privilege of a select group of large research centers, Teraflop computing is becoming a commodity owing to inexpensive GPU cards and multi to many-core x86 processors. This paradigm shift towards large scale parallel computing has been leveraged in CHRONO, a freely available C++ multi-physics simulation package. CHRONO is made up of a collection of loosely coupled components that facilitate different aspects of multi-physics modeling, simulation, and visualization. This contribution provides an overview of CHRONO::Engine, CHRONO::Flex, CHRONO::Fluid, and CHRONO::Render, which are modules that can capitalize on the processing power of hundreds of parallel processors. Problems that can be tackled in CHRONO include but are not limited to granular material dynamics, tangled large flexible structures with self contact, particulate flows, and tracked vehicle mobility. The paper presents an overview of each of these modules and illustrates through several examples the potential of this multi-physics library
Absorption and photoluminescence spectroscopy on a single self-assembled charge-tunable quantum dot
We have performed detailed photoluminescence (PL) and absorption spectroscopy
on the same single self-assembled quantum dot in a charge-tunable device. The
transition from neutral to charged exciton in the PL occurs at a more negative
voltage than the corresponding transition in absorption. We have developed a
model of the Coulomb blockade to account for this observation. At large
negative bias, the absorption broadens as a result of electron and hole
tunneling. We observe resonant features in this regime whenever the quantum dot
hole level is resonant with two-dimensional hole states located at the capping
layer-blocking barrier interface in our structure.Comment: 6 pages, 6 figure
Polyhedral Analysis using Parametric Objectives
The abstract domain of polyhedra lies at the heart of many program analysis techniques. However, its operations can be expensive, precluding their application to polyhedra that involve many variables. This paper describes a new approach to computing polyhedral domain operations. The core of this approach is an algorithm to calculate variable elimination (projection) based on parametric linear programming. The algorithm enumerates only non-redundant inequalities of the projection space, hence permits anytime approximation of the output
Seesaw Neutrino Masses with Large Mixings from Dimensional Deconstruction
We demonstrate a dynamical origin for the dimension-five seesaw operator in
dimensional deconstruction models. Light neutrino masses arise from the seesaw
scale which corresponds to the inverse lattice spacing. It is shown that the
deconstructing limit naturally prefers maximal leptonic mixing. Higher-order
corrections which are allowed by gauge invariance can transform the bi-maximal
into a bi-large mixing. These terms may appear to be non-renormalizable at
scales smaller than the deconstruction scale.Comment: Revised version published in PR
Tree defence and bark beetles in a drying world: carbon partitioning, functioning and modelling
Drought has promoted largeâscale, insectâinduced tree mortality in recent years, with severe consequences for ecosystem function, atmospheric processes, sustainable resources and global biogeochemical cycles. However, the physiological linkages among drought, tree defences, and insect outbreaks are still uncertain, hindering our ability to accurately predict tree mortality under onâgoing climate change. Here we propose an interdisciplinary research agenda for addressing these crucial knowledge gaps. Our framework includes field manipulations, laboratory experiments, and modelling of insect and vegetation dynamics, and focuses on how drought affects interactions between conifer trees and bark beetles. We build upon existing theory and examine several key assumptions: 1) there is a tradeâoff in tree carbon investment between primary and secondary metabolites (e.g. growth vs. defence); 2) secondary metabolites are one of the main component of tree defence against bark beetles and associated microbes; and 3) implementing coniferâbark beetle interactions in current models improves predictions of forest disturbance in a changing climate. Our framework provides guidance for addressing a major shortcoming in current implementations of largeâscale vegetation models, the underârepresentation of insectâinduced tree mortality
- âŠ