84,338 research outputs found
Holistic debugging - enabling instruction set simulation for software quality assurance
We present holistic debugging, a novel method for observing execution of complex and distributed software. It builds on an instruction set simulator, which provides reproducible experiments and non-intrusive probing of state in a distributed system. Instruction set simulators, however, only provide low-level information, so a holistic debugger contains a translation framework that maps this information to higher abstraction level observation tools, such as source code debuggers. We have created Nornir, a proof-of-concept holistic debugger, built on the simulator Simics. For each observed process in the simulated system, Nornir creates an abstraction translation stack, with virtual machine translators that map machine-level storage contents (e.g. physical memory, registers) provided by Simics, to application-level data (e.g. virtual memory contents) by parsing the data structures of operating systems and virtual machines. Nornir includes a modified version of the GNU debugger (GDB), which supports non-intrusive symbolic debugging of distributed applications. Nornir's main interface is a debugger shepherd, a programmable interface that controls multiple debuggers, and allows users to coherently inspect the entire state of heterogeneous, distributed applications. It provides a robust observation platform for construction of new observation tools
Learning Action Models: Qualitative Approach
In dynamic epistemic logic, actions are described using action models. In
this paper we introduce a framework for studying learnability of action models
from observations. We present first results concerning propositional action
models. First we check two basic learnability criteria: finite identifiability
(conclusively inferring the appropriate action model in finite time) and
identifiability in the limit (inconclusive convergence to the right action
model). We show that deterministic actions are finitely identifiable, while
non-deterministic actions require more learning power-they are identifiable in
the limit. We then move on to a particular learning method, which proceeds via
restriction of a space of events within a learning-specific action model. This
way of learning closely resembles the well-known update method from dynamic
epistemic logic. We introduce several different learning methods suited for
finite identifiability of particular types of deterministic actions.Comment: 18 pages, accepted for LORI-V: The Fifth International Conference on
Logic, Rationality and Interaction, October 28-31, 2015, National Taiwan
University, Taipei, Taiwa
Observability of Dark Matter Substructure with Pulsar Timing Correlations
Dark matter substructure on small scales is currently weakly constrained, and
its study may shed light on the nature of the dark matter. In this work we
study the gravitational effects of dark matter substructure on measured pulsar
phases in pulsar timing arrays (PTAs). Due to the stability of pulse phases
observed over several years, dark matter substructure around the Earth-pulsar
system can imprint discernible signatures in gravitational Doppler and Shapiro
delays. We compute pulsar phase correlations induced by general dark matter
substructure, and project constraints for a few models such as monochromatic
primordial black holes (PBHs), and Cold Dark Matter (CDM)-like NFW subhalos.
This work extends our previous analysis, which focused on static or single
transiting events, to a stochastic analysis of multiple transiting events. We
find that stochastic correlations, in a PTA similar to the Square Kilometer
Array (SKA), are uniquely powerful to constrain subhalos as light as , with concentrations as low as that predicted by standard
CDM.Comment: 45 pages, 12 figure
On the Observational Equivalence of Continuous-Time Deterministic and Indeterministic Descriptions
This paper presents and philosophically assesses three types of results on the observational equivalence of continuous-time measure-theoretic deterministic and indeterministic descriptions. The first results establish observational equivalence to abstract mathematical descriptions. The second results are stronger because they show observational equivalence between deterministic and indeterministic descriptions found in science. Here I also discuss Kolmogorov's contribution. For the third results I introduce two new meanings of 'observational equivalence at every observation level'. Then I show the even stronger result of observational equivalence at every (and not just some) observation level between deterministic and indeterministic descriptions found in science. These results imply the following. Suppose one wants to find out whether a phenomenon is best modeled as deterministic or indeterministic. Then one cannot appeal to differences in the probability distributions of deterministic and indeterministic descriptions found in science to argue that one of the descriptions is preferable because there is no such difference. Finally, I criticise the extant claims of philosophers and mathematicians on observational equivalence
Simulation of Quantum Computation: A deterministic event-based approach
We demonstrate that locally connected networks of machines that have
primitive learning capabilities can be used to perform a deterministic,
event-based simulation of quantum computation. We present simulation results
for basic quantum operations such as the Hadamard and the controlled-NOT gate,
and for seven-qubit quantum networks that implement Shor's numbering factoring
algorithm.Comment: J. Comp. Theor. Nanoscience (in press); http://www.compphys.net/dl
The Power of Non-Determinism in Higher-Order Implicit Complexity
We investigate the power of non-determinism in purely functional programming
languages with higher-order types. Specifically, we consider cons-free programs
of varying data orders, equipped with explicit non-deterministic choice.
Cons-freeness roughly means that data constructors cannot occur in function
bodies and all manipulation of storage space thus has to happen indirectly
using the call stack.
While cons-free programs have previously been used by several authors to
characterise complexity classes, the work on non-deterministic programs has
almost exclusively considered programs of data order 0. Previous work has shown
that adding explicit non-determinism to cons-free programs taking data of order
0 does not increase expressivity; we prove that this - dramatically - is not
the case for higher data orders: adding non-determinism to programs with data
order at least 1 allows for a characterisation of the entire class of
elementary-time decidable sets.
Finally we show how, even with non-deterministic choice, the original
hierarchy of characterisations is restored by imposing different restrictions.Comment: pre-edition version of a paper accepted for publication at ESOP'1
- …