31,936 research outputs found
Counter Machines and Distributed Automata: A Story about Exchanging Space and Time
We prove the equivalence of two classes of counter machines and one class of
distributed automata. Our counter machines operate on finite words, which they
read from left to right while incrementing or decrementing a fixed number of
counters. The two classes differ in the extra features they offer: one allows
to copy counter values, whereas the other allows to compute copyless sums of
counters. Our distributed automata, on the other hand, operate on directed path
graphs that represent words. All nodes of a path synchronously execute the same
finite-state machine, whose state diagram must be acyclic except for
self-loops, and each node receives as input the state of its direct
predecessor. These devices form a subclass of linear-time one-way cellular
automata.Comment: 15 pages (+ 13 pages of appendices), 5 figures; To appear in the
proceedings of AUTOMATA 2018
Probabilistic Computability and Choice
We study the computational power of randomized computations on infinite
objects, such as real numbers. In particular, we introduce the concept of a Las
Vegas computable multi-valued function, which is a function that can be
computed on a probabilistic Turing machine that receives a random binary
sequence as auxiliary input. The machine can take advantage of this random
sequence, but it always has to produce a correct result or to stop the
computation after finite time if the random advice is not successful. With
positive probability the random advice has to be successful. We characterize
the class of Las Vegas computable functions in the Weihrauch lattice with the
help of probabilistic choice principles and Weak Weak K\H{o}nig's Lemma. Among
other things we prove an Independent Choice Theorem that implies that Las Vegas
computable functions are closed under composition. In a case study we show that
Nash equilibria are Las Vegas computable, while zeros of continuous functions
with sign changes cannot be computed on Las Vegas machines. However, we show
that the latter problem admits randomized algorithms with weaker failure
recognition mechanisms. The last mentioned results can be interpreted such that
the Intermediate Value Theorem is reducible to the jump of Weak Weak
K\H{o}nig's Lemma, but not to Weak Weak K\H{o}nig's Lemma itself. These
examples also demonstrate that Las Vegas computable functions form a proper
superclass of the class of computable functions and a proper subclass of the
class of non-deterministically computable functions. We also study the impact
of specific lower bounds on the success probabilities, which leads to a strict
hierarchy of classes. In particular, the classical technique of probability
amplification fails for computations on infinite objects. We also investigate
the dependency on the underlying probability space.Comment: Information and Computation (accepted for publication
The quantum speed up as advanced knowledge of the solution
With reference to a search in a database of size N, Grover states: "What is
the reason that one would expect that a quantum mechanical scheme could
accomplish the search in O(square root of N) steps? It would be insightful to
have a simple two line argument for this without having to describe the details
of the search algorithm". The answer provided in this work is: "because any
quantum algorithm takes the time taken by a classical algorithm that knows in
advance 50% of the information that specifies the solution of the problem".
This empirical fact, unnoticed so far, holds for both quadratic and exponential
speed ups and is theoretically justified in three steps: (i) once the physical
representation is extended to the production of the problem on the part of the
oracle and to the final measurement of the computer register, quantum
computation is reduction on the solution of the problem under a relation
representing problem-solution interdependence, (ii) the speed up is explained
by a simple consideration of time symmetry, it is the gain of information about
the solution due to backdating, to before running the algorithm, a
time-symmetric part of the reduction on the solution; this advanced knowledge
of the solution reduces the size of the solution space to be explored by the
algorithm, (iii) if I is the information acquired by measuring the content of
the computer register at the end of the algorithm, the quantum algorithm takes
the time taken by a classical algorithm that knows in advance 50% of I, which
brings us to the initial statement.Comment: 23 pages, to be published in IJT
Sleep Analytics and Online Selective Anomaly Detection
We introduce a new problem, the Online Selective Anomaly Detection (OSAD), to
model a specific scenario emerging from research in sleep science. Scientists
have segmented sleep into several stages and stage two is characterized by two
patterns (or anomalies) in the EEG time series recorded on sleep subjects.
These two patterns are sleep spindle (SS) and K-complex. The OSAD problem was
introduced to design a residual system, where all anomalies (known and unknown)
are detected but the system only triggers an alarm when non-SS anomalies
appear. The solution of the OSAD problem required us to combine techniques from
both machine learning and control theory. Experiments on data from real
subjects attest to the effectiveness of our approach.Comment: Submitted to 20th ACM SIGKDD Conference on Knowledge Discovery and
Data Mining 201
Neural activity classification with machine learning models trained on interspike interval series data
The flow of information through the brain is reflected by the activity
patterns of neural cells. Indeed, these firing patterns are widely used as
input data to predictive models that relate stimuli and animal behavior to the
activity of a population of neurons. However, relatively little attention was
paid to single neuron spike trains as predictors of cell or network properties
in the brain. In this work, we introduce an approach to neuronal spike train
data mining which enables effective classification and clustering of neuron
types and network activity states based on single-cell spiking patterns. This
approach is centered around applying state-of-the-art time series
classification/clustering methods to sequences of interspike intervals recorded
from single neurons. We demonstrate good performance of these methods in tasks
involving classification of neuron type (e.g. excitatory vs. inhibitory cells)
and/or neural circuit activity state (e.g. awake vs. REM sleep vs. nonREM sleep
states) on an open-access cortical spiking activity dataset
The Parameterized Complexity of Domination-type Problems and Application to Linear Codes
We study the parameterized complexity of domination-type problems.
(sigma,rho)-domination is a general and unifying framework introduced by Telle:
a set D of vertices of a graph G is (sigma,rho)-dominating if for any v in D,
|N(v)\cap D| in sigma and for any $v\notin D, |N(v)\cap D| in rho. We mainly
show that for any sigma and rho the problem of (sigma,rho)-domination is W[2]
when parameterized by the size of the dominating set. This general statement is
optimal in the sense that several particular instances of
(sigma,rho)-domination are W[2]-complete (e.g. Dominating Set). We also prove
that (sigma,rho)-domination is W[2] for the dual parameterization, i.e. when
parameterized by the size of the dominated set. We extend this result to a
class of domination-type problems which do not fall into the
(sigma,rho)-domination framework, including Connected Dominating Set. We also
consider problems of coding theory which are related to domination-type
problems with parity constraints. In particular, we prove that the problem of
the minimal distance of a linear code over Fq is W[2] for both standard and
dual parameterizations, and W[1]-hard for the dual parameterization.
To prove W[2]-membership of the domination-type problems we extend the
Turing-way to parameterized complexity by introducing a new kind of non
deterministic Turing machine with the ability to perform `blind' transitions,
i.e. transitions which do not depend on the content of the tapes. We prove that
the corresponding problem Short Blind Multi-Tape Non-Deterministic Turing
Machine is W[2]-complete. We believe that this new machine can be used to prove
W[2]-membership of other problems, not necessarily related to dominationComment: 19 pages, 2 figure
- …