2,840 research outputs found
The extended analog computer and functions computable in a digital sense
In this paper we compare the computational power of the Extended Analog Computer (EAC) with partial recursive functions. We first give a survey of some part of computational theory in discrete and in real space. In the last section we show that the EAC can generate any partial recursive function defined over N. Moreover we conclude that the classical halting problem for partial recursive functions is an equivalent of testing by EAC if sets are empty or not
A Survey on Continuous Time Computations
We provide an overview of theories of continuous time computation. These
theories allow us to understand both the hardness of questions related to
continuous time dynamical systems and the computational power of continuous
time analog models. We survey the existing models, summarizing results, and
point to relevant references in the literature
On the possible Computational Power of the Human Mind
The aim of this paper is to address the question: Can an artificial neural
network (ANN) model be used as a possible characterization of the power of the
human mind? We will discuss what might be the relationship between such a model
and its natural counterpart. A possible characterization of the different power
capabilities of the mind is suggested in terms of the information contained (in
its computational complexity) or achievable by it. Such characterization takes
advantage of recent results based on natural neural networks (NNN) and the
computational power of arbitrary artificial neural networks (ANN). The possible
acceptance of neural networks as the model of the human mind's operation makes
the aforementioned quite relevant.Comment: Complexity, Science and Society Conference, 2005, University of
Liverpool, UK. 23 page
Discontinuities in recurrent neural networks
This paper studies the computational power of various discontinuous
real computational models that are based on the classical analog
recurrent neural network (ARNN). This ARNN consists of finite number
of neurons; each neuron computes a polynomial net-function and a
sigmoid-like continuous activation-function.
The authors introducePostprint (published version
Polynomial Time corresponds to Solutions of Polynomial Ordinary Differential Equations of Polynomial Length
We provide an implicit characterization of polynomial time computation in
terms of ordinary differential equations: we characterize the class
of languages computable in polynomial time in terms of
differential equations with polynomial right-hand side.
This result gives a purely continuous (time and space) elegant and simple
characterization of . This is the first time such classes
are characterized using only ordinary differential equations. Our
characterization extends to functions computable in polynomial time over the
reals in the sense of computable analysis. This extends to deterministic
complexity classes above polynomial time.
This may provide a new perspective on classical complexity, by giving a way
to define complexity classes, like , in a very simple
way, without any reference to a notion of (discrete) machine. This may also
provide ways to state classical questions about computational complexity via
ordinary differential equations, i.e.~by using the framework of analysis
On Buffon Machines and Numbers
The well-know needle experiment of Buffon can be regarded as an analog (i.e.,
continuous) device that stochastically "computes" the number 2/pi ~ 0.63661,
which is the experiment's probability of success. Generalizing the experiment
and simplifying the computational framework, we consider probability
distributions, which can be produced perfectly, from a discrete source of
unbiased coin flips. We describe and analyse a few simple Buffon machines that
generate geometric, Poisson, and logarithmic-series distributions. We provide
human-accessible Buffon machines, which require a dozen coin flips or less, on
average, and produce experiments whose probabilities of success are expressible
in terms of numbers such as, exp(-1), log 2, sqrt(3), cos(1/4), aeta(5).
Generally, we develop a collection of constructions based on simple
probabilistic mechanisms that enable one to design Buffon experiments involving
compositions of exponentials and logarithms, polylogarithms, direct and inverse
trigonometric functions, algebraic and hypergeometric functions, as well as
functions defined by integrals, such as the Gaussian error function.Comment: Largely revised version with references and figures added. 12 pages.
In ACM-SIAM Symposium on Discrete Algorithms (SODA'2011
- …