711,939 research outputs found
Universal discrete-time reservoir computers with stochastic inputs and linear readouts using non-homogeneous state-affine systems
A new class of non-homogeneous state-affine systems is introduced for use in
reservoir computing. Sufficient conditions are identified that guarantee first,
that the associated reservoir computers with linear readouts are causal,
time-invariant, and satisfy the fading memory property and second, that a
subset of this class is universal in the category of fading memory filters with
stochastic almost surely uniformly bounded inputs. This means that any
discrete-time filter that satisfies the fading memory property with random
inputs of that type can be uniformly approximated by elements in the
non-homogeneous state-affine family.Comment: 41 page
Recommended from our members
Pioneers on the air: BBC radio broadcasts on computers and A.I., 1946-56
Between 1946 and 1956, a number of BBC radio broadcasts were made by pioneers in the fields of computing, artificial intelligence and cybernetics. Although no sound recordings of the broadcasts survive, transcripts are held at the BBC's Written Archives Centre at Caversham in the UK. This paper is based on a study of these transcripts, which have received little attention from historians.
The paper surveys the range of computer-related broadcasts during 1946–1956 and discusses some recurring themes from the broadcasts, especially the relationship of 'artificial intelligence' to human intelligence. Additionally, it discusses the context of the broadcasts, both in relation to the BBC and to contemporary awareness of computers
Algorithms on ensemble quantum computers.
In ensemble (or bulk) quantum computation, all computations are performed on an ensemble of computers rather than on a single computer. Measurements of qubits in an individual computer cannot be performed; instead, only expectation values (over the complete ensemble of computers) can be measured. As a result of this limitation on the model of computation, many algorithms cannot be processed directly on such computers, and must be modified, as the common strategy of delaying the measurements usually does not resolve this ensemble-measurement problem. Here we present several new strategies for resolving this problem. Based on these strategies we provide new versions of some of the most important quantum algorithms, versions that are suitable for implementing on ensemble quantum computers, e.g., on liquid NMR quantum computers. These algorithms are Shor's factorization algorithm, Grover's search algorithm (with several marked items), and an algorithm for quantum fault-tolerant computation. The first two algorithms are simply modified using a randomizing and a sorting strategies. For the last algorithm, we develop a classical-quantum hybrid strategy for removing measurements. We use it to present a novel quantum fault-tolerant scheme. More explicitly, we present schemes for fault-tolerant measurement-free implementation of Toffoli and σ(z)(¼) as these operations cannot be implemented "bitwise", and their standard fault-tolerant implementations require measurement
Making a community network sustainable: the future of the wired high rise
Abstract
Much time and money has been committed by governments, private business and the third sector over the last five years in establishing opportunities for underserved populations to gain access to new forms of information and communication technologies, in an effort to overcome the so-called ‘digital divide’.
This paper traces the efforts which have been made to establish a networked community at a single high rise public housing estate in inner Melbourne, Australia, and considers some of the potential opportunities for and barriers to ensuring the continuity of the network, which is large, complex, costly and potentially fragile, into the future
Stationary Algorithmic Probability
Kolmogorov complexity and algorithmic probability are defined only up to an
additive resp. multiplicative constant, since their actual values depend on the
choice of the universal reference computer. In this paper, we analyze a natural
approach to eliminate this machine-dependence.
Our method is to assign algorithmic probabilities to the different computers
themselves, based on the idea that "unnatural" computers should be hard to
emulate. Therefore, we study the Markov process of universal computers randomly
emulating each other. The corresponding stationary distribution, if it existed,
would give a natural and machine-independent probability measure on the
computers, and also on the binary strings.
Unfortunately, we show that no stationary distribution exists on the set of
all computers; thus, this method cannot eliminate machine-dependence. Moreover,
we show that the reason for failure has a clear and interesting physical
interpretation, suggesting that every other conceivable attempt to get rid of
those additive constants must fail in principle, too.
However, we show that restricting to some subclass of computers might help to
get rid of some amount of machine-dependence in some situations, and the
resulting stationary computer and string probabilities have beautiful
properties.Comment: 13 pages, 5 figures. Added an example of a positive recurrent
computer se
Revisiting the thermodynamics of hardening plasticity for unsaturated soils
A thermodynamically consistent extension of the constitutive equations of
saturated soils to unsaturated conditions is often worked out through the use a
unique 'effective' interstitial pressure, accounting equivalently for the
pressures of the saturating fluids acting separately on the internal solid
walls of the pore network. The natural candidate for this effective
interstitial pressure is the space averaged interstitial pressure. In contrast
experimental observations have revealed that, at least, a pair of stress state
variables was needed for a suitable framework to describe
stress-strain-strength behaviour of unsaturated soils. The thermodynamics
analysis presented here shows that the most general approach to the behaviour
of unsaturated soils actually requires three stress state variables: the
suction, which is required to describe the invasion of the soil by the liquid
water phase through the retention curve; two effective stresses, which are
required to describe the soil deformation at water saturation held constant.
However a simple assumption related to the plastic flow rule leads to the final
need of only a Bishop-like effective stress to formulate the stress-strain
constitutive equation describing the soil deformation, while the retention
properties still involve the suction and possibly the deformation. Commonly
accepted models for unsaturated soils, that is the Barcelona Basic Model and
any approach based on the use of an effective averaged interstitial pressure,
appear as special extreme cases of the thermodynamic formulation proposed here
School use of learning platforms and associated technologies - case study: primary school 1
Study of benefits and effective use of learning platforms in schools based on 12 case studie
- …