8,336 research outputs found
Memristors for the Curious Outsiders
We present both an overview and a perspective of recent experimental advances
and proposed new approaches to performing computation using memristors. A
memristor is a 2-terminal passive component with a dynamic resistance depending
on an internal parameter. We provide an brief historical introduction, as well
as an overview over the physical mechanism that lead to memristive behavior.
This review is meant to guide nonpractitioners in the field of memristive
circuits and their connection to machine learning and neural computation.Comment: Perpective paper for MDPI Technologies; 43 page
Optical signatures of the superconducting Goldstone mode in granular aluminum: experiments and theory
Recent advances in the experimental growth and control of disordered thin
films, heterostructures, and interfaces provide a fertile ground for the
observation and characterisation of the collective superconducting excitations
emerging below after breaking the gauge symmetry. Here we combine
THz experiments in a nano-structured granular Al thin film and theoretical
calculations to demonstrate the existence of optically-active phase modes,
which represent the Goldstone excitations of the broken gauge symmetry. By
measuring the complex transmission trough the sample we identify a sizeable and
temperature-dependent optical sub-gap absorption, which cannot be ascribed to
quasiparticle excitations. A quantitative modelling of this material as a
disordered Josephson array of nano-grains allows us to determine, with no free
parameters, the structure of the spatial inhomogeneities induced by shell
effects. Besides being responsible for the enhancement of the critical
temperature with respect to bulk Al, already observed in the past, this spatial
inhomogeneity provides a mechanism for the optical visibility of the Goldstone
mode. By computing explicitly the optical spectrum of the superconducting phase
fluctuations we obtain a good quantitative description of the experimental
data. Our results demonstrate that nanograins arrays are a promising setting to
study and control the collective superconducting excitations via optical means
Fast algorithms for computing the Boltzmann collision operator
The development of accurate and fast numerical schemes for the five fold
Boltzmann collision integral represents a challenging problem in scientific
computing. For a particular class of interactions, including the so-called hard
spheres model in dimension three, we are able to derive spectral methods that
can be evaluated through fast algorithms. These algorithms are based on a
suitable representation and approximation of the collision operator. Explicit
expressions for the errors in the schemes are given and spectral accuracy is
proved. Parallelization properties and adaptivity of the algorithms are also
discussed.Comment: 22 page
Some Thoughts on Hypercomputation
Hypercomputation is a relatively new branch of computer science that emerged
from the idea that the Church--Turing Thesis, which is supposed to describe
what is computable and what is noncomputable, cannot possible be true. Because
of its apparent validity, the Church--Turing Thesis has been used to
investigate the possible limits of intelligence of any imaginable life form,
and, consequently, the limits of information processing, since living beings
are, among others, information processors. However, in the light of
hypercomputation, which seems to be feasibly in our universe, one cannot impose
arbitrary limits to what intelligence can achieve unless there are specific
physical laws that prohibit the realization of something. In addition,
hypercomputation allows us to ponder about aspects of communication between
intelligent beings that have not been considered befor
Notes on the Mathematical Foundations of Analogue Computation
Digital computing has its mathematical foundations in (classical) recursion theory and constructive mathematics. The implicit, working, assumption of those who practice the noble art of analog computing may well be that the mathematical foundations of their subject is as sound as the foundations of the real analysis. That, in turn, implies a reliance on the soundness of set theory plus the axiom of choice. This is, surely, seriously disturbing from a computation point of view. Therefore, in this paper, I seek to locate a foundation for analog computing in exhibiting some tentative dualities with results that are analogous to those that are standard in computability theory. The main question, from the point of view of economics, is whether the Phillips Machine, as an analog computer, has universal computing properties. The conjectured answer is in the negative.
Knowledge Engineering from Data Perspective: Granular Computing Approach
The concept of rough set theory is a mathematical approach to uncertainly and vagueness in data analysis, introduced by Zdzislaw Pawlak in 1980s. Rough set theory assumes the underlying structure of knowledge is a partition. We have extended Pawlakâs concept of knowledge to coverings. We have taken a soft approach regarding any generalized subset as a basic knowledge. We regard a covering as basic knowledge from which the theory of knowledge approximations and learning, knowledge dependency and reduct are developed
"Possible DeïŹnitions of an âA Prioriâ Granule\ud in General Rough Set Theory" by A. Mani
We introduce an abstract framework for general rough set theory from a mereological perspective and consider possible concepts of âa prioriâ granules and granulation in the same. The framework is ideal for relaxing many of the\ud
relatively superïŹuous set-theoretic axioms and for improving the semantics of many relation based, cover-based and dialectical rough set theories. This is a\ud
relatively simplified presentation of a section in three different recent research papers by the present author.\u
- âŠ