3,784 research outputs found
The future of computing beyond Moore's Law.
Moore's Law is a techno-economic model that has enabled the information technology industry to double the performance and functionality of digital electronics roughly every 2 years within a fixed cost, power and area. Advances in silicon lithography have enabled this exponential miniaturization of electronics, but, as transistors reach atomic scale and fabrication costs continue to rise, the classical technological driver that has underpinned Moore's Law for 50 years is failing and is anticipated to flatten by 2025. This article provides an updated view of what a post-exascale system will look like and the challenges ahead, based on our most recent understanding of technology roadmaps. It also discusses the tapering of historical improvements, and how it affects options available to continue scaling of successors to the first exascale machine. Lastly, this article covers the many different opportunities and strategies available to continue computing performance improvements in the absence of historical technology drivers. This article is part of a discussion meeting issue 'Numerical algorithms for high-performance computational science'
Noise-based information processing: Noise-based logic and computing: what do we have so far?
We briefly introduce noise-based logic. After describing the main motivations
we outline classical, instantaneous (squeezed and non-squeezed), continuum,
spike and random-telegraph-signal based schemes with applications such as
circuits that emulate the brain functioning and string verification via a slow
communication channel.Comment: Invited talk at the 21st International Conference on Noise and
Fluctuations, Toronto, Canada, June 12-16, 201
Memristors -- from In-memory computing, Deep Learning Acceleration, Spiking Neural Networks, to the Future of Neuromorphic and Bio-inspired Computing
Machine learning, particularly in the form of deep learning, has driven most
of the recent fundamental developments in artificial intelligence. Deep
learning is based on computational models that are, to a certain extent,
bio-inspired, as they rely on networks of connected simple computing units
operating in parallel. Deep learning has been successfully applied in areas
such as object/pattern recognition, speech and natural language processing,
self-driving vehicles, intelligent self-diagnostics tools, autonomous robots,
knowledgeable personal assistants, and monitoring. These successes have been
mostly supported by three factors: availability of vast amounts of data,
continuous growth in computing power, and algorithmic innovations. The
approaching demise of Moore's law, and the consequent expected modest
improvements in computing power that can be achieved by scaling, raise the
question of whether the described progress will be slowed or halted due to
hardware limitations. This paper reviews the case for a novel beyond CMOS
hardware technology, memristors, as a potential solution for the implementation
of power-efficient in-memory computing, deep learning accelerators, and spiking
neural networks. Central themes are the reliance on non-von-Neumann computing
architectures and the need for developing tailored learning and inference
algorithms. To argue that lessons from biology can be useful in providing
directions for further progress in artificial intelligence, we briefly discuss
an example based reservoir computing. We conclude the review by speculating on
the big picture view of future neuromorphic and brain-inspired computing
systems.Comment: Keywords: memristor, neuromorphic, AI, deep learning, spiking neural
networks, in-memory computin
Is there a Moore's law for quantum computing?
There is a common wisdom according to which many technologies can progress
according to some exponential law like the empirical Moore's law that was
validated for over half a century with the growth of transistors number in
chipsets. As a still in the making technology with a lot of potential promises,
quantum computing is supposed to follow the pack and grow inexorably to
maturity. The Holy Grail in that domain is a large quantum computer with
thousands of errors corrected logical qubits made themselves of thousands, if
not more, of physical qubits. These would enable molecular simulations as well
as factoring 2048 RSA bit keys among other use cases taken from the intractable
classical computing problems book. How far are we from this? Less than 15 years
according to many predictions. We will see in this paper that Moore's empirical
law cannot easily be translated to an equivalent in quantum computing. Qubits
have various figures of merit that won't progress magically thanks to some new
manufacturing technique capacity. However, some equivalents of Moore's law may
be at play inside and outside the quantum realm like with quantum computers
enabling technologies, cryogeny and control electronics. Algorithms, software
tools and engineering also play a key role as enablers of quantum computing
progress. While much of quantum computing future outcomes depends on qubit
fidelities, it is progressing rather slowly, particularly at scale. We will
finally see that other figures of merit will come into play and potentially
change the landscape like the quality of computed results and the energetics of
quantum computing. Although scientific and technological in nature, this
inventory has broad business implications, on investment, education and
cybersecurity related decision-making processes.Comment: 32 pages, 24 figure
- …