306 research outputs found

    Scaling up integrated photonic reservoirs towards low-power high-bandwidth computing

    No full text

    Urban hydroinformatics: past, present and future

    Get PDF
    This is the author accepted manuscriptHydroinformatics, as an interdisciplinary domain that blurs boundaries between water science, data science and computer science, is constantly evolving and reinventing itself. At the heart of this evolution, lies a continuous process of critical (self) appraisal of the discipline’s past, present and potential for further evolution, that creates a positive feedback loop between legacy, reality and aspirations. The power of this process is attested by the successful story of hydroinformatics thus far, which has arguably been able to mobilize wide ranging research and development and get the water sector more in tune with the digital revolution of the past 30 years. In this context, this paper attempts to trace the evolution of the discipline, from its computational hydraulics origins to its present focus on the complete socio-technical system, by providing at the same time, a functional framework to improve the understanding and highlight the links between different strands of the state-of-art hydroinformatic research and innovation. Building on this state-of-art landscape, the paper then attempts to provide an overview of key developments that are coming up, on the discipline’s horizon, focusing on developments relevant to urban water management, while at the same time, highlighting important legal, ethical and technical challenges that need to be addressed to ensure that the brightest aspects of this potential future are realized. Despite obvious limitations imposed by a single paper’s ability to report on such a diverse and dynamic field, it is hoped that this work contributes to a better understanding of both the current state of hydroinformatics and to a shared vision on the most exciting prospects for the future evolution of the discipline and the water sector it serves

    White Paper on Digital and Complex Information

    Get PDF
    Information is one of the main traits of the contemporary era. Indeed there aremany perspectives to define the present times, such as the Digital Age, the Big Dataera, the Fourth Industrial Revolution, the fourth Paradigm of science, and in all ofthem information, gathered, stored, processed and transmitted, plays a key role.Technological developments in the last decades such as powerful computers, cheaperand miniaturized solutions as smartphones, massive optical communication, or theInternet, to name few, have enabled this shift to the Information age. This shift hasdriven daily life, cultural and social deep changes, in work and personal activities,on access to knowledge, information spreading, altering interpersonal relations orthe way we interact in public and private sphere, in economy and politics, pavingthe way to globalizationPeer reviewe

    BPLight-CNN: A Photonics-based Backpropagation Accelerator for Deep Learning

    Full text link
    Training deep learning networks involves continuous weight updates across the various layers of the deep network while using a backpropagation algorithm (BP). This results in expensive computation overheads during training. Consequently, most deep learning accelerators today employ pre-trained weights and focus only on improving the design of the inference phase. The recent trend is to build a complete deep learning accelerator by incorporating the training module. Such efforts require an ultra-fast chip architecture for executing the BP algorithm. In this article, we propose a novel photonics-based backpropagation accelerator for high performance deep learning training. We present the design for a convolutional neural network, BPLight-CNN, which incorporates the silicon photonics-based backpropagation accelerator. BPLight-CNN is a first-of-its-kind photonic and memristor-based CNN architecture for end-to-end training and prediction. We evaluate BPLight-CNN using a photonic CAD framework (IPKISS) on deep learning benchmark models including LeNet and VGG-Net. The proposed design achieves (i) at least 34x speedup, 34x improvement in computational efficiency, and 38.5x energy savings, during training; and (ii) 29x speedup, 31x improvement in computational efficiency, and 38.7x improvement in energy savings, during inference compared to the state-of-the-art designs. All these comparisons are done at a 16-bit resolution; and BPLight-CNN achieves these improvements at a cost of approximately 6% lower accuracy compared to the state-of-the-art

    Leveraging Optical Communication Fiber and AI for Distributed Water Pipe Leak Detection

    Full text link
    Detecting leaks in water networks is a costly challenge. This article introduces a practical solution: the integration of optical network with water networks for efficient leak detection. Our approach uses a fiber-optic cable to measure vibrations, enabling accurate leak identification and localization by an intelligent algorithm. We also propose a method to access leak severity for prioritized repairs. Our solution detects even small leaks with flow rates as low as 0.027 L/s. It offers a cost-effective way to improve leak detection, enhance water management, and increase operational efficiency.Comment: Accepte

    2022 roadmap on neuromorphic computing and engineering

    Full text link
    Modern computation based on von Neumann architecture is now a mature cutting-edge science. In the von Neumann architecture, processing and memory units are implemented as separate blocks interchanging data intensively and continuously. This data transfer is responsible for a large part of the power consumption. The next generation computer technology is expected to solve problems at the exascale with 1018^{18} calculations each second. Even though these future computers will be incredibly powerful, if they are based on von Neumann type architectures, they will consume between 20 and 30 megawatts of power and will not have intrinsic physically built-in capabilities to learn or deal with complex data as our brain does. These needs can be addressed by neuromorphic computing systems which are inspired by the biological concepts of the human brain. This new generation of computers has the potential to be used for the storage and processing of large amounts of digital information with much lower power consumption than conventional processors. Among their potential future applications, an important niche is moving the control from data centers to edge devices. The aim of this roadmap is to present a snapshot of the present state of neuromorphic technology and provide an opinion on the challenges and opportunities that the future holds in the major areas of neuromorphic technology, namely materials, devices, neuromorphic circuits, neuromorphic algorithms, applications, and ethics. The roadmap is a collection of perspectives where leading researchers in the neuromorphic community provide their own view about the current state and the future challenges for each research area. We hope that this roadmap will be a useful resource by providing a concise yet comprehensive introduction to readers outside this field, for those who are just entering the field, as well as providing future perspectives for those who are well established in the neuromorphic computing community
    • …
    corecore