3,809 research outputs found

    Scaling up integrated photonic reservoirs towards low-power high-bandwidth computing

    No full text

    Machine Learning with Chaotic Strange Attractors

    Full text link
    Machine learning studies need colossal power to process massive datasets and train neural networks to reach high accuracies, which have become gradually unsustainable. Limited by the von Neumann bottleneck, current computing architectures and methods fuel this high power consumption. Here, we present an analog computing method that harnesses chaotic nonlinear attractors to perform machine learning tasks with low power consumption. Inspired by neuromorphic computing, our model is a programmable, versatile, and generalized platform for machine learning tasks. Our mode provides exceptional performance in clustering by utilizing chaotic attractors' nonlinear mapping and sensitivity to initial conditions. When deployed as a simple analog device, it only requires milliwatt-scale power levels while being on par with current machine learning techniques. We demonstrate low errors and high accuracies with our model for regression and classification-based learning tasks.Comment: Manuscript is 13 pages, 4 figures. Supplementary Material is 6 pages, 3 figure

    An Overview on Application of Machine Learning Techniques in Optical Networks

    Get PDF
    Today's telecommunication networks have become sources of enormous amounts of widely heterogeneous data. This information can be retrieved from network traffic traces, network alarms, signal quality indicators, users' behavioral data, etc. Advanced mathematical tools are required to extract meaningful information from these data and take decisions pertaining to the proper functioning of the networks from the network-generated data. Among these mathematical tools, Machine Learning (ML) is regarded as one of the most promising methodological approaches to perform network-data analysis and enable automated network self-configuration and fault management. The adoption of ML techniques in the field of optical communication networks is motivated by the unprecedented growth of network complexity faced by optical networks in the last few years. Such complexity increase is due to the introduction of a huge number of adjustable and interdependent system parameters (e.g., routing configurations, modulation format, symbol rate, coding schemes, etc.) that are enabled by the usage of coherent transmission/reception technologies, advanced digital signal processing and compensation of nonlinear effects in optical fiber propagation. In this paper we provide an overview of the application of ML to optical communications and networking. We classify and survey relevant literature dealing with the topic, and we also provide an introductory tutorial on ML for researchers and practitioners interested in this field. Although a good number of research papers have recently appeared, the application of ML to optical networks is still in its infancy: to stimulate further work in this area, we conclude the paper proposing new possible research directions

    Chain-structure time-delay reservoir computing for synchronizing chaotic signal and an application to secure communication

    Get PDF
    In this work, a chain-structure time-delay reservoir (CSTDR) computing, as a new kind of machine learning-based recurrent neural network, is proposed for synchronizing chaotic signals. Compared with the single time-delay reservoir, our proposed CSTDR computing shows excellent performance in synchronizing chaotic signal achieving an order of magnitude higher accuracy. Noise consideration and optimal parameter setting of the model are discussed. Taking the CSTDR computing as the core, a novel scheme of secure communication is further designed, in which the “smart” receiver is different from the traditional in that it can synchronize to the chaotic signal used for encryption in an adaptive manner. The scheme can solve the issues such as design constrains for identical dynamical systems and couplings between transmitter and receiver in conventional settings. To further manifest the practical significance of the scheme, the digital implementation using field-programmable gate array is conducted and tested experimentally with real-world examples including image and video transmission. The work sheds light on developing machine learning-based signal processing and communication applications

    A Survey on Reservoir Computing and its Interdisciplinary Applications Beyond Traditional Machine Learning

    Full text link
    Reservoir computing (RC), first applied to temporal signal processing, is a recurrent neural network in which neurons are randomly connected. Once initialized, the connection strengths remain unchanged. Such a simple structure turns RC into a non-linear dynamical system that maps low-dimensional inputs into a high-dimensional space. The model's rich dynamics, linear separability, and memory capacity then enable a simple linear readout to generate adequate responses for various applications. RC spans areas far beyond machine learning, since it has been shown that the complex dynamics can be realized in various physical hardware implementations and biological devices. This yields greater flexibility and shorter computation time. Moreover, the neuronal responses triggered by the model's dynamics shed light on understanding brain mechanisms that also exploit similar dynamical processes. While the literature on RC is vast and fragmented, here we conduct a unified review of RC's recent developments from machine learning to physics, biology, and neuroscience. We first review the early RC models, and then survey the state-of-the-art models and their applications. We further introduce studies on modeling the brain's mechanisms by RC. Finally, we offer new perspectives on RC development, including reservoir design, coding frameworks unification, physical RC implementations, and interaction between RC, cognitive neuroscience and evolution.Comment: 51 pages, 19 figures, IEEE Acces

    Improving the accuracy of weed species detection for robotic weed control in complex real-time environments

    Get PDF
    Alex Olsen applied deep learning and machine vision to improve the accuracy of weed species detection in real time complex environments. His robotic weed control prototype, AutoWeed, presents a new efficient tool for weed management in crop and pasture and has launched a startup agricultural technology company
    corecore