9 research outputs found

    Virtual Chimera States for Delayed-Feedback Systems

    No full text
    International audienceTime-delayed systems are found to display remarkable temporal patterns the dynamics of which split into regular and chaotic components repeating at the interval of a delay. This novel long-term behavior for delay dynamics results from strongly asymmetric nonlinear delayed feedback driving a highly damped harmonic oscillator dynamics. In the corresponding virtual space-time representation, the behavior is found to develop as a chimeralike state, a new paradigmatic object from the network theory characterized by the coexistence of synchronous and incoherent oscillations. Numerous virtual chimera states are obtained and analyzed, through experiment, theory, and simulations

    Virtual space-time delay dynamics and their chimera states

    No full text
    International audience<font face="null"&gt<span style="font-size: 13px;"&gtVirtual space-time delay dynamics and their chimera states</span&gt</font&g

    Chimera in space-time representation of nonlinear delay dynamics

    No full text
    International audienceDelay dynamical systems are known to have a space-time interpretation. We propose to use this feature for the quest of experimental Chimera. A benchmark optoelectronic delay oscillator is considered as the physical setup intended to emulate a high dimensional spatio-temporal dynamics of coupled virtual nodes. Theoretical analysis, as well as numerical simulations, are proposed to identify the temporal and amplitude parameter conditions under which Chimera could be obtained in an experimental delay dynamics

    Chimera states in laser delay dynamics: experiment and modeling

    No full text
    International audience<font face="null"&gt<span style="font-size: 13px;"&gtChimera states in laser delay dynamics: experiment and modeling</span&gt</font&g

    Efficient design of hardware-enabled reservoir computing in FPGAs

    No full text
    International audienceIn this work, we propose a new approach toward the efficient optimization and implementation of reservoir computing hardware, reducing the required domain-expert knowledge and optimization effort. First, we introduce a self-adapting reservoir input mask to the structure of the data via linear autoencoders. We, therefore, incorporate the advantages of dimensionality reduction and dimensionality expansion achieved by conventional algorithmically-efficient linear algebra procedures of principal component analysis. Second, we employ evolutionary-inspired genetic algorithm techniques resulting in a highly efficient optimization of reservoir dynamics with a dramatically reduced number of evaluations comparing to exhaustive search. We illustrate the method on the so-called single-node reservoir computing architecture, especially suitable for implementation in ultrahigh-speed hardware. The combination of both methods and the resulting reduction of time required for performance optimization of a hardware system establish a strategy toward machine learning hardware capable of self-adaption to optimally solve specific problems. We confirm the validity of those principles building reservoir computing hardware based on a field-programmable gate array

    "Etats Chimères" dans une dynamique à retard en longueur d'onde d'une diode laser accordable

    No full text
    National audienceChimera states are exhotic solutions arising in complex dynamics, typically in networks of coupled oscillators. They are charaterized by a cluster arrangement within the network, each cluster being identified by a particular behavior of all of its individual oscillators. Chimera appear as the occurrence of stable neighboring groups of oscillators, the oscillator motion within a cluster being uniform within one cluster, but incongruent between clusters.We report the experimental, numerical and theoretical observation of such virtual Chimera states in a particular class of nonlinear dynamical system modeled by a nonlinear delayed integro-differential equation. The corresponding experimental setup involves an optoelectronic delay oscillator in which the dynamical variable is the wavelength of a tunable laser diode

    Coupled Nonlinear Delay Systems as Deep Convolutional Neural Networks

    No full text
    International audienceNeural networks are transforming the field of computer algorithms, yet their emulation on current computing substrates is highly inefficient. Reservoir computing was successfully implemented on a large variety of substrates and gave new insight in overcoming this implementation bottleneck. Despite its success, the approach lags behind the state of the art in deep learning. We therefore extend time-delay reservoirs to deep networks and demonstrate that these conceptually correspond to deep convolutional neural networks. Convolution is intrinsically realized on a substrate level by generic drive-response properties of dynamical systems. The resulting novelty is avoiding vector matrix products between layers, which cause low efficiency in today’s substrates. Compared to singleton time-delay reservoirs, our deep network achieves accuracy improvements by at least an order of magnitude in Mackey-Glass and Lorenz time series prediction
    corecore