6,398 research outputs found

    Wave-based extreme deep learning based on non-linear time-Floquet entanglement

    Full text link
    Wave-based analog signal processing holds the promise of extremely fast, on-the-fly, power-efficient data processing, occurring as a wave propagates through an artificially engineered medium. Yet, due to the fundamentally weak non-linearities of traditional wave materials, such analog processors have been so far largely confined to simple linear projections such as image edge detection or matrix multiplications. Complex neuromorphic computing tasks, which inherently require strong non-linearities, have so far remained out-of-reach of wave-based solutions, with a few attempts that implemented non-linearities on the digital front, or used weak and inflexible non-linear sensors, restraining the learning performance. Here, we tackle this issue by demonstrating the relevance of Time-Floquet physics to induce a strong non-linear entanglement between signal inputs at different frequencies, enabling a power-efficient and versatile wave platform for analog extreme deep learning involving a single, uniformly modulated dielectric layer and a scattering medium. We prove the efficiency of the method for extreme learning machines and reservoir computing to solve a range of challenging learning tasks, from forecasting chaotic time series to the simultaneous classification of distinct datasets. Our results open the way for wave-based machine learning with high energy efficiency, speed, and scalability.Comment: 23 pages, 9 figure

    Excitability and optical pulse generation in semiconductor lasers driven by resonant tunneling diode photo-detectors

    Get PDF
    We demonstrate, experimentally and theoretically, excitable nanosecond optical pulses in optoelectronic integrated circuits operating at telecommunication wavelengths (1550 nm) comprising a nanoscale double barrier quantum well resonant tunneling diode (RTD) photo-detector driving a laser diode (LD). When perturbed either electrically or optically by an input signal above a certain threshold, the optoelectronic circuit generates short electrical and optical excitable pulses mimicking the spiking behavior of biological neurons. Interestingly, the asymmetric nonlinear characteristic of the RTD-LD allows for two different regimes where one obtain either single pulses or a burst of multiple pulses. The high-speed excitable response capabilities are promising for neurally inspired information applications in photonics. (C) 2013 Optical Society of AmericaFCT [PTDC/EEA-TEL/100755/2008]; FCT Portugal [SFRH/BPD/84466/2012]; Ramon y Cajal fellowship; project RANGER [TEC2012-38864-C03-01]; Direcci General de Recerca del Govern de les Illes Balears; EU FEDER funds; Ministry of Economics and Competitivity of Spain [FIS2010-22322-C02-01

    Nanophotonic reservoir computing with photonic crystal cavities to generate periodic patterns

    Get PDF
    Reservoir computing (RC) is a technique in machine learning inspired by neural systems. RC has been used successfully to solve complex problems such as signal classification and signal generation. These systems are mainly implemented in software, and thereby they are limited in speed and power efficiency. Several optical and optoelectronic implementations have been demonstrated, in which the system has signals with an amplitude and phase. It is proven that these enrich the dynamics of the system, which is beneficial for the performance. In this paper, we introduce a novel optical architecture based on nanophotonic crystal cavities. This allows us to integrate many neurons on one chip, which, compared with other photonic solutions, closest resembles a classical neural network. Furthermore, the components are passive, which simplifies the design and reduces the power consumption. To assess the performance of this network, we train a photonic network to generate periodic patterns, using an alternative online learning rule called first-order reduced and corrected error. For this, we first train a classical hyperbolic tangent reservoir, but then we vary some of the properties to incorporate typical aspects of a photonics reservoir, such as the use of continuous-time versus discrete-time signals and the use of complex-valued versus real-valued signals. Then, the nanophotonic reservoir is simulated and we explore the role of relevant parameters such as the topology, the phases between the resonators, the number of nodes that are biased and the delay between the resonators. It is important that these parameters are chosen such that no strong self-oscillations occur. Finally, our results show that for a signal generation task a complex-valued, continuous-time nanophotonic reservoir outperforms a classical (i.e., discrete-time, real-valued) leaky hyperbolic tangent reservoir (normalized root-mean-square errors = 0.030 versus NRMSE = 0.127)

    Extreme events generated in microcavity lasers and their predictions by reservoir computing

    Full text link
    Extreme events generated by complex systems have been intensively studied in many fields due to their great impact on scientific research and our daily lives. However, their prediction is still a challenge in spite of the tremendous progress that model-free machine learning has brought to the field. We experimentally generate, and theoretically model, extreme events in a current-modulated, single-mode microcavity laser operating on orthogonal polarizations, where their strongly differing thresholds -- due to cavity birefringence -- give rise to giant light pulses initiated by spontaneous emission. Applying reservoir-computing techniques, we identify in advance the emergence of an extreme event from a time series, in spite of coarse sampling and limited sample length. Performance is optimized through new hybrid configurations that we introduce in this paper. Advance warning times can reach 5ns, i.e. approximately ten times the rise time of the individual extreme event

    Novel linear and nonlinear optical signal processing for ultra-high bandwidth communications

    Get PDF
    The thesis is articulated around the theme of ultra-wide bandwidth single channel signals. It focuses on the two main topics of transmission and processing of information by techniques compatible with high baudrates. The processing schemes introduced combine new linear and nonlinear optical platforms such as Fourier-domain programmable optical processors and chalcogenide chip waveguides, as well as the concept of neural network. Transmission of data is considered in the context of medium distance links of Optical Time Division Multiplexed (OTDM) data subject to environmental fluctuations. We experimentally demonstrate simultaneous compensation of differential group delay and multiple orders of dispersion at symbol rates of 640 Gbaud and 1.28 Tbaud. Signal processing at high bandwidth is envisaged both in the case of elementary post-transmission analog error mitigation and in the broader field of optical computing for high level operations (“optical processor”). A key innovation is the introduction of a novel four-wave mixing scheme implementing a dot-product operation between wavelength multiplexed channels. In particular, it is demonstrated for low-latency hash-key based all-optical error detection in links encoded with advanced modulation formats. Finally, the work presents groundbreaking concepts for compact implementation of an optical neural network as a programmable multi-purpose processor. The experimental architecture can implement neural networks with several nodes on a single optical nonlinear transfer function implementing functions such as analog-to-digital conversion. The particularity of the thesis is the new approaches to optical signal processing that potentially enable high level operations using simple optical hardware and limited cascading of components
    corecore