8,013 research outputs found
Learning phase transitions from dynamics
We propose the use of recurrent neural networks for classifying phases of
matter based on the dynamics of experimentally accessible observables. We
demonstrate this approach by training recurrent networks on the magnetization
traces of two distinct models of one-dimensional disordered and interacting
spin chains. The obtained phase diagram for a well-studied model of the
many-body localization transition shows excellent agreement with previously
known results obtained from time-independent entanglement spectra. For a
periodically-driven model featuring an inherently dynamical time-crystalline
phase, the phase diagram that our network traces in a previously-unexplored
regime coincides with an order parameter for its expected phases.Comment: 5 pages + 3 fig, appendix + 5 fi
Photonic Delay Systems as Machine Learning Implementations
Nonlinear photonic delay systems present interesting implementation platforms
for machine learning models. They can be extremely fast, offer great degrees of
parallelism and potentially consume far less power than digital processors. So
far they have been successfully employed for signal processing using the
Reservoir Computing paradigm. In this paper we show that their range of
applicability can be greatly extended if we use gradient descent with
backpropagation through time on a model of the system to optimize the input
encoding of such systems. We perform physical experiments that demonstrate that
the obtained input encodings work well in reality, and we show that optimized
systems perform significantly better than the common Reservoir Computing
approach. The results presented here demonstrate that common gradient descent
techniques from machine learning may well be applicable on physical
neuro-inspired analog computers
Structure theory for the realization of finite state automata Progress report, 1 Nov. 1966 - 30 Apr. 1967
Structure theory for realization of finite state automat
- …