Machine learning imitates the basic features of biological neural networks to
efficiently perform tasks such as pattern recognition. This has been mostly
achieved at a software level, and a strong effort is currently being made to
mimic neurons and synapses with hardware components, an approach known as
neuromorphic computing. CMOS-based circuits have been used for this purpose,
but they are non-scalable, limiting the device density and motivating the
search for neuromorphic materials. While recent advances in resistive switching
have provided a path to emulate synapses at the 10 nm scale, a scalable neuron
analogue is yet to be found. Here, we show how heat transfer can be utilized to
mimic neuron functionalities in Mott nanodevices. We use the Joule heating
created by current spikes to trigger the insulator-to-metal transition in a
biased VO2 nanogap. We show that thermal dynamics allow the implementation of
the basic neuron functionalities: activity, leaky integrate-and-fire,
volatility and rate coding. By using local temperature as the internal
variable, we avoid the need of external capacitors, which reduces neuristor
size by several orders of magnitude. This approach could enable neuromorphic
hardware to take full advantage of the rapid advances in memristive synapses,
allowing for much denser and complex neural networks. More generally, we show
that heat dissipation is not always an undesirable effect: it can perform
computing tasks if properly engineered