Artificial neural networks usually consist of successive linear
multiply-accumulate operations and nonlinear activation functions. However,
most optical neural networks only achieve the linear operation in the optical
domain, while the optical implementation of activation function remains
challenging. Here we present an optical ReLU-like activation function based on
a semiconductor laser subject to the optical injection in experiment. The
ReLU-like function is achieved in a broad regime above the Hopf bifurcation of
the injection-locking diagram. In particular, the slope of the activation
function is reconfigurable by tuning the frequency difference between the
master laser and the slave laser