A machine learning method for prediction of Raman gain and noise spectra is
presented: it guarantees high-accuracy (RMSE < 0.4 dB) and low computational
complexity making it suitable for real-time implementation in future optical
networks controllers