Artificial neural network emulators have been demonstrated to be a very
computationally efficient method to rapidly generate galaxy spectral energy
distributions (SEDs), for parameter inference or otherwise. Using a highly
flexible and fast mathematical structure, they can learn the nontrivial
relationship between input galaxy parameters and output observables. However,
they do so imperfectly, and small errors in flux prediction can yield large
differences in recovered parameters. In this work, we investigate the
relationship between an emulator's execution time, uncertainties, correlated
errors, and ability to recover accurate posteriors. We show that emulators can
recover consistent results to traditional fits, with precision of 25β40%
in posterior medians for stellar mass, stellar metallicity, star formation
rate, and stellar age. We find that emulation uncertainties scale with an
emulator's width N as βNβ1 while execution time scales as
βN2, resulting in an inherent tradeoff between execution time and
emulation uncertainties. We also find that emulators with uncertainties smaller
than observational uncertaities are able to recover accurate posteriors for
most parameters without a significant increase in catastrophic outliers.
Furthermore, we demonstrate that small architectures can produce flux residuals
that have significant correlations, which can create dangerous systematic
errors in colors. Finally, we show that the distributions chosen for generating
training sets can have a large effect on emulators' ability to accurately fit
rare objects. Selecting the optimal architecture and training set for an
emulator will minimize the computational requirements for fitting near-future
large-scale galaxy surveys.Comment: 26 pages, 15 figures. Submitted to the Astrophysical Journa