We propose a stochastic first-order trust-region method with inexact function
and gradient evaluations for solving finite-sum minimization problems. At each
iteration, the function and the gradient are approximated by sampling. The
sample size in gradient approximations is smaller than the sample size in
function approximations and the latter is determined using a deterministic rule
inspired by the inexact restoration method, which allows the decrease of the
sample size at some iterations. The trust-region step is then either accepted
or rejected using a suitable merit function, which combines the function
estimate with a measure of accuracy in the evaluation. We show that the
proposed method eventually reaches full precision in evaluating the objective
function and we provide a worst-case complexity result on the number of
iterations required to achieve full precision. We validate the proposed
algorithm on nonconvex binary classification problems showing good performance
in terms of cost and accuracy and the important feature that a burdensome
tuning of the parameters involved is not required