We present a new accelerated stochastic second-order method that is robust to
both gradient and Hessian inexactness, which occurs typically in machine
learning. We establish theoretical lower bounds and prove that our algorithm
achieves optimal convergence in both gradient and Hessian inexactness in this
key setting. We further introduce a tensor generalization for stochastic
higher-order derivatives. When the oracles are non-stochastic, the proposed
tensor algorithm matches the global convergence of Nesterov Accelerated Tensor
method. Both algorithms allow for approximate solutions of their auxiliary
subproblems with verifiable conditions on the accuracy of the solution