Stochastic gradient descent (SGD) is a scalable and memory-efficient
optimization algorithm for large datasets and stream data, which has drawn a
great deal of attention and popularity. The applications of SGD-based
estimators to statistical inference such as interval estimation have also
achieved great success. However, most of the related works are based on i.i.d.
observations or Markov chains. When the observations come from a mixing time
series, how to conduct valid statistical inference remains unexplored. As a
matter of fact, the general correlation among observations imposes a challenge
on interval estimation. Most existing methods may ignore this correlation and
lead to invalid confidence intervals. In this paper, we propose a mini-batch
SGD estimator for statistical inference when the data is Ï•-mixing. The
confidence intervals are constructed using an associated mini-batch bootstrap
SGD procedure. Using ``independent block'' trick from \cite{yu1994rates}, we
show that the proposed estimator is asymptotically normal, and its limiting
distribution can be effectively approximated by the bootstrap procedure. The
proposed method is memory-efficient and easy to implement in practice.
Simulation studies on synthetic data and an application to a real-world dataset
confirm our theory