In the era of big data, detecting outliers in time series data is crucial, particularly in fields such as finance and engineering. This article proposes a novel sequence outlier detection method based on the gated recurrent unit autoencoder with Gaussian mixture model (GRU-AE-GMM), which combines gated recurrent unit (GRU), autoencoder (AE), Gaussian mixture model (GMM), and optimization algorithms. The GRU captures long-term dependencies within the sequence, while the AE measures sequence abnormality. Meanwhile, the GMM models the relationship between the original and reconstructed sequences, employing the Expectation–Maximization (EM) algorithm for parameter estimation to calculate the likelihood of each hidden variable belonging to each Gaussian mixture component. In this article, we first train the model with mean-squared error loss (MSEL), and then further enhanced by substituting it with quantile loss (QL), composite quantile loss (CQL), and Huber loss (HL), respectively. Next, we validate the effectiveness and robustness of the proposed model through Monte Carlo experiments conducted under different error terms. Finally, the method is applied to Amazon stock data for 2022, demonstrating its significant potential for application in dynamic and unpredictable market environments
Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.