5,101 research outputs found

    FC-GAGA: Fully Connected Gated Graph Architecture for Spatio-Temporal Traffic Forecasting

    Full text link
    Forecasting of multivariate time-series is an important problem that has applications in traffic management, cellular network configuration, and quantitative finance. A special case of the problem arises when there is a graph available that captures the relationships between the time-series. In this paper we propose a novel learning architecture that achieves performance competitive with or better than the best existing algorithms, without requiring knowledge of the graph. The key element of our proposed architecture is the learnable fully connected hard graph gating mechanism that enables the use of the state-of-the-art and highly computationally efficient fully connected time-series forecasting architecture in traffic forecasting applications. Experimental results for two public traffic network datasets illustrate the value of our approach, and ablation studies confirm the importance of each element of the architecture. The code is available here: https://github.com/boreshkinai/fc-gaga

    Long Term Predictive Modeling on Big Spatio-Temporal Data

    Get PDF
    In the era of massive data, one of the most promising research fields involves the analysis of large-scale Spatio-temporal databases to discover exciting and previously unknown but potentially useful patterns from data collected over time and space. A modeling process in this domain must take temporal and spatial correlations into account, but with the dimensionality of the time and space measurements increasing, the number of elements potentially contributing to a target sharply grows, making the target\u27s long-term behavior highly complex, chaotic, highly dynamic, and hard to predict. Therefore, two different considerations are taken into account in this work: one is about how to identify the most relevant and meaningful features from the original Spatio-temporal feature space; the other is about how to model complex space-time dynamics with sensitive dependence on initial and boundary conditions. First, identifying strongly related features and removing the irrelevant or less important features with respect to a target feature from large-scale Spatio-temporal data sets is a critical and challenging issue in many fields, including the evolutionary history of crime hot spots, uncovering weather patterns, predicting floodings, earthquakes, and hurricanes, and determining global warming trends. The optimal sub-feature-set that contains all the valuable information is called the Markov Boundary. Unfortunately, the existing feature selection methods often focus on identifying a single Markov Boundary when real-world data could have many feature subsets that are equally good boundaries. In our work, we design a new multiple-Markov-boundary-based predictive model, Galaxy, to identify the precursors to heavy precipitation event clusters and predict heavy rainfall with a long lead time. We applied Galaxy to an extremely high-dimensional meteorological data set and finally determined 15 Markov boundaries related to heavy rainfall events in the Des Moines River Basin in Iowa. Our model identified the cold surges along the coast of Asia as an essential precursor to the surface weather over the United States, a finding which was later corroborated by climate experts. Second, chaotic behavior exists in many nonlinear Spatio-temporal systems, such as climate dynamics, weather prediction, and the space-time dynamics of virus spread. A reliable solution for these systems must handle their complex space-time dynamics and sensitive dependence on initial and boundary conditions. Deep neural networks\u27 hierarchical feature learning capabilities in both spatial and temporal domains are helpful for nonlinear Spatio-temporal dynamics modeling. However, sensitive dependence on initial and boundary conditions is still challenging for theoretical research and many critical applications. This study proposes a new recurrent architecture, error trajectory tracing, and accompanying training regime, Horizon Forcing, for prediction in chaotic systems. These methods have been validated on real-world Spatio-temporal data sets, including one meteorological dataset, three classics, chaotic systems, and four real-world time series prediction tasks with chaotic characteristics. Experiments\u27 results show that each proposed model could outperform the performance of current baseline approaches

    Data science for buildings, a multi-scale approach bridging occupants to smart-city energy planning

    Get PDF
    • …
    corecore