29,306 research outputs found

    Short-Term Forecasting of Passenger Demand under On-Demand Ride Services: A Spatio-Temporal Deep Learning Approach

    Full text link
    Short-term passenger demand forecasting is of great importance to the on-demand ride service platform, which can incentivize vacant cars moving from over-supply regions to over-demand regions. The spatial dependences, temporal dependences, and exogenous dependences need to be considered simultaneously, however, which makes short-term passenger demand forecasting challenging. We propose a novel deep learning (DL) approach, named the fusion convolutional long short-term memory network (FCL-Net), to address these three dependences within one end-to-end learning architecture. The model is stacked and fused by multiple convolutional long short-term memory (LSTM) layers, standard LSTM layers, and convolutional layers. The fusion of convolutional techniques and the LSTM network enables the proposed DL approach to better capture the spatio-temporal characteristics and correlations of explanatory variables. A tailored spatially aggregated random forest is employed to rank the importance of the explanatory variables. The ranking is then used for feature selection. The proposed DL approach is applied to the short-term forecasting of passenger demand under an on-demand ride service platform in Hangzhou, China. Experimental results, validated on real-world data provided by DiDi Chuxing, show that the FCL-Net achieves better predictive performance than traditional approaches including both classical time-series prediction models and neural network based algorithms (e.g., artificial neural network and LSTM). This paper is one of the first DL studies to forecast the short-term passenger demand of an on-demand ride service platform by examining the spatio-temporal correlations.Comment: 39 pages, 10 figure

    λ‹€ λ³€μˆ˜ μ‹œκ³„μ—΄ μ˜ˆμΈ‘μ„ μœ„ν•œ 주의 기반 LSTM λͺ¨λΈ 연ꡬ

    Get PDF
    ν•™μœ„λ…Όλ¬Έ(석사)--μ„œμšΈλŒ€ν•™κ΅ λŒ€ν•™μ› :κ³΅κ³ΌλŒ€ν•™ 컴퓨터곡학뢀,2019. 8. Kang, U.Given previous observations of a multivariate time-series, how can we accurately predict the future value of several steps ahead? With the continuous development of sensor systems and computer systems, time-series prediction techniques are playing more and more important roles in various fields, such as finance, energy, and traffics. Many models have been proposed for time-series prediction tasks, such as Autoregressive model, Vector Autoregressive model, and Recurrent Neural Networks (RNNs). However, these models still have limitations like failure in modeling non-linearity and long-term dependencies in time-series. Among all the proposed approaches, the Temporal Pattern Attention (TPA), which is an attention-based LSTM model, achieves state-of-the-art performance on several real-world multivariate time-series datasets. In this thesis, we study three factors that effect the prediction performance of TPA model, which are the Recurrent Neural Network RNN layer, the attention mechanism, and the Convolutional Neural Network for temporal patter detection. For recurrent layer, we implement bi-directional LSTMs that can extract information from the input sequence in both forward and backward directions. In addition, we design two attention mechanisms, each of which assigns attention weights in different directions. We study the effect of both attention mechanisms on TPA model. Finally, to validate the Convolutional Neural Network (CNN) for temporal pattern detection, we implement a TPA model without CNN. We test all of these factors using several real-world time-series datasets from different fields. The experimental results indicate the validity of these factors.I. Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 II. Preliminaries . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 2.1 Long Short-term Memory . . . . . . . . . . . . . . . . . . . . . . . . . 4 2.2 Typical Attention Mechanism . . . . . . . . . . . . . . . . . . . . . . 5 2.3 Temporal Pattern Attention Model . . . . . . . . . . . . . . . . . . . . 6 III. Study on Temporal Pattern Attention Model . . . . . . . . . . . . . . 9 3.1 Problem Definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9 3.2 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10 3.3 Recurrent Neural Network Layer . . . . . . . . . . . . . . . . . . . . . 10 3.4 Vertical v.s. Horizontal Attention Mechanism . . . . . . . . . . . . . . 12 3.5 Temporal Pattern Attention Model without CNN . . . . . . . . . . . . 14 IV. Experiments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15 4.1 Experimental Setup . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15 4.2 Performance Comparison (Q1) . . . . . . . . . . . . . . . . . . . . . . 18 4.3 Effects of Bi-directional LSTM (Q2) . . . . . . . . . . . . . . . . . . . 20 4.4 Effects of CNN for Temporal Pattern Detection (Q3) . . . . . . . . . . 22 4.5 Which Attention Direction Is Better (Q4) . . . . . . . . . . . . . . . . 23 V. RelatedWorks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26 VI. Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30Maste

    Using Causality-Aware Graph Neural Networks to Predict Temporal Centralities in Dynamic Graphs

    Full text link
    Node centralities play a pivotal role in network science, social network analysis, and recommender systems. In temporal data, static path-based centralities like closeness or betweenness can give misleading results about the true importance of nodes in a temporal graph. To address this issue, temporal generalizations of betweenness and closeness have been defined that are based on the shortest time-respecting paths between pairs of nodes. However, a major issue of those generalizations is that the calculation of such paths is computationally expensive. Addressing this issue, we study the application of De Bruijn Graph Neural Networks (DBGNN), a causality-aware graph neural network architecture, to predict temporal path-based centralities in time series data. We experimentally evaluate our approach in 13 temporal graphs from biological and social systems and show that it considerably improves the prediction of both betweenness and closeness centrality compared to a static Graph Convolutional Neural Network

    Spatiotemporal convolutional network for time-series prediction and causal inference

    Full text link
    Making predictions in a robust way is not easy for nonlinear systems. In this work, a neural network computing framework, i.e., a spatiotemporal convolutional network (STCN), was developed to efficiently and accurately render a multistep-ahead prediction of a time series by employing a spatial-temporal information (STI) transformation. The STCN combines the advantages of both the temporal convolutional network (TCN) and the STI equation, which maps the high-dimensional/spatial data to the future temporal values of a target variable, thus naturally providing the prediction of the target variable. From the observed variables, the STCN also infers the causal factors of the target variable in the sense of Granger causality, which are in turn selected as effective spatial information to improve the prediction robustness. The STCN was successfully applied to both benchmark systems and real-world datasets, all of which show superior and robust performance in multistep-ahead prediction, even when the data were perturbed by noise. From both theoretical and computational viewpoints, the STCN has great potential in practical applications in artificial intelligence (AI) or machine learning fields as a model-free method based only on the observed data, and also opens a new way to explore the observed high-dimensional data in a dynamical manner for machine learning.Comment: 23 pages, 6 figure
    • …
    corecore