5,711 research outputs found

    A MapReduce-based nearest neighbor approach for big-data-driven traffic flow prediction

    Full text link
    In big-data-driven traffic flow prediction systems, the robustness of prediction performance depends on accuracy and timeliness. This paper presents a new MapReduce-based nearest neighbor (NN) approach for traffic flow prediction using correlation analysis (TFPC) on a Hadoop platform. In particular, we develop a real-time prediction system including two key modules, i.e., offline distributed training (ODT) and online parallel prediction (OPP). Moreover, we build a parallel k-nearest neighbor optimization classifier, which incorporates correlation information among traffic flows into the classification process. Finally, we propose a novel prediction calculation method, combining the current data observed in OPP and the classification results obtained from large-scale historical data in ODT, to generate traffic flow prediction in real time. The empirical study on real-world traffic flow big data using the leave-one-out cross validation method shows that TFPC significantly outperforms four state-of-the-art prediction approaches, i.e., autoregressive integrated moving average, Naïve Bayes, multilayer perceptron neural networks, and NN regression, in terms of accuracy, which can be improved 90.07% in the best case, with an average mean absolute percent error of 5.53%. In addition, it displays excellent speedup, scaleup, and sizeup

    Adaptive traffic lights based on traffic flow prediction using machine learning models

    Get PDF
    Traffic congestion prediction is one of the essential components of intelligent transport systems (ITS). This is due to the rapid growth of population and, consequently, the high number of vehicles in cities. Nowadays, the problem of traffic congestion attracts more and more attention from researchers in the field of ITS. Traffic congestion can be predicted in advance by analyzing traffic flow data. In this article, we used machine learning algorithms such as linear regression, random forest regressor, decision tree regressor, gradient boosting regressor, and K-neighbor regressor to predict traffic flow and reduce traffic congestion at intersections. We used the public roads dataset from the UK national road traffic to test our models. All machine learning algorithms obtained good performance metrics, indicating that they are valid for implementation in smart traffic light systems. Next, we implemented an adaptive traffic light system based on a random forest regressor model, which adjusts the timing of green and red lights depending on the road width, traffic density, types of vehicles, and expected traffic. Simulations of the proposed system show a 30.8% reduction in traffic congestion, thus justifying its effectiveness and the interest of deploying it to regulate the signaling problem in intersections

    DeepTransport: Learning Spatial-Temporal Dependency for Traffic Condition Forecasting

    Full text link
    Predicting traffic conditions has been recently explored as a way to relieve traffic congestion. Several pioneering approaches have been proposed based on traffic observations of the target location as well as its adjacent regions, but they obtain somewhat limited accuracy due to lack of mining road topology. To address the effect attenuation problem, we propose to take account of the traffic of surrounding locations(wider than adjacent range). We propose an end-to-end framework called DeepTransport, in which Convolutional Neural Networks (CNN) and Recurrent Neural Networks (RNN) are utilized to obtain spatial-temporal traffic information within a transport network topology. In addition, attention mechanism is introduced to align spatial and temporal information. Moreover, we constructed and released a real-world large traffic condition dataset with 5-minute resolution. Our experiments on this dataset demonstrate our method captures the complex relationship in temporal and spatial domain. It significantly outperforms traditional statistical methods and a state-of-the-art deep learning method

    An Overview on Application of Machine Learning Techniques in Optical Networks

    Get PDF
    Today's telecommunication networks have become sources of enormous amounts of widely heterogeneous data. This information can be retrieved from network traffic traces, network alarms, signal quality indicators, users' behavioral data, etc. Advanced mathematical tools are required to extract meaningful information from these data and take decisions pertaining to the proper functioning of the networks from the network-generated data. Among these mathematical tools, Machine Learning (ML) is regarded as one of the most promising methodological approaches to perform network-data analysis and enable automated network self-configuration and fault management. The adoption of ML techniques in the field of optical communication networks is motivated by the unprecedented growth of network complexity faced by optical networks in the last few years. Such complexity increase is due to the introduction of a huge number of adjustable and interdependent system parameters (e.g., routing configurations, modulation format, symbol rate, coding schemes, etc.) that are enabled by the usage of coherent transmission/reception technologies, advanced digital signal processing and compensation of nonlinear effects in optical fiber propagation. In this paper we provide an overview of the application of ML to optical communications and networking. We classify and survey relevant literature dealing with the topic, and we also provide an introductory tutorial on ML for researchers and practitioners interested in this field. Although a good number of research papers have recently appeared, the application of ML to optical networks is still in its infancy: to stimulate further work in this area, we conclude the paper proposing new possible research directions

    A survey of machine learning techniques applied to self organizing cellular networks

    Get PDF
    In this paper, a survey of the literature of the past fifteen years involving Machine Learning (ML) algorithms applied to self organizing cellular networks is performed. In order for future networks to overcome the current limitations and address the issues of current cellular systems, it is clear that more intelligence needs to be deployed, so that a fully autonomous and flexible network can be enabled. This paper focuses on the learning perspective of Self Organizing Networks (SON) solutions and provides, not only an overview of the most common ML techniques encountered in cellular networks, but also manages to classify each paper in terms of its learning solution, while also giving some examples. The authors also classify each paper in terms of its self-organizing use-case and discuss how each proposed solution performed. In addition, a comparison between the most commonly found ML algorithms in terms of certain SON metrics is performed and general guidelines on when to choose each ML algorithm for each SON function are proposed. Lastly, this work also provides future research directions and new paradigms that the use of more robust and intelligent algorithms, together with data gathered by operators, can bring to the cellular networks domain and fully enable the concept of SON in the near future

    A Taxonomy of Traffic Forecasting Regression Problems From a Supervised Learning Perspective

    Get PDF
    One contemporary policy to deal with traffic congestion is the design and implementation of forecasting methods that allow users to plan ahead of time and decision makers to improve traffic management. Current data availability and growing computational capacities have increased the use of machine learning (ML) to address traffic prediction, which is mostly modeled as a supervised regression problem. Although some studies have presented taxonomies to sort the literature in this field, they are mostly oriented to classify the ML methods applied and a little effort has been directed to categorize the traffic forecasting problems approached by them. As far as we know, there is no comprehensive taxonomy that classifies these problems from the point of view of both traffic and ML. In this paper, we propose a taxonomy to categorize the aforementioned problems from both traffic and a supervised regression learning perspective. The taxonomy aims at unifying and consolidating categorization criteria related to traffic and it introduces new criteria to classify the problems in terms of how they are modeled from a supervised regression approach. The traffic forecasting literature, from 2000 to 2019, is categorized using this taxonomy to illustrate its descriptive power. From this categorization, different remarks are discussed regarding the current gaps and trends in the addressed traffic forecasting area

    Thirty Years of Machine Learning: The Road to Pareto-Optimal Wireless Networks

    Full text link
    Future wireless networks have a substantial potential in terms of supporting a broad range of complex compelling applications both in military and civilian fields, where the users are able to enjoy high-rate, low-latency, low-cost and reliable information services. Achieving this ambitious goal requires new radio techniques for adaptive learning and intelligent decision making because of the complex heterogeneous nature of the network structures and wireless services. Machine learning (ML) algorithms have great success in supporting big data analytics, efficient parameter estimation and interactive decision making. Hence, in this article, we review the thirty-year history of ML by elaborating on supervised learning, unsupervised learning, reinforcement learning and deep learning. Furthermore, we investigate their employment in the compelling applications of wireless networks, including heterogeneous networks (HetNets), cognitive radios (CR), Internet of things (IoT), machine to machine networks (M2M), and so on. This article aims for assisting the readers in clarifying the motivation and methodology of the various ML algorithms, so as to invoke them for hitherto unexplored services as well as scenarios of future wireless networks.Comment: 46 pages, 22 fig
    • …
    corecore