11,717 research outputs found

    Assessing the Impact of Game Day Schedule and Opponents on Travel Patterns and Route Choice using Big Data Analytics

    Get PDF
    The transportation system is crucial for transferring people and goods from point A to point B. However, its reliability can be decreased by unanticipated congestion resulting from planned special events. For example, sporting events collect large crowds of people at specific venues on game days and disrupt normal traffic patterns. The goal of this study was to understand issues related to road traffic management during major sporting events by using widely available INRIX data to compare travel patterns and behaviors on game days against those on normal days. A comprehensive analysis was conducted on the impact of all Nebraska Cornhuskers football games over five years on traffic congestion on five major routes in Nebraska. We attempted to identify hotspots, the unusually high-risk zones in a spatiotemporal space containing traffic congestion that occur on almost all game days. For hotspot detection, we utilized a method called Multi-EigenSpot, which is able to detect multiple hotspots in a spatiotemporal space. With this algorithm, we were able to detect traffic hotspot clusters on the five chosen routes in Nebraska. After detecting the hotspots, we identified the factors affecting the sizes of hotspots and other parameters. The start time of the game and the Cornhuskers’ opponent for a given game are two important factors affecting the number of people coming to Lincoln, Nebraska, on game days. Finally, the Dynamic Bayesian Networks (DBN) approach was applied to forecast the start times and locations of hotspot clusters in 2018 with a weighted mean absolute percentage error (WMAPE) of 13.8%

    Leveraging Personal Navigation Assistant Systems Using Automated Social Media Traffic Reporting

    Full text link
    Modern urbanization is demanding smarter technologies to improve a variety of applications in intelligent transportation systems to relieve the increasing amount of vehicular traffic congestion and incidents. Existing incident detection techniques are limited to the use of sensors in the transportation network and hang on human-inputs. Despite of its data abundance, social media is not well-exploited in such context. In this paper, we develop an automated traffic alert system based on Natural Language Processing (NLP) that filters this flood of information and extract important traffic-related bullets. To this end, we employ the fine-tuning Bidirectional Encoder Representations from Transformers (BERT) language embedding model to filter the related traffic information from social media. Then, we apply a question-answering model to extract necessary information characterizing the report event such as its exact location, occurrence time, and nature of the events. We demonstrate the adopted NLP approaches outperform other existing approach and, after effectively training them, we focus on real-world situation and show how the developed approach can, in real-time, extract traffic-related information and automatically convert them into alerts for navigation assistance applications such as navigation apps.Comment: This paper is accepted for publication in IEEE Technology Engineering Management Society International Conference (TEMSCON'20), Metro Detroit, Michigan (USA

    AXTAR: Mission Design Concept

    Full text link
    The Advanced X-ray Timing Array (AXTAR) is a mission concept for X-ray timing of compact objects that combines very large collecting area, broadband spectral coverage, high time resolution, highly flexible scheduling, and an ability to respond promptly to time-critical targets of opportunity. It is optimized for submillisecond timing of bright Galactic X-ray sources in order to study phenomena at the natural time scales of neutron star surfaces and black hole event horizons, thus probing the physics of ultradense matter, strongly curved spacetimes, and intense magnetic fields. AXTAR's main instrument, the Large Area Timing Array (LATA) is a collimated instrument with 2-50 keV coverage and over 3 square meters effective area. The LATA is made up of an array of supermodules that house 2-mm thick silicon pixel detectors. AXTAR will provide a significant improvement in effective area (a factor of 7 at 4 keV and a factor of 36 at 30 keV) over the RXTE PCA. AXTAR will also carry a sensitive Sky Monitor (SM) that acts as a trigger for pointed observations of X-ray transients in addition to providing high duty cycle monitoring of the X-ray sky. We review the science goals and technical concept for AXTAR and present results from a preliminary mission design study.Comment: 19 pages, 10 figures, to be published in Space Telescopes and Instrumentation 2010: Ultraviolet to Gamma Ray, Proceedings of SPIE Volume 773

    A data driven method for congestion mining using big data analytic

    Get PDF
    Congestion detection is one of the key steps to reduce delays and associated costs in traffic management. With the increasing usage of GPS base navigation, promising speed data is now available. This study utilizes such extensive historical probe data to detect spatiotemporal congestion by mining historical speed data. The detected congestion were further classified as Recurrent and Non Recurrent Congestion (RC, NRC). This paper presents a big data driven expert system for identifying both recurrent and non-recurrent congestion and analyzing the delay and cost associated with them. For this purpose, first normal and anomalous days were classified based on travel rate distribution. Later, we utilized Bayesian change point detection to segment speed signal and detect temporal congestion. Finally according to the type of congestion summary statistics and performance measures including (delays, delay cost, and congestion hours) were analyzed. In this study, a statistical big data mining methodology is developed and the robustness of the proposed methodology is tested on probe data for 2016 calendar year, in Des Moines region, Iowa, US. The proposed framework is self adaptive because it does not rely on additional information for detecting spatio-temporal congestion. Therefore, it addresses the limits of prior work in NRC detection. The optimum value for congestion percentage threshold is identified by Elbow cut off method and speed values were temporally denoise

    Interactive, multi-purpose traffic prediction platform using connected vehicles dataset

    Get PDF
    Traffic congestion is a perennial issue because of the increasing traffic demand yet limited budget for maintaining current transportation infrastructure; let alone expanding them. Many congestion management techniques require timely and accurate traffic estimation and prediction. Examples of such techniques include incident management, real-time routing, and providing accurate trip information based on historical data. In this dissertation, a speech-powered traffic prediction platform is proposed, which deploys a new deep learning algorithm for traffic prediction using Connected Vehicles (CV) data. To speed-up traffic forecasting, a Graph Convolution -- Gated Recurrent Unit (GC-GRU) architecture is proposed and analysis of its performance on tabular data is compared to state-of-the-art models. GC-GRU's Mean Absolute Percentage Error (MAPE) was very close to Transformer (3.16 vs 3.12) while achieving the fastest inference time and a six-fold faster training time than Transformer, although Long-Short-Term Memory (LSTM) was the fastest in training. Such improved performance in traffic prediction with a shorter inference time and competitive training time allows the proposed architecture to better cater to real-time applications. This is the first study to demonstrate the advantage of using multiscale approach by combining CV data with conventional sources such as Waze and probe data. CV data was better at detecting short duration, Jam and stand-still incidents and detected them earlier as compared to probe. CV data excelled at detecting minor incidents with a 90 percent detection rate versus 20 percent for probes and detecting them 3 minutes faster. To process the big CV data faster, a new algorithm is proposed to extract the spatial and temporal features from the CSV files into a Multiscale Data Analysis (MDA). The algorithm also leverages Graphics Processing Unit (GPU) using the Nvidia Rapids framework and Dask parallel cluster in Python. The results show a seventy-fold speedup in the data Extract, Transform, Load (ETL) of the CV data for the State of Missouri of an entire day for all the unique CV journeys (reducing the processing time from about 48 hours to 25 minutes). The processed data is then fed into a customized UNet model that learns highlevel traffic features from network-level images to predict large-scale, multi-route, speed and volume of CVs. The accuracy and robustness of the proposed model are evaluated by taking different road types, times of day and image snippets of the developed model and comparable benchmarks. To visually analyze the historical traffic data and the results of the prediction model, an interactive web application powered by speech queries is built to offer accurate and fast insights of traffic performance, and thus, allow for better positioning of traffic control strategies. The product of this dissertation can be seamlessly deployed by transportation authorities to understand and manage congestions in a timely manner.Includes bibliographical references

    Developing Sampling Strategies and Predicting Freeway Travel Time Using Bluetooth Data

    Get PDF
    Accurate, reliable, and timely travel time is critical to monitor transportation system performance and assist motorists with trip-making decisions. Travel time is estimated using the data from various sources like cellular technology, automatic vehicle identification (AVI) systems. Irrespective of sources, data have characteristics in terms of accuracy and reliability shaped by the sampling rate along with other factors. As a probe based AVI technology, Bluetooth data is not immune to the sampling issue that directly affects the accuracy and reliability of the information it provides. The sampling rate can be affected by the stochastic nature of traffic state varying by time of day. A single outlier may sharply affect the travel time. This study brings attention to several crucial issues - intervals with no sample, minimum sample size and stochastic property of travel time, that play pivotal role on the accuracy and reliability of information along with its time coverage. It also demonstrates noble approaches and thus, represents a guideline for researchers and practitioner to select an appropriate interval for sample accumulation flexibly by set up the threshold guided by the nature of individual researches’ problems and preferences. After selection of an appropriate interval for sample accumulation, the next step is to estimate travel time. Travel time can be estimated either based on arrival time or based on departure time of corresponding vehicle. Considering the estimation procedure, these two are defined as arrival time based travel time (ATT) and departure time based travel time (DTT) respectively. A simple data processing algorithm, which processed more than a hundred million records reliably and efficiently, was introduced to ensure accurate estimation of travel time. Since outlier filtering plays a pivotal role in estimation accuracy, a simplified technique has proposed to filter outliers after examining several well-established outlier-filtering algorithms. In general, time of arrival is utilized to estimate overall travel time; however, travel time based on departure time (DTT) is more accurate and thus, DTT should be treated as true travel time. Accurate prediction is an integral component of calculating DTT, as real-time DTT is not available. The performances of Kalman filter (KF) were compared to corresponding modeling techniques; both link and corridor based, and concluded that the KF method offers superior prediction accuracy in link-based model. This research also examined the effect of different noise assumptions and found that the steady noise computed from full-dataset leads to the most accurate prediction. Travel time prediction had a 4.53% mean absolute percentage of error due to the effective application of KF

    Using the Active Collimator and Shield Assembly of an EXIST-Type Mission as a Gamma-Ray Burst Spectrometer

    Full text link
    The Energetic X-ray Imaging Survey Telescope (EXIST) is a mission design concept that uses coded masks seen by Cadmium Zinc Telluride (CZT) detectors to register hard X-rays in the energy region from 10 keV to 600 keV. A partially active or fully active anti-coincidence shield/collimator with a total area of between 15 and 35 square meters will be used to define the field of view of the CZT detectors and to suppress the background of cosmic-ray-induced events. In this paper, we describe the use of a sodium activated cesium iodide shield/collimator to detect gamma-ray bursts (GRBs) and to measure their energy spectra in the energy range from 100 keV up to 10 MeV. We use the code GEANT4 to simulate the interactions of photons and cosmic rays with the spacecraft and instrument and the code DETECT2000 to simulate the optical properties of the scintillation detectors. The shield collimator achieves a nu-F-nu sensitivity of 3 x 10^(-9) erg cm^(-2) s^(-1) and 2 x 10^(-8) erg cm^(-2) s^(-1) at 100 keV and 600 keV, respectively. The sensitivity is well matched to that of the coded mask telescope. The broad energy coverage of an EXIST-type mission with active shields will constrain the peak of the spectral energy distribution (SED) for a large number of GRBs. The measurement of the SED peak may be key for determining photometric GRB redshifts and for using GRBs as cosmological probes.Comment: 20 pages, 10 Figures, Accepted May 19, 2006 A&

    A framework for smart traffic management using heterogeneous data sources

    Get PDF
    A thesis submitted in partial fulfilment of the requirements of the University of Wolverhampton for the degree of Doctor of Philosophy.Traffic congestion constitutes a social, economic and environmental issue to modern cities as it can negatively impact travel times, fuel consumption and carbon emissions. Traffic forecasting and incident detection systems are fundamental areas of Intelligent Transportation Systems (ITS) that have been widely researched in the last decade. These systems provide real time information about traffic congestion and other unexpected incidents that can support traffic management agencies to activate strategies and notify users accordingly. However, existing techniques suffer from high false alarm rate and incorrect traffic measurements. In recent years, there has been an increasing interest in integrating different types of data sources to achieve higher precision in traffic forecasting and incident detection techniques. In fact, a considerable amount of literature has grown around the influence of integrating data from heterogeneous data sources into existing traffic management systems. This thesis presents a Smart Traffic Management framework for future cities. The proposed framework fusions different data sources and technologies to improve traffic prediction and incident detection systems. It is composed of two components: social media and simulator component. The social media component consists of a text classification algorithm to identify traffic related tweets. These traffic messages are then geolocated using Natural Language Processing (NLP) techniques. Finally, with the purpose of further analysing user emotions within the tweet, stress and relaxation strength detection is performed. The proposed text classification algorithm outperformed similar studies in the literature and demonstrated to be more accurate than other machine learning algorithms in the same dataset. Results from the stress and relaxation analysis detected a significant amount of stress in 40% of the tweets, while the other portion did not show any emotions associated with them. This information can potentially be used for policy making in transportation, to understand the users��� perception of the transportation network. The simulator component proposes an optimisation procedure for determining missing roundabouts and urban roads flow distribution using constrained optimisation. Existing imputation methodologies have been developed on straight section of highways and their applicability for more complex networks have not been validated. This task presented a solution for the unavailability of roadway sensors in specific parts of the network and was able to successfully predict the missing values with very low percentage error. The proposed imputation methodology can serve as an aid for existing traffic forecasting and incident detection methodologies, as well as for the development of more realistic simulation networks
    corecore