15,668 research outputs found

    Understanding Micro-Level Lane Change and Lane Keeping Driving Decisions: Harnessing Big Data Streams from Instrumented Vehicles

    Get PDF
    It is important to get a deeper understanding of instantaneous driving behaviors, especially aggressive and extreme driving behaviors such as hard acceleration, as they endanger traffic efficiency and safety by creating unstable flows and dangerous situations. The aim of the dissertation is to understand micro-level instantaneous driving decisions related to lateral movements such as lane change or lane keeping events on various roadway types. The impacts of these movements are fundamental to microscopic traffic flow and safety. Sufficient geo-referenced data collected from connected vehicles enables analysis of these driving decisions. The “Big Data” cover vehicle trajectories, reported at 10 Hz frequency, and driving situations, which make it possible to establish a framework.The dissertation conducts several key analyses by applying advanced statistical modeling and data mining techniques. First, the dissertation proposes an innovative methodology for identifying normal and extreme lane change events by analyzing the lane-based vehicle positions, e.g., sharp changes in distance of vehicle centerline relative to the lane boundaries, and vehicle motions captured by the distributions of instantaneous lateral acceleration and speed. Second, since surrounding driving behavior influences instantaneous lane keeping behaviors, the dissertation investigates correlations between different driving situations and lateral shifting volatility, which quantifies the variability in instantaneous lateral displacements. Third, the dissertation analyzes the “Gossip effect” which captures the peer influence of surrounding vehicles on the instantaneous driving decisions of subject vehicles at micro-level. Lastly, the dissertation explores correlations between lane change crash propensity or injury severity and driving volatility, which quantifies the fluctuation variability in instantaneous driving decisions.The research findings contribute to the ongoing theoretical and policy debates regarding the effects of instantaneous driving movements. The main contributions of this dissertation are: 1) Quantification of instantaneous driving decisions with regard to two aspects: vehicle motions (e.g., lateral and longitudinal acceleration, and vehicle speed) and lateral displacement; 2) Extraction of critical information embedded in large-scale trajectory data; and 3) An understanding of the correlations between lane change outcomes and instantaneous lateral driving decisions

    Exploring Statistical and Machine Learning-Based Missing Data Imputation Methods to Improve Crash Frequency Prediction Models for Highway-Rail Grade Crossings

    Get PDF
    Highway-rail grade crossings (HRGCs) are critical spatial locations of transportation safety because crashes at HRGCs are often catastrophic, potentially causing several injuries and fatalities. Every year in the United States, a significant number of crashes occur at these crossings, prompting local and state organizations to engage in safety analysis and estimate crash frequency prediction models for resource allocation. These models provide valuable insights into safety and risk mitigation strategies for HRGCs. Furthermore, the estimation of these models is based on inventory details of HRGCs, and their quality is crucial for reliable crash predictions. However, many of these models exclude crossings with missing inventory details, which can adversely affect the precision of these models. In this study, a random sample of inventory details of 2000 HRGCs was taken from the Federal Railroad Administration’s HRGCs inventory database. Data filters were applied to retain only those crossings in the data that were at-grade, public and operational (N=1096). Missing values were imputed using various statistical and machine learning methods, including Mean, Median and Mode (MMM) imputation, Last Observation Carried Forward (LOCF) imputation, K-Nearest Neighbors (KNN) imputation, Expectation-Maximization (EM) imputation, Support Vector Machine (SVM) imputation, and Random Forest (RF) imputation. The results indicated that the crash frequency models based on machine learning imputation methods yielded better-fitted models (lower AIC and BIC values). The findings underscore the importance of obtaining complete inventory data through machine learning imputation methods when developing crash frequency models for HRGCs. This approach can substantially enhance the precision of these models, improving their predictive capabilities, and ultimately saving valuable human lives

    Analyzing Severity of Vehicle Crashes at Highway-Rail Grade Crossings: Multinomial Logit Modeling

    Get PDF
    The purpose of this paper is to develop a nominal response multinomial logit model (MNLM) to identify factors that are important in making an injury severity difference and to explore the impact of such explanatory variables on three different severity levels of vehicle-related crashes at highway-rail grade crossings (HRGCs) in the United States. Vehicle-rail and pedestrian-rail crash data on USDOT highway-rail crossing inventory and public crossing sites from 2005 to 2012 are used in this study. A multinomial logit model is developed using SAS PROC LOGISTICS procedure and marginal effects are also calculated. The MNLM results indicate that when rail equipment with high speed struck a vehicle, the chance of a fatality resulting increased. The study also reveals that vehicle pick-up trucks, concrete, and rubber surfaces were more likely to be involved in more severe crashes. On the other hand, truck-trailer vehicles in snow and foggy weather conditions, development area types (residential, commercial, industrial, and institutional), and higher daily traffic volumes were more likely to be involved in less severe crashes. Educating and equipping drivers with good driving habits and short-term law enforcement actions, can potentially minimize the chance of severe vehicle crashes at HRGCs

    3D Infrastructure Condition Assessment For Rail Highway Applications

    Get PDF
    Highway roughness is a concern for both the motoring public and highway authorities. Roughness may even increase the risk of crashes. Rail-highway grade crossings are particularly problematic. Roughness may be due to deterioration or simply due to the way the crossing was built to accommodate grade change, local utilities, or rail elevation. With over 216,000 crossings in the US, maintenance is a vast undertaking. While methods are available to quantify highway roughness, no method exists to quantitatively assess the condition of rail crossings. Conventional inspection relies on a labor-intensive process of qualitative judgment. A quantifiable, objective and extensible procedure for rating and prioritizing improvement of crossings is thus desired. In this dissertation, a 3D infrastructure condition assessment model is developed for evaluating the condition and performance of rail highway grade crossings. Various scanning techniques and devices are developed or used to obtain the 3D “point cloud” or surface as the first step towards quantifying crossing roughness. Next, a technique for repeatable field measurement of acceleration is presented and tested to provide a condition index. Acceleration-based metrics are developed, and these can be used to rate and compare crossings for improvement programs to mitigate potential vehicle damage and provide passenger comfort. A vehicle dynamic model is next customized to use surface models to estimate vertical accelerations eliminating the need for field data collection. Following, crossing roughness and rideability is estimated directly from 3D point clouds. This allows isolation of acceleration components derived from the surface condition and original design profile. Finally, a practice ready application of the 3D point cloud is developed and presented to address hump crossing safety. In conclusion, the dissertation presents several methods to assess the condition and performance of rail crossings. It provides quantitative metrics that can be used to evaluate designs and construction methods, and efficiently implement cost effective improvement programs. The metrics provide a technique to measure and monitor system assets over time, and can be extended to other infrastructure components such as pavements and bridges

    The Road to School. The Barcelona case

    Get PDF
    Mobility of the young population between 6 and 10-year-old has been continuously decreasing the last decades causing problems of health (obesity) and decreasing the development of spatial skills along with the sense of community. The paper focuses on the road between school and home and deals with a specific project called “Camino Escolar” (School Road) which supports parents in the decision to authorize their children to go and walk alone. The empirical case is developed in Barcelona where 136 School Road projects exist but more precisely analyses two specific districts. The methodology is divided into two phases. In the first phase, we conduct an exploratory study based on interviews with the different stakeholders of the education system and conclude on a list of barriers against the development of the School Road project. In the second one, we ask for the parents to prioritize these barriers according to their grade of importance. The results show the different barriers can be classified into four clusters which are physical insecurities, emotional insecurities, the city infrastructure quality and the project management quality. These findings help public managers to better manage such kind of project in order to prepare future cities.Peer ReviewedPostprint (author's final draft

    TRA-946: COMPARING CRASH ESTIMATION TECHNIQUES FOR RANKING OF SITES IN A NETWORK SCREENING PROCESS

    Get PDF
    Network screening, a process for an effective and efficient management of road safety programs, relies on crash prediction techniques to quantify the relative risks of given sites. The two most commonly used statistical approaches are- cross-sectional model-based approach and Empirical Bayesian (EB) approach as they are known for reducing regression-to-mean bias problem of a simple crash history-based method. Meanwhile, relatively the EB approach is known to be a robust technique as it accounts for a site-specific risk level while still incorporating the risk estimates obtained from a cross-sectional model. Common to both the approaches is they are relatively convenient to apply and easy to interpret due to a defined mathematical equation used to relate crashes and the potential explanatory variables. However, pre-specification of such relations are challenging as the true cause-effects are not known. One approach is to use trial and error process to select for the final relation. Nonetheless, potential misspecification may still remain which consequently could result in an inaccurate list of crash hotspots in a network screening process. As an alternative to model-based approach, this study applies kernel regression (KR), which is a data-driven nonparametric method. In addition, the KR method is extended in a similar framework of EB approach to account for site-specific risk levels. All these techniques when applied in a case study using crash data from Highway 401 of Ontario, Canada showed that some deviations exist between the methods, particularly when applied in the ranking of sites in a network screening process

    Network-Wide Pedestrian and Bicycle Crash Analysis with Statistical and Machine Learning Models in Utah

    Get PDF
    Recent trends in crashes indicate a dramatic increase in both the number and share of pedestrian and bicyclist injuries and fatalities nationally and in many states. Crash frequency modeling was undertaken to identify crash prone characteristics of segments and non-signalized intersections and explore possible non-linear associations of explanatory variables with crashes. Crowdsourced “Strava” app data was used for bicycle volume, and pedestrian counts estimated from nearby signalized intersections were used as pedestrian volume. Multiple negative binomial models investigated crashes at different spatial scales to account for different levels of data availability and completeness. The models showed high traffic volume, steeper vertical grades on roads, frequent bus and rail stations, greater driveway density, more legs at intersections, streets with high large truck presence, greater residential and employment density, as a larger share of low-income households and non-white race/ethnicity groups are indicators of locations with more pedestrian and bicycle crashes. Crash severity model results showed that crashes occurring at mid-blocks and near vertical grades were more severe compared to crashes at intersections. High daily temperature, driving under influence, and distracted driving also increases injury severity in crashes. This study suggests potential countermeasures, policy implications, and the scope of future research for improving pedestrian and bicycle safety at segments and at non-signalized intersections

    Reliability Improvement On Feasibility Study For Selection Of Infrastructure Projects Using Data Mining And Machine Learning

    Get PDF
    With the progressive development of infrastructure construction, conventional analytical methods such as correlation index, quantifying factors, and peer review are no longer satisfactory in support for decision-making of implementing an infrastructure project in the age of big data. This study proposes using a mathematical model named Fuzzy-Neural Comprehensive Evaluation Model (FNCEM) to improve the reliability of the feasibility study of infrastructure projects by using data mining and machine learning. Specifically, the data collection on time-series data, including traffic videos (278 Gigabytes) and historical weather data, uses transportation cameras and online searching, respectively. Meanwhile, the researcher sent out a questionnaire for the collection of the public opinions upon the influencing factors that an infrastructure project may have. Then, this model implements the backpropagation Artificial Neural Network (BP-ANN) algorithm to simulate traffic flows and generate outputs as partial quantitative references for evaluation. The traffic simulation outputs used as partial inputs to the Analytic Hierarchy Process (AHP) based Fuzzy logic module of the system for the determination of the minimum traffic flows that a construction scheme in corresponding feasibility study should meet. This study bases on a real scenario of constructing a railway-crossing facility in a college town. The research results indicated that BP-ANN was well applied to simulate 15-minute small-scale pedestrian and vehicle flow with minimum overall logarithmic mean squared errors (Log-MSE) of 3.80 and 5.09, respectively. Also, AHP-based Fuzzy evaluation significantly decreased the evaluation subjectivity of selecting construction schemes by 62.5%. It concluded that the FNCEM model has strong potentials of enriching the methodology of conducting a feasibility study of the infrastructure project
    • …
    corecore