76 research outputs found
Measuring the Quality of Arterial Traffic Signal Timing – A Trajectory-based Methodology
Evaluating the benefits from traffic signal timing is of increasing interest to transportation policymakers, operators, and the public as integrating performance measurements with agencies’ daily signal timing management has become a top priority. This dissertation presents a trajectory-based methodology for evaluating the quality of arterial signal timing, a critical part of signal operations that promises reduced travel time and fewer vehicle stops along arterials as well as improved travelers’ perception of transportation services. The proposed methodology could significantly contribute to performance-oriented signal timing practices by addressing challenges regarding which performance measures should be selected, how performance measurements can be performed cost-effectively, and how to make performance measures accessible to people with limited knowledge of traffic engineering. A review of the current state of practice and research was conducted first, indicating an urgent research need for developing an arterial-level methodology for signal timing performance assessments as the established techniques are mostly based on by-link or by-movement metrics. The literature review also revealed deficiencies of existing performance measures pertaining to traffic signal timing. Accordingly, travel-run speed and stop characteristics, which can be extracted from vehicle GPS trajectories, were selected to measure the quality of arterial signal timing in this research.Two performance measures were then defined based on speed and stop characteristics: the attainability of ideal progression (AIP) and the attainability of user satisfaction (AUS). In order to determine AIP and AUS, a series of investigations and surveys were conducted to characterize the effects of non-signal-timing-related factors (e.g., arterial congestion level) on average travel speed as well as how stops may affect travelers’ perceived quality of signal timing. AIP was calculated considering the effects of non-signal-timing-related factors, and AUS accounted for the changes in the perceived quality of signal timing due to various stop circumstances.Based upon AIP and AUS, a grade-based performance measurement methodology was developed. The methodology included AIP scoring, AUS scoring, and two scoring adjustments. The two types of scoring adjustments further improved the performance measurement results considering factors such as cross-street delay, pedestrian delays, and arterial geometry. Furthermore, the research outlined the process for implementing the proposed methodology, including the necessary data collection and the preliminary examination of the applicable conditions. Case studies based on real-world signal re-timing projects were presented to demonstrate the effectiveness of the proposed methodology in enhancing agencies’ capabilities of cost-effectively monitoring the quality of arterial signal timing, actively addressing signal timing issues, and reporting the progress and outcomes in a concise and intuitive manner
Recommended from our members
A data-driven methodology for prioritizing traffic signal retiming operations
Signal retiming is one of the chief responsibilities of municipal transportation agencies and is an important means for reducing congestion and improving transportation quality and reliability. Many agencies conduct signal retiming and adjustment in a schedule-based manner. However, leveraging a data-driven, need-based approach to the prioritization of signal retiming operations could better optimize use of agency resources. Additionally, the growing availability of probe vehicle data has made it an increasingly popular tool for use in roadway performance measurement. This thesis presents a methodology for utilizing segment-level probe-based speed data to rank the performance of traffic signal corridors for retiming purposes. This methodology is then demonstrated in an analysis of 79 traffic signal corridors maintained by the City of Austin, Texas. The analysis considers 15-minute speed records for all weekdays in September 2016 and September 2017 to compute metrics and rank corridors based on their relative performance across time periods. The results show that the ranking methodology compares corridors equitably despite differences in road length, functional class, and traffic signal density. Additionally, results indicate that the corridors prioritized by the ranking methodology represent a much greater potential for improving travel time than the corridors selected under the schedule-based approach. This methodology is then packaged into a web-based tool for integration into agency decision-making. Finally, consideration is given to how this methodology might be used to identify candidate corridors for implementing adaptive signal control techniques.Civil, Architectural, and Environmental Engineerin
Developing Emergency Preparedness Plans For Orlando International Airport (MCO) Using Microscopic Simulator WATSim
Emergency preparedness typically involves the preparation of detailed plans that can be implemented in response to a variety of possible emergencies or disruptions to the transportation system. One shortcoming of past response plans was that they were based on only rudimentary traffic analysis or in many cases none at all. With the advances in traffic simulation during the last decade, it is now possible to model many traffic problems, such as emergency management, signal control and testing of Intelligent Transportation System technologies. These problems are difficult to solve using the traditional tools, which are based on analytical methods. Therefore, emergency preparedness planning can greatly benefit from the use of micro-simulation models to evaluate the impacts of natural and man-made incidents and assess the effectiveness of various responses. This simulation based study assessed hypothetical emergency preparedness plans and what geometric and/or operational improvements need to be done in response to emergency incidents. A detailed framework outlining the model building, calibration and validation of the model using microscopic traffic simulation model WATSim (academic version) is provided. The Roadway network data consists of geometric layout of the network, number of lanes, intersection description which include the turning bays, signal timings, phasing sequence, turning movement information etc. The network in and around the OIA region is coded into WATSim with 3 main signalized intersections, 180 nodes and 235 links. The travel demand data includes the vehicle counts in each link of the network and was modeled as percentage turning count movements. After the OIA network was coded into WATSim, the road network was calibrated and validated for the peak hour mostly obtained from ADT with 8% K factor by comparing the simulated and actual link counts at 15 different key locations in the network and visual verification done. Ranges of scenarios were tested that includes security checkpoint, route diversion incase of incident in or near the airport and increasing demand on the network. Travel time, maximum queue length and delay were used as measures of effectiveness and the results tabulated. This research demonstrates the potential benefits of using microscopic simulation models when developing emergency preparedness strategies. In all 4 main Events were modeled and analyzed. In Event 1, occurrence of 15 minutes traffic incident on a section of South Access road was simulated and its impact on the network operations was studied. The averaged travel time under the incident duration to Side A was more than doubled (29 minutes, more than a 100% increase) compared to the base case and similarly that of Side B two and a half times more (23 minutes, also more than a 100% increase). The overall network performance in terms of delay was found to be 231.09 sec/veh. and baseline 198.9 sec/veh. In Event 2, two cases with and without traffic diversions were assumed and evaluated under 15 minutes traffic incident modeled at the same link and spot as in Event 1. It was assumed that information about the traffic incident was disseminated upstream of the incident 2 minutes after the incident had occurred. This scenario study demonstrated that on the average, 17% (4 minutes) to 41% (12 minutes) per vehicle of travel time savings are achieved when real-time traffic information was provided to 26% percent of the drivers diverted. The overall network performance in delay for this event was also found to improve significantly (166.92 sec/veh). These findings led to the conclusion that investment in ITS technologies that support dissemination of traffic information (such as Changeable Message Signs, Highway Advisory Radio, etc) would provide a great advantage in traffic management under emergency situations and road diversion strategies. Event 3 simulated a Security Check point. It was observed that on the average, travel times to Sides A and B was 3 and 5 minutes more respectively compared to its baseline. Averaged queue length of 650 feet and 890 feet worst case was observed. Event 4 determined when and where the network breaks down when loaded. Among 10 sets of demand created, the network appeared to be breaking down at 30% increase based on the network-wide delay and at 15% based on Level of Service (LOS). The 90% increase appeared to have the most effect on the network with a total network-wide delay close to 620 seconds per vehicle which is 3 and a half times compared to the baseline. Conclusions and future scope were provided to ensure continued safe and efficient traffic operations inside and outside the Orlando International Airport region and to support efficient and informed decision making in the face of emergency situations
Organic traffic control
Modern cities cannot be imagined without traffic lights controlling the road network. To handle the network\u27s changing demands efficiently, the signal plan specification needs to be shifted from the design time to the run-time of a signal system. The generic observer/controller architecture proposed for Organic Computing facilitates this shift. A two-levelled learning mechanism optimises signal plans on-line while a distributed coordination mechanism establishes green waves in the road network
New Signal Priority Strategies to Improve Public Transit Operations
Rapid urbanization is causing severe congestion on road transport networks around the world. Improving service and attracting more travellers could be part of the solution. In urban areas, improving public transportation efficiency and reliability can reduce traffic congestion and improve transportation system performance. By facilitating public buses' movement through traffic signal-controlled intersections, a Transit Signal Priority (TSP) strategy can contribute to the reduction of queuing time at intersections. In the last decade, studies have focused on TSP systems to help public transportation organizations attract more travellers. However, the traditional TSP also has a significant downside; it is detrimental to non-prioritized movements and other transport modes.
This research proposes new TSP strategies that account for the number of passengers on board as well as the real-time adherence of buses to their present schedules. Two methods have been proposed. First, buses are prioritized based on their load and their adherence to their schedules, while in the second method, the person delay at an intersection is optimized. The optimization approach in the first method uses a specific priority for public transit, while additional parameters are considered in the second method, like residual queue and arrival rate at the intersection. One of this research's main contributions is providing insight into the benefits of these new TSP methods along a corridor and on an isolated signalized intersection. The proposed methods need real-time information on transit operations, traffic signals status and vehicular flows. The lack of readily available infrastructure to provide all these data is compensated by using a traffic simulator, VISSIM, for an isolated intersection and an arterial corridor. The study area simulation results indicated that the new TSP methods performed better than the conventional TSP. For the investigated study area, it was shown that the second method performed better in an isolated signalized intersection, while the first method reduced traffic and environmental indices when used for an arterial corridor.
Future research can investigate the effects of the proposed methodology on the urban network by using macrosimulation to see the effects of the proposed TSP on the network. Also, considering conflicting TSP requests in these methodologies could be another area for further research
Broadening Understanding of Roundabout Operation Analysis: Planning-Level Tools and Signal Application
In United States, roundabouts have recently emerged as an effective and efficient alternative to conventional signalized intersections for the control of traffic at junctions. This thesis includes two investigations related to the operations of roundabouts.
The first investigation examines the ability of a planning-level tool (the critical sum method) to serve as an indicator variable for the results of the Highway Capacity Manual’s average delay per vehicle measure for a roundabout facility; to what extent do the results of one predict the results of the other? The critical sum method was found to accurately predict the HCM average delay per vehicle for low-volume conditions, approximately up to an average delay of 15 seconds per vehicle, but the tool was found to provide inaccurate predictions for higher volume conditions.
The second investigation looks at the potential of metering signals on a roundabout facility to transfer excess capacity from a low-volume approach to an adjacent higher-volume approach. The analysis indicated positive results for the theoretical benefits of the metering signal when only placing simulated traffic on two of the approaches, but the results were not duplicated when analyzing more-realistic volume scenarios with traffic on all four approaches.
Advisor: John Sangster
Recommended from our members
Dynamic traffic assignment-based modeling paradigms for sustainable transportation planning and urban development
textTransportation planning and urban development in the United States have synchronously emerged over the past few decades to encompass goals associated with sustainability, improved connectivity, complete streets and mitigation of environmental impacts. These goals have evolved in tandem with some of the relatively more traditional objectives of supply-side improvements such as infrastructure and capacity expansion. Apart from the numerous federal regulations in the US transportation sector that reassert sustainability motivations, metropolitan planning organizations and civic societies face similar concerns in their decision-making and policy implementation. However, overall transportation planning to incorporate these wide-ranging objectives requires characterization of large-scale transportation systems and traffic flow through them, which is dynamic in nature, computationally intense and a non-trivial problem.
Thus, these contemporary questions lie at the interface of transportation planning, urban development and sustainability planning. They have the potential of being effectively addressed through state-of-the-art transportation modeling tools, which is the main motivation and philosophy of this thesis. From the research standpoint, some of these issues have been addressed in the past typically from the urban design, built-environment, public health and vehicle technology and mostly qualitative perspectives, but not as much from the traffic engineering and transportation systems perspective---a gap in literature which the thesis aims to fill. Specifically, it makes use of simulation-based dynamic traffic assignment (DTA) to develop modeling paradigms and integrated frameworks to seamlessly incorporate these in the transportation planning process. In addition to just incorporating them in the planning process, DTA-based paradigms are able to accommodate numerous spatial and temporal dynamics associated with system traffic, which more traditional static models are not able to. Besides, these features are critical in the context of the planning questions of this study.
Specifically, systemic impacts of suburban and urban street pattern developments typically found in US cities in past decades of the 20th century have been investigated. While street connectivity and design evolution is mostly regulated through local codes and subdivision ordinances, its impacts on traffic and system congestion requires modeling and quantitative evidence which are explored in this thesis. On the environmental impact mitigation side, regional emission inventories from the traffic sector have also been quantified. Novel modeling approaches for the street connectivity-accessibility problem are proposed. An integrated framework using the Environmental Protection Agency's regulatory MOVES model has been developed, combining it with mesoscopic-level DTA simulation. Model demonstrations and applications on real and large-sized study areas reveal that different levels of connectivity and accessibility have substantial impacts on system-wide traffic---as connectivity levels reduce, traffic and congestion metrics show a gradually increasing trend. As regards emissions, incorporation of dynamic features leads to more realistic emissions inventory generation compared to default databases and modules, owing to consideration of the added dynamic features of system traffic and region-specific conditions. Inter-dependencies among these sustainability planning questions through the common linkage of traffic dynamics are also highlighted.
In summary, the modeling frameworks, analyses and findings in the thesis contribute to some ongoing debates in planning studies and practice regarding ideal urban designs, provisions of sustainability and complete streets. Furthermore, the integrated emissions modeling framework, in addition to sustainability-related contributions, provides important tools to aid MPOs and state agencies in preparation of state implementation plans for demonstrating conformity to national ambient air-quality standards in their regions and counties. This is a critical condition for them to receive federal transportation funding.Civil, Architectural, and Environmental Engineerin
Recommended from our members
Interoperability of wireless communication technologies in hybrid networks: Evaluation of end-to-end interoperability issues and quality of service requirements
This thesis was submitted for the degree of Doctor of Philosophy and awarded by Brunel University.Hybrid Networks employing wireless communication technologies have nowadays brought closer the vision of communication “anywhere, any time with anyone”. Such communication technologies consist of various standards, protocols, architectures, characteristics, models, devices, modulation and coding techniques. All these different technologies naturally may share some common characteristics, but there are also many important differences. New advances in these technologies are emerging very rapidly, with the advent of new models, characteristics, protocols and architectures. This rapid evolution imposes many challenges and issues to be addressed, and of particular importance are the interoperability issues of the following wireless technologies: Wireless Fidelity (Wi-Fi) IEEE802.11, Worldwide Interoperability for Microwave Access (WiMAX) IEEE 802.16, Single Channel per Carrier (SCPC), Digital Video Broadcasting of Satellite (DVB-S/DVB-S2), and Digital Video Broadcasting Return Channel through Satellite (DVB-RCS). Due to the differences amongst wireless technologies, these technologies do not generally interoperate easily with each other because of various interoperability and Quality of Service (QoS) issues.
The aim of this study is to assess and investigate end-to-end interoperability issues and QoS requirements, such as bandwidth, delays, jitter, latency, packet loss, throughput, TCP performance, UDP performance, unicast and multicast services and availability, on hybrid wireless communication networks (employing both satellite broadband and terrestrial wireless technologies).
The thesis provides an introduction to wireless communication technologies followed by a review of previous research studies on Hybrid Networks (both satellite and terrestrial wireless technologies, particularly Wi-Fi, WiMAX, DVB-RCS, and SCPC). Previous studies have discussed Wi-Fi, WiMAX, DVB-RCS, SCPC and 3G technologies and their standards as well as their properties and characteristics, such as operating frequency, bandwidth, data rate, basic configuration, coverage, power, interference, social issues, security problems, physical and MAC layer design and development issues. Although some previous studies provide valuable contributions to this area of research, they are limited to link layer characteristics, TCP performance, delay, bandwidth, capacity, data rate, and throughput. None of the studies cover all aspects of end-to-end interoperability issues and QoS requirements; such as bandwidth, delay, jitter, latency, packet loss, link performance, TCP and UDP performance, unicast and multicast performance, at end-to-end level, on Hybrid wireless networks.
Interoperability issues are discussed in detail and a comparison of the different technologies and protocols was done using appropriate testing tools, assessing various performance measures including: bandwidth, delay, jitter, latency, packet loss, throughput and availability testing. The standards, protocol suite/ models and architectures for Wi-Fi, WiMAX, DVB-RCS, SCPC, alongside with different platforms and applications, are discussed and compared. Using a robust approach, which includes a new testing methodology and a generic test plan, the testing was conducted using various realistic test scenarios on real networks, comprising variable numbers and types of nodes. The data, traces, packets, and files were captured from various live scenarios and sites. The test results were analysed in order to measure and compare the characteristics of wireless technologies, devices, protocols and applications.
The motivation of this research is to study all the end-to-end interoperability issues and Quality of Service requirements for rapidly growing Hybrid Networks in a comprehensive and systematic way.
The significance of this research is that it is based on a comprehensive and systematic investigation of issues and facts, instead of hypothetical ideas/scenarios or simulations, which informed the design of a test methodology for empirical data gathering by real network testing, suitable for the measurement of hybrid network single-link or end-to-end issues using proven test tools.
This systematic investigation of the issues encompasses an extensive series of tests measuring delay, jitter, packet loss, bandwidth, throughput, availability, performance of audio and video session, multicast and unicast performance, and stress testing. This testing covers most common test scenarios in hybrid networks and gives recommendations in achieving good end-to-end interoperability and QoS in hybrid networks.
Contributions of study include the identification of gaps in the research, a description of interoperability issues, a comparison of most common test tools, the development of a generic test plan, a new testing process and methodology, analysis and network design recommendations for end-to-end interoperability issues and QoS requirements. This covers the complete cycle of this research.
It is found that UDP is more suitable for hybrid wireless network as compared to TCP, particularly for the demanding applications considered, since TCP presents significant problems for multimedia and live traffic which requires strict QoS requirements on delay, jitter, packet loss and bandwidth. The main bottleneck for satellite communication is the delay of approximately 600 to 680 ms due to the long distance factor (and the finite speed of light) when communicating over geostationary satellites.
The delay and packet loss can be controlled using various methods, such as traffic classification, traffic prioritization, congestion control, buffer management, using delay compensator, protocol compensator, developing automatic request technique, flow scheduling, and bandwidth allocation
Dual-State Kalman Filter Forecasting and Control Theory Applications for Proactive Ramp Metering
Deterioration of freeway traffic flow condition due to bottlenecks can be ameliorated with ramp metering. A challenge in ramp metering is that it is not possible to process data in real-time and use the output in a control algorithm. This is due to the fact that by the time processing is completed and a control measure applied, the traffic state will have changed. A solution to this problem is to forecast the traffic state and implement a control measure based on the forecast.
A dual-state Kalman filter was used to forecast traffic data at two locations on a freeway (I-84). A Kalman filter is an optimal recursive data processing algorithm; predictions are based on only the previous time-step’s prediction and all previous data do not need to be stored and reprocessed with new measurements. A coordinated feedback ramp metering control logic was implemented. The closed-loop system seeks to control the traffic density on the mainline while minimizing on-ramp queues through weighting functions.
The integration of the Kalman filter with the ramp meter control logic accomplishes the ramp meter algorithmic scheme in which is proactive to changes in freeway conditions by controlling a forecasted state. In this closed-loop framework, real-time forecasts are produced with a continuously updated prediction that minimizes errors and recursively improves with each successive measurement. MATLAB was used to model the closed-loop control system as well as modify the input output constraints to evaluate and tune controller performance
Recommended from our members
An empirical delay model for application in unsignalized intersections in dynamic traffic assignment
textUp until recently, unsignalized nodes have been either ignored or inadequately represented in Dynamic Traffic Assignment (DTA) models. This is due to the difficult nature of incorporating internal node conflicts into dynamic flow models. It was thought or assumed that these nodes had little impact on overall model results, but evidence from testing in Visual Interactive System for Transportation Algorithms (VISTA), a DTA model, reveals that may not be the case. This paper explores recent attempts at characterizing stop sign effects within DTA flow models. From previous studies, it has been found that incorporating these unsignalized and priority movements internal to the flow model requires large amounts of computational power, are challenging to make efficient, and lead to a multiple or infinite solution space. Based on these findings, a deterministic approach is both impractical and likely impossible in the existing framework of the Cell Transmission (CTM) and Link Transmission (LTM) models commonly used in DTA. Thus, a method of utilizing empirical relationships based on information readily available in these models may be a more acceptable approach. Microsimulation is much more suitable for modeling these types of interactions and is capable of producing results near to reality. For this reason, microsimulation was chosen as a viable method for developing empirical relationships of such complex interactions to then be used as inputs into the macroscopic flow models of DTA. This paper presents a model developed to calculate delays expected by vehicles at stop approaches based on information that can be taken from a dynamic flow model such as CTM and LTM models. This model is validated by video data recorded and analyzed for accuracy. Potential uses and probable implementations of the model are explored to appropriately incorporate unsignalized and priority movements into existing flow models.Civil, Architectural, and Environmental Engineerin
- …