8 research outputs found

    Design and Evaluate Coordinated Ramp Metering Strategies for Utah Freeways

    Get PDF
    MPC-641During the past few decades, ramp metering control has been widely implemented in many U.S. states, including Utah. Numerous studies and applications have demonstrated that ramp metering control is an effective strategy to reduce overall freeway congestion by managing the amount of traffic entering the freeway. Ramp metering controllers can be implemented as coordinated or uncoordinated systems. Currently, Utah freeway on-ramps are operated in an uncoordinated way. Despite improvements to the operational efficiency of mainline flows, uncoordinated ramp metering will inevitably create additional delays to the ramp flows. Therefore, this project aims to assist the Utah Department of Transportation (UDOT) in deploying coordinated ramp metering systems and evaluating the performance of deployed systems. First, we leverage a method to identify existing freeway bottlenecks using current UDOT datasets, including PeMs and ClearGuide. Based on this, we select the site that may benefit from coordinated ramp metering from those determined locations. A VISSIM model is then developed for this selected corridor and the VISSIM model is calibrated based on collected traffic flow data. We apply the calibrated VISSIM model to conduct simulations to evaluate system performance under different freeway mainline congestion levels. Finally, the calibrated VISSIM model is leveraged to evaluate the coordinated ramp metering strategy of the bottleneck algorithm from both operational and safety aspects

    A domain-driven method for creating self-adaptive application architecture

    Get PDF
    Following the increasing complexity of modern software systems, software engineers have introduced self-adaptation techniques from the field of control theory into software development. However, it is still difficult to construct self-adaptive software systems. By understanding the importance of software architecture, this dissertation concerns the issues of how to design a domain-specific self-adaptive software application architecture in a principled way. Specifically, there is still lacking of method for helping software engineers generate software architecture which is consistent with the domain knowledge. To achieve the research goal, this dissertation has: 1) investigated the existing definitions about software architecture; 2) proposed a framework of understanding self-adaptive software application architecture via appropriate architectural patterns; 3) proposed a novel high-order language, and the tools, to specify domain-specific uncertainty; 4) proposed an improved version of Grasp, and the tools, so that users can describe the dynamism of a self-adaptive application; 5) proposed a novel architectural pattern by selecting architectural patterns in a principled way; 6) evaluate this work by applying these methods to a business project

    Learning objectives for Department of the Navy entry-level budget analysts (Series GS-560)

    Get PDF
    This thesis identifies learning objectives which Department of the Navy (DoN) entry-level budget analysts should learn during their first year on the job in order to perform effectively and efficiently. It provides various demographics of budget analysts, including job requirements and locations of assignments. A discussion of the Department of Defense financial management environment focuses on current and future trends which are or will be impacting budget analysts. In addition, training courses and programs that are currently available to DoN entry-level budget analysts are examined. The primary conclusion of this research was that these is a dire need for quality training of DoN entry-level budget analysts. Recommendations are offered on how the learning objectives identified by this study can be utilized to assist in the development of quality training courses, materials and programs.http://archive.org/details/learningobjectiv00holfLieutenant, United States NavyApproved for public release; distribution is unlimited

    Building long-term scenarios for development: The methodological state of the art with an application to foreign direct investment in Africa

    Full text link
    "This study provides an introduction to scenario analysis as a tool for development policy planning. The study is divided into three parts. The first part of the study outlines the central characteristics of scenario analysis methods, distinguishes scenario analysis from other research approaches, and presents a general guide for building scenarios. Illustrations of applications of scenario analysis methods in fields related to global development complement the methodological discussions in this part of the study. A second part of the study develops an original illustration of how scenario methods can be applied to examine development policy issues byfocusing on the question of how foreign direct investment flows could change the African development landscape toward the year 2030. This chapter culminates with the presentation of four fictional narratives charting how investment patterns and development outcomes could unfoldover the next two decades.The third and final chapter of the study outlines several considerations that policymakers potentially interested in using scenario methods as a supplement to their existing planning tools should make in evaluating whether the application of these methods within their organizations is desirable." (excerpt

    Evaluating Detours for a Major Construction Project in the Era of Real-time Route Guidance

    Get PDF
    69A3551747104Major road construction projects can be significant sources of traffic congestion and motorist delays. Maintaining agencies typically attempt to mitigate these impacts by designating detour routes and providing project information to motorists. This information can be conveyed through a variety of media, from traditional static and variable roadway signage placed in the field to electronic media including web sites, radio and television advertisements, call centers, text messaging, and navigation apps. In this era of real-time traffic information and in-vehicle route guidance, it is not clear to what extent this detour information is used and which messaging components are most effective. This study used the Interstate 59/20 reconstruction project in Birmingham, AL to evaluate the detour planning process and the effectiveness of the resulting detour and information strategies. The objective was to develop recommendations and best practices that can be applied to future construction projects and allow transportation agencies to allocate project resources to greatest effect. The evaluation included a review of the transportation modeling process used to project traffic diversions and designate detour routes, a survey of motorists to determine the sources of information they used to choose detour routes during construction, and a study of traffic patterns before, during, and after the project to understand if and how detour patterns changed over the course of the one-year project. The study resulted in recommendations for conducting planning studies for large roadway projects and found that factors such as transit usage assumptions, employer work policies, and roadway capacity assumptions can have significant impacts on model accuracy. The survey found that motorists used a wide variety of information sources when selecting detour routes and that they often modified those routes based on real-time data. The travel time and traffic count analysis found that detour patterns did vary over time as the transportation system reached equilibrium. It also found that actual traffic patterns during the reconstruction project did not always match responses to the motorist survey, suggesting that motorists used designated detour routes initially but adjusted them during the course of the project

    Methods for the evaluation of biomarkers in patients with kidney and liver diseases: multicentre research programme including ELUCIDATE RCT

    Get PDF
    Background: Protein biomarkers with associations with the activity and outcomes of diseases are being identified by modern proteomic technologies. They may be simple, accessible, cheap and safe tests that can inform diagnosis, prognosis, treatment selection, monitoring of disease activity and therapy and may substitute for complex, invasive and expensive tests. However, their potential is not yet being realised. Design and methods: The study consisted of three workstreams to create a framework for research: workstream 1, methodology – to define current practice and explore methodology innovations for biomarkers for monitoring disease; workstream 2, clinical translation – to create a framework of research practice, high-quality samples and related clinical data to evaluate the validity and clinical utility of protein biomarkers; and workstream 3, the ELF to Uncover Cirrhosis as an Indication for Diagnosis and Action for Treatable Event (ELUCIDATE) randomised controlled trial (RCT) – an exemplar RCT of an established test, the ADVIA Centaur® Enhanced Liver Fibrosis (ELF) test (Siemens Healthcare Diagnostics Ltd, Camberley, UK) [consisting of a panel of three markers – (1) serum hyaluronic acid, (2) amino-terminal propeptide of type III procollagen and (3) tissue inhibitor of metalloproteinase 1], for liver cirrhosis to determine its impact on diagnostic timing and the management of cirrhosis and the process of care and improving outcomes. Results: The methodology workstream evaluated the quality of recommendations for using prostate-specific antigen to monitor patients, systematically reviewed RCTs of monitoring strategies and reviewed the monitoring biomarker literature and how monitoring can have an impact on outcomes. Simulation studies were conducted to evaluate monitoring and improve the merits of health care. The monitoring biomarker literature is modest and robust conclusions are infrequent. We recommend improvements in research practice. Patients strongly endorsed the need for robust and conclusive research in this area. The clinical translation workstream focused on analytical and clinical validity. Cohorts were established for renal cell carcinoma (RCC) and renal transplantation (RT), with samples and patient data from multiple centres, as a rapid-access resource to evaluate the validity of biomarkers. Candidate biomarkers for RCC and RT were identified from the literature and their quality was evaluated and selected biomarkers were prioritised. The duration of follow-up was a limitation but biomarkers were identified that may be taken forward for clinical utility. In the third workstream, the ELUCIDATE trial registered 1303 patients and randomised 878 patients out of a target of 1000. The trial started late and recruited slowly initially but ultimately recruited with good statistical power to answer the key questions. ELF monitoring altered the patient process of care and may show benefits from the early introduction of interventions with further follow-up. The ELUCIDATE trial was an ‘exemplar’ trial that has demonstrated the challenges of evaluating biomarker strategies in ‘end-to-end’ RCTs and will inform future study designs. Conclusions: The limitations in the programme were principally that, during the collection and curation of the cohorts of patients with RCC and RT, the pace of discovery of new biomarkers in commercial and non-commercial research was slower than anticipated and so conclusive evaluations using the cohorts are few; however, access to the cohorts will be sustained for future new biomarkers. The ELUCIDATE trial was slow to start and recruit to, with a late surge of recruitment, and so final conclusions about the impact of the ELF test on long-term outcomes await further follow-up. The findings from the three workstreams were used to synthesise a strategy and framework for future biomarker evaluations incorporating innovations in study design, health economics and health informatics
    corecore