26 research outputs found

    Development of safety performance functions and a GIS based spatial analysis of collision data for the City of Saskatoon

    Get PDF
    The American Association of State Highway and Transportation Officials (AASHTO) produced the first edition of the Highway Safety Manual (HSM) in 2010. The HSM introduces a six-step safety management process which provides engineers with a systematic and scientific approach to identifying and managing safety concerns on a road network. Each step plays a vital role in improving the safety of target road networks, and the first step, network screening, is the step where the safety issues are first identified. This is accomplished through the use of a series of Safety Performance Functions (SPFs). SPFs are mathematical equations that relate the collision frequency at a particular location with traffic volume and roadway characteristics. The purpose of this research is to develop a series of locally derived SPFs for the City of Saskatoon to allow engineers to estimate the expected number of collisions for the purpose of evaluating new roadway design alternatives and screening the existing roadway network in terms of safety. By developing locally derived SPFs, it may be possible to obtain a better prediction of the expected number of collisions than from using the base model SPFs provided in the HSM. The development of SPFs required three separate databases containing roadway characteristics, traffic volume, and collision records to be integrated into a single database. Using the statistical program R-Language, SPFs were developed and validated for the City of Saskatoon. The developed SPFs were used to conduct a network screening of Saskatoon’s roadways and intersections to identify locations with safety concerns. The results from the network screening were incorporated into ArcGIS to allow for a visual analysis of the spatial collision patterns. Finally, the locally derived SPFs were compared with the HSM base model SPFs to determine if other jurisdictions would benefit from developing their own locally derived SPFs for urban areas

    BOTTOM-UP NETWORK SCREENING TO IDENTIFY HIGH COLLISION LOCATIONS FOR THE CITY OF SASKATOON

    Get PDF
    Safety network screening is used to identify roadway locations (e.g., intersections and roadway segments) for potential safety improvements. Currently, one of the most commonly used network screening methods in practice is the safety performance function (SPF) based method that uses traffic volume data as an essential input for the screening process. However, the lack of traffic volume data for target roadway locations restricts the applicability of SPF-based network screening methods. The primary objective of this study is to screen Saskatoon’s roadway networks using two existing network screening methods (i.e., the binomial test and the beta-binomial (BB) test) that do not require traffic volume as an input. Previous studies have applied the binomial test and/or the BB test without explicitly defining the particular circumstances that indicate which test is preferable. This study introduced a formal statistical test known as the overdispersion test (i.e., “C(α) Test”) to determine which network screening method – the binomial test or the BB test – should be used to screen a given study dataset. The “C(α) Test” was applied to a total of 36 study collision datasets, including 26 segment collision datasets, and 10 intersection collision datasets. (“C (α) Test” results showed that 15 of 26 (58%) segment collision datasets, and all of 10 intersection collision datasets contained statistically significant overdispersion at the 95% confidence level (P-value < 0.05). The BB test was selected as an appropriate network screening method for 15 segment collision datasets and 10 intersection collision datasets. The remaining 11 segment collision datasets that did not contain statistically significant overdispersion (P-value ≥ 0.05) were screened using the binomial test. The network screening results for each study location (i.e., a segment or an intersection) in all 36 study datasets were presented in terms of the estimated probability obtained from either the binomial test or the BB test. The estimated probability values were used as a ranking measure to select the top 10 or top 30 riskiest locations for both roadway segments and intersections. The network screening results (estimated probability) for each study segment or intersection in all 36 study collision datasets were then visually displayed in a set of 36 collision maps that were developed using ArcGIS. The developed GIS-based collision maps are expected to help engineers in the City of Saskatoon to efficiently select potential locations for deploying specific safety countermeasures that will result in the reduction of a certain configuration of collisions at the screened locations. As a final component of this thesis, a diagnosis study was performed to identify the most dominant collision configurations at the top 30 riskiest signalized intersections (among a total of 154 signalized intersections) in Saskatoon. This study quantitatively compared the performance of two existing collision diagnosis methods (i.e., descriptive data analysis and BB test), and the comparison results revealed that the BB test is a more rigorous collision diagnosis method than the descriptive data analysis

    Stability of the skyrmion lattice near the critical temperature in cubic helimagnets

    Get PDF
    The phase diagram of cubic helimagnets near the critical temperature is obtained from a Landau-Ginzburg model, including fluctuations to gaussian level. The free energy is evaluated via a saddle point expansion around the local minima of the Landau-Ginzburg functional. The local minima are computed by solving the Euler-Lagrange equations with appropriate boundary conditions, preserving manifestly the full nonlinearity that is characteristic of skyrmion states. It is shown that the fluctuations stabilize the skyrmion lattice in a region of the phase diagram close to the critical temperature, where it becomes the equilibrium state. A comparison of this approach with previous computations performed with a different approach (truncated Fourier expansion of magnetic states) is given.Comment: 6 pages, 6 color figure

    California Water Reallocation: Where\u27d You Get That?

    Get PDF
    When thirsty, Californians often avoid going to the market for more water. Instead, they might borrow some from their rich neighbors, they might sue them or more commonly, they simply take more from users without much of a voice (e.g. the fish or future generations). These alternatives are often superior to using markets. Within markets, a surprising detail emerges – it is uncommon for farmers to fallow fields in order to sell water to another user. Rather, many water transfers are structured so sellers can have their cake and eat it too. While some of these transfers rightly bring about jealousy and criticism, they likely do facilitate efficient water use. In discussing these points, I provide a more holistic description of how water users reallocate water as well as a richer understanding of how California’s water market actually works

    Family house with office part

    Get PDF
    Předmětem bakalářské práce je vypracování projektové dokumentace pro provedení stavby na novostavbu rodinného domu s kancelářskou částí. Objekt je částečné podsklepen s přízemím a podkrovím, zastřešený sedlovou střechou. K objektu je přistavěna garáž. Svislé nosné konstrukce jsou ze systému Porotherm, stropy z prefabrikovaných panelů. Konstrukci zastřešení tvoří dřevěný krov.Subject of this bachelor's thesis is elaboration of the project documentation for currying out of new family house with office part. Building is partial cellarage with ground floor and attic covered with gabled roof as well. Garage is extended to side of building. The vertical structures are made from Porotherm ceramic system. Ceilings are made from concrete prefabricated panels. Structure of the roof constitute wooden truss.

    Die ontwikkeling van wasige beheerders met behulp van ontoegewyde grootskaalse geintegreerde bane

    Get PDF
    M.Ing. (Electrical & Electronic Engineering)Please refer to full text to view abstrac

    SequenceMatch: Revisiting the design of weak-strong augmentations for Semi-supervised learning

    Full text link
    Semi-supervised learning (SSL) has become popular in recent years because it allows the training of a model using a large amount of unlabeled data. However, one issue that many SSL methods face is the confirmation bias, which occurs when the model is overfitted to the small labeled training dataset and produces overconfident, incorrect predictions. To address this issue, we propose SequenceMatch, an efficient SSL method that utilizes multiple data augmentations. The key element of SequenceMatch is the inclusion of a medium augmentation for unlabeled data. By taking advantage of different augmentations and the consistency constraints between each pair of augmented examples, SequenceMatch helps reduce the divergence between the prediction distribution of the model for weakly and strongly augmented examples. In addition, SequenceMatch defines two different consistency constraints for high and low-confidence predictions. As a result, SequenceMatch is more data-efficient than ReMixMatch, and more time-efficient than both ReMixMatch (Ă—4\times4) and CoMatch (Ă—2\times2) while having higher accuracy. Despite its simplicity, SequenceMatch consistently outperforms prior methods on standard benchmarks, such as CIFAR-10/100, SVHN, and STL-10. It also surpasses prior state-of-the-art methods by a large margin on large-scale datasets such as ImageNet, with a 38.46\% error rate. Code is available at https://github.com/beandkay/SequenceMatch.Comment: Accepted to WACV 202

    Mirror - Vol. 31, No. 14 - December 15, 2005

    Get PDF
    The Mirror (sometimes called the Fairfield Mirror) is the official student newspaper of Fairfield University, and is published weekly during the academic year (September - May). It runs from 1977 - the present; current issues are available online.https://digitalcommons.fairfield.edu/archives-mirror/1674/thumbnail.jp

    Simulation methods and error analysis for trawl processes and ambit fields

    Get PDF
    Trawl processes are continuous-time, stationary and infinitely divisible processes which can describe a wide range of possible serial correlation patterns in data. In this paper, we introduce new simulation algorithms for trawl processes with monotonic trawl functions and establish their error bounds and convergence properties. We extensively analyse the computational complexity and practical implementation of these algorithms and discuss which one to use depending on the type of LĂ©vy basis. We extend the above methodology to the simulation of kernel-weighted, volatility modulated trawl processes and develop a new simulation algorithm for ambit fields. Finally, we discuss how simulation schemes previously described in the literature can be combined with our methods for decreased computational cost
    corecore