510 research outputs found

    Identifying relationship between skid resistance and road crashes using probability-based approach

    Get PDF
    Road accidents are of great concerns for road and transport departments around world, which cause tremendous loss and dangers for public. Reducing accident rates and crash severity are imperative goals that governments, road and transport authorities, and researchers are aimed to achieve. In Australia, road crash trauma costs the nation A15billionannually.Fivepeoplearekilled,and550areinjuredeveryday.EachfatalitycoststhetaxpayerA 15 billion annually. Five people are killed, and 550 are injured every day. Each fatality costs the taxpayer A1.7 million. Serious injury cases can cost the taxpayer many times the cost of a fatality. Crashes are in general uncontrolled events and are dependent on a number of interrelated factors such as driver behaviour, traffic conditions, travel speed, road geometry and condition, and vehicle characteristics (e.g. tyre type pressure and condition, and suspension type and condition). Skid resistance is considered one of the most important surface characteristics as it has a direct impact on traffic safety. Attempts have been made worldwide to study the relationship between skid resistance and road crashes. Most of these studies used the statistical regression and correlation methods in analysing the relationships between skid resistance and road crashes. The outcomes from these studies provided mix results and not conclusive. The objective of this paper is to present a probability-based method of an ongoing study in identifying the relationship between skid resistance and road crashes. Historical skid resistance and crash data of a road network located in the tropical east coast of Queensland were analysed using the probability-based method. Analysis methodology and results of the relationships between skid resistance, road characteristics and crashes are presented

    Probability based method of reinforced concrete member approach

    Get PDF
    Diplomová práce se zabývá posouzením železobetonového prvku pomocí plně pravděpodobnostního přístupu a jeho porovnáním s metodou dílčích součinitelů spolehlivosti. Tento přístup posouzení mezního stavu únosnosti bude aplikován na stanovení mezního stavu únosnosti železobetonového sloupu namáhaného kombinací „N+M“, tedy normálovou silou a ohybovým momentem. Konkrétně se budu zabývat kruhovým železobetonovým sloupem s a bez vlivu ovinutí jednosměrnou tkaninou z uhlíkových vláken a porovnáním teoreticky určených hodnot s hodnotami určenými experimentem, který je prováděn na Ústavu betonových a zděných konstrukcí Fakulty stavební Vysokého učení technického v Brně.This thesis deals with the assessment of reinforced concrete elements using a fully probabilistic approach and its comparison with the method of partial reliability factors. This method of ultimate limit state design will be applied to determine the ultimate limit state of reinforced concrete column loaded by a combination of "N + M", the normal force and bending moment. Specifically, I will solve a circular reinforced concrete column with and without the effect of wrapping unidirectional carbon fiber fabrics and comparing the theoretically determined values with the values determined by experiment, which is carried out at the Institute of Concrete and Masonry Structures Faculty of Civil Engineering of the Technical University in Brno.

    Risk assessment in life-cycle costing for road asset management

    Get PDF
    Queensland Department of Main Roads, Australia, spends approximately A$ 1 billion annually for road infrastructure asset management. To effectively manage road infrastructure, firstly road agencies not only need to optimise the expenditure for data collection, but at the same time, not jeopardise the reliability in using the optimised data to predict maintenance and rehabilitation costs. Secondly, road agencies need to accurately predict the deterioration rates of infrastructures to reflect local conditions so that the budget estimates could be accurately estimated. And finally, the prediction of budgets for maintenance and rehabilitation must provide a certain degree of reliability. This paper presents the results of case studies in using the probability-based method for an integrated approach (i.e. assessing optimal costs of pavement strength data collection; calibrating deterioration prediction models that suit local condition and assessing risk-adjusted budget estimates for road maintenance and rehabilitation for assessing life-cycle budget estimates). The probability concept is opening the path to having the means to predict life-cycle maintenance and rehabilitation budget estimates that have a known probability of success (e.g. produce budget estimates for a project life-cycle cost with 5% probability of exceeding). The paper also presents a conceptual decision-making framework in the form of risk mapping in which the life-cycle budget/cost investment could be considered in conjunction with social, environmental and political issues

    Integrating Industry and National Economic Accounts: First Steps and Future Improvements

    Get PDF
    The integration of the annual I-O accounts with the GDP-by-industry accounts is the most recent in a series of improvements to the industry accounts provided by the BEA in recent years. BEA prepares two sets of national industry accounts: The I-O accounts, which consist of the benchmark I-O accounts and the annual I-O accounts, and the GDPby- industry accounts. Both the I-O accounts and the GDP-by-industry accounts present measures of gross output, intermediate inputs, and value added by industry. However, in the past, they were inconsistent because of the use of different methodologies, classification frameworks, and source data. The integration of these accounts eliminated these inconsistencies and improved the accuracy of both sets of accounts. The integration of the annual industry accounts represents a major advance in the timeliness, accuracy, and consistency of these accounts, and is a result of significant improvements in BEA's estimating methods. The paper describes the new methodology, and the future steps required to integrate the industry accounts with the NIPAs. The new methodology combines source data between the two industry accounts to improve accuracy; it prepares the newly integrated accounts within an I-O framework that balances and reconciles industry production with commodity usage. Moreover, the new methodology allows the acceleration of the release of the annual I-O accounts by 2 years and for the first time, provides a consistent time series of annual I-O accounts. Three appendices are provided: A description of the probability-based method to rank source data by quality; a description of the new balancing produced for producing the annual I-O accounts; and a description of the computation method used to estimate chaintype price and quantity indexes in the GDP-by-industry accounts.

    Approach of Solving Multi-objective Programming Problem by Means of Probability Theory and Uniform Experimental Design

    Get PDF
    In this paper, an approach to deal with the multi-objective programming problem is regulated by means of probability-based multi-objective optimization, discrete uniform experimental design, and sequential algorithm for optimization. The probability-based method for multi-objective optimization is used to conduct conversion of the multi-objective optimization problem into a single-objective optimization one in the viewpoint of probability theory. The discrete uniform experimental design is used to supply an efficient sampling to simplify the conversion. The sequential algorithm for optimization is employed to carry out further optimization. The corresponding treatments reveal the essence of the multi-objective programming, and consideration of the simultaneous optimization of each objective of multi-objective programming problem rationally. Two examples are conducted to illuminate the rationality of the approach

    Two approaches for synthesizing scalable residential energy consumption data

    Full text link
    © 2019 Elsevier B.V. Many fields require scalable and detailed energy consumption data for different study purposes. However, due to privacy issues, it is often difficult to obtain sufficiently large datasets. This paper proposes two different methods for synthesizing fine-grained energy consumption data for residential households, namely a regression-based method and a probability-based method. They each use a supervised machine learning method, which trains models with a relatively small real-world dataset and then generates large-scale time series based on the models. This paper describes the two methods in details, including data generation process, optimization techniques, and parallel data generation. This paper evaluates the performance of the two methods, which compare the resulting consumption profiles with real-world data, including patterns, statistics, and parallel data generation in the cluster. The results demonstrate the effectiveness of the proposed methods and their efficiency in generating large-scale datasets

    Maximum Likelihood and Bayesian Estimation of Skeletal Age-at-Death from the Human Pubic Symphysis

    Get PDF
    A number of methodological problems have recently plagued studies of adult skeletal age-at-death estimation. Over the last two decades, researchers have extended considerable effort to place age estimation studies on a firmer statistical ground. However, many of the current methods can still be criticized because they make unjustifiable assumptions or use inappropriate statistical models. Much of the controversy surrounding age-at-death estimation has focused specifically on the question of applying age standards from a reference collection of known-age individuals to a target group of unknown age. The current study, involving a large sample (n=739) of adult male pubic symphysis data, demonstrates a probability-based method in order to obtain the full posterior distribution for age-at-death conditional on observed symphyseal phases using both a maximum likelihood and Bayesian estimator. With the application of the maximum likelihood or Bayesian estimator (where the prior distribution for age is external to the reference sample) it is possible to produce age estimates that are independent of the reference sample age-at-death distribution

    PROBABILITY-BASED SIMULATION OF 2-D VELOCITY DISTRIBUTION AND DISCHARGE ESTIMATION IN OPEN CHANNEL FLOW

    Get PDF
    A probability-based method is presented that can be used to simulate 2-D velocity distribution in rectangular open channels and to estimate the flow discharge. The method is based on Chiu's velocity distribution equation. A technique for estimating a parameter of 2-D velocity equation has been developed, by which the 2-D velocity distribution in rectangular open channels can be simulated by using one or several velocity samples, or even without using any velocity data. The present study also developed an efficient method of discharge estimation in rivers, which is applicable regardless of whether flow is steady or unsteady. It only requires a quick velocity sampling. The relation between the surface velocity and the vertical mean velocity has been studied. It can be used for developing a non-contact method of discharge measurement.Under the same framework of analysis, a new slope-area method has been developed to determine the flow discharge. It can reduce errors due to the uncertainties in Manning's n and the energy coefficient that exist in the widely-used slope-area method
    corecore