1,753 research outputs found

    Implementing Snow Load Monitoring to Control Reliability of a Stadium Roof

    Get PDF
    This contribution shows how monitoring can be used to control reliability of a structure not complying with the requirements of Eurocodes. A general methodology to obtain cost-optimal decisions using limit state design, probabilistic reliability analysis and cost estimates is utilised in a full-scale case study dealing with the roof of a stadium located in Northern Italy. The results demonstrate the potential of monitoring systems and probabilistic reliability analysis to support decisions regarding safety measures such as snow removal, or temporary closure of the stadium

    Probabilistic Modeling of Structural Forces

    Get PDF
    Since forces acting on structures fluctuate widely with time and space during the lifetime of a structure, variations of the forces should be considered by probability distributions. Probabilistic definition of forces is expressed by random field variables including stochastic parameters. Structural forces are simulated by adopting Normal and Gamma probability distribution functions. The basic model given by JCSS (Joint Committee on Structural Safety) code principles is used as model to take into account the variations. In the simulation of the live loads comprised of sustained and intermittent loads, time intervals are assumed to follow a Poisson process and their distributions are defined by exponential distributions. The simulated loads are evaluated in terms of percentiles, correlation effects, reduction factors and extreme values. Results are compared with those of deterministic model as well. It has been observed that probabilistic model is more realistic and the results can be used in the calculation of specific fractiles like load and resistance factor design

    Samplers and Extractors for Unbounded Functions

    Get PDF
    Blasiok (SODA\u2718) recently introduced the notion of a subgaussian sampler, defined as an averaging sampler for approximating the mean of functions f from {0,1}^m to the real numbers such that f(U_m) has subgaussian tails, and asked for explicit constructions. In this work, we give the first explicit constructions of subgaussian samplers (and in fact averaging samplers for the broader class of subexponential functions) that match the best known constructions of averaging samplers for [0,1]-bounded functions in the regime of parameters where the approximation error epsilon and failure probability delta are subconstant. Our constructions are established via an extension of the standard notion of randomness extractor (Nisan and Zuckerman, JCSS\u2796) where the error is measured by an arbitrary divergence rather than total variation distance, and a generalization of Zuckerman\u27s equivalence (Random Struct. Alg.\u2797) between extractors and samplers. We believe that the framework we develop, and specifically the notion of an extractor for the Kullback-Leibler (KL) divergence, are of independent interest. In particular, KL-extractors are stronger than both standard extractors and subgaussian samplers, but we show that they exist with essentially the same parameters (constructively and non-constructively) as standard extractors

    Querying Schemas With Access Restrictions

    Full text link
    We study verification of systems whose transitions consist of accesses to a Web-based data-source. An access is a lookup on a relation within a relational database, fixing values for a set of positions in the relation. For example, a transition can represent access to a Web form, where the user is restricted to filling in values for a particular set of fields. We look at verifying properties of a schema describing the possible accesses of such a system. We present a language where one can describe the properties of an access path, and also specify additional restrictions on accesses that are enforced by the schema. Our main property language, AccLTL, is based on a first-order extension of linear-time temporal logic, interpreting access paths as sequences of relational structures. We also present a lower-level automaton model, Aautomata, which AccLTL specifications can compile into. We show that AccLTL and A-automata can express static analysis problems related to "querying with limited access patterns" that have been studied in the database literature in the past, such as whether an access is relevant to answering a query, and whether two queries are equivalent in the accessible data they can return. We prove decidability and complexity results for several restrictions and variants of AccLTL, and explain which properties of paths can be expressed in each restriction.Comment: VLDB201

    Quantification of the conditional value of SHM data for the fatigue safety evaluation of a road viaduct

    Get PDF

    Fine-grained dichotomies for the Tutte plane and Boolean #CSP

    Get PDF
    Jaeger, Vertigan, and Welsh [15] proved a dichotomy for the complexity of evaluating the Tutte polynomial at fixed points: The evaluation is #P-hard almost everywhere, and the remaining points admit polynomial-time algorithms. Dell, Husfeldt, and Wahl\'en [9] and Husfeldt and Taslaman [12], in combination with Curticapean [7], extended the #P-hardness results to tight lower bounds under the counting exponential time hypothesis #ETH, with the exception of the line y=1y=1, which was left open. We complete the dichotomy theorem for the Tutte polynomial under #ETH by proving that the number of all acyclic subgraphs of a given nn-vertex graph cannot be determined in time exp(o(n))exp(o(n)) unless #ETH fails. Another dichotomy theorem we strengthen is the one of Creignou and Hermann [6] for counting the number of satisfying assignments to a constraint satisfaction problem instance over the Boolean domain. We prove that all #P-hard cases are also hard under #ETH. The main ingredient is to prove that the number of independent sets in bipartite graphs with nn vertices cannot be computed in time exp(o(n))exp(o(n)) unless #ETH fails. In order to prove our results, we use the block interpolation idea by Curticapean [7] and transfer it to systems of linear equations that might not directly correspond to interpolation.Comment: 16 pages, 1 figur

    Durability Analysis of Concrete Bridge Deck Exposed to the Chloride Ions Using Direct Optimized Probabilistic Calculation

    Get PDF
    Durability of reinforced concrete structures is a deeply discussed problem recently. Concrete structures in the external environment are very often affected by chloride ions from de-icing salt or sea water. Chloride ions penetrate through the concrete cover layer of the reinforcement and can cause eventually the corrosion of the steel. However, when estimating the durability of the structure, it is not sometimes possible to express the parameters by constant values; therefore, the probabilistic methods come in handy. Then, the variability of inputs and outputs can be expressed by histograms. Two probabilistic approaches were applied in this task – Monte Carlo simulation with Simulation-Based Reliability Assessment method, which is widely used for such type of problems, and the Direct Optimized Probabilistic Calculation, which is still relatively new type of approach. The result is a comparison of mentioned methods in terms of accuracy on the model of one-dimensional chloride penetration with time independent diffusion coefficient by using the Fick’s Second Law of Diffusion

    The IceCube Realtime Alert System

    Get PDF
    Following the detection of high-energy astrophysical neutrinos in 2013, their origin is still unknown. Aiming for the identification of an electromagnetic counterpart of a rapidly fading source, we have implemented a realtime analysis framework for the IceCube neutrino observatory. Several analyses selecting neutrinos of astrophysical origin are now operating in realtime at the detector site in Antarctica and are producing alerts to the community to enable rapid follow-up observations. The goal of these observations is to locate the astrophysical objects responsible for these neutrino signals. This paper highlights the infrastructure in place both at the South Pole detector site and at IceCube facilities in the north that have enabled this fast follow-up program to be developed. Additionally, this paper presents the first realtime analyses to be activated within this framework, highlights their sensitivities to astrophysical neutrinos and background event rates, and presents an outlook for future discoveries.Comment: 33 pages, 9 figures, Published in Astroparticle Physic
    corecore