831 research outputs found

    Dynamic Complexity of Parity Exists Queries

    Get PDF
    Given a graph whose nodes may be coloured red, the parity of the number of red nodes can easily be maintained with first-order update rules in the dynamic complexity framework DynFO of Patnaik and Immerman. Can this be generalised to other or even all queries that are definable in first-order logic extended by parity quantifiers? We consider the query that asks whether the number of nodes that have an edge to a red node is odd. Already this simple query of quantifier structure parity-exists is a major roadblock for dynamically capturing extensions of first-order logic. We show that this query cannot be maintained with quantifier-free first-order update rules, and that variants induce a hierarchy for such update rules with respect to the arity of the maintained auxiliary relations. Towards maintaining the query with full first-order update rules, it is shown that degree-restricted variants can be maintained

    Reliability-based design optimization of shells with uncertain geometry using adaptive Kriging metamodels

    Full text link
    Optimal design under uncertainty has gained much attention in the past ten years due to the ever increasing need for manufacturers to build robust systems at the lowest cost. Reliability-based design optimization (RBDO) allows the analyst to minimize some cost function while ensuring some minimal performances cast as admissible failure probabilities for a set of performance functions. In order to address real-world engineering problems in which the performance is assessed through computational models (e.g., finite element models in structural mechanics) metamodeling techniques have been developed in the past decade. This paper introduces adaptive Kriging surrogate models to solve the RBDO problem. The latter is cast in an augmented space that "sums up" the range of the design space and the aleatory uncertainty in the design parameters and the environmental conditions. The surrogate model is used (i) for evaluating robust estimates of the failure probabilities (and for enhancing the computational experimental design by adaptive sampling) in order to achieve the requested accuracy and (ii) for applying a gradient-based optimization algorithm to get optimal values of the design parameters. The approach is applied to the optimal design of ring-stiffened cylindrical shells used in submarine engineering under uncertain geometric imperfections. For this application the performance of the structure is related to buckling which is addressed here by means of a finite element solution based on the asymptotic numerical method

    Reliability-based assessment procedures for existing concrete structures

    Get PDF
    A feasibility study of reliability theory as a tool for the assessment of present safety and residual service life of damaged concrete structures has been performed in order to find a transparent methodology for the assessment procedure. It is concluded that the current guidelines are open to interpretation and that the variation in the results obtained regarding the structural safety is too great to be acceptable. Interpretations by the engineer are also included when deterministic methods are used, but probabilistic methods are more sensitive to the assumptions made and the differences in the results will therefore be greater. In a literature survey it is concluded that residual service life predictions should not be expected to be valid for more than 10 to 15 years, due to the large variability of the variables involved in the analysis. Based on these conclusions predictive models that are suitable for the inclusion of new data, and methods for the incorporation of new data are proposed. Information from the field of medical statistics and robotics suggests that linear regression models are well suited for this type of updated monitoring. Two test cases were studied, a concrete dam and a railway bridge. From the dam case, it was concluded that the safety philosophy in the deterministic dam specific assessment guidelines further development. Probabilistic descriptions of important variables, such as ice loads and friction coefficients, are needed if reliability theory is to be used for assessment purposes. During the study of the railway bridge it became clear that model uncertainties for different failure mechanisms used in concrete design are lacking. If Bayesian updating is to be used as a tool for incorporation of test data regarding concrete strength info the reliability analysis, a priori information must be established. A need for a probabilistic description of the hardening process of concrete was identified for the purpose of establishing a priori information. This description can also be used as qualitative assessment of the concrete. If there is a large discrepancy between the predicted value and the measured value, the concrete should be investigated regarding deterioration due to, for example internal frost or alkali silica reactions. Reliability theory is well suited for the assessment process since features of the reliability theory such as sensitivity analysis give good decision support for matters concerning both safety and service life predictions

    On the Complexity of Bounded Context Switching

    Get PDF
    Bounded context switching (BCS) is an under-approximate method for finding violations to safety properties in shared-memory concurrent programs. Technically, BCS is a reachability problem that is known to be NP-complete. Our contribution is a parameterized analysis of BCS. The first result is an algorithm that solves BCS when parameterized by the number of context switches (cs) and the size of the memory (m) in O*(m^(cs)2^(cs)). This is achieved by creating instances of the easier problem Shuff which we solve via fast subset convolution. We also present a lower bound for BCS of the form m^o(cs / log(cs)), based on the exponential time hypothesis. Interestingly, the gap is closely related to a conjecture that has been open since FOCS\u2707. Further, we prove that BCS admits no polynomial kernel. Next, we introduce a measure, called scheduling dimension, that captures the complexity of schedules. We study BCS parameterized by the scheduling dimension (sdim) and show that it can be solved in O*((2m)^(4sdim)4^t), where t is the number of threads. We consider variants of the problem for which we obtain (matching) upper and lower bounds
    • …
    corecore