1,115 research outputs found

    Shingle 2.0: generalising self-consistent and automated domain discretisation for multi-scale geophysical models

    Full text link
    The approaches taken to describe and develop spatial discretisations of the domains required for geophysical simulation models are commonly ad hoc, model or application specific and under-documented. This is particularly acute for simulation models that are flexible in their use of multi-scale, anisotropic, fully unstructured meshes where a relatively large number of heterogeneous parameters are required to constrain their full description. As a consequence, it can be difficult to reproduce simulations, ensure a provenance in model data handling and initialisation, and a challenge to conduct model intercomparisons rigorously. This paper takes a novel approach to spatial discretisation, considering it much like a numerical simulation model problem of its own. It introduces a generalised, extensible, self-documenting approach to carefully describe, and necessarily fully, the constraints over the heterogeneous parameter space that determine how a domain is spatially discretised. This additionally provides a method to accurately record these constraints, using high-level natural language based abstractions, that enables full accounts of provenance, sharing and distribution. Together with this description, a generalised consistent approach to unstructured mesh generation for geophysical models is developed, that is automated, robust and repeatable, quick-to-draft, rigorously verified and consistent to the source data throughout. This interprets the description above to execute a self-consistent spatial discretisation process, which is automatically validated to expected discrete characteristics and metrics.Comment: 18 pages, 10 figures, 1 table. Submitted for publication and under revie

    Modified low-cycle fatigue (LCF) test

    Get PDF
    The fatigue test results obtained by the common low-cycle fatigue test (LCF) and its modified MLCF counterpart were presented. A satisfactory agreement of results was achieved for the two selected materials. With the MLCF method it is possible to examine from ten to twenty parameters using one single sample only. These parameters characterise the tested material in terms of its mechanical properties under the conditions of mechanical loads. Simultaneously, the study shows the implementation of the modified low-cycle fatigue test in practice

    Comparison of Chlorine-36 Exchange With K2ReCl6 and Cs2 ReOCl5

    Get PDF
    K2ReCl6 (IV), after precipitation from solutions of HCl36 with nitron, contained no significant radioactivity. Cs2ReOCl5 (V), insoluble in HCl, after suspension in methanol containing HCl36 and filtration was radioactive. An x-ray diffraction determination of the Cs2ReOCl6 unit cell parameters and a microscopic estimation of the surface of the samples made possible the calculation of 83% exchange of the surface chlorine atoms. Results agree with the generalization that octohedral complexes with no vacant d orbitals are inert, whereas those with at least one vacant d orbital are labile. Cs2ReOCl5 is a yellow-brown crystalline solid, insoluble in dilute HCI. X-ray diffraction determination gave hexagonal unit cell parameters of a =7.04A0 and C=10.9A

    Simulating Auxiliary Inputs, Revisited

    Get PDF
    For any pair (X,Z)(X,Z) of correlated random variables we can think of ZZ as a randomized function of XX. Provided that ZZ is short, one can make this function computationally efficient by allowing it to be only approximately correct. In folklore this problem is known as \emph{simulating auxiliary inputs}. This idea of simulating auxiliary information turns out to be a powerful tool in computer science, finding applications in complexity theory, cryptography, pseudorandomness and zero-knowledge. In this paper we revisit this problem, achieving the following results: \begin{enumerate}[(a)] We discuss and compare efficiency of known results, finding the flaw in the best known bound claimed in the TCC'14 paper "How to Fake Auxiliary Inputs". We present a novel boosting algorithm for constructing the simulator. Our technique essentially fixes the flaw. This boosting proof is of independent interest, as it shows how to handle "negative mass" issues when constructing probability measures in descent algorithms. Our bounds are much better than bounds known so far. To make the simulator (s,ϵ)(s,\epsilon)-indistinguishable we need the complexity O(s25ϵ2)O\left(s\cdot 2^{5\ell}\epsilon^{-2}\right) in time/circuit size, which is better by a factor ϵ2\epsilon^{-2} compared to previous bounds. In particular, with our technique we (finally) get meaningful provable security for the EUROCRYPT'09 leakage-resilient stream cipher instantiated with a standard 256-bit block cipher, like AES256\mathsf{AES256}.Comment: Some typos present in the previous version have been correcte

    Parameterized complexity of the MINCCA problem on graphs of bounded decomposability

    Full text link
    In an edge-colored graph, the cost incurred at a vertex on a path when two incident edges with different colors are traversed is called reload or changeover cost. The "Minimum Changeover Cost Arborescence" (MINCCA) problem consists in finding an arborescence with a given root vertex such that the total changeover cost of the internal vertices is minimized. It has been recently proved by G\"oz\"upek et al. [TCS 2016] that the problem is FPT when parameterized by the treewidth and the maximum degree of the input graph. In this article we present the following results for the MINCCA problem: - the problem is W[1]-hard parameterized by the treedepth of the input graph, even on graphs of average degree at most 8. In particular, it is W[1]-hard parameterized by the treewidth of the input graph, which answers the main open problem of G\"oz\"upek et al. [TCS 2016]; - it is W[1]-hard on multigraphs parameterized by the tree-cutwidth of the input multigraph; - it is FPT parameterized by the star tree-cutwidth of the input graph, which is a slightly restricted version of tree-cutwidth. This result strictly generalizes the FPT result given in G\"oz\"upek et al. [TCS 2016]; - it remains NP-hard on planar graphs even when restricted to instances with at most 6 colors and 0/1 symmetric costs, or when restricted to instances with at most 8 colors, maximum degree bounded by 4, and 0/1 symmetric costs.Comment: 25 pages, 11 figure

    The chaining lemma and its application

    Get PDF
    We present a new information-theoretic result which we call the Chaining Lemma. It considers a so-called “chain” of random variables, defined by a source distribution X(0)with high min-entropy and a number (say, t in total) of arbitrary functions (T1,…, Tt) which are applied in succession to that source to generate the chain (Formula presented). Intuitively, the Chaining Lemma guarantees that, if the chain is not too long, then either (i) the entire chain is “highly random”, in that every variable has high min-entropy; or (ii) it is possible to find a point j (1 ≤ j ≤ t) in the chain such that, conditioned on the end of the chain i.e. (Formula presented), the preceding part (Formula presented) remains highly random. We think this is an interesting information-theoretic result which is intuitive but nevertheless requires rigorous case-analysis to prove. We believe that the above lemma will find applications in cryptography. We give an example of this, namely we show an application of the lemma to protect essentially any cryptographic scheme against memory tampering attacks. We allow several tampering requests, the tampering functions can be arbitrary, however, they must be chosen from a bounded size set of functions that is fixed a prior

    The Complexity of Repairing, Adjusting, and Aggregating of Extensions in Abstract Argumentation

    Full text link
    We study the computational complexity of problems that arise in abstract argumentation in the context of dynamic argumentation, minimal change, and aggregation. In particular, we consider the following problems where always an argumentation framework F and a small positive integer k are given. - The Repair problem asks whether a given set of arguments can be modified into an extension by at most k elementary changes (i.e., the extension is of distance k from the given set). - The Adjust problem asks whether a given extension can be modified by at most k elementary changes into an extension that contains a specified argument. - The Center problem asks whether, given two extensions of distance k, whether there is a "center" extension that is a distance at most (k-1) from both given extensions. We study these problems in the framework of parameterized complexity, and take the distance k as the parameter. Our results covers several different semantics, including admissible, complete, preferred, semi-stable and stable semantics
    corecore