1,506 research outputs found

    A Refutation of Goodman's Type‐Token Theory of Notation

    Get PDF
    In Languages of Art, Nelson Goodman presents a general theory of symbolic notation. However, I show that his theory could not adequately explain possible cases of natural language notational uses, and argue that this outcome undermines, not only Goodman's own theory, but any broadly type versus token based account of notational structure.Given this failure, an alternative representational theory is proposed, in which different visual or perceptual aspects of a given physical inscription each represent a different letter, word, or other notational item. Such a view is strongly supported by the completely conventional relation between inscriptions and notation, as shown by encryption techniques etc

    Multi-Modal Mean-Fields via Cardinality-Based Clamping

    Get PDF
    Mean Field inference is central to statistical physics. It has attracted much interest in the Computer Vision community to efficiently solve problems expressible in terms of large Conditional Random Fields. However, since it models the posterior probability distribution as a product of marginal probabilities, it may fail to properly account for important dependencies between variables. We therefore replace the fully factorized distribution of Mean Field by a weighted mixture of such distributions, that similarly minimizes the KL-Divergence to the true posterior. By introducing two new ideas, namely, conditioning on groups of variables instead of single ones and using a parameter of the conditional random field potentials, that we identify to the temperature in the sense of statistical physics to select such groups, we can perform this minimization efficiently. Our extension of the clamping method proposed in previous works allows us to both produce a more descriptive approximation of the true posterior and, inspired by the diverse MAP paradigms, fit a mixture of Mean Field approximations. We demonstrate that this positively impacts real-world algorithms that initially relied on mean fields.Comment: Submitted for review to CVPR 201

    Data refinement for true concurrency

    Get PDF
    The majority of modern systems exhibit sophisticated concurrent behaviour, where several system components modify and observe the system state with fine-grained atomicity. Many systems (e.g., multi-core processors, real-time controllers) also exhibit truly concurrent behaviour, where multiple events can occur simultaneously. This paper presents data refinement defined in terms of an interval-based framework, which includes high-level operators that capture non-deterministic expression evaluation. By modifying the type of an interval, our theory may be specialised to cover data refinement of both discrete and continuous systems. We present an interval-based encoding of forward simulation, then prove that our forward simulation rule is sound with respect to our data refinement definition. A number of rules for decomposing forward simulation proofs over both sequential and parallel composition are developed

    Lower Bounds for Symbolic Computation on Graphs: Strongly Connected Components, Liveness, Safety, and Diameter

    Full text link
    A model of computation that is widely used in the formal analysis of reactive systems is symbolic algorithms. In this model the access to the input graph is restricted to consist of symbolic operations, which are expensive in comparison to the standard RAM operations. We give lower bounds on the number of symbolic operations for basic graph problems such as the computation of the strongly connected components and of the approximate diameter as well as for fundamental problems in model checking such as safety, liveness, and co-liveness. Our lower bounds are linear in the number of vertices of the graph, even for constant-diameter graphs. For none of these problems lower bounds on the number of symbolic operations were known before. The lower bounds show an interesting separation of these problems from the reachability problem, which can be solved with O(D)O(D) symbolic operations, where DD is the diameter of the graph. Additionally we present an approximation algorithm for the graph diameter which requires O~(nD)\tilde{O}(n \sqrt{D}) symbolic steps to achieve a (1+Ï”)(1+\epsilon)-approximation for any constant Ï”>0\epsilon > 0. This compares to O(n⋅D)O(n \cdot D) symbolic steps for the (naive) exact algorithm and O(D)O(D) symbolic steps for a 2-approximation. Finally we also give a refined analysis of the strongly connected components algorithms of Gentilini et al., showing that it uses an optimal number of symbolic steps that is proportional to the sum of the diameters of the strongly connected components

    Querying Schemas With Access Restrictions

    Full text link
    We study verification of systems whose transitions consist of accesses to a Web-based data-source. An access is a lookup on a relation within a relational database, fixing values for a set of positions in the relation. For example, a transition can represent access to a Web form, where the user is restricted to filling in values for a particular set of fields. We look at verifying properties of a schema describing the possible accesses of such a system. We present a language where one can describe the properties of an access path, and also specify additional restrictions on accesses that are enforced by the schema. Our main property language, AccLTL, is based on a first-order extension of linear-time temporal logic, interpreting access paths as sequences of relational structures. We also present a lower-level automaton model, Aautomata, which AccLTL specifications can compile into. We show that AccLTL and A-automata can express static analysis problems related to "querying with limited access patterns" that have been studied in the database literature in the past, such as whether an access is relevant to answering a query, and whether two queries are equivalent in the accessible data they can return. We prove decidability and complexity results for several restrictions and variants of AccLTL, and explain which properties of paths can be expressed in each restriction.Comment: VLDB201

    Some Ontological Principles for Designing Upper Level Lexical Resources

    Full text link
    The purpose of this paper is to explore some semantic problems related to the use of linguistic ontologies in information systems, and to suggest some organizing principles aimed to solve such problems. The taxonomic structure of current ontologies is unfortunately quite complicated and hard to understand, especially for what concerns the upper levels. I will focus here on the problem of ISA overloading, which I believe is the main responsible of these difficulties. To this purpose, I will carefully analyze the ontological nature of the categories used in current upper-level structures, considering the necessity of splitting them according to more subtle distinctions or the opportunity of excluding them because of their limited organizational role.Comment: 8 pages - gzipped postscript file - A4 forma

    A robust sequential hypothesis testing method for brake squeal localisation

    Get PDF
    This contribution deals with the in situ detection and localisation of brake squeal in an automobile. As brake squeal is emitted from regions known a priori, i.e., near the wheels, the localisation is treated as a hypothesis testing problem. Distributed microphone arrays, situated under the automobile, are used to capture the directional properties of the sound field generated by a squealing brake. The spatial characteristics of the sampled sound field is then used to formulate the hypothesis tests. However, in contrast to standard hypothesis testing approaches of this kind, the propagation environment is complex and time-varying. Coupled with inaccuracies in the knowledge of the sensor and source positions as well as sensor gain mismatches, modelling the sound field is difficult and standard approaches fail in this case. A previously proposed approach implicitly tried to account for such incomplete system knowledge and was based on ad hoc likelihood formulations. The current paper builds upon this approach and proposes a second approach, based on more solid theoretical foundations, that can systematically account for the model uncertainties. Results from tests in a real setting show that the proposed approach is more consistent than the prior state-of-the-art. In both approaches, the tasks of detection and localisation are decoupled for complexity reasons. The localisation (hypothesis testing) is subject to a prior detection of brake squeal and identification of the squeal frequencies. The approaches used for the detection and identification of squeal frequencies are also presented. The paper, further, briefly addresses some practical issues related to array design and placement. (C) 2019 Author(s)
    • 

    corecore