831 research outputs found

    A Spectrum of Applications of Automated Reasoning

    Full text link
    The likelihood of an automated reasoning program being of substantial assistance for a wide spectrum of applications rests with the nature of the options and parameters it offers on which to base needed strategies and methodologies. This article focuses on such a spectrum, featuring W. McCune's program OTTER, discussing widely varied successes in answering open questions, and touching on some of the strategies and methodologies that played a key role. The applications include finding a first proof, discovering single axioms, locating improved axiom systems, and simplifying existing proofs. The last application is directly pertinent to the recently found (by R. Thiele) Hilbert's twenty-fourth problem--which is extremely amenable to attack with the appropriate automated reasoning program--a problem concerned with proof simplification. The methodologies include those for seeking shorter proofs and for finding proofs that avoid unwanted lemmas or classes of term, a specific option for seeking proofs with smaller equational or formula complexity, and a different option to address the variable richness of a proof. The type of proof one obtains with the use of OTTER is Hilbert-style axiomatic, including details that permit one sometimes to gain new insights. We include questions still open and challenges that merit consideration.Comment: 13 page

    Topological derivation of shape exponents for stretched exponential relaxation

    Get PDF
    In homogeneous glasses, values of the important dimensionless stretched-exponential shape parameter beta are shown to be determined by magic (not adjusted) simple fractions derived from fractal configuration spaces of effective dimension d* by applying different topological axioms (rules) in the presence (absence) of a forcing electric field. The rules are based on a new central principle for defining glassy states: equal a priori distributions of fractal residual configurational entropy. Our approach and its beta estimates are fully supported by the results of relaxation measurements involving many different glassy materials and probe methods. The present unique topological predictions for beta typically agree with observed values to ~ 1% and indicate that for field-forced conditions beta should be constant for appreciable ranges of such exogenous variables as temperature and ionic concentration, as indeed observed using appropriate data analysis. The present approach can also be inverted and used to test sample homogeneity and quality.Comment: Original 13 pages lengthened to 21 pages (longer introduction, added references and discussion of new experimental data published since original submission

    The Computer Modelling of Mathematical Reasoning

    Get PDF
    xv, 403 p.; 23 cm

    The Tensor Track, III

    Full text link
    We provide an informal up-to-date review of the tensor track approach to quantum gravity. In a long introduction we describe in simple terms the motivations for this approach. Then the many recent advances are summarized, with emphasis on some points (Gromov-Hausdorff limit, Loop vertex expansion, Osterwalder-Schrader positivity...) which, while important for the tensor track program, are not detailed in the usual quantum gravity literature. We list open questions in the conclusion and provide a rather extended bibliography.Comment: 53 pages, 6 figure

    Efficient and scalable triangle centrality algorithms in the arkouda framework

    Get PDF
    Graph data structures provide a unique challenge for both analysis and algorithm development. These data structures are irregular in that memory accesses are not known a priori and accesses to these structures tend to lack locality. Despite these challenges, graph data structures are a natural way to represent relationships between entities and to exhibit unique features about these relationships. The network created from these relationships can create unique local structures that can describe the behavior between members of these structures. Graphs can be analyzed in a number of different ways including at a high level in community detection and at the node level in centrality. Both of these are difficult to quantitatively define because a “correct” answer is not readily apparent. The centrality of a node can be subjective; what does it mean central in an amorphous data structure? Further, even when centrality or community detection can be defined, there are typically trade offs in detection and analysis. A fine grained method may yield a more precise method but the run time may scale exponentially or even beyond. For small datasets this may not be a concern but for graph datasets this can make analysis prohibitive considering a social media networks where there are millions of people with millions of connections. Based on these two criteria, we implement several versions of a recently designed centrality measure called Triangle Centrality which is a centrality metric that considers both connectivity of a node with other nodes and the connectivity of a node’s neighbors. The connectivity is aptly measured through the triangles formed by nodes. There are two ways to implement triangle centrality; a graph based approach and an approach that utilizes linear algebra and matrix operations. This implementation is done with graph based data structures and to optimize this, we implement several versions of triangle counting based on prior research into the high performance computing framework, Arkouda. We implement an edge list intersection, a minimized search kernel method, a path merge method, and a small set intersection method. To compare these methods, we include a naive method and a comparison to a linear algebra implementation that uses the SuiteSparse GraphBLAS library. Our implementation utilizes an open-source framework called Arkouda which is a distributed platform for data scientists and developers. It simplifies complex parallel algorithms and the storage of datasets onto a back end Chapel server and allows users to access these from an intuitive pythonic interface. Our results demonstrate the scalability of the platform and are analyzed against different graph properties to see how these affect the implementation

    Simulating mobile ad hoc networks: a quantitative evaluation of common MANET simulation models

    Get PDF
    Because it is difficult and costly to conduct real-world mobile ad hoc network experiments, researchers commonly rely on computer simulation to evaluate their routing protocols. However, simulation is far from perfect. A growing number of studies indicate that simulated results can be dramatically affected by several sensitive simulation parameters. It is also commonly noted that most simulation models make simplifying assumptions about radio behavior. This situation casts doubt on the reliability and applicability of many ad hoc network simulation results. In this study, we begin with a large outdoor routing experiment testing the performance of four popular ad hoc algorithms (AODV, APRL, ODMRP, and STARA). We present a detailed comparative analysis of these four implementations. Then, using the outdoor results as a baseline of reality, we disprove a set of common assumptions used in simulation design, and quantify the impact of these assumptions on simulated results. We also more specifically validate a group of popular radio models with our real-world data, and explore the sensitivity of various simulation parameters in predicting accurate results. We close with a series of specific recommendations for simulation and ad hoc routing protocol designers
    corecore