58 research outputs found

    Centrality Heuristics for Exact Model Counting

    Get PDF
    Model counting is the archetypical #P-complete problem consisting of determining the number of satisfying truth assignments of a given propositional formula. In this short paper, we empirically investigate the potential of employing graph centrality measures as a basis of search heuristics in the context of exact model counting. In particular, we integrate centrality-based heuristics into the search-based exact model counter sharpSAT. Our experiments show that employing centrality information significantly improves the empirical performance of sharpSAT, and also allows for simplifying the search heuristics compared to the current default heuristics of the model counter. In particular, we show that the VSIDS heuristic, which is an integral search heuristic employed in essentially all state-of-the-art conflict-driven clause learning Boolean satisfiability solvers, appears to be of very limited use in the context of model counting.Peer reviewe

    The Surprising Power of Graph Neural Networks with Random Node Initialization

    Full text link
    Graph neural networks (GNNs) are effective models for representation learning on relational data. However, standard GNNs are limited in their expressive power, as they cannot distinguish graphs beyond the capability of the Weisfeiler-Leman graph isomorphism heuristic. In order to break this expressiveness barrier, GNNs have been enhanced with random node initialization (RNI), where the idea is to train and run the models with randomized initial node features. In this work, we analyze the expressive power of GNNs with RNI, and prove that these models are universal, a first such result for GNNs not relying on computationally demanding higher-order properties. This universality result holds even with partially randomized initial node features, and preserves the invariance properties of GNNs in expectation. We then empirically analyze the effect of RNI on GNNs, based on carefully constructed datasets. Our empirical findings support the superior performance of GNNs with RNI over standard GNNs.Comment: Proceedings of the Thirtieth International Joint Conference on Artificial Intelligence (IJCAI-21). Code and data available at http://www.github.com/ralphabb/GNN-RN

    Decidability of Querying First-Order Theories via Countermodels of Finite Width

    Full text link
    We propose a generic framework for establishing the decidability of a wide range of logical entailment problems (briefly called querying), based on the existence of countermodels that are structurally simple, gauged by certain types of width measures (with treewidth and cliquewidth as popular examples). As an important special case of our framework, we identify logics exhibiting width-finite finitely universal model sets, warranting decidable entailment for a wide range of homomorphism-closed queries, subsuming a diverse set of practically relevant query languages. As a particularly powerful width measure, we propose Blumensath's partitionwidth, which subsumes various other commonly considered width measures and exhibits highly favorable computational and structural properties. Focusing on the formalism of existential rules as a popular showcase, we explain how finite partitionwidth sets of rules subsume other known abstract decidable classes but -- leveraging existing notions of stratification -- also cover a wide range of new rulesets. We expose natural limitations for fitting the class of finite unification sets into our picture and provide several options for remedy

    Automatic generation of high speed elliptic curve cryptography code

    Get PDF
    Apparently, trust is a rare commodity when power, money or life itself are at stake. History is full of examples. Julius Caesar did not trust his generals, so that: ``If he had anything confidential to say, he wrote it in cipher, that is, by so changing the order of the letters of the alphabet, that not a word could be made out. If anyone wishes to decipher these, and get at their meaning, he must substitute the fourth letter of the alphabet, namely D, for A, and so with the others.'' And so the history of cryptography began moving its first steps. Nowadays, encryption has decayed from being an emperor's prerogative and became a daily life operation. Cryptography is pervasive, ubiquitous and, the best of all, completely transparent to the unaware user. Each time we buy something on the Internet we use it. Each time we search something on Google we use it. Everything without (almost) realizing that it silently protects our privacy and our secrets. Encryption is a very interesting instrument in the "toolbox of security" because it has very few side effects, at least on the user side. A particularly important one is the intrinsic slow down that its use imposes in the communications. High speed cryptography is very important for the Internet, where busy servers proliferate. Being faster is a double advantage: more throughput and less server overhead. In this context, however, the public key algorithms starts with a big handicap. They have very bad performances if compared to their symmetric counterparts. Due to this reason their use is often reduced to the essential operations, most notably key exchanges and digital signatures. The high speed public key cryptography challenge is a very practical topic with serious repercussions in our technocentric world. Using weak algorithms with a reduced key length to increase the performances of a system can lead to catastrophic results. In 1985, Miller and Koblitz independently proposed to use the group of rational points of an elliptic curve over a finite field to create an asymmetric algorithm. Elliptic Curve Cryptography (ECC) is based on a problem known as the ECDLP (Elliptic Curve Discrete Logarithm Problem) and offers several advantages with respect to other more traditional encryption systems such as RSA and DSA. The main benefit is that it requires smaller keys to provide the same security level since breaking the ECDLP is much harder. In addition, a good ECC implementation can be very efficient both in time and memory consumption, thus being a good candidate for performing high speed public key cryptography. Moreover, some elliptic curve based techniques are known to be extremely resilient to quantum computing attacks, such as the SIDH (Supersingular Isogeny Diffie-Hellman). Traditional elliptic curve cryptography implementations are optimized by hand taking into account the mathematical properties of the underlying algebraic structures, the target machine architecture and the compiler facilities. This process is time consuming, requires a high degree of expertise and, ultimately, error prone. This dissertation' ultimate goal is to automatize the whole optimization process of cryptographic code, with a special focus on ECC. The framework presented in this thesis is able to produce high speed cryptographic code by automatically choosing the best algorithms and applying a number of code-improving techniques inspired by the compiler theory. Its central component is a flexible and powerful compiler able to translate an algorithm written in a high level language and produce a highly optimized C code for a particular algebraic structure and hardware platform. The system is generic enough to accommodate a wide array of number theory related algorithms, however this document focuses only on optimizing primitives based on elliptic curves defined over binary fields

    Computer Science Logic 2018: CSL 2018, September 4-8, 2018, Birmingham, United Kingdom

    Get PDF

    A primordial, mathematical, logical and computable, demonstration (proof) of the family of conjectures known as Goldbach´s

    Get PDF
    licencia de Creative Commons Reconocimiento-NoComercial-SinObraDerivada 4.0 Internacional.In this document, by means of a novel system model and first order topological, algebraic and geometrical free-­‐context formal language (NT-­‐FS&L), first, we describe a new signature for a set of the natural numbers that is rooted in an intensional inductive de-­‐embedding process of both, the tensorial identities of the known as “natural numbers”, and the abstract framework of theirs locus-­‐positional based symbolic representations. Additionally, we describe that NT-­‐FS&L is able to: i.-­‐ Embed the De Morgan´s Laws and the FOL-­‐Peano´s Arithmetic Axiomatic. ii.-­‐ Provide new points of view and perspectives about the succession, precede and addition operations and of their abstract, topological, algebraic, analytic geometrical, computational and cognitive, formal representations. Second, by means of the inductive apparatus of NT-­‐FS&L, we proof that the family of conjectures known as Glodbach’s holds entailment and truth when the reasoning starts from the consistent and finitary axiomatic system herein describedWe wish to thank the Organic Chemistry Institute of the Spanish National Research Council (IQOG/CSIC) for its operative and technical support to the Pedro Noheda Research Group (PNRG). We also thank the Institute for Physical and Information Technologies (ITETI/CSIC) of the Spanish National Research Council for their hospitality. We also thank for their long years of dedicated and kind support Dr. Juan Martínez Armesto (VATC/CSIC), Belén Cabrero Suárez (IQOG/CSIC, Administration), Mar Caso Neira (IQOG/CENQUIOR/CSIC, Library) and David Herrero Ruíz (PNRG/IQOG/CSIC). We wish to thank to Bernabé-­‐Pajares´s brothers (Dr. Manuel Bernabé-­‐Pajares, IQOG/CSIC Structural Chemistry & Biochemistry; Magnetic Nuclear Resonance and Dr. Alberto Bernabé Pajares (Greek Philology and Indo-­‐European Linguistics/UCM), for their kind attention during numerous and kind discussions about space, time, imaging and representation of knowledge, language, transcription mistakes, myths and humans always holding us familiar illusion and passion for knowledge and intellectual progress. We wish to thank Dr. Carlos Cativiela Marín (ISQCH/UNIZAR) for his encouragement and for kind listening and attention. We wish to thank Miguel Lorca Melton for his encouragement and professional point of view as Patent Attorney. Last but not least, our gratitude to Nati, María and Jaime for the time borrowed from a loving husband and father. Finally, we apologize to many who have not been mentioned today, but to whom we are grateful. Finally, let us point out that we specially apologize to many who have been mentioned herein for any possible misunderstanding regarding the sense and intension of their philosophic, scientific and/or technical hard work and milestone ideas; we hope that at least Goldbach, Euler and Feymann do not belong to this last human´s collectivity.Peer reviewe
    • …
    corecore