31 research outputs found

    Hierarchical combination of intruder theories

    Get PDF
    International audienceRecently automated deduction tools have proved to be very effective for detecting attacks on cryptographic protocols. These analysis can be improved, for finding more subtle weaknesses, by a more accurate modelling of operators employed by protocols. Several works have shown how to handle a single algebraic operator (associated with a fixed intruder theory) or how to combine several operators satisfying disjoint theories. However several interesting equational theories, such as exponentiation with an abelian group law for exponents remain out of the scope of these techniques. This has motivated us to introduce a new notion of hierarchical combination for non-disjoint intruder theories and to show decidability results for the deduction problem in these theories. We have also shown that under natural hypotheses hierarchical intruder constraints can be decided. This result applies to an exponentiation theory that appears to be more general than the one considered before

    Logic for Programming, Artificial Intelligence, and Reasoning

    Full text link

    Hierarchical Combination of Intruder Theories

    Full text link
    Abstract. Recently automated deduction tools have proved to be very effective for detecting attacks on cryptographic protocols. These analysis can be improved, for finding more subtle weaknesses, by a more accurate modelling of operators employed by protocols. Several works have shown how to handle a single algebraic operator (associated with a fixed intruder theory) or how to combine several operators satisfying disjoint theories. However several interesting equational theories, such as exponentiation with an abelian group law for exponents remain out of the scope of these techniques. This has motivated us to introduce a new notion of hierarchical combination for intruder theories and to show decidability results for the deduction problem in these theories. Under a simple hypothesis, we were able to simplify this deduction problem. This simplification is then applied to prove the decidability of constraint systems w.r.t. an intruder relying on exponentiation theory.

    Approximation based tree regular model checking

    Get PDF
    International audienceThis paper addresses the following general problem of tree regular model-checking: decide whether R(L)Lp=\R^*(L)\cap L_p =\emptyset where R\R^* is the reflexive and transitive closure of a successor relation induced by a term rewriting system R\R, and LL and LpL_p are both regular tree languages. We develop an automatic approximation-based technique to handle this -- undecidable in general -- problem in most practical cases, extending a recent work by Feuillade, Genet and Viet Triem Tong. We also make this approach fully automatic for practical validation of security protocols

    Reasoning about recognizability in security protocols

    Get PDF
    Although verifying a message has long been recognized as an important concept, which has been used explicitly or implicitly in security protocol analysis, there is no consensus on its exact meaning. Such a lack of formal treatment of the concept makes it extremely difficult to evaluate the vulnerability of security protocols. This dissertation offers a precise answer to the question: What is meant by saying that a message can be "verified''? The core technical innovation is a third notion of knowledge in security protocols -- recognizability. It can be considered as intermediate between deduction and static equivalence, two classical knowledge notions in security protocols. We believe that the notion of recognizability sheds important lights on the study of security protocols. More specifically, this thesis makes four contributions. First, we develop a knowledge model to capture an agent's cognitive ability to understand messages. Thanks to a clear distinction between de re/dicto interpretations of a message, the knowledge model unifies both computational and symbolic views of cryptography gracefully. Second, we propose a new notion of knowledge in security protocols -- recognizability -- to fully capture one's ability or inability to cope with potentially ambiguous messages. A terminating procedure is given to decide recognizability under the standard Dolev-Yao model. Third, we establish a faithful view of the attacker based on recognizability. This yields new insights into protocol compilations and protocol implementations. Specifically, we identify two types of attacks that can be thawed through adjusting the protocol implementation; and show that an ideal implementation that corresponds to the intended protocol semantics does not always exist. Overall, the obtained attacker's view provides a path to more secure protocol designs and implementations. Fourth, we use recognizability to provide a new perspective on type-flaw attacks. Unlike most previous approaches that have focused on heuristic schemes to detect or prevent type-flaw attacks, our approach exposes the enabling factors of such attacks. Similarly, we apply the notion of recognizability to analyze off-line guessing attacks. Without enumerating rules to determine whether a guess can be "verified'', we derive a new definition based on recognizability to fully capture the attacker's guessing capabilities. This definition offers a general framework to reason about guessing attacks in a symbolic setting, independent of specific intruder models. We show how the framework can be used to analyze both passive and active guessing attacks

    A system for computational analysis and reconstruction of 3D comminuted bone fractures

    Get PDF
    High energy impacts at joint locations often generate highly fragmented, or comminuted bone fractures. A leading current approach for treatment requires physicians qualitatively to classify the fracture to one of four possible fracture severity cases. Each case then has a sequence of best-practices for obtaining the best possible prognosis for the patient. It has been observed that qualitative evaluation of fracture severity by physicians can vary significantly which can lead to potential mis-classification and mis-treatment of these fracture cases. Major indicators of fracture severity are (i) fracture surface area, i.e., how much surface area was generated when the bone broke apart and (ii) dispersion, i.e., how far the fragments have rotated and translated from their original anatomic positions. Work in this dissertation develops computational tools that solve the bone puzzle-solving problem automatically or semi-automatically and extract previously unavailable quantitative information for these indicators from each bone fragment that are intended to assist physicians in making a more accurate and reliable fracture severity classification. The system applies novel three-dimensional (3D) puzzle-solving algorithms to identify the fracture fragments in the CT image data and piece them back together in a virtual environment. Doing so provides quantitative values for both fracture surface area and dispersion that reduce variability in fracture severity classifications and prevent mis-diagnosis for fracture cases that may be difficult to qualitatively classify using traditional approaches. This dissertation describes the system, the underlying algorithms and demonstrates the virtual reconstruction results and quantitative analysis of comminuted bone fractures from six clinical cases

    A study on unification and disunification modulo

    Get PDF
    Dissertação (mestrado)—Universidade de Brasília, Instituto de Ciências Exatas, Departamento de Ciência da Computação, 2020.Estuda-se a comparação entre unificação assimétrica e desunificação módulo teorias equa- cionais em relação às suas complexidades, como desenvolvida por Ravishankar, Narendran e Gero. A unificação assimétrica é um tipo de unificação equacional em que as soluções devem fornecer o lado direito dos problemas apresentados na forma normal. E a desunifi- cação é resolver problemas com equações e “disequações” em relação à uma teoria equaci- onal dada. As soluções para os problemas de desunificação são substituições que tornam os dois termos de cada equação iguais, mas os dois termos de cada “disequação” diferen- tes. Unificação e desunificação equacional foram comparadas por os autores mencionados com relação as suas complexidades de tempo para duas teorias equacionais: a primeira associativa (A), comutativa (C), com unidade (U) e nilpotente (N), como (ACUN) e a segunda com tais propriedades, mas adicionando um homomorfismo (h), como (ACUNh), mostrando que desunificação pode ser resolvida em tempo polinomial enquanto unificação assimétrica é NP-difícil para ambas as teorias equacionais. Além disso, foi estudada a abordagem introduzidas por Zhiqiang Liu, em sua dissertação de doutorado, para converter osunificadores módulo ACUN em assimétricos, com símbolos de função não interpretados, usando as regras de inferência. Para a teoria associativa comutativa com homomorfismo (ACh), estudou-se a prova de que unificação módulo ACh é indecidível, assim como o algoritmo de semi-decisão, recentemente introduzido por Ajay Kumar Eeralla e Christopher Lynch, que apresenta um conjunto de regras de inferência para resolver o problema com limitações.Comparisons between asymmetric unification and disunification modulo AC concerning their complexities, as developed by Ravishankar, Narendran and Gero are studied. Asym- metric unification is a type of equational unification problem in which the solutions must give as right-hand sides of the input problem, normal forms regarding some rewriting sys- tem. And disunification problems require solving equations and "disequations" for a given equational theory. Solutions to the disunification problems are substitutions that make the two terms of each equation equal, but the two terms of each “disequation” different. These authors compared the complexity of the unification and disunification problems for two equational theories. The properties of the first equational theory are associativity (A), commutativity (C), the existence of unity (U), and nilpotence (N), abbreviated as ACUN. And, the second equational theory has the same properties but adds a homomorphism (h), for short, ACUNh. For such equational theories, details of the proof that disunification can be solved in polynomial time while the asymmetric unification is NP-hard have been studied. Besides, the approach for converting ACUN unifiers to asymmetric ones, with uninterpreted function symbols using the inference rules introduced by Zhiqiang Liu, in his Ph.D. dissertation, was studied. Narendran’s proof of the undecidability of the unifi- cation problem modulo the associative commutative theory with homomorphism ACh is studied. Also, the semi-decision algorithm, recently introduced by Ajay Kumar Eeralla and Christopher Lynch, is studied, which presents a set of inference rules for solving a bounded version of ACh unification
    corecore