5 research outputs found
Task-Structured Probabilistic I/O Automata
In the Probabilistic I/O Automata (PIOA) framework, nondeterministicchoices are resolved using perfect-information schedulers,which are similar to history-dependent policies for Markov decision processes(MDPs). These schedulers are too powerful in the setting of securityanalysis, leading to unrealistic adversarial behaviors. Therefore, weintroduce in this paper a novel mechanism of task partitions for PIOAs.This allows us to define partial-information adversaries in a systematicmanner, namely, via sequences of tasks.The resulting task-PIOA framework comes with simple notions of externalbehavior and implementation, and supports simple compositionalityresults. A new type of simulation relation is defined and proven soundwith respect to our notion of implementation. To illustrate the potentialof this framework, we summarize our verification of an ObliviousTransfer protocol, where we combine formal and computational analyses.Finally, we present an extension with extra expressive power, usinglocal schedulers of individual components
Using Probabilistic I/O Automata to Analyze an Oblivious Transfer Protocol
We demonstrate how to carry out cryptographic security analysis ofdistributed protocols within the Probabilistic I/O Automata frameworkof Lynch, Segala, and Vaandrager.This framework provides tools for arguing rigorously about theconcurrency and scheduling aspects of protocols, and about protocolspresented at different levels of abstraction.Consequently, it can help in making cryptographic analysis moreprecise and less susceptible to errors.We concentrate on a relatively simple two-party Oblivious Transferprotocol, in the presence of a semi-honest adversary (essentially, aneavesdropper).For the underlying cryptographic notion of security, we use a versionof Canetti's Universally Composable security.In spite of the relative simplicity of the example, the exercise isquite nontrivial.It requires taking many fundamental issues into account,including nondeterministic behavior, scheduling, resource-boundedcomputation, and computational hardness assumptions for cryptographicprimitives
Task-Structured Probabilistic I/O Automata
"May 28, 2009."Modeling frameworks such as Probabilistic I/O Automata (PIOA) and Markov Decision Processes permit both probabilistic and nondeterministic choices. In order to use these frameworks to express claims about probabilities of events, one needs mechanisms for resolving nondeterministic choices. For PIOAs, nondeterministic choices have traditionally been resolved by schedulers that have perfect information about the past execution. However, these schedulers are too powerful for certain settings, such as cryptographic protocol analysis, where information must sometimes be hidden. Here, we propose a new, less powerful nondeterminism-resolution mechanism for PIOAs, consisting of tasks and local schedulers. Tasks are equivalence classes of system actions that are scheduled by oblivious, global task sequences. Local schedulers resolve nondeterminism within system components, based on local information only. The resulting task-PIOA framework yields simple notions of external behavior and implementation, and supports simple compositionality results. We also define a new kind of simulation relation, and show it to be sound for proving implementation. We illustrate the potential of the task-PIOAframework by outlining its use in verifying an Oblivious Transfer protocol
Complexity Results for Reachability in Cooperating Systems and Approximated Reachability by Abstract Over-Approximations
This work deals with theoretic aspects of cooperating systems, i.e., systems that consists of cooperating subsystems. Our main focus lies on the complexity theoretic classification of deciding the reachability problem and on efficiently establishing deadlock-freedom in models of cooperating systems. The formal verification of system properties is an active field of research, first attempts of which go back to the late 60's. The behavior of cooperating systems suffers from the state space explosion problem and can become very large. This is, techniques that are based on an analysis of the reachable state space have a runtime exponential in the number of subsystems. The consequence is that even modern techniques that decide whether or not a system property holds in a system can become unfeasible.
We use interaction systems, introduced by Sifakis et al. in 2003, as a formalism to model cooperating systems. The reachability problem and deciding deadlock-freedom in interaction systems was proved to be PSPACE-complete. An approach to deal with this issue is to investigate subclasses of systems in which these problems can be treated efficiently. We show here that the reachability problem remains PSPACE-complete in subclasses of interaction systems with a restricted communication structure. We consider structures that from trees, stars and linear arrangements of subsystems. Our result motivates the research of techniques that treat the reachability problem in these subclasses based on sufficient conditions which exploit characteristics of the structural restrictions.
In a second part of this work we investigate an approach to efficiently establish the reachability of states and deadlock-freedom in general interaction systems. We introduce abstract over-approximations -- a concept of compact representations of over-approximations of the reachable behavior of interaction systems. Families of abstract over-approximations are the basis for our approach to establish deadlock-freedom in interaction systems in polynomial time in the size of the underlying interaction system. We introduce an operator called Edge-Match for refining abstract over-approximations. The strength of our approach is illustrated on various parametrized instances of interaction systems. Furthermore, we establish a link between our refinement approach and the field of relational database theory and use this link in order to make a preciseness statement about our refinement approach
Hierarchical and compositional verification of cryptographic protocols
Nella verifica dei protocolli di sicurezza ci sono due importanti approcci
che sono conosciuti sotto il nome di approccio simbolico e computazionale,
rispettivamente. Nell'approccio simbolico i messaggi sono termini di
un'algebra e le primitive crittografiche sono idealmente sicure; nell'approccio
computazionale i messaggi sono sequenze di bit e le primitive crittografiche
sono sicure con elevata probabilit\ue0. Questo significa, per esempio, che
nell'approccio simbolico solo chi conosce la chiave di decifratura pu\uf2 decifrare
un messaggio cifrato, mentre nell'approccio computazionale la probabilit\ue0 di decifrare un testo cifrato senza conoscere la chiave di decifratura \ue8 trascurabile.
Di solito, i protocolli crittografici sono il risultato dell'interazione di molte
componenti: alcune sono basate su primitive crittografiche, altre su altri
principi. In generale, quello che risulta \ue8 un sistema complesso che vorremmo
poter analizzare in modo modulare invece che doverlo studiare come un
singolo sistema.
Una situazione simile pu\uf2 essere trovata nel contesto dei sistemi distribuiti,
dove ci sono molti componenti probabilistici che interagiscono tra loro
implementando un algoritmo distribuito. In questo contesto l'analisi della
correttezza di un sistema complesso \ue8 molto rigorosa ed \ue8 basata su strumenti
che derivano dalla teoria dell'informazione, strumenti come il metodo
di simulazione che permette di decomporre grossi problemi in problemi pi\uf9 piccoli e di verificare i sistemi in modo gerarchico e composizionale. Il metodo
di simulazione consiste nello stabilire delle relazioni tra gli stati di due
automi, chiamate relazioni di simulazione, e nel verificare che tali relazioni
soddisfano delle condizioni di passo appropriate, come che ogni transizione
del sistema simulato pu\uf2 essere imitata dal sistema simulante nel rispetto
della relazione data. Usando un approccio composizionale possiamo studiare
le propriet\ue0 di ogni singolo sotto-problema indipendentemente dagli altri
per poi derivare le propriet\ue0 del sistema complessivo. Inoltre, la verifica gerarchica
ci permette di definire molti raffinamenti intermedi tra la specifica
e l'implementazione. Spesso la verifica gerarchica e composizionale \ue8 pi\uf9
semplice e chiara che l'intera verifica fatta in una volta sola.
In questa tesi introduciamo una nuova relazione di simulazione, che chiamiamo
simulazione polinomialmente accurata o simulazione approssimata,
che \ue8 composizionale e che permette di usare l\u2019approccio gerarchico nelle nostre
analisi. Le simulazioni polinomialmente accurate estendono le relazioni
di simulazione definite nel contesto dei sistemi distribuiti sia nel caso forte
sia in quello debole tenendo conto delle lunghezze delle esecuzioni e delle
propriet\ue0 computazionali delle primitive crittografiche.
Oltre alle simulazioni polinomialmente accurate, forniamo altri strumenti
che possono semplificare l\u2019analisi dei protocolli crittografici: il primo \ue8 il
concetto di automa condizionale che permette di rimuovere eventi che occorrono
con probabilit\ue0 trascurabile in modo sicuro. Data una macchina
che \ue8 attaccabile con probabilit\ue0 trascurabile, se costruiamo un automa che \ue8 condizionale all'assenza di questi attacchi, allora esiste una simulazione
tra i due. Questo ci permette, tra l'altro, di lavorare con le relazioni di
simulazione tutto il tempo e in particolare possiamo anche dimostrare in
modo composizionale che l'eliminazione di eventi trascurabili \ue8 sicura. Questa
propriet\ue0 \ue8 giustificata dal teorema dell\u2019automa condizionale che afferma
che gli eventi sono trascurabili se e solo se la relazione identit\ue0 \ue8 una simulazione
approssimata dall\u2019automa alla sua controparte condizionale. Un altro
strumento \ue8 il teorema della corrispondenza delle esecuzioni, che estende
quello del contesto dei sistemi distribuiti, che giustifica l\u2019approccio gerarchico.
Infatti, il teorema afferma che se abbiamo molti automi e una catena
di simulazioni tra di essi, allora con elevata probabilit\ue0 ogni esecuzione del
primo automa della catena \ue8 in relazione con un\u2019esecuzione dell'ultimo automa
della catena. In altre parole, abbiamo che la probabilit\ue0 che l'ultimo
automa non sia in grado di simulare un\u2019esecuzione del primo \ue8 trascurabile.
Infine, usiamo il framework delle simulazioni polinomialmente accurate
per fornire delle famiglie di automi che implementano le primitive crittografiche
comunemente usate e per dimostrare che l'approccio simbolico \ue8
corretto rispetto all\u2019approccio computazionale.Two important approaches to the verification of security protocols are
known under the general names of symbolic and computational, respectively.
In the symbolic approach messages are terms of an algebra and the cryptographic
primitives are ideally secure; in the computational approach messages
are bitstrings and the cryptographic primitives are secure with overwhelming
probability. This means, for example, that in the symbolic approach
only who knows the decryption key can decrypt a ciphertext, while in
the computational approach the probability to decrypt a ciphertext without
knowing the decryption key is negligible.
Usually, the cryptographic protocols are the outcome of the interaction
of several components: some of them are based on cryptographic primitives,
other components on other principles. In general, the result is a complex
system that we would like to analyse in a modular way instead of studying
it as a single system.
A similar situation can be found in the context of distributed systems,
where there are several probabilistic components that interact with each
other implementing a distributed algorithm. In this context, the analysis
of the correctness of a complex system is very rigorous and it is based on
tools from information theory such as the simulation method that allows
us to decompose large problems into smaller problems and to verify systems
hierarchically and compositionally. The simulation method consists
of establishing relations between the states of two automata, called simulation
relations, and to verify that such relations satisfy appropriate step
conditions: each transition of the simulated system can be matched by the
simulating system up to the given relation. Using a compositional approach
we can study the properties of each small problem independently from the
each other, deriving the properties of the overall system. Furthermore, the
hierarchical verification allows us to build several intermediate refinements
between specification and implementation. Often hierarchical and compositional
verification is simpler and cleaner than direct one-step verification,
since each refinement may focus on specific homogeneous aspects of the implementation.
In this thesis we introduce a new simulation relation, that we call polynomially
accurate simulation, or approximated simulation, that is compositional
and that allows us to adopt the hierarchical approach in our analyses.
The polynomially accurate simulations extend the simulation relations of
the distributed systems context in both strong and weak cases taking into
account the lengths of the computations and of the computational properties
of the cryptographic primitives.
Besides the polynomially accurate simulations, we provide other tools
that can simplify the analysis of cryptographic protocols: the first one is the
concept of conditional automaton, that permits to safely remove events that
occur with negligible probability. Starting from a machine that is attackable
with negligible probability, if we build an automaton that is conditional to
the absence of these attacks, then there exists a simulation. And this allows
us to work with the simulation relations all the time and in particular we can
also prove in a compositional way that the elimination of negligible events
from an automaton is safe. This property is justified by the conditional
automaton theorem that states that events are negligible if and only if the
identity relation is an approximated simulation from the automaton and
its conditional counterpart. Another tool is the execution correspondence
theorem, that extends the one of the distributed systems context, that allows
us to use the hierarchical approach. In fact, the theorem states that if we
have several automata and a chain of simulations between them, then with
overwhelming probability each execution of the first automaton is related
to an execution of the last automaton. In other words, we have that the
probability that the last automaton is not able to simulate an execution of
the first one is negligible.
Finally, we use the polynomially accurate simulation framework to provide
families of automata that implement commonly used cryptographic
primitives and to prove that the symbolic approach is sound with respect to
the computational approach