79,727 research outputs found
Model checking control communication of a FACTS device
This thesis concerns the design and verification of a real-time communication protocol for sensor data collection and processing between an embedded computer and a DSP. In such systems, a certain amount of data loss without recovery may be tolerated. The key issue is to design and verify the correctness in the presence of these lost data frames under real-time constraints. This thesis describes a temporal verification that if the end processes do not detect that too many frames are lost, defined by comparison of error counters against given threshold values, then there will be a bounded delay between transmission of data frames and reception of control frames. This verification and others presented herein were performed with the model checkers SPIN and RT-SPIN --Abstract, page iii
Model Checking Control Communication of a FACTS Device
This paper concerns the design and verification of a realtime communication protocol for sensor data collection and processing between an embedded computer and a DSP. In such systems, a certain amount of data loss without recovery may be tolerated. The key issue is to define and verify the correctness in the presence of these lost data frames under real-time constraints. This paper describes a temporal verification that if the end processes do not detect that too many frames are lost, defined by comparison of error counters against given threshold values, then there will be a bounded delay between transmission of data frames and reception of control frames. This verification and others presented herein were performed with the model checkers SPIN and RT-SPIN
Modular Verification of Interrupt-Driven Software
Interrupts have been widely used in safety-critical computer systems to
handle outside stimuli and interact with the hardware, but reasoning about
interrupt-driven software remains a difficult task. Although a number of static
verification techniques have been proposed for interrupt-driven software, they
often rely on constructing a monolithic verification model. Furthermore, they
do not precisely capture the complete execution semantics of interrupts such as
nested invocations of interrupt handlers. To overcome these limitations, we
propose an abstract interpretation framework for static verification of
interrupt-driven software that first analyzes each interrupt handler in
isolation as if it were a sequential program, and then propagates the result to
other interrupt handlers. This iterative process continues until results from
all interrupt handlers reach a fixed point. Since our method never constructs
the global model, it avoids the up-front blowup in model construction that
hampers existing, non-modular, verification techniques. We have evaluated our
method on 35 interrupt-driven applications with a total of 22,541 lines of
code. Our results show the method is able to quickly and more accurately
analyze the behavior of interrupts.Comment: preprint of the ASE 2017 pape
Computation using Noise-based Logic: Efficient String Verification over a Slow Communication Channel
Utilizing the hyperspace of noise-based logic, we show two string
verification methods with low communication complexity. One of them is based on
continuum noise-based logic. The other one utilizes noise-based logic with
random telegraph signals where a mathematical analysis of the error probability
is also given. The last operation can also be interpreted as computing
universal hash functions with noise-based logic and using them for string
comparison. To find out with 10^-25 error probability that two strings with
arbitrary length are different (this value is similar to the error probability
of an idealistic gate in today's computer) Alice and Bob need to compare only
83 bits of the noise-based hyperspace.Comment: Accepted for publication in European Journal of Physics B (November
10, 2010
Statistical Model Checking : An Overview
Quantitative properties of stochastic systems are usually specified in logics
that allow one to compare the measure of executions satisfying certain temporal
properties with thresholds. The model checking problem for stochastic systems
with respect to such logics is typically solved by a numerical approach that
iteratively computes (or approximates) the exact measure of paths satisfying
relevant subformulas; the algorithms themselves depend on the class of systems
being analyzed as well as the logic used for specifying the properties. Another
approach to solve the model checking problem is to \emph{simulate} the system
for finitely many runs, and use \emph{hypothesis testing} to infer whether the
samples provide a \emph{statistical} evidence for the satisfaction or violation
of the specification. In this short paper, we survey the statistical approach,
and outline its main advantages in terms of efficiency, uniformity, and
simplicity.Comment: non
The Audit Logic: Policy Compliance in Distributed Systems
We present a distributed framework where agents can share data along with usage policies. We use an expressive policy language including conditions, obligations and delegation. Our framework also supports the possibility to refine policies. Policies are not enforced a-priori. Instead policy compliance is checked using an a-posteriri auditing approach. Policy compliance is shown by a (logical) proof that the authority can systematically check for validity. Tools for automatically checking and generating proofs are also part of the framework.\u
Ground terminal expert (GTEX). Part 2: Expert system diagnostics for a 30/20 Gigahertz satellite transponder
A research effort was undertaken to investigate how expert system technology could be applied to a satellite communications system. The focus of the expert system is the satellite earth station. A proof of concept expert system called the Ground Terminal Expert (GTEX) was developed at the University of Akron in collaboration with the NASA Lewis Research Center. With the increasing demand for satellite earth stations, maintenance is becoming a vital issue. Vendors of such systems will be looking for cost effective means of maintaining such systems. The objective of GTEX is to aid in diagnosis of faults occurring with the digital earth station. GTEX was developed on a personal computer using the Automated Reasoning Tool for Information Management (ART-IM) developed by the Inference Corporation. Developed for the Phase 2 digital earth station, GTEX is a part of the Systems Integration Test and Evaluation (SITE) facility located at the NASA Lewis Research Center
Applying Formal Methods to Networking: Theory, Techniques and Applications
Despite its great importance, modern network infrastructure is remarkable for
the lack of rigor in its engineering. The Internet which began as a research
experiment was never designed to handle the users and applications it hosts
today. The lack of formalization of the Internet architecture meant limited
abstractions and modularity, especially for the control and management planes,
thus requiring for every new need a new protocol built from scratch. This led
to an unwieldy ossified Internet architecture resistant to any attempts at
formal verification, and an Internet culture where expediency and pragmatism
are favored over formal correctness. Fortunately, recent work in the space of
clean slate Internet design---especially, the software defined networking (SDN)
paradigm---offers the Internet community another chance to develop the right
kind of architecture and abstractions. This has also led to a great resurgence
in interest of applying formal methods to specification, verification, and
synthesis of networking protocols and applications. In this paper, we present a
self-contained tutorial of the formidable amount of work that has been done in
formal methods, and present a survey of its applications to networking.Comment: 30 pages, submitted to IEEE Communications Surveys and Tutorial
- …