146 research outputs found
Detecting Ontological Conflicts in Protocols between Semantic Web Services
The task of verifying the compatibility between interacting web services has
traditionally been limited to checking the compatibility of the interaction
protocol in terms of message sequences and the type of data being exchanged.
Since web services are developed largely in an uncoordinated way, different
services often use independently developed ontologies for the same domain
instead of adhering to a single ontology as standard. In this work we
investigate the approaches that can be taken by the server to verify the
possibility to reach a state with semantically inconsistent results during the
execution of a protocol with a client, if the client ontology is published.
Often database is used to store the actual data along with the ontologies
instead of storing the actual data as a part of the ontology description. It is
important to observe that at the current state of the database the semantic
conflict state may not be reached even if the verification done by the server
indicates the possibility of reaching a conflict state. A relational algebra
based decision procedure is also developed to incorporate the current state of
the client and the server databases in the overall verification procedure
A Multi-objective Perspective for Operator Scheduling using Fine-grained DVS Architecture
The stringent power budget of fine grained power managed digital integrated
circuits have driven chip designers to optimize power at the cost of area and
delay, which were the traditional cost criteria for circuit optimization. The
emerging scenario motivates us to revisit the classical operator scheduling
problem under the availability of DVFS enabled functional units that can
trade-off cycles with power. We study the design space defined due to this
trade-off and present a branch-and-bound(B/B) algorithm to explore this state
space and report the pareto-optimal front with respect to area and power. The
scheduling also aims at maximum resource sharing and is able to attain
sufficient area and power gains for complex benchmarks when timing constraints
are relaxed by sufficient amount. Experimental results show that the algorithm
that operates without any user constraint(area/power) is able to solve the
problem for most available benchmarks, and the use of power budget or area
budget constraints leads to significant performance gain.Comment: 18 pages, 6 figures, International journal of VLSI design &
Communication Systems (VLSICS
DietCNN: Multiplication-free Inference for Quantized CNNs
The rising demand for networked embedded systems with machine intelligence
has been a catalyst for sustained attempts by the research community to
implement Convolutional Neural Networks (CNN) based inferencing on embedded
resource-limited devices. Redesigning a CNN by removing costly multiplication
operations has already shown promising results in terms of reducing inference
energy usage. This paper proposes a new method for replacing multiplications in
a CNN by table look-ups. Unlike existing methods that completely modify the CNN
operations, the proposed methodology preserves the semantics of the major CNN
operations. Conforming to the existing mechanism of the CNN layer operations
ensures that the reliability of a standard CNN is preserved. It is shown that
the proposed multiplication-free CNN, based on a single activation codebook,
can achieve 4.7x, 5.6x, and 3.5x reduction in energy per inference in an FPGA
implementation of MNIST-LeNet-5, CIFAR10-VGG-11, and Tiny ImageNet-ResNet-18
respectively. Our results show that the DietCNN approach significantly improves
the resource consumption and latency of deep inference for smaller models,
often used in embedded systems. Our code is available at:
https://github.com/swadeykgp/DietCNNComment: Supplementary for S. Dey, P. Dasgupta and P. P. Chakrabarti,
"DietCNN: Multiplication-free Inference for Quantized CNNs," 2023
International Joint Conference on Neural Networks (IJCNN), Gold Coast,
Australia, 2023, pp. 1-8, doi: 10.1109/IJCNN54540.2023.1019177
Complexity of compositional model checking of computation tree logic on simple structures
Temporal Logic Model Checking is one of the most potent tools for the veri.cation of .nite state systems. Computation Tree Logic (CTL) has gained popularity because unlike most other logics, CTL model checking of a single transition system can be achieved in polynomial time. However, in most real-life problems, specially in distributed and parallel systems, the system consist of a set of concurrent processes and the veri.cation problem translates to model check the composition of the component processes. Since explicit composition leads to state explosion, verifying the system without actually composing the components is attractive, even for possibly restrictive class of systems.We show that the problem of compositional CTL model checking is PSPACE complete for the class of systems composed of components that are tree-like transition structure and do not interact among themselves. For the simplest forms of existential and universal CTL formulas model checking turns out to be NP complete and coNP complete, respectively. The results hold for both synchronous and asynchronous composition
What lies between design intent coverage and model checking?
Practitioners of formal property verification often work around the capacity limitations of formal verification tools by breaking down properties into smaller properties that can be checked on the sub-modules of the parent module. To support this methodology, we have developed a formal methodology for verifying whether the decomposition is indeed sound and complete, that is, whether verifying the smaller properties on the submodules actually guarantees the original property on the parent module. In practice, however designers do not write properties for all modules and thereby our previous methodology was applicable to selected cases only. In this paper we present new formal methods that allow us to handle RTL blocks in the analysis. We believe that the new approach will significantly widen the scope of the methodology, thereby enabling the validation engineer to handle much larger designs than admitted by existing formal verification tools
- …