1,005 research outputs found
Honesty by typing
We propose a type system for a calculus of contracting processes. Processes may stipulate contracts, and then either behave honestly, by keeping the promises made, or not. Type safety guarantees that a typeable process is honest - that is, the process abides by the contract it has stipulated in all possible contexts, even those containing dishonest adversaries
Model checking usage policies
We study usage automata, a formal model for specifying policies on the usage of resources. Usage automata extend finite state automata with some additional features, parameters and guards, that improve their expressivity. We show that usage automata are expressive enough to model policies of real-world applications. We discuss their expressive power, and we prove that the problem of telling whether a computation complies with a usage policy is decidable. The main contribution of this paper is a model checking technique for usage automata. The model is that of usages, i.e. basic processes that describe the possible patterns of resource access and creation. In spite of the model having infinite states, because of recursion and resource creation, we devise a polynomial-time model checking technique for deciding when a usage complies with a usage policy
Formal Models of Bitcoin Contracts: A Survey
Although Bitcoin is mostly used as a decentralized application to transfer cryptocurrency, over the last 10 years there have been several studies on how to exploit Bitcoin to execute smart contracts. These are computer protocols which allow users to exchange bitcoins according to complex pre-agreed rules. Some of these studies introduce formal models of Bitcoin contracts, which specify their behavior in non-ambiguous terms, in some cases providing tools to automatically verify relevant contract properties. In this paper, we survey the formal models proposed in the scientific literature, comparing their expressiveness and applicability in the wild
A Computationally Light Pruning Strategy for Single Layer Neural Networks based on Threshold Function
Embedded machine learning relies on inference functions that can fit resource-constrained, low-power computing devices. The literature proves that single layer neural networks using threshold functions can provide a suitable trade off between classification accuracy and computational cost. In this regard, the number of neurons directly impacts both on computational complexity and on resources allocation. Thus, the present research aims at designing an efficient pruning technique that can take into account the peculiarities of the threshold function. The paper shows that feature selection criteria based on filter models can effectively be applied to neuron selection. In particular, valuable outcomes can be obtained by designing ad-hoc objective functions for the selection process. An extensive experimental campaign confirms that the proposed objective function compares favourably with state-of-the-art pruning techniques
Effects of grazing intensity and the use of veterinary medical products on dung beetle biodiversity in the sub-mountainous landscape of Central Italy
Grazing extensification and intensification are among the main problems affecting European grasslands. We analyze the impact of grazing intensity (low and moderate) and the use of veterinary medical products (VMPs) on the dung beetle community in the province of Pesaro-Urbino (Italy). Grazing intensity is a key factor in explaining the diversity of dung beetles. In the case of the alpha diversity component, sites with a low level of grazing activity—related in a previous step to the subsequent abandonment of traditional farming—is characterized by a loss of species richness (q = 0) and a reduction in alpha diversity at the levels q = 1 and q = 2. In the case of beta diversity, sites with a different grazing intensity show remarkable differences in terms of the composition of their species assemblages. The use of VMPs is another important factor in explaining changes in dung beetle diversity. In sites with a traditional use of VMPs, a significant loss of species richness and biomass is observed, as is a notable effect on beta diversity. In addition, the absence of indicator species in sites with a historical use of VMPs corroborates the hypothesis that these substances have a ubiquitous effect on dung beetles. However, the interaction between grazing activity and VMPs when it comes to explaining changes in dung beetle diversity is less significant (or is not significant) than the main effects (each factor separately) for alpha diversity, biomass and species composition. This may be explained if we consider that both factors affect the various species differently. In other words, the reduction in dung availability affects several larger species more than it does very small species, although this does not imply that the former are more susceptible to injury caused by the ingestion of dung contaminated with VMPs. Finally, in order to prevent negative consequences for dung beetle diversity, we propose the maintenance of a moderate grazing intensity and the rational use of VMPs. It is our view that organic management can prevent excessive extensification while providing an economic stimulus to the sector. Simultaneously, it can also prevent the abuse of VMPs.Financial support was partially provided by Project CGL2015-68207-R of the Secretaría de Estado de Investigación, Desarrollo e Innovación of the Ministerio de Economía y Competitividad of Spain. Mattia Tonelli benefited for an Italian ministerial PhD scholarship
A Deep Learning approach to Reduced Order Modelling of Parameter Dependent Partial Differential Equations
Within the framework of parameter dependent PDEs, we develop a constructive
approach based on Deep Neural Networks for the efficient approximation of the
parameter-to-solution map. The research is motivated by the limitations and
drawbacks of state-of-the-art algorithms, such as the Reduced Basis method,
when addressing problems that show a slow decay in the Kolmogorov n-width. Our
work is based on the use of deep autoencoders, which we employ for encoding and
decoding a high fidelity approximation of the solution manifold. In order to
fully exploit the approximation capabilities of neural networks, we consider a
nonlinear version of the Kolmogorov n-width over which we base the concept of a
minimal latent dimension. We show that this minimal dimension is intimately
related to the topological properties of the solution manifold, and we provide
some theoretical results with particular emphasis on second order elliptic
PDEs. Finally, we report numerical experiments where we compare the proposed
approach with classical POD-Galerkin reduced order models. In particular, we
consider parametrized advection-diffusion PDEs, and we test the methodology in
the presence of strong transport fields, singular terms and stochastic
coefficients
Modelling and verifying contract-oriented systems in Maude
We address the problem of modelling and verifying contractoriented systems, wherein distributed agents may advertise and stipulate contracts, but — differently from most other approaches to distributed agents — are not assumed to always behave “honestly”. We describe an executable specification in Maude of the semantics of CO2, a calculus for contract-oriented systems [6]. The honesty property [5] characterises those agents which always respect their contracts, in all possible execution contexts. Since there is an infinite number of such contexts, honesty cannot be directly verified by model-checking the state space of an agent (indeed, honesty is an undecidable property in general [5]). The main contribution of this paper is a sound verification technique for honesty. To do that, we safely over-approximate the honesty property by abstracting from the actual contexts a process may be engaged with. Then, we develop a model-checking technique for this abstraction, we describe an implementation in Maude, and we discuss some experiments with it
A survey on deep learning in image polarity detection: Balancing generalization performances and computational costs
Deep convolutional neural networks (CNNs) provide an effective tool to extract complex information from images. In the area of image polarity detection, CNNs are customarily utilized in combination with transfer learning techniques to tackle a major problem: the unavailability of large sets of labeled data. Thus, polarity predictors in general exploit a pre-trained CNN as the feature extractor that in turn feeds a classification unit. While the latter unit is trained from scratch, the pre-trained CNN is subject to fine-tuning. As a result, the specific CNN architecture employed as the feature extractor strongly affects the overall performance of the model. This paper analyses state-of-the-art literature on image polarity detection and identifies the most reliable CNN architectures. Moreover, the paper provides an experimental protocol that should allow assessing the role played by the baseline architecture in the polarity detection task. Performance is evaluated in terms of both generalization abilities and computational complexity. The latter attribute becomes critical as polarity predictors, in the era of social networks, might need to be updated within hours or even minutes. In this regard, the paper gives practical hints on the advantages and disadvantages of the examined architectures both in terms of generalization and computational cost
Vicious circles in contracts and in logic
Contracts are formal promises on the future interactions of participants, which describe the causal dependencies among their actions. An inherent feature of contracts is that such dependencies may be circular: for instance, a buyer promises to pay for an item if the seller promises to ship it, and vice versa. We establish a bridge between two formal models for contracts, one based on games over event structures, and the other one on Propositional Contract Logic. In particular, we show that winning strategies in the game-theoretic model correspond to proofs in the logi
- …