3,553 research outputs found
Formal Foundations for Provably Safe Web Components
One of the cornerstones of modern software development that enables the creation of
sophisticated software systems is the concept of reusable software components. Especially
the fast-paced and business-driven web ecosystem is in need of a robust and safe way of
reusing components. As it stands, however, the concepts and functions needed to create
web components are spread out, immature, and not clearly defined, leaving much room
for misunderstandings.
To improve the situation, we need to look at the core of web browsers: the Document
Object Model (DOM). It represents the state of a website with which users and client-side
code (JavaScript) interact. Being in this central position makes the DOM the most
central and critical part of a web browser with respect to safety and security, so we
need to understand exactly what it does and which guarantees it provides. A well-
established approach for this kind of highly critical system is to apply formal methods to
mathematically prove certain properties.
In this thesis, we provide a formal analysis of web components based on shadow roots,
highlight their short-comings by proving them unsafe in many circumstances, and propose
suggestions to provably improve their safety. In more detail, we build a formalisation
of the Core DOM in Isabelle/HOL into which we introduce shadow roots. Then, we
extract novel properties and invariants that improve the often implicit assumptions
of the standard. We show that the model complies to the standard by symbolically
evaluating all relevant test cases from the official compliance suite successfully on our
model. We introduce novel definitions of web components and their safety and classify
the most important DOM API accordingly, by which we uncover surprising behavior and
shortcomings. Finally, we propose changes to the DOM standard by altering our model
and proving that the safety of many DOM API methods improves while leading to a less
ambiguous API
Creative Thinking and Modelling for the Decision Support in Water Management
This paper reviews the state of art in knowledge and preferences elicitation techniques. The purpose of the study was to evaluate various cognitive mapping techniques in order to conclude with the identification of the optimal technique for the NetSyMod methodology. Network Analysis – Creative System Modelling (NetSyMod) methodology has been designed for the improvement of decision support systems (DSS) with respect to the environmental problems. In the paper the difference is made between experts and stakeholders knowledge and preference elicitation methods. The suggested technique is very similar to the Nominal Group Techniques (NGT) with the external representation of the analysed problem by means of the Hodgson Hexagons. The evolving methodology is undergoing tests within several EU-funded projects such as: ITAES, IISIM, NostrumDSS.Creative modelling, Cognitive mapping, Preference elicitation techniques, Decision support
Automatic Generation of Minimal Cut Sets
A cut set is a collection of component failure modes that could lead to a
system failure. Cut Set Analysis (CSA) is applied to critical systems to
identify and rank system vulnerabilities at design time. Model checking tools
have been used to automate the generation of minimal cut sets but are generally
based on checking reachability of system failure states. This paper describes a
new approach to CSA using a Linear Temporal Logic (LTL) model checker called BT
Analyser that supports the generation of multiple counterexamples. The approach
enables a broader class of system failures to be analysed, by generalising from
failure state formulae to failure behaviours expressed in LTL. The traditional
approach to CSA using model checking requires the model or system failure to be
modified, usually by hand, to eliminate already-discovered cut sets, and the
model checker to be rerun, at each step. By contrast, the new approach works
incrementally and fully automatically, thereby removing the tedious and
error-prone manual process and resulting in significantly reduced computation
time. This in turn enables larger models to be checked. Two different
strategies for using BT Analyser for CSA are presented. There is generally no
single best strategy for model checking: their relative efficiency depends on
the model and property being analysed. Comparative results are given for the
A320 hydraulics case study in the Behavior Tree modelling language.Comment: In Proceedings ESSS 2015, arXiv:1506.0325
Rethinking fuelwood: people, policy and the anatomy of a charcoal supply chain in a decentralizing Peru
In Peru, as in many developing countries, charcoal is an important source of fuel. We examine the commercial charcoal commodity chain from its production in Ucayali, in the Peruvian Amazon, to its sale in the national market. Using a mixed-methods approach, we look at the actors involved in the commodity chain and their relationships, including the distribution of benefits along the chain. We outline the obstacles and opportunities for a more equitable charcoal supply chain within a multi-level governance context. The results show that charcoal provides an important livelihood for most of the actors along the supply chain, including rural poor and women. We find that the decentralisation process in Peru has implications for the formalisation of charcoal supply chains, a traditionally informal, particularly related to multi-level institutional obstacles to equitable commerce. This results in inequity in the supply chain, which persecutes the poorest participants and supports the most powerful actors
Wilsonian renormalization, differential equations and Hopf algebras
In this paper, we present an algebraic formalism inspired by Butcher's
B-series in numerical analysis and the Connes-Kreimer approach to perturbative
renormalization. We first define power series of non linear operators and
propose several applications, among which the perturbative solution of a fixed
point equation using the non linear geometric series. Then, following
Polchinski, we show how perturbative renormalization works for a non linear
perturbation of a linear differential equation that governs the flow of
effective actions. Then, we define a general Hopf algebra of Feynman diagrams
adapted to iterations of background field effective action computations. As a
simple combinatorial illustration, we show how these techniques can be used to
recover the universality of the Tutte polynomial and its relation to the
-state Potts model. As a more sophisticated example, we use ordered diagrams
with decorations and external structures to solve the Polchinski's exact
renormalization group equation. Finally, we work out an analogous construction
for the Schwinger-Dyson equations, which yields a bijection between planar
diagrams and a certain class of decorated rooted trees.Comment: 42 pages, 26 figures in PDF format, extended version of a talk given
at the conference "Combinatorics and physics" held at Max Planck Institut
fuer Mathematik in Bonn in march 2007, some misprints correcte
- …