25,762 research outputs found
Process Algebras
Process Algebras are mathematically rigorous languages with well defined semantics that permit describing and verifying properties of concurrent communicating systems.
They can be seen as models of processes, regarded as agents that act and interact continuously with other similar agents and with their common environment. The agents may be real-world objects (even people), or they may be artifacts, embodied perhaps in computer hardware or software systems.
Many different approaches (operational, denotational, algebraic) are taken for describing the meaning of processes. However, the operational approach is the reference one. By relying on the so called Structural Operational Semantics (SOS), labelled transition systems are built and composed by using the different operators of the many different process algebras. Behavioral equivalences are used to abstract from unwanted details and identify those systems that react similarly to external
experiments
Discovering, quantifying, and displaying attacks
In the design of software and cyber-physical systems, security is often
perceived as a qualitative need, but can only be attained quantitatively.
Especially when distributed components are involved, it is hard to predict and
confront all possible attacks. A main challenge in the development of complex
systems is therefore to discover attacks, quantify them to comprehend their
likelihood, and communicate them to non-experts for facilitating the decision
process. To address this three-sided challenge we propose a protection analysis
over the Quality Calculus that (i) computes all the sets of data required by an
attacker to reach a given location in a system, (ii) determines the cheapest
set of such attacks for a given notion of cost, and (iii) derives an attack
tree that displays the attacks graphically. The protection analysis is first
developed in a qualitative setting, and then extended to quantitative settings
following an approach applicable to a great many contexts. The quantitative
formulation is implemented as an optimisation problem encoded into
Satisfiability Modulo Theories, allowing us to deal with complex cost
structures. The usefulness of the framework is demonstrated on a national-scale
authentication system, studied through a Java implementation of the framework.Comment: LMCS SPECIAL ISSUE FORTE 201
The role of concurrency in an evolutionary view of programming abstractions
In this paper we examine how concurrency has been embodied in mainstream
programming languages. In particular, we rely on the evolutionary talking
borrowed from biology to discuss major historical landmarks and crucial
concepts that shaped the development of programming languages. We examine the
general development process, occasionally deepening into some language, trying
to uncover evolutionary lineages related to specific programming traits. We
mainly focus on concurrency, discussing the different abstraction levels
involved in present-day concurrent programming and emphasizing the fact that
they correspond to different levels of explanation. We then comment on the role
of theoretical research on the quest for suitable programming abstractions,
recalling the importance of changing the working framework and the way of
looking every so often. This paper is not meant to be a survey of modern
mainstream programming languages: it would be very incomplete in that sense. It
aims instead at pointing out a number of remarks and connect them under an
evolutionary perspective, in order to grasp a unifying, but not simplistic,
view of the programming languages development process
Recommended from our members
Location-based and contextual mobile learning. A STELLAR Small-Scale Study
This study starts from several inputs that the partners have collected from previous and current running research projects and a workshop organised at the STELLAR Alpine Rendevous 2010. In the study, several steps have been taken, firstly a literature review and analysis of existing systems; secondly, mobile learning experts have been involved in a concept mapping study to identify the main challenges that can be solved via mobile learning; and thirdly, an identification of educational patterns based on these examples has been done.
Out of this study the partners aim to develop an educational framework for contextual learning as a unifying approach in the field. Therefore one of our central research questions is: how can we investigate, theorise, model and support contextual learning
Adaptive Resonance: An Emerging Neural Theory of Cognition
Adaptive resonance is a theory of cognitive information processing which has been realized as a family of neural network models. In recent years, these models have evolved to incorporate new capabilities in the cognitive, neural, computational, and technological domains. Minimal models provide a conceptual framework, for formulating questions about the nature of cognition; an architectural framework, for mapping cognitive functions to cortical regions; a semantic framework, for precisely defining terms; and a computational framework, for testing hypotheses. These systems are here exemplified by the distributed ART (dART) model, which generalizes localist ART systems to allow arbitrarily distributed code representations, while retaining basic capabilities such as stable fast learning and scalability. Since each component is placed in the context of a unified real-time system, analysis can move from the level of neural processes, including learning laws and rules of synaptic transmission, to cognitive processes, including attention and consciousness. Local design is driven by global functional constraints, with each network synthesizing a dynamic balance of opposing tendencies. The self-contained working ART and dART models can also be transferred to technology, in areas that include remote sensing, sensor fusion, and content-addressable information retrieval from large databases.Office of Naval Research and the defense Advanced Research Projects Agency (N00014-95-1-0409, N00014-1-95-0657); National Institutes of Health (20-316-4304-5
Issues about the Adoption of Formal Methods for Dependable Composition of Web Services
Web Services provide interoperable mechanisms for describing, locating and
invoking services over the Internet; composition further enables to build
complex services out of simpler ones for complex B2B applications. While
current studies on these topics are mostly focused - from the technical
viewpoint - on standards and protocols, this paper investigates the adoption of
formal methods, especially for composition. We logically classify and analyze
three different (but interconnected) kinds of important issues towards this
goal, namely foundations, verification and extensions. The aim of this work is
to individuate the proper questions on the adoption of formal methods for
dependable composition of Web Services, not necessarily to find the optimal
answers. Nevertheless, we still try to propose some tentative answers based on
our proposal for a composition calculus, which we hope can animate a proper
discussion
Topological derivation of shape exponents for stretched exponential relaxation
In homogeneous glasses, values of the important dimensionless
stretched-exponential shape parameter beta are shown to be determined by magic
(not adjusted) simple fractions derived from fractal configuration spaces of
effective dimension d* by applying different topological axioms (rules) in the
presence (absence) of a forcing electric field. The rules are based on a new
central principle for defining glassy states: equal a priori distributions of
fractal residual configurational entropy. Our approach and its beta estimates
are fully supported by the results of relaxation measurements involving many
different glassy materials and probe methods. The present unique topological
predictions for beta typically agree with observed values to ~ 1% and indicate
that for field-forced conditions beta should be constant for appreciable ranges
of such exogenous variables as temperature and ionic concentration, as indeed
observed using appropriate data analysis. The present approach can also be
inverted and used to test sample homogeneity and quality.Comment: Original 13 pages lengthened to 21 pages (longer introduction, added
references and discussion of new experimental data published since original
submission
A Manifesto of Nodalism
This paper proposes the notion of Nodalism as a means describing contemporary culture and of understanding my own creative practice in electronic music composition. It draws on theories and ideas from Kirby, Bauman, Bourriaud, Deleuze, Guatarri, and Gochenour, to demonstrate how networks of ideas or connectionist neural models of cognitive behaviour can be used to contextualize, understand and become a creative tool for the creation of contemporary electronic music
- âŠ