2,003 research outputs found
The xSAP Safety Analysis Platform
This paper describes the xSAP safety analysis platform. xSAP provides several
model-based safety analysis features for finite- and infinite-state synchronous
transition systems. In particular, it supports library-based definition of
fault modes, an automatic model extension facility, generation of safety
analysis artifacts such as Dynamic Fault Trees (DFTs) and Failure Mode and
Effects Analysis (FMEA) tables. Moreover, it supports probabilistic evaluation
of Fault Trees, failure propagation analysis using Timed Failure Propagation
Graphs (TFPGs), and Common Cause Analysis (CCA). xSAP has been used in several
industrial projects as verification back-end, and is currently being evaluated
in a joint R&D Project involving FBK and The Boeing Company
Causality and Temporal Dependencies in the Design of Fault Management Systems
Reasoning about causes and effects naturally arises in the engineering of
safety-critical systems. A classical example is Fault Tree Analysis, a
deductive technique used for system safety assessment, whereby an undesired
state is reduced to the set of its immediate causes. The design of fault
management systems also requires reasoning on causality relationships. In
particular, a fail-operational system needs to ensure timely detection and
identification of faults, i.e. recognize the occurrence of run-time faults
through their observable effects on the system. Even more complex scenarios
arise when multiple faults are involved and may interact in subtle ways.
In this work, we propose a formal approach to fault management for complex
systems. We first introduce the notions of fault tree and minimal cut sets. We
then present a formal framework for the specification and analysis of
diagnosability, and for the design of fault detection and identification (FDI)
components. Finally, we review recent advances in fault propagation analysis,
based on the Timed Failure Propagation Graphs (TFPG) formalism.Comment: In Proceedings CREST 2017, arXiv:1710.0277
Discussion to: ‘Guidelines on the use of inverse velocity method as a tool for setting alarm thresholds and forecasting landslides and structure collapses’ by T. Carlà , E. Intrieri, F. Di Traglia, T. Nolesini, G. Gigli and N. Casagli
The paper ‘Guidelines on the use of inverse velocity method as a tool for setting alarm thresholds and forecasting landslides and structure collapses’ by T. Carlà , E. Intrieri, F. Di Traglia, T. Nolesini, G. Gigli and N. Casagli deals with a sensitive topic for landslide risk management. Exploring the pre-failure behaviour of four different case histories, the authors proposed standard procedures for the application of the inverse velocity method (INV, Fukuzono 1985). Specifically, they suggested guidelines for the filtering of velocity data and an original and simple approach to automatically set the first and the second alarm thresholds using the inverse velocity method. The present discussion addresses three different topics: (1) data filter selection according to the features of monitoring instrument; (2) the importance of data sampling frequency for the forecasting analysis and (3) the influence of the starting point (SP in this discussion) for the application of INV analysis. Moreover, based on this matter, a new method is proposed to update the INV analysis on an ongoing basis
Digital image correlation (DIC) analysis of the 3 December 2013 Montescaglioso landslide (Basilicata, Southern Italy). Results from a multi-dataset investigation
Image correlation remote sensing monitoring techniques are becoming key tools for
providing effective qualitative and quantitative information suitable for natural hazard assessments,
specifically for landslide investigation and monitoring. In recent years, these techniques have
been successfully integrated and shown to be complementary and competitive with more standard
remote sensing techniques, such as satellite or terrestrial Synthetic Aperture Radar interferometry.
The objective of this article is to apply the proposed in-depth calibration and validation analysis,
referred to as the Digital Image Correlation technique, to measure landslide displacement.
The availability of a multi-dataset for the 3 December 2013 Montescaglioso landslide, characterized
by different types of imagery, such as LANDSAT 8 OLI (Operational Land Imager) and TIRS
(Thermal Infrared Sensor), high-resolution airborne optical orthophotos, Digital Terrain Models
and COSMO-SkyMed Synthetic Aperture Radar, allows for the retrieval of the actual landslide
displacement field at values ranging from a few meters (2–3 m in the north-eastern sector of the
landslide) to 20–21 m (local peaks on the central body of the landslide). Furthermore, comprehensive
sensitivity analyses and statistics-based processing approaches are used to identify the role of the
background noise that affects the whole dataset. This noise has a directly proportional relationship to
the different geometric and temporal resolutions of the processed imagery. Moreover, the accuracy
of the environmental-instrumental background noise evaluation allowed the actual displacement
measurements to be correctly calibrated and validated, thereby leading to a better definition of
the threshold values of the maximum Digital Image Correlation sub-pixel accuracy and reliability
(ranging from 1/10 to 8/10 pixel) for each processed dataset
An Effective Fixpoint Semantics for Linear Logic Programs
In this paper we investigate the theoretical foundation of a new bottom-up
semantics for linear logic programs, and more precisely for the fragment of
LinLog that consists of the language LO enriched with the constant 1. We use
constraints to symbolically and finitely represent possibly infinite
collections of provable goals. We define a fixpoint semantics based on a new
operator in the style of Tp working over constraints. An application of the
fixpoint operator can be computed algorithmically. As sufficient conditions for
termination, we show that the fixpoint computation is guaranteed to converge
for propositional LO. To our knowledge, this is the first attempt to define an
effective fixpoint semantics for linear logic programs. As an application of
our framework, we also present a formal investigation of the relations between
LO and Disjunctive Logic Programming. Using an approach based on abstract
interpretation, we show that DLP fixpoint semantics can be viewed as an
abstraction of our semantics for LO. We prove that the resulting abstraction is
correct and complete for an interesting class of LO programs encoding Petri
Nets.Comment: 39 pages, 5 figures. To appear in Theory and Practice of Logic
Programmin
Model Checking Linear Logic Specifications
The overall goal of this paper is to investigate the theoretical foundations
of algorithmic verification techniques for first order linear logic
specifications. The fragment of linear logic we consider in this paper is based
on the linear logic programming language called LO enriched with universally
quantified goal formulas. Although LO was originally introduced as a
theoretical foundation for extensions of logic programming languages, it can
also be viewed as a very general language to specify a wide range of
infinite-state concurrent systems.
Our approach is based on the relation between backward reachability and
provability highlighted in our previous work on propositional LO programs.
Following this line of research, we define here a general framework for the
bottom-up evaluation of first order linear logic specifications. The evaluation
procedure is based on an effective fixpoint operator working on a symbolic
representation of infinite collections of first order linear logic formulas.
The theory of well quasi-orderings can be used to provide sufficient conditions
for the termination of the evaluation of non trivial fragments of first order
linear logic.Comment: 53 pages, 12 figures "Under consideration for publication in Theory
and Practice of Logic Programming
Cooperativismo escolar. propuestas didácticas en el contexto de la educación cooperativa
El Modelo Curricular de la República Argentina incluye como uno de sus objetivos prácticas cooperativas en la Educación Secundaria. El presente trabajo desarrolla un proyecto para dar lugar a la estimulación de las habilidades interpersonales a través de actividades para la clase de Matemática correspondiente a la etapa de formalización de estructuras conceptuales-procedimentales, apoyadas en los Pilares del Cooperativismo, con una concepción de Educación para la Libertad, la Justicia y la Solidaridad
Assessing Gender Inequality among Italian Regions: The Italian Gender Gap Index
This paper aims at exploring and evaluating the geographic
distribution of gender inequality across Italian regions. The aim of the
analysis is two-fold. First we build a composite indicator of gender
inequality at the regional level for Italy by applying the methodology
developed by the World Economic Forum for the Global Gender Gap
Index. Second, we compute the Italian Gender Gap Index for each region
in order to measure the within-country heterogeneity that characterizes
Italy. We complete the analysis by presenting the correlation between the
Italian Gender Gap Index and relevant socio-economic variables
Actividades desarrolladas en el marco de la pedagogÃa de la cooperación en la enseñanza de la geometrÃa según lo prescripto por la teorÃa de los niveles de van hiele
Situados ante el desafÃo de sostener prácticas áulicas inmersas en el marco provisto por la pedagogÃa de la cooperación acompañadas por el objetivo de que se produzca el aprendizaje esperado de la Matemática, se ha indagado en distintas teorÃas. En el caso del aprendizaje de la GeometrÃa, lo propuesto por la TeorÃa de Van Hiele presenta una posibilidad en su desarrollo dentro de un contexto pleno de aprendizaje cooperativo. A tal efecto, la propuesta desarrollada se lleva cabo en alumnos con edades entre los 11 y 12 años. La misma proporcionó conclusiones y reflexiones a modo de guÃa para la toma de decisiones del docente
- …