25 research outputs found
Recommended from our members
Detection of Security and Dependability Threats: A Belief Based Reasoning Approach
Monitoring the preservation of security and dependability (S&D) properties during the operation of systems at runtime is an important verification measure that can increase system resilience. However it does not always provide sufficient scope for taking control actions against violations as it only detects problems after they occur. In this paper, we describe a proactive monitoring approach that detects potential violations of S&D properties, called ldquothreatsrdquo, and discuss the results of an initial evaluation of it
Recommended from our members
Predicting software service availability: Towards a runtime monitoring approach
This paper presents a prediction model for software services availability measured by the mean-time-to-repair (MTTR) and mean-time-to-failure (MTTF) of a service. The prediction model is based on the experimental identification of probabilistic prediction for variables that affect MTTR/MTTF, based on monitoring service data collected at runtime
Recommended from our members
EVEREST+: Run-time SLA violations prediction
Monitoring the preservation of QoS properties during the operation of service-based systems at run-time is an important verification measure for checking if the current service usage is compliant with agreed SLAs. Monitoring, however, does not always provide sufficient scope for taking control actions against violations as it only detects violations after they occur. In this paper we describe a model-based prediction framework, EVEREST+, for both QoS predictors development and execution. EVEREST+ was designed to provide a framework for developing in an easy and fast way QoS predictors only focusing on their prediction algorithms implementation without the need for caring about how to collect or retrieve historical data or how to infer models out of collected data. It also provides a run-time environment for executing QoS predictors and storing their predictions
A unifying perspective on protocol mediation: interoperability in the Future Internet
Given the highly dynamic and extremely heterogeneous software systems composing the Future Internet, automatically achieving interoperability between software components —without modifying them— is more than simply desirable, it is quickly becoming a necessity. Although much work has been carried out on interoperability, existing solutions have not fully succeeded in keeping pace with the increasing complexity and heterogeneity of modern software, and meeting the demands of runtime support. On the one hand, solutions at the application layer target higher automation and loose coupling through the synthesis of intermediary entities, mediators, to compensate for the differences between the interfaces of components and coordinate their behaviours, while assuming the use of the same middleware solution. On the other hand, solutions to interoperability across heterogeneous middleware technologies do not reconcile the differences between components at the application layer. In this paper we propose a unified approach for achieving interoperability between heterogeneous software components with compatible functionalities across the application and middleware layers. First, we provide a solution to automatically generate cross-layer parsers and composers that abstract network messages into a uniform representation independent of the middleware used. Second, these generated parsers and composers are integrated within a mediation framework to support the deployment of the mediators synthesised at the application layer. More specifically, the generated parser analyses the network messages received from one component and transforms them into a representation that can be understood by the application-level mediator. Then, the application-level mediator performs the necessary data conversion and behavioural coordination. Finally, the composer transforms the representation produced by the application-level mediator into network messages that can be sent to the other component. The resulting unified mediation framework reconciles the differences between software components from the application down to the middleware layers. We validate our approach through a case study in the area of conference management
Designing Multi-Layers Self-Adaptive Complex Applications
The impossibility of statically determining the behavior of complex systems that interact at runtime with heterogeneous devices and remote entities, may lead to unexpected system failures and performance degradation. Recently, self-adaptive applications have been recognized as viable solutions for dealing with systems where size and complexity increase beyond the ability of humans. However, self-adaptive solutions have been always studied in isolation, only involving single layers of the system (e.g. operative system, middleware, firmware, hardware). In this paper we discuss our novel idea of multi-layers deep adaptability of complex systems. We present an integrated approach for designing and coordinating applications with three-layers of self-adaptation. The solution proposed is based on specialized sense-plan-act control loops that interact each others to monitor both specific parts and also the global behavior of the system. These control loops sense unexpected behaviors that can compromise the system, then diagnose the system layer that needs an adaptation action, and finally act the proper adaptation strategy to overcome the revealed problems without user intervention. We validate the approach with a SOA-based application, the Virtual Tour Guide, designed with three-layers self-adaptive abilities in order to overcome functional/non-functional problems that can derive from the integration of heterogeneous and remote third-party servic
A novel avalanche-confinement TEPC for microdosimetry at nanometric level
The tissue equivalent proportional counter (TEPC) is the most accurate device for measuring the microdosimetric properties of a particle beam, showing to properly assess the relative biological effectiveness by linking the physical parameters of the radiation with the corresponding biological response. Nevertheless no detailed information on the track structure of the impinging particles can be obtained, since the lower operation limit of the common TEPCs is about 0.3 ?m. On the other hand, the pattern of particle interactions at the nanometer level, which demonstrated to have a strong correlation with radiation-induced damages to the DNA, is directly measured by only three different nanodosimeters worldwide: practical instruments are not yet available.
The gap between microdosimetry and track-nanodosimetry can be filled partially by extending the TEPC response down to the nanometric region. A feasibility study of a novel TEPC designed to simulate biological sites in the nanometric domain was performed. The present paper aims at describing the design, the development and the characterization of this avalanche-confinement TEPC. Irradiations with photons, fast neutrons and low-energy carbon ions demonstrated the capability of this TEPC of measuring in the range 0.3 ÎĽm - 25 nm