47 research outputs found
Recommended from our members
Detection of Security and Dependability Threats: A Belief Based Reasoning Approach
Monitoring the preservation of security and dependability (S&D) properties during the operation of systems at runtime is an important verification measure that can increase system resilience. However it does not always provide sufficient scope for taking control actions against violations as it only detects problems after they occur. In this paper, we describe a proactive monitoring approach that detects potential violations of S&D properties, called ldquothreatsrdquo, and discuss the results of an initial evaluation of it
Recommended from our members
Predicting software service availability: Towards a runtime monitoring approach
This paper presents a prediction model for software services availability measured by the mean-time-to-repair (MTTR) and mean-time-to-failure (MTTF) of a service. The prediction model is based on the experimental identification of probabilistic prediction for variables that affect MTTR/MTTF, based on monitoring service data collected at runtime
Recommended from our members
EVEREST+: Run-time SLA violations prediction
Monitoring the preservation of QoS properties during the operation of service-based systems at run-time is an important verification measure for checking if the current service usage is compliant with agreed SLAs. Monitoring, however, does not always provide sufficient scope for taking control actions against violations as it only detects violations after they occur. In this paper we describe a model-based prediction framework, EVEREST+, for both QoS predictors development and execution. EVEREST+ was designed to provide a framework for developing in an easy and fast way QoS predictors only focusing on their prediction algorithms implementation without the need for caring about how to collect or retrieve historical data or how to infer models out of collected data. It also provides a run-time environment for executing QoS predictors and storing their predictions
A telescope detection system for direct and high resolution spectrometry of intense neutron fields
A high energy- and spatial-resolution telescope detector was designed and constructed for neutron
spectrometry of intense neutron fields. The detector is constituted by a plastic scintillator coupled to a
monolithic silicon telescope (MST), in turn consisting of a DE and an E stage. The scintillator behaves as
an “active” recoil-proton converter, since it measures the deposited energy of the recoil-protons
generated across. The MST measures the residual energy of recoil-protons downstream of the converter
and also discriminates recoil-protons from photons associated to the neutron field. The lay-out of
the scintillator/MST system was optimized through an analytical model for selecting the angular range of
the scattered protons. The use of unfolding techniques for reconstructing the neutron energy distribution
was thus avoided with reasonable uncertainty (about 1.6% in neutron energy) and efficiency (of the order
of 106 counts per unit neutron fluence). A semi-empirical procedure was also developed for correcting
the non-linearity in light emission from the organic scintillator. The spectrometer was characterized with
quasi-monoenergetic and continuous fields of neutrons generated at the CN Van De Graaff accelerator of
the INFN-Legnaro National Laboratory, Italy, showing satisfactory agreement with literature data
Experimental investigation of silicon photomultipliers as compact light readout systems for gamma-ray spectroscopy applications in fusion plasmas
A matrix of Silicon Photo Multipliers has been developed for light readout from a large area 1 in. x 1 in. LaBr3 crystal. The system has been characterized in the laboratory and its performance compared to that of a conventional photo multiplier tube. A pulse duration of 100 ns was achieved, which opens up to spectroscopy applications at high counting rates. The energy resolution measured using radioactive sources extrapolates to 3%-4% in the energy range E gamma = 3-5 MeV, enabling gamma-ray spectroscopy measurements at good energy resolution. The results reported here are of relevance in view of the development of compact gamma-ray detectors with spectroscopy capabilities, such as an enhanced gamma-ray camera for high power fusion plasmas, where the use of photomultiplier is impeded by space limitation and sensitivity to magnetic fields
A unifying perspective on protocol mediation: interoperability in the Future Internet
Given the highly dynamic and extremely heterogeneous software systems composing the Future Internet, automatically achieving interoperability between software components —without modifying them— is more than simply desirable, it is quickly becoming a necessity. Although much work has been carried out on interoperability, existing solutions have not fully succeeded in keeping pace with the increasing complexity and heterogeneity of modern software, and meeting the demands of runtime support. On the one hand, solutions at the application layer target higher automation and loose coupling through the synthesis of intermediary entities, mediators, to compensate for the differences between the interfaces of components and coordinate their behaviours, while assuming the use of the same middleware solution. On the other hand, solutions to interoperability across heterogeneous middleware technologies do not reconcile the differences between components at the application layer. In this paper we propose a unified approach for achieving interoperability between heterogeneous software components with compatible functionalities across the application and middleware layers. First, we provide a solution to automatically generate cross-layer parsers and composers that abstract network messages into a uniform representation independent of the middleware used. Second, these generated parsers and composers are integrated within a mediation framework to support the deployment of the mediators synthesised at the application layer. More specifically, the generated parser analyses the network messages received from one component and transforms them into a representation that can be understood by the application-level mediator. Then, the application-level mediator performs the necessary data conversion and behavioural coordination. Finally, the composer transforms the representation produced by the application-level mediator into network messages that can be sent to the other component. The resulting unified mediation framework reconciles the differences between software components from the application down to the middleware layers. We validate our approach through a case study in the area of conference management
Recommended from our members
Automatic test case generation for WS-Agreements using combinatorial testing
In the scope of the applications developed under the service-based paradigm, Service Level Agreements (SLAs) are a standard mechanism used to flexibly specify the Quality of Service (QoS) that must be delivered. These agreements contain the conditions negotiated between the service provider and consumers as well as the potential penalties derived from the violation of such conditions. In this context, it is important to assure that the service based application (SBA) behaves as expected in order to avoid potential consequences like penalties or dissatisfaction between the stakeholders that have negotiated and signed the SLA. In this article we address the testing of SLAs specified using the WS-Agreement standard by means of applying testing techniques such as the Classification Tree Method and Combinatorial Testing to generate test cases. From the content of the individual terms of the SLA, we identify situations that need to be tested. We also obtain a set of constraints based on the SLA specification and the behavior of the SBA in order to guarantee the testability of the test cases. Furthermore, we define three different coverage strategies with the aim at grading the intensity of the tests. Finally, we have developed a tool named SLACT (SLA Combinatorial Testing) in order to automate the process and we have applied the whole approach to an eHealth case study