69,717 research outputs found

    Formal analysis techniques for gossiping protocols

    Get PDF
    We give a survey of formal verification techniques that can be used to corroborate existing experimental results for gossiping protocols in a rigorous manner. We present properties of interest for gossiping protocols and discuss how various formal evaluation techniques can be employed to predict them

    Power quality and electromagnetic compatibility: special report, session 2

    Get PDF
    The scope of Session 2 (S2) has been defined as follows by the Session Advisory Group and the Technical Committee: Power Quality (PQ), with the more general concept of electromagnetic compatibility (EMC) and with some related safety problems in electricity distribution systems. Special focus is put on voltage continuity (supply reliability, problem of outages) and voltage quality (voltage level, flicker, unbalance, harmonics). This session will also look at electromagnetic compatibility (mains frequency to 150 kHz), electromagnetic interferences and electric and magnetic fields issues. Also addressed in this session are electrical safety and immunity concerns (lightning issues, step, touch and transferred voltages). The aim of this special report is to present a synthesis of the present concerns in PQ&EMC, based on all selected papers of session 2 and related papers from other sessions, (152 papers in total). The report is divided in the following 4 blocks: Block 1: Electric and Magnetic Fields, EMC, Earthing systems Block 2: Harmonics Block 3: Voltage Variation Block 4: Power Quality Monitoring Two Round Tables will be organised: - Power quality and EMC in the Future Grid (CIGRE/CIRED WG C4.24, RT 13) - Reliability Benchmarking - why we should do it? What should be done in future? (RT 15

    An Intelligent QoS Identification for Untrustworthy Web Services Via Two-phase Neural Networks

    Full text link
    QoS identification for untrustworthy Web services is critical in QoS management in the service computing since the performance of untrustworthy Web services may result in QoS downgrade. The key issue is to intelligently learn the characteristics of trustworthy Web services from different QoS levels, then to identify the untrustworthy ones according to the characteristics of QoS metrics. As one of the intelligent identification approaches, deep neural network has emerged as a powerful technique in recent years. In this paper, we propose a novel two-phase neural network model to identify the untrustworthy Web services. In the first phase, Web services are collected from the published QoS dataset. Then, we design a feedforward neural network model to build the classifier for Web services with different QoS levels. In the second phase, we employ a probabilistic neural network (PNN) model to identify the untrustworthy Web services from each classification. The experimental results show the proposed approach has 90.5% identification ratio far higher than other competing approaches.Comment: 8 pages, 5 figure

    Formal Probabilistic Analysis of a Wireless Sensor Network for Forest Fire Detection

    Full text link
    Wireless Sensor Networks (WSNs) have been widely explored for forest fire detection, which is considered a fatal threat throughout the world. Energy conservation of sensor nodes is one of the biggest challenges in this context and random scheduling is frequently applied to overcome that. The performance analysis of these random scheduling approaches is traditionally done by paper-and-pencil proof methods or simulation. These traditional techniques cannot ascertain 100% accuracy, and thus are not suitable for analyzing a safety-critical application like forest fire detection using WSNs. In this paper, we propose to overcome this limitation by applying formal probabilistic analysis using theorem proving to verify scheduling performance of a real-world WSN for forest fire detection using a k-set randomized algorithm as an energy saving mechanism. In particular, we formally verify the expected values of coverage intensity, the upper bound on the total number of disjoint subsets, for a given coverage intensity, and the lower bound on the total number of nodes.Comment: In Proceedings SCSS 2012, arXiv:1307.802

    Damage identification in structural health monitoring: a brief review from its implementation to the Use of data-driven applications

    Get PDF
    The damage identification process provides relevant information about the current state of a structure under inspection, and it can be approached from two different points of view. The first approach uses data-driven algorithms, which are usually associated with the collection of data using sensors. Data are subsequently processed and analyzed. The second approach uses models to analyze information about the structure. In the latter case, the overall performance of the approach is associated with the accuracy of the model and the information that is used to define it. Although both approaches are widely used, data-driven algorithms are preferred in most cases because they afford the ability to analyze data acquired from sensors and to provide a real-time solution for decision making; however, these approaches involve high-performance processors due to the high computational cost. As a contribution to the researchers working with data-driven algorithms and applications, this work presents a brief review of data-driven algorithms for damage identification in structural health-monitoring applications. This review covers damage detection, localization, classification, extension, and prognosis, as well as the development of smart structures. The literature is systematically reviewed according to the natural steps of a structural health-monitoring system. This review also includes information on the types of sensors used as well as on the development of data-driven algorithms for damage identification.Peer ReviewedPostprint (published version

    Big Data Analytics for QoS Prediction Through Probabilistic Model Checking

    Get PDF
    As competitiveness increases, being able to guaranting QoS of delivered services is key for business success. It is thus of paramount importance the ability to continuously monitor the workflow providing a service and to timely recognize breaches in the agreed QoS level. The ideal condition would be the possibility to anticipate, thus predict, a breach and operate to avoid it, or at least to mitigate its effects. In this paper we propose a model checking based approach to predict QoS of a formally described process. The continous model checking is enabled by the usage of a parametrized model of the monitored system, where the actual value of parameters is continuously evaluated and updated by means of big data tools. The paper also describes a prototype implementation of the approach and shows its usage in a case study.Comment: EDCC-2014, BIG4CIP-2014, Big Data Analytics, QoS Prediction, Model Checking, SLA compliance monitorin

    Improving route discovery in on-demand routing protocols using local topology information in MANETs

    Get PDF
    Most existing routing protocols proposed for MANETs use flooding as a broadcast technique for the propagation of network control packets; a particular example of this is the dissemination of route requests (RREQs), which facilitate route discovery. In flooding, each mobile node rebroadcasts received packets, which, in this manner, are propagated network-wide with considerable overhead. This paper improves on the performance of existing routing protocols by reducing the communication overhead incurred during the route discovery process by implementing a new broadcast algorithm called the adjusted probabilistic flooding on the Ad-Hoc on Demand Distance Vector (AODV) protocol. AODV [3] is a well-known and widely studied algorithm which has been shown over the past few years to maintain an overall lower routing overhead compared to traditional proactive schemes, even though it uses flooding to propagate RREQs. Our results, as presented in this paper, reveal that equipping AODV with fixed and adjusted probabilistic flooding, instead, helps reduce the overhead of the route discovery process whilst maintaining comparable performance levels in terms of saved rebroadcasts and reachability as achieved by conventional AODV\@. Moreover, the results indicate that the adjusted probabilistic technique results in better performance compared to the fixed one for both of these metrics
    corecore