290 research outputs found

    Logical Specification and Analysis of Fault Tolerant Systems through Partial Model Checking

    Get PDF
    This paper presents a framework for a logical characterisation of fault tolerance and its formal analysis based on partial model checking techniques. The framework requires a fault tolerant system to be modelled using a formal calculus, here the CCS process algebra. To this aim we propose a uniform modelling scheme in which to specify a formal model of the system, its failing behaviour and possibly its fault-recovering procedures. Once a formal model is provided into our scheme, fault tolerance - with respect to a given property - can be formalized as an equational µ-calculus formula. This formula expresses in a logic formalism, all the fault scenarios satisfying that fault tolerance property. Such a characterisation understands the analysis of fault tolerance as a form of analysis of open systems and thank to partial model checking strategies, it can be made independent on any particular fault assumption. Moreover this logical characterisation makes possible the fault-tolerance verification problem be expressed as a general µ-calculus validation problem, for solving which many theorem proof techniques and tools are available. We present several analysis methods showing the flexibility of our approach

    Service Security and Privacy as a Socio-Technical Problem: Literature review, analysis methodology and challenge domains

    Get PDF
    Published online September 2015 accepted: 15 September 2014Published online September 2015 accepted: 15 September 2014The security and privacy of the data that users transmit, more or less deliberately, to modern services is an open problem. It is not solely limited to the actual Internet traversal, a sub-problem vastly tackled by consolidated research in security protocol design and analysis. By contrast, it entails much broader dimensions pertaining to how users approach technology and understand the risks for the data they enter. For example, users may express cautious or distracted personas depending on the service and the point in time; further, pre-established paths of practice may lead them to neglect the intrusive privacy policy offered by a service, or the outdated protections adopted by another. The approach that sees the service security and privacy problem as a socio-technical one needs consolidation. With this motivation, the article makes a threefold contribution. It reviews the existing literature on service security and privacy, especially from the socio-technical standpoint. Further, it outlines a general research methodology aimed at layering the problem appropriately, at suggesting how to position existing findings, and ultimately at indicating where a transdisciplinary task force may fit in. The article concludes with the description of the three challenge domains of services whose security and privacy we deem open socio-technical problems, not only due to their inherent facets but also to their huge number of users

    The Audit Logic: Policy Compliance in Distributed Systems

    Get PDF
    We present a distributed framework where agents can share data along with usage policies. We use an expressive policy language including conditions, obligations and delegation. Our framework also supports the possibility to refine policies. Policies are not enforced a-priori. Instead policy compliance is checked using an a-posteriri auditing approach. Policy compliance is shown by a (logical) proof that the authority can systematically check for validity. Tools for automatically checking and generating proofs are also part of the framework.\u

    Optimal joint routing and link scheduling for real-time traffic in TDMA Wireless Mesh Networks

    Get PDF
    We investigate the problem of joint routing and link scheduling in Time-Division Multiple Access (TDMA) Wireless Mesh Networks (WMNs) carrying real-time traffic. We propose a framework that always computes a feasible solution (i.e. a set of paths and link activations) if there exists one, by optimally solving a mixed integer-non linear problem. Such solution can be computed in minutes or tens thereof for e.g. grids of up to 4x4 nodes. We also propose heuristics based on Lagrangian decomposition to compute suboptimal solutions considerably faster and/or for larger WMNs, up to about 50 nodes. We show that the heuristic solutions are near-optimal, and we exploit them to investigate the optimal placement of one or more gateways from a delay bound perspective

    Outlining the mission profile of agricultural tractors through CAN-BUS data analytics

    Get PDF
    Tractor manufacturers need to know how farmers use their agricultural tractors for an optimal machine design. Tractor usage is not easy to assess due to the large variability of field operations. However, modern tractors embed sensors integrated into the CAN-BUS network and their data is accessible through the ISO 11,783 protocol. Even though this technology has been available for a long time, the use of CAN-BUS data for outlining the tractor usage is still limited, because a proper post-processing method is lacking. This study aimed to present a novel classification scheme of CAN-BUS data which permits to outline the tractor usage. On a tractor, a CAN-BUS data logger and a GNSS receiver were installed, and real-world data were recorded for 579 h. Thus, data was obtained in the most realistic condition. Tractor positions were classified using GIS layers while operating conditions were classified depending on the usage of the tractor's subsystems. The method highlights that showed to be able to detect the 97% of the logged data and that the tractor operated on the field in working, on idle, and moving duties for 65%, 18% and 16% of the time, respectively. The method allows a far more precise outline of tractor usage opening opportunities to obtain large benefits from massively collected CAN-BUS data

    Security Analysis of Parlay/OSA Framework

    Get PDF
    This paper analyzes the security of the Trust and Security Management (TSM) protocol, an authentication protocol which is part of the Parlay/OSA Application Program Interfaces (APIs). Architectures based on Parlay/OSA APIs allow third party service providers to develop new services that can access, in a controlled and secure way, to those network capabilities offered by the network operator. Role of the TSM protocol, run by network gateways, is to authenticate the client applications trying to access and use the network capabilities features offered. For this reason potential security flaws in its authentication strategy can bring to unauthorized use of network with evident damages to the operator and to the quality of the services. This paper shows how a rigorous formal analysis of TSM underlines serious weaknesses in the model describing its authentication procedure. This usually means that also the original system (i.e., the TSM protocol itself) hides the same flaws. The paper relates about the design activity of the formal model, the tool-aided verification performed and the security flaws discovered. This will allow us to discuss about how the security of the TSM protocol can be generally improve

    A Formal Security Analysis of an OSA/Parlay Authentication Interface

    Get PDF
    Abstract. We report on an experience in analyzing the security of the Trust and Security Management (TSM) protocol, an authentication procedure within the OSA/Parlay Application Program Interfaces (APIs) of the Open Service Access and Parlay Group. The experience has been conducted jointly by research institutes experienced in security and industry experts in telecommunication networking. OSA/Parlay APIs are designed to enable the creation of telecommunication applications outside the traditional network space and business model. Network operators consider the OSA/Parlay a promising architecture to stimulate the development of web service applications by third party providers, which may not necessarily be experts in telecommunication and security. The TSM protocol is executed by the gateways to OSA/Parlay networks; its role is to authenticate client applications trying to access the interfaces of some object representing an offered network capability. For this reason, potential security flaws in the TSM authentication strategy can cause the unauthorized use of the network, with evident damages to the operator and the quality of services. We report a rigorous formal analysis of the TSM specification, which is originally given in UML. Furthermore, we illustrate our design choices to obtain the formal model, describe the tool-aided verification and finally expose the security flaws discovered

    Coherent transfer of optical orbital angular momentum in multi-order Raman sideband generation

    Full text link
    Experimental results from the generation of Raman sidebands using optical vortices are presented. By generating two sets of sidebands originating from different locations in a Raman active crystal, one set containing optical orbital angular momentum and the other serving as a reference, a Young's double slit experiment was simultaneously realized for each sideband. The interference between the two sets of sidebands was used to determine the helicity and topological charge in each order. Topological charges in all orders were found to be discrete and follow selection rules predicted by a cascaded Raman process.Comment: 4 pages, 3 figure

    Statistical design of personalized medicine interventions: The Clarification of Optimal Anticoagulation through Genetics (COAG) trial

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>There is currently much interest in pharmacogenetics: determining variation in genes that regulate drug effects, with a particular emphasis on improving drug safety and efficacy. The ability to determine such variation motivates the application of personalized drug therapies that utilize a patient's genetic makeup to determine a safe and effective drug at the correct dose. To ascertain whether a genotype-guided drug therapy improves patient care, a personalized medicine intervention may be evaluated within the framework of a randomized controlled trial. The statistical design of this type of personalized medicine intervention requires special considerations: the distribution of relevant allelic variants in the study population; and whether the pharmacogenetic intervention is equally effective across subpopulations defined by allelic variants.</p> <p>Methods</p> <p>The statistical design of the Clarification of Optimal Anticoagulation through Genetics (COAG) trial serves as an illustrative example of a personalized medicine intervention that uses each subject's genotype information. The COAG trial is a multicenter, double blind, randomized clinical trial that will compare two approaches to initiation of warfarin therapy: genotype-guided dosing, the initiation of warfarin therapy based on algorithms using clinical information and genotypes for polymorphisms in <it>CYP2C9 </it>and <it>VKORC1</it>; and clinical-guided dosing, the initiation of warfarin therapy based on algorithms using only clinical information.</p> <p>Results</p> <p>We determine an absolute minimum detectable difference of 5.49% based on an assumed 60% population prevalence of zero or multiple genetic variants in either <it>CYP2C9 </it>or <it>VKORC1 </it>and an assumed 15% relative effectiveness of genotype-guided warfarin initiation for those with zero or multiple genetic variants. Thus we calculate a sample size of 1238 to achieve a power level of 80% for the primary outcome. We show that reasonable departures from these assumptions may decrease statistical power to 65%.</p> <p>Conclusions</p> <p>In a personalized medicine intervention, the minimum detectable difference used in sample size calculations is not a known quantity, but rather an unknown quantity that depends on the genetic makeup of the subjects enrolled. Given the possible sensitivity of sample size and power calculations to these key assumptions, we recommend that they be monitored during the conduct of a personalized medicine intervention.</p> <p>Trial Registration</p> <p>clinicaltrials.gov: NCT00839657</p

    A multi-factorial analysis of response to warfarin in a UK prospective cohort

    Get PDF
    Background Warfarin is the most widely used oral anticoagulant worldwide, but it has a narrow therapeutic index which necessitates constant monitoring of anticoagulation response. Previous genome-wide studies have focused on identifying factors explaining variance in stable dose, but have not explored the initial patient response to warfarin, and a wider range of clinical and biochemical factors affecting both initial and stable dosing with warfarin. Methods A prospective cohort of 711 patients starting warfarin was followed up for 6 months with analyses focusing on both non-genetic and genetic factors. The outcome measures used were mean weekly warfarin dose (MWD), stable mean weekly dose (SMWD) and international normalised ratio (INR) > 4 during the first week. Samples were genotyped on the Illumina Human610-Quad chip. Statistical analyses were performed using Plink and R. Results VKORC1 and CYP2C9 were the major genetic determinants of warfarin MWD and SMWD, with CYP4F2 having a smaller effect. Age, height, weight, cigarette smoking and interacting medications accounted for less than 20 % of the variance. Our multifactorial analysis explained 57.89 % and 56.97 % of the variation for MWD and SMWD, respectively. Genotypes for VKORC1 and CYP2C9*3, age, height and weight, as well as other clinical factors such as alcohol consumption, loading dose and concomitant drugs were important for the initial INR response to warfarin. In a small subset of patients for whom data were available, levels of the coagulation factors VII and IX (highly correlated) also played a role. Conclusion Our multifactorial analysis in a prospectively recruited cohort has shown that multiple factors, genetic and clinical, are important in determining the response to warfarin. VKORC1 and CYP2C9 genetic polymorphisms are the most important determinants of warfarin dosing, and it is highly unlikely that other common variants of clinical importance influencing warfarin dosage will be found. Both VKORC1 and CYP2C9*3 are important determinants of the initial INR response to warfarin. Other novel variants, which did not reach genome-wide significance, were identified for the different outcome measures, but need replication
    corecore