381 research outputs found

    Classical BI: Its Semantics and Proof Theory

    Full text link
    We present Classical BI (CBI), a new addition to the family of bunched logics which originates in O'Hearn and Pym's logic of bunched implications BI. CBI differs from existing bunched logics in that its multiplicative connectives behave classically rather than intuitionistically (including in particular a multiplicative version of classical negation). At the semantic level, CBI-formulas have the normal bunched logic reading as declarative statements about resources, but its resource models necessarily feature more structure than those for other bunched logics; principally, they satisfy the requirement that every resource has a unique dual. At the proof-theoretic level, a very natural formalism for CBI is provided by a display calculus \`a la Belnap, which can be seen as a generalisation of the bunched sequent calculus for BI. In this paper we formulate the aforementioned model theory and proof theory for CBI, and prove some fundamental results about the logic, most notably completeness of the proof theory with respect to the semantics.Comment: 42 pages, 8 figure

    Trust economics feasibility study

    Get PDF
    We believe that enterprises and other organisations currently lack sophisticated methods and tools to determine if and how IT changes should be introduced in an organisation, such that objective, measurable goals are met. This is especially true when dealing with security-related IT decisions. We report on a feasibility study, Trust Economics, conducted to demonstrate that such methodology can be developed. Assuming a deep understanding of the IT involved, the main components of our trust economics approach are: (i) assess the economic or financial impact of IT security solutions; (ii) determine how humans interact with or respond to IT security solutions; (iii) based on above, use probabilistic and stochastic modelling tools to analyse the consequences of IT security decisions. In the feasibility study we apply the trust economics methodology to address how enterprises should protect themselves against accidental or malicious misuse of USB memory sticks, an acute problem in many industries

    A semantics for reductive logic and proof-search

    Get PDF

    Portfolio-based appraisal: superficial or useful?

    Get PDF
    This paper outlines the growing role played by performance appraisal within medical regulation, supported by learning portfolios. It investigates if these are superficial or useful tools. In doing so it argues that caution must be exercised in promoting such tools to help modernise medical regulatory frameworks

    Why Separation Logic Works

    Get PDF
    One might poetically muse that computers have the essence both of logic and machines. Through the case of the history of Separation Logic, we explore how this assertion is more than idle poetry. Separation Logic works because it merges the software engineer’s conceptual model of a program’s manipulation of computer memory with the logical model that interprets what sentences in the logic are true, and because it has a proof theory which aids in the crucial problem of scaling the reasoning task. Scalability is a central problem, and some would even say the central problem, in appli- cations of logic in computer science. Separation Logic is an interesting case because of its widespread success in verification tools. For these two senses of model—the engineering/conceptual and the logical—to merge in a genuine sense, each must maintain their norms of use from their home disciplines. When this occurs, both the logic and engineering benefit greatly. Seeking this intersection of two different senses of model provides a strategy for how computer scientists and logicians may be successful. Furthermore, the history of Separation Logic for analysing programs provides a novel case for philosophers of science of how software engineers and computer scientists develop models and the components of such models. We provide three contributions: an exploration of the extent of models merging that is necessary for success in computer science; an introduction to the technical details of Separation Logic, which can be used for reasoning about other exhaustible resources; and an introduction to (a subset of) the problems, process, and results of computer scientists for those outside the field

    The homeobox transcription factor Even-skipped regulates acquisition of electrical properties in Drosophila neurons.

    Get PDF
    BACKGROUND: While developmental processes such as axon pathfinding and synapse formation have been characterized in detail, comparatively less is known of the intrinsic developmental mechanisms that regulate transcription of ion channel genes in embryonic neurons. Early decisions, including motoneuron axon targeting, are orchestrated by a cohort of transcription factors that act together in a combinatorial manner. These transcription factors include Even-skipped (Eve), islet and Lim3. The perdurance of these factors in late embryonic neurons is, however, indicative that they might also regulate additional aspects of neuron development, including the acquisition of electrical properties. RESULTS: To test the hypothesis that a combinatorial code transcription factor is also able to influence the acquisition of electrical properties in embryonic neurons we utilized the molecular genetics of Drosophila to manipulate the expression of Eve in identified motoneurons. We show that increasing expression of this transcription factor, in two Eve-positive motoneurons (aCC and RP2), is indeed sufficient to affect the electrical properties of these neurons in early first instar larvae. Specifically, we observed a decrease in both the fast K+ conductance (IKfast) and amplitude of quantal cholinergic synaptic input. We used charybdotoxin to pharmacologically separate the individual components of IKfast to show that increased Eve specifically down regulates the Slowpoke (a BK Ca2+-gated potassium channel), but not Shal, component of this current. Identification of target genes for Eve, using DNA adenine methyltransferase identification, revealed strong binding sites in slowpoke and nAcRalpha-96Aa (a nicotinic acetylcholine receptor subunit). Verification using real-time PCR shows that pan-neuronal expression of eve is sufficient to repress transcripts for both slo and nAcRalpha-96Aa. CONCLUSION: Taken together, our findings demonstrate, for the first time, that Eve is sufficient to regulate both voltage- and ligand-gated currents in motoneurons, extending its known repertoire of action beyond its already characterized role in axon guidance. Our data are also consistent with a common developmental program that utilizes a defined set of transcription factors to determine both morphological and functional neuronal properties

    Resilience in Information Stewardship

    Get PDF
    Information security is concerned with protecting the confi- dentiality, integrity, and availability of information systems. System managers deploy their resources with the aim of maintaining target levels of these attributes in the presence of reactive threats. Information stewardship is the challenge of maintaining the sustainability and resilience of the security attributes of (complex, interconnected, multi-agent) information ecosystems. In this paper, we present, in the tradition public economics, a model of stewardship which addresses directly the question of resilience. We model attacker-target-steward behaviour in a fully endogenous Nash equilibrium setting. We analyse the occurrence of externalities across targets and assess the steward’s ability to internalize these externalities under varying informational assumptions. We apply and simulate this model in the case of a critical national infrastructure example

    A Rapid Drug Resistance Genotyping Workflow for Mycobacterium tuberculosis, Using Targeted Isothermal Amplification and Nanopore Sequencing

    Get PDF
    Phenotypic drug susceptibility testing (DST) for tuberculosis (TB) requires weeks to yield results. Although molecular tests rapidly detect drug resistance-associated mutations (DRMs), they are not scalable to cover the full genome and the many DRMs that can predict resistance. Whole-genome sequencing (WGS) methods are scalable, but if conducted directly on sputum, typically require a target enrichment step, such as nucleic acid amplification. We developed a targeted isothermal amplification-nanopore sequencing workflow for rapid prediction of drug resistance of TB isolates. We used recombinase polymerase amplification (RPA) to perform targeted isothermal amplification (37°C for 90 min) of three regions within the Mycobacterium tuberculosis genome, followed by nanopore sequencing on the MinION. We tested 29 mycobacterial genomic DNA extracts from patients with drug-resistant (DR) TB and compared our results to those of WGS by Illumina and phenotypic DST to evaluate the accuracy of prediction of resistance to rifampin and isoniazid. Amplification by RPA showed fidelity equivalent to that of high-fidelity PCR (100% concordance). Nanopore sequencing generated DRM predictions identical to those of WGS, with considerably faster sequencing run times of minutes rather than days. The sensitivity and specificity of rifampin resistance prediction for our workflow were 96.3% (95% confidence interval [CI], 81.0 to 99.9%) and 100.0% (95% CI, 15.8 to 100.0%), respectively. For isoniazid resistance prediction, the sensitivity and specificity were 100.0% (95% CI, 86.3 to 100.0%) and 100.0% (95% CI, 39.8 to 100.0%), respectively. The workflow consumable costs per sample are less than £100. Our rapid and low-cost drug resistance genotyping workflow provides accurate prediction of rifampin and isoniazid resistance, making it appropriate for use in resource-limited settings

    Contagion in cybersecurity attacks

    Get PDF
    Systems security is essential for the efficient operation of all organizations. Indeed, most large firms employ a designated 'Chief Information Security Officer' to coordinate the operational aspects of the organization’s information security. Part of this role is in planning investment responses to information security threats against the firm's corporate network infrastructure. To this end, we develop and estimate a vector equation system of threats to 10 important IP services, using industry standard SANS data on threats to various components of a firm's information system over the period January 2003 – February 2011. Our results reveal strong evidence of contagion between such attacks, with attacks on ssh and Secure Web Server indicating increased attack activity on other ports. Security managers who ignore such contagious inter-relationships may underestimate the underlying risk to their systems' defence of security attributes, such as sensitivity and criticality, and thus delay appropriate information security investments

    A Focused Sequent Calculus Framework for Proof Search in Pure Type Systems

    Get PDF
    Basic proof-search tactics in logic and type theory can be seen as the root-first applications of rules in an appropriate sequent calculus, preferably without the redundancies generated by permutation of rules. This paper addresses the issues of defining such sequent calculi for Pure Type Systems (PTS, which were originally presented in natural deduction style) and then organizing their rules for effective proof-search. We introduce the idea of Pure Type Sequent Calculus with meta-variables (PTSCalpha), by enriching the syntax of a permutation-free sequent calculus for propositional logic due to Herbelin, which is strongly related to natural deduction and already well adapted to proof-search. The operational semantics is adapted from Herbelin's and is defined by a system of local rewrite rules as in cut-elimination, using explicit substitutions. We prove confluence for this system. Restricting our attention to PTSC, a type system for the ground terms of this system, we obtain the Subject Reduction property and show that each PTSC is logically equivalent to its corresponding PTS, and the former is strongly normalising iff the latter is. We show how to make the logical rules of PTSC into a syntax-directed system PS for proof-search, by incorporating the conversion rules as in syntax-directed presentations of the PTS rules for type-checking. Finally, we consider how to use the explicitly scoped meta-variables of PTSCalpha to represent partial proof-terms, and use them to analyse interactive proof construction. This sets up a framework PE in which we are able to study proof-search strategies, type inhabitant enumeration and (higher-order) unification
    corecore