475 research outputs found

    Changing a semantics: opportunism or courage?

    Full text link
    The generalized models for higher-order logics introduced by Leon Henkin, and their multiple offspring over the years, have become a standard tool in many areas of logic. Even so, discussion has persisted about their technical status, and perhaps even their conceptual legitimacy. This paper gives a systematic view of generalized model techniques, discusses what they mean in mathematical and philosophical terms, and presents a few technical themes and results about their role in algebraic representation, calibrating provability, lowering complexity, understanding fixed-point logics, and achieving set-theoretic absoluteness. We also show how thinking about Henkin's approach to semantics of logical systems in this generality can yield new results, dispelling the impression of adhocness. This paper is dedicated to Leon Henkin, a deep logician who has changed the way we all work, while also being an always open, modest, and encouraging colleague and friend.Comment: 27 pages. To appear in: The life and work of Leon Henkin: Essays on his contributions (Studies in Universal Logic) eds: Manzano, M., Sain, I. and Alonso, E., 201

    New global stability estimates for monochromatic inverse acoustic scattering

    Full text link
    We give new global stability estimates for monochromatic inverse acoustic scattering. These estimates essentially improve estimates of [P. Hahner, T. Hohage, SIAM J. Math. Anal., 33(3), 2001, 670-685] and can be considered as a solution of an open problem formulated in the aforementioned work

    Modal Ω-Logic: Automata, Neo-Logicism, and Set-Theoretic Realism

    Get PDF
    This essay examines the philosophical significance of Ω\Omega-logic in Zermelo-Fraenkel set theory with choice (ZFC). The duality between coalgebra and algebra permits Boolean-valued algebraic models of ZFC to be interpreted as coalgebras. The modal profile of Ω\Omega-logical validity can then be countenanced within a coalgebraic logic, and Ω\Omega-logical validity can be defined via deterministic automata. I argue that the philosophical significance of the foregoing is two-fold. First, because the epistemic and modal profiles of Ω\Omega-logical validity correspond to those of second-order logical consequence, Ω\Omega-logical validity is genuinely logical, and thus vindicates a neo-logicist conception of mathematical truth in the set-theoretic multiverse. Second, the foregoing provides a modal-computational account of the interpretation of mathematical vocabulary, adducing in favor of a realist conception of the cumulative hierarchy of sets

    The Escherichia coli transcriptome mostly consists of independently regulated modules

    Get PDF
    Underlying cellular responses is a transcriptional regulatory network (TRN) that modulates gene expression. A useful description of the TRN would decompose the transcriptome into targeted effects of individual transcriptional regulators. Here, we apply unsupervised machine learning to a diverse compendium of over 250 high-quality Escherichia coli RNA-seq datasets to identify 92 statistically independent signals that modulate the expression of specific gene sets. We show that 61 of these transcriptomic signals represent the effects of currently characterized transcriptional regulators. Condition-specific activation of signals is validated by exposure of E. coli to new environmental conditions. The resulting decomposition of the transcriptome provides: a mechanistic, systems-level, network-based explanation of responses to environmental and genetic perturbations; a guide to gene and regulator function discovery; and a basis for characterizing transcriptomic differences in multiple strains. Taken together, our results show that signal summation describes the composition of a model prokaryotic transcriptome

    Global large carnivore conservation and international law

    Get PDF
    International cooperation, including through international legal instruments, appears important for the conservation of large carnivores worldwide. This is due to, inter alia, the worrying conservation status and population trends of many large carnivore species; the importance of large carnivores for biodiversity conservation at large; their occurrence at low densities, with many populations extending across various countries; and the international nature of particular threats. For the 31 heaviest species in the order Carnivora, this study (i) documents to what extent existing international legal instruments contribute to large carnivore conservation, and (ii) identifies ways of optimizing their contribution in this regard. From this dual perspective, it reviews all global wildlife conservation treaties—Ramsar Wetlands Convention, World Heritage Convention, Convention on Trade in Endangered Species, Convention on Migratory Species (CMS), Convention on Biological Diversity (CBD)—and selected regional instruments, using standard international law research methodology. Results indicate that a substantial body of relevant international law already exists, whereas simultaneously there is clear potential for enhancing the contribution of international law to large carnivore conservation. Avenues for pursuing this include promotion of instruments’ effective implementation; clarification of their precise implications for large carnivore conservation; development of formal guidance; expansion of instruments’ scope in terms of species, sites and countries; and creation of new instruments. The CMS and CBD hold particular potential in some of these respects. The experiences being gained under European legal instruments constitute an interesting ‘laboratory’ regarding human coexistence with expanding large carnivore populations and transboundary cooperation at the (sub)population level

    Radical anti-realism and substructural logics

    No full text
    We first provide the outline of an argument in favour of a radical form of anti-realism premised on the need to comply with two principles, implicitness and immanence, when trying to frame assertability-conditions. It follows from the first principle that one ought to avoid explicit bounding of the length of computations, as is the case for some strict finitists, and look for structural weakening instead. In order to comply with the principle of immanence, one ought to take into account the difference between being able to recognize a proof when presented with one and being able to produce one and thus avoid the idealization of our cognitive capacities that arise within Hilbert-style calculi. We then explore the possibility of weakening structural rules in order to comply with radical anti-realist strictures

    Rapid changes in gene expression: DNA determinants of promoter regulation by the concentration of the transcription initiating NTP in Bacillus subtilis

    Get PDF
    In bacteria, rapid changes in gene expression can be achieved by affecting the activity of RNA polymerase with small molecule effectors during transcription initiation. An important small molecule effector is the initiating nucleoside triphosphate (iNTP). At some promoters, an increasing iNTP concentration stimulates promoter activity, while a decreasing concentration has the opposite effect. Ribosomal RNA (rRNA) promoters from Gram-positive Bacillus subtilis are regulated by the concentration of their iNTP. Yet, the sequences of these promoters do not emulate the sequence characteristics of [iNTP]-regulated rRNA promoters of Gram-negative Escherichia coli. Here, we identified the 3′-promoter region, corresponding to the transcription bubble, as key for B. subtilis rRNA promoter regulation via the concentration of the iNTP. Within this region, the conserved −5T (3 bp downstream from the −10 hexamer) is required for this regulation. Moreover, we identified a second class of [iNTP]-regulated promoters in B. subtilis where the sequence determinants are not limited to the transcription bubble region. Overall, it seems that various sequence combinations can result in promoter regulation by [iNTP] in B. subtilis. Finally, this study demonstrates how the same type of regulation can be achieved with strikingly different promoter sequences in phylogenetically distant species

    Estimating the Impact of Adding C-Reactive Protein as a Criterion for Lipid Lowering Treatment in the United States

    Get PDF
    BACKGROUND: There is growing interest in using C-reactive protein (CRP) levels to help select patients for lipid lowering therapy—although this practice is not yet supported by evidence of benefit in a randomized trial. OBJECTIVE: To estimate the number of Americans potentially affected if a CRP criteria were adopted as an additional indication for lipid lowering therapy. To provide context, we also determined how well current lipid lowering guidelines are being implemented. METHODS: We analyzed nationally representative data to determine how many Americans age 35 and older meet current National Cholesterol Education Program (NCEP) treatment criteria (a combination of risk factors and their Framingham risk score). We then determined how many of the remaining individuals would meet criteria for treatment using 2 different CRP-based strategies: (1) narrow: treat individuals at intermediate risk (i.e., 2 or more risk factors and an estimated 10–20% risk of coronary artery disease over the next 10 years) with CRP > 3 mg/L and (2) broad: treat all individuals with CRP > 3 mg/L. DATA SOURCE: Analyses are based on the 2,778 individuals participating in the 1999–2002 National Health and Nutrition Examination Survey with complete data on cardiac risk factors, fasting lipid levels, CRP, and use of lipid lowering agents. MAIN MEASURES: The estimated number and proportion of American adults meeting NCEP criteria who take lipid-lowering drugs, and the additional number who would be eligible based on CRP testing. RESULTS: About 53 of the 153 million Americans aged 35 and older meet current NCEP criteria (that do not involve CRP) for lipid-lowering treatment. Sixty-five percent, however, are not currently being treated, even among those at highest risk (i.e., patients with established heart disease or its risk equivalent)—62% are untreated. Adopting the narrow and broad CRP strategies would make an additional 2.1 and 25.3 million Americans eligible for treatment, respectively. The latter strategy would make over half the adults age 35 and older eligible for lipid-lowering therapy, with most of the additionally eligible (57%) coming from the lowest NCEP heart risk category (i.e., 0–1 risk factors). CONCLUSION: There is substantial underuse of lipid lowering therapy for American adults at high risk for coronary disease. Rather than adopting CRP-based strategies, which would make millions more lower risk patients eligible for treatment (and for whom treatment benefit has not yet been demonstrated in a randomized trial), we should ensure the treatment of currently defined high-risk patients for whom the benefit of therapy is established
    corecore