733 research outputs found

    Bridging Computational Notions of Depth

    Full text link
    In this article, we study the relationship between notions of depth for sequences, namely, Bennett's notions of strong and weak depth, and deep Π10\Pi^0_1 classes, introduced by the authors and motivated by previous work of Levin. For the first main result of the study, we show that every member of a Π10\Pi^0_1 class is order-deep, a property that implies strong depth. From this result, we obtain new examples of strongly deep sequences based on properties studied in computability theory and algorithmic randomness. We further show that not every strongly deep sequence is a member of a deep Π10\Pi^0_1 class. For the second main result, we show that the collection of strongly deep sequences is negligible, which is equivalent to the statement that the probability of computing a strongly deep sequence with some random oracle is 0, a property also shared by every deep Π10\Pi^0_1 class. Finally, we show that variants of strong depth, given in terms of a priori complexity and monotone complexity, are equivalent to weak depth

    First order-rewritability and containment of conjunctive queries in horn description logics

    Get PDF
    International audienceWe study FO-rewritability of conjunctive queries in the presence of ontologies formulated in a description logic between EL and Horn-SHIF, along with related query containment problems. Apart from providing characterizations, we establish complexity results ranging from EXPTIME via NEXPTIME to 2EXPTIME, pointing out several interesting effects. In particular, FO-rewriting is more complex for conjunctive queries than for atomic queries when inverse roles are present, but not otherwise

    Constructive Dimension and Turing Degrees

    Full text link
    This paper examines the constructive Hausdorff and packing dimensions of Turing degrees. The main result is that every infinite sequence S with constructive Hausdorff dimension dim_H(S) and constructive packing dimension dim_P(S) is Turing equivalent to a sequence R with dim_H(R) <= (dim_H(S) / dim_P(S)) - epsilon, for arbitrary epsilon > 0. Furthermore, if dim_P(S) > 0, then dim_P(R) >= 1 - epsilon. The reduction thus serves as a *randomness extractor* that increases the algorithmic randomness of S, as measured by constructive dimension. A number of applications of this result shed new light on the constructive dimensions of Turing degrees. A lower bound of dim_H(S) / dim_P(S) is shown to hold for the Turing degree of any sequence S. A new proof is given of a previously-known zero-one law for the constructive packing dimension of Turing degrees. It is also shown that, for any regular sequence S (that is, dim_H(S) = dim_P(S)) such that dim_H(S) > 0, the Turing degree of S has constructive Hausdorff and packing dimension equal to 1. Finally, it is shown that no single Turing reduction can be a universal constructive Hausdorff dimension extractor, and that bounded Turing reductions cannot extract constructive Hausdorff dimension. We also exhibit sequences on which weak truth-table and bounded Turing reductions differ in their ability to extract dimension.Comment: The version of this paper appearing in Theory of Computing Systems, 45(4):740-755, 2009, had an error in the proof of Theorem 2.4, due to insufficient care with the choice of delta. This version modifies that proof to fix the error

    Cell-Type-Specific Recruitment of Amygdala Interneurons to Hippocampal Theta Rhythm and Noxious Stimuli In Vivo

    Get PDF
    Neuronal synchrony in the basolateral amygdala (BLA) is critical for emotional behavior. Coordinated theta-frequency oscillations between the BLA and the hippocampus and precisely timed integration of salient sensory stimuli in the BLA are involved in fear conditioning. We characterized GABAergic interneuron types of the BLA and determined their contribution to shaping these network activities. Using in vivo recordings in rats combined with the anatomical identification of neurons, we found that the firing of BLA interneurons associated with network activities was cell type specific. The firing of calbindin-positive interneurons targeting dendrites was precisely theta-modulated, but other cell types were heterogeneously modulated, including parvalbumin-positive basket cells. Salient sensory stimuli selectively triggered axo-axonic cells firing and inhibited firing of a disctinct projecting interneuron type. Thus, GABA is released onto BLA principal neurons in a time-, domain-, and sensory-specific manner. These specific synaptic actions likely cooperate to promote amygdalo-hippocampal synchrony involved in emotional memory formation

    Idiopathic hypertrophic pachymeningitis presenting with occipital neuralgia

    Get PDF
    Background: Although occipital neuralgia is usually caused by degenerative arthropathy, nearly 20 other aetiologies may lead to this condition.Methods: We present the first case report of hypertrophic pachymeningitis revealed by isolated occipital neuralgia.Results and conclusions: Idiopathic hypertrophic pachymeningitis is a plausible cause of occipital neuralgia and may present without cranial-nerve palsy. There is no consensus on the treatment for idiopathic hypertrophic pachymeningitis, but the usual approach is to start corticotherapy and then to add immunosuppressants. When occipital neuralgia is not clinically isolated or when a first-line treatment fails, another disease diagnosis should be considered. However, the cost effectiveness of extended investigations needs to be considered.Keywords: neuralgia/pathology, meningitis, neck pain/aetiology, revie

    Computable randomness is about more than probabilities

    Get PDF
    We introduce a notion of computable randomness for infinite sequences that generalises the classical version in two important ways. First, our definition of computable randomness is associated with imprecise probability models, in the sense that we consider lower expectations (or sets of probabilities) instead of classical 'precise' probabilities. Secondly, instead of binary sequences, we consider sequences whose elements take values in some finite sample space. Interestingly, we find that every sequence is computably random with respect to at least one lower expectation, and that lower expectations that are more informative have fewer computably random sequences. This leads to the intriguing question whether every sequence is computably random with respect to a unique most informative lower expectation. We study this question in some detail and provide a partial answer
    corecore