1,124 research outputs found

    Turing jumps through provability

    Full text link
    Fixing some computably enumerable theory TT, the Friedman-Goldfarb-Harrington (FGH) theorem says that over elementary arithmetic, each Σ1\Sigma_1 formula is equivalent to some formula of the form Tφ\Box_T \varphi provided that TT is consistent. In this paper we give various generalizations of the FGH theorem. In particular, for n>1n>1 we relate Σn\Sigma_{n} formulas to provability statements [n]TTrueφ[n]_T^{\sf True}\varphi which are a formalization of "provable in TT together with all true Σn+1\Sigma_{n+1} sentences". As a corollary we conclude that each [n]TTrue[n]_T^{\sf True} is Σn+1\Sigma_{n+1}-complete. This observation yields us to consider a recursively defined hierarchy of provability predicates [n+1]T[n+1]^\Box_T which look a lot like [n+1]TTrue[n+1]_T^{\sf True} except that where [n+1]TTrue[n+1]_T^{\sf True} calls upon the oracle of all true Σn+2\Sigma_{n+2} sentences, the [n+1]T[n+1]^\Box_T recursively calls upon the oracle of all true sentences of the form nTϕ\langle n \rangle_T^\Box\phi. As such we obtain a `syntax-light' characterization of Σn+1\Sigma_{n+1} definability whence of Turing jumps which is readily extended beyond the finite. Moreover, we observe that the corresponding provability predicates [n+1]T[n+1]_T^\Box are well behaved in that together they provide a sound interpretation of the polymodal provability logic GLPω{\sf GLP}_\omega

    The trauma film paradigm as an experimental psychopathology model of psychological trauma: intrusive memories and beyond

    Get PDF
    A better understanding of psychological trauma is fundamental to clinical psychology. Following traumatic event(s), a clinically significant number of people develop symptoms, including those of Acute Stress Disorder and/or Post Traumatic Stress Disorder. The trauma film paradigm offers an experimental psychopathology model to study both exposure and reactions to psychological trauma, including the hallmark symptom of intrusive memories. We reviewed 74 articles that have used this paradigm since the earliest review (Holmes & Bourne, 2008) until July 2014. Highlighting the different stages of trauma processing, i.e. pre-, peri- and post-trauma, the studies are divided according to manipulations before, during and after film viewing, for experimental as well as correlational designs. While the majority of studies focussed on the frequency of intrusive memories, other reactions to trauma were also modelled. We discuss the strengths and weaknesses of the trauma film paradigm as an experimental psychopathology model of trauma, consider ethical issues, and suggest future directions. By understanding the basic mechanisms underlying trauma symptom development, we can begin to translate findings from the laboratory to the clinic, test innovative science-driven interventions, and in the future reduce the debilitating effects of psychopathology following stressful and/or traumatic events

    Maladaptation and the paradox of robustness in evolution

    Get PDF
    Background. Organisms use a variety of mechanisms to protect themselves against perturbations. For example, repair mechanisms fix damage, feedback loops keep homeostatic systems at their setpoints, and biochemical filters distinguish signal from noise. Such buffering mechanisms are often discussed in terms of robustness, which may be measured by reduced sensitivity of performance to perturbations. Methodology/Principal Findings. I use a mathematical model to analyze the evolutionary dynamics of robustness in order to understand aspects of organismal design by natural selection. I focus on two characters: one character performs an adaptive task; the other character buffers the performance of the first character against perturbations. Increased perturbations favor enhanced buffering and robustness, which in turn decreases sensitivity and reduces the intensity of natural selection on the adaptive character. Reduced selective pressure on the adaptive character often leads to a less costly, lower performance trait. Conclusions/Significance. The paradox of robustness arises from evolutionary dynamics: enhanced robustness causes an evolutionary reduction in the adaptive performance of the target character, leading to a degree of maladaptation compared to what could be achieved by natural selection in the absence of robustness mechanisms. Over evolutionary time, buffering traits may become layered on top of each other, while the underlying adaptive traits become replaced by cheaper, lower performance components. The paradox of robustness has widespread implications for understanding organismal design

    Realizability of the Lorentzian (n,1)-Simplex

    Full text link
    In a previous article [JHEP 1111 (2011) 072; arXiv:1108.4965] we have developed a Lorentzian version of the Quantum Regge Calculus in which the significant differences between simplices in Lorentzian signature and Euclidean signature are crucial. In this article we extend a central result used in the previous article, regarding the realizability of Lorentzian triangles, to arbitrary dimension. This technical step will be crucial for developing the Lorentzian model in the case of most physical interest: 3+1 dimensions. We first state (and derive in an appendix) the realizability conditions on the edge-lengths of a Lorentzian n-simplex in total dimension n=d+1, where d is the number of space-like dimensions. We then show that in any dimension there is a certain type of simplex which has all of its time-like edge lengths completely unconstrained by any sort of triangle inequality. This result is the d+1 dimensional analogue of the 1+1 dimensional case of the Lorentzian triangle.Comment: V1: 15 pages, 2 figures. V2: Minor clarifications added to Introduction and Discussion sections. 1 reference updated. This version accepted for publication in JHEP. V3: minor updates and clarifications, this version closely corresponds to the version published in JHE

    Wormhole Cosmic Censorship

    Full text link
    We analyze the properties of a Kerr-like wormhole supported by phantom matter, which is an exact solution of the Einstein-phantom field equations. It is shown that the solution has a naked ring singularity which is unreachable to null geodesics falling freely from the outside. Similarly to Roger Penrose's cosmic censorship, that states that all naked singularities in the Universe must be protected by event horizons, here we conjecture from our results that a naked singularity can also be fully protected by the intrinsic properties of a wormhole's throat

    Autonomous decision-making against induced seismicity in deep fluid injections

    Full text link
    The rise in the frequency of anthropogenic earthquakes due to deep fluid injections is posing serious economic, societal, and legal challenges to geo-energy and waste-disposal projects. We propose an actuarial approach to mitigate this risk, first by defining an autonomous decision-making process based on an adaptive traffic light system (ATLS) to stop risky injections, and second by quantifying a "cost of public safety" based on the probability of an injection-well being abandoned. The ATLS underlying statistical model is first confirmed to be representative of injection-induced seismicity, with examples taken from past reservoir stimulation experiments (mostly from Enhanced Geothermal Systems, EGS). Then the decision strategy is formalized: Being integrable, the model yields a closed-form ATLS solution that maps a risk-based safety standard or norm to an earthquake magnitude not to exceed during stimulation. Finally, the EGS levelized cost of electricity (LCOE) is reformulated in terms of null expectation, with the cost of abandoned injection-well implemented. We find that the price increase to mitigate the increased seismic risk in populated areas can counterbalance the heat credit. However this "public safety cost" disappears if buildings are based on earthquake-resistant designs or if a more relaxed risk safety standard or norm is chosen.Comment: 8 pages, 4 figures, conference (International Symposium on Energy Geotechnics, 26-28 September 2018, Lausanne, Switzerland

    A 'short walk' is longer before radiotherapy than afterwards: a qualitative study questioning the baseline and follow-up design

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Numerous studies have indirectly demonstrated changes in the content of respondents' QoL appraisal process over time by revealing response-shift effects. This is the first known study to qualitatively examine the assumption of consistency in the content of the cognitive processes underlying QoL appraisal over time. Specific objectives are to examine whether the content of each distinct cognitive process underlying QoL appraisal is (dis)similar over time and whether patterns of (dis)similarity can be discerned across and within patients and/or items.</p> <p>Methods</p> <p>We conducted cognitive think-aloud interviews with 50 cancer patients prior to and following radiotherapy to elicit cognitive processes underlying the assessment of 7 EORTC QLQ-C30 items. Qualitative analysis of patients' responses at baseline and follow-up was independently carried out by 2 researchers by means of an analysis scheme based on the cognitive process models of Tourangeau et al. and Rapkin & Schwartz.</p> <p>Results</p> <p>The interviews yielded 342 comparisons of baseline and follow-up responses, which were analyzed according to the five cognitive processes underlying QoL appraisal. The content of comprehension/frame of reference changed in 188 comparisons; retrieval/sampling strategy in 246; standards of comparison in 152; judgment/combinatory algorithm in 113; and reporting and response selection in 141 comparisons. Overall, in 322 comparisons of responses (94%) the content of at least one cognitive component changed over time. We could not discern patterns of (dis)similarity since the content of each of the cognitive processes differed across and within patients and/or items. Additionally, differences found in the content of a cognitive process for one item was not found to influence dissimilarity in the content of that same cognitive process for the subsequent item.</p> <p>Conclusions</p> <p>The assumption of consistency in the content of the cognitive processes underlying QoL appraisal over time was not found to be in line with the cognitive processes described by the respondents. Additionally, we could not discern patterns of (dis)similarity across and within patients and/or items. In building on cognitive process models and the response shift literature, this study contributes to a better understanding of patient-reported QoL appraisal over time.</p

    Quasi-normal frequencies: Key analytic results

    Full text link
    The study of exact quasi-normal modes [QNMs], and their associated quasi-normal frequencies [QNFs], has had a long and convoluted history - replete with many rediscoveries of previously known results. In this article we shall collect and survey a number of known analytic results, and develop several new analytic results - specifically we shall provide several new QNF results and estimates, in a form amenable for comparison with the extant literature. Apart from their intrinsic interest, these exact and approximate results serve as a backdrop and a consistency check on ongoing efforts to find general model-independent estimates for QNFs, and general model-independent bounds on transmission probabilities. Our calculations also provide yet another physics application of the Lambert W function. These ideas have relevance to fields as diverse as black hole physics, (where they are related to the damped oscillations of astrophysical black holes, to greybody factors for the Hawking radiation, and to more speculative state-counting models for the Bekenstein entropy), to quantum field theory (where they are related to Casimir energies in unbounded systems), through to condensed matter physics, (where one may literally be interested in an electron tunelling through a physical barrier).Comment: V1: 29 pages; V2: Reformatted, 31 pages. Title changed to reflect major additions and revisions. Now describes exact QNFs for the double-delta potential in terms of the Lambert W function. V3: Minor edits for clarity. Four references added. No physics changes. Still 31 page

    Tidal Forces in Reissner-Nordström Spacetimes

    Get PDF
    We analyze the tidal forces produced in the spacetime of Reissner-Nordstr\"om black holes. We point out that the radial component of the tidal force changes sign just outside the event horizon if the charge-to-mass ratio is close to 11 unlike in Schwarzschild spacetime of uncharged black holes, and that the angular component changes sign between the outer and inner horizons. We solve the geodesic deviation equations for radially falling bodies towards the charged black hole. We find, for example, that the radial component of the geodesic deviation vector starts decreasing inside the event horizon unlike in the Schwarzschild case

    Conformally rescaled spacetimes and Hawking radiation

    Full text link
    We study various derivations of Hawking radiation in conformally rescaled metrics. We focus on two important properties, the location of the horizon under a conformal transformation and its associated temperature. We find that the production of Hawking radiation cannot be associated in all cases to the trapping horizon because its location is not invariant under a conformal transformation. We also find evidence that the temperature of the Hawking radiation should transform simply under a conformal transformation, being invariant for asymptotic observers in the limit that the conformal transformation factor is unity at their location.Comment: 22 pages, version submitted to journa
    corecore