20,992 research outputs found

    Transcranial magnetic stimulation disrupts the perception and embodiment of facial expressions

    Get PDF
    Copyright © 2008 Society for Neuroscience and the authors. The The Journal of Neuroscience uses a Creative Commons Attribution-NonCommercial-ShareAlike licence: http://creativecommons.org/licenses/by-nc-sa/4.0/.Theories of embodied cognition propose that recognizing facial expressions requires visual processing followed by simulation of the somatovisceral responses associated with the perceived expression. To test this proposal, we targeted the right occipital face area (rOFA) and the face region of right somatosensory cortex (rSC) with repetitive transcranial magnetic stimulation (rTMS) while participants discriminated facial expressions. rTMS selectively impaired discrimination of facial expressions at both sites but had no effect on a matched face identity task. Site specificity within the rSC was demonstrated by targeting rTMS at the face and finger regions while participants performed the expression discrimination task. rTMS targeted at the face region impaired task performance relative to rTMS targeted at the finger region. To establish the temporal course of visual and somatosensory contributions to expression processing, double-pulse TMS was delivered at different times to rOFA and rSC during expression discrimination. Accuracy dropped when pulses were delivered at 60–100 ms at rOFA and at 100–140 and 130–170 ms at rSC. These sequential impairments at rOFA and rSC support embodied accounts of expression recognition as well as hierarchical models of face processing. The results also demonstrate that nonvisual cortical areas contribute during early stages of expression processing.Biotechnology and Biological Sciences Research Counci

    Preliminary galaxy extraction from DENIS images

    Get PDF
    The extragalactic applications of NIR surveys are summarized with a focus on the ability to map the interstellar extinction of our Galaxy. Very preliminary extraction of galaxies on a set of 180 consecutive images is presented, and the results illustrate some of the pitfalls in attempting an homogeneous extraction of galaxies from these wide-angle and shallow surveys.Comment: Invited talk at "The Impact of Large-Scale Near-IR Sky Surveys", meeting held in Tenerife, Spain, April 1996. 10 pages LaTeX with style file and 4 PS files include

    Coherent states for compact Lie groups and their large-N limits

    Full text link
    The first two parts of this article surveys results related to the heat-kernel coherent states for a compact Lie group K. I begin by reviewing the definition of the coherent states, their resolution of the identity, and the associated Segal-Bargmann transform. I then describe related results including connections to geometric quantization and (1+1)-dimensional Yang--Mills theory, the associated coherent states on spheres, and applications to quantum gravity. The third part of this article summarizes recent work of mine with Driver and Kemp on the large-N limit of the Segal--Bargmann transform for the unitary group U(N). A key result is the identification of the leading-order large-N behavior of the Laplacian on "trace polynomials."Comment: Submitted to the proceeding of the CIRM conference, "Coherent states and their applications: A contemporary panorama.

    The impact of the ATLAS zero-lepton, jets and missing momentum search on a CMSSM fit

    Full text link
    Recent ATLAS data significantly extend the exclusion limits for supersymmetric particles. We examine the impact of such data on global fits of the constrained minimal supersymmetric standard model (CMSSM) to indirect and cosmological data. We calculate the likelihood map of the ATLAS search, taking into account systematic errors on the signal and on the background. We validate our calculation against the ATLAS determinaton of 95% confidence level exclusion contours. A previous CMSSM global fit is then re-weighted by the likelihood map, which takes a bite at the high probability density region of the global fit, pushing scalar and gaugino masses up.Comment: 16 pages, 7 figures. v2 has bigger figures and fixed typos. v3 has clarified explanation of our handling of signal systematic

    Suppression of electron spin decoherence in a quantum dot

    Full text link
    The dominant source of decoherence for an electron spin in a quantum dot is the hyperfine interaction with the surrounding bath of nuclear spins. The decoherence process may be slowed down by subjecting the electron spin to suitable sequences of external control pulses. We investigate the performance of a variety of dynamical decoupling protocols using exact numerical simulation. Emphasis is given to realistic pulse delays and the long-time limit, beyond the domain where available analytical approaches are guaranteed to work. Our results show that both deterministic and randomized protocols are capable to significantly prolong the electron coherence time, even when using control pulse separations substantially larger than what expected from the {\em upper cutoff} frequency of the coupling spectrum between the electron and the nuclear spins. In a realistic parameter range, the {\em total width} of such a coupling spectrum appears to be the physically relevant frequency scale affecting the overall quality of the decoupling.Comment: 8 pages, 3 figures. Invited talk at the XXXVII Winter Colloquium on the Physics of Quantum Electronics, Snowbird, Jan 2007. Submitted to J. Mod. Op

    Synthesis of Recursive ADT Transformations from Reusable Templates

    Full text link
    Recent work has proposed a promising approach to improving scalability of program synthesis by allowing the user to supply a syntactic template that constrains the space of potential programs. Unfortunately, creating templates often requires nontrivial effort from the user, which impedes the usability of the synthesizer. We present a solution to this problem in the context of recursive transformations on algebraic data-types. Our approach relies on polymorphic synthesis constructs: a small but powerful extension to the language of syntactic templates, which makes it possible to define a program space in a concise and highly reusable manner, while at the same time retains the scalability benefits of conventional templates. This approach enables end-users to reuse predefined templates from a library for a wide variety of problems with little effort. The paper also describes a novel optimization that further improves the performance and scalability of the system. We evaluated the approach on a set of benchmarks that most notably includes desugaring functions for lambda calculus, which force the synthesizer to discover Church encodings for pairs and boolean operations

    Cytotoxic activity of Treponema denticola

    Get PDF
    published_or_final_versio

    Adjusting for multiple prognostic factors in the analysis of randomised trials

    Get PDF
    Background: When multiple prognostic factors are adjusted for in the analysis of a randomised trial, it is unclear (1) whether it is necessary to account for each of the strata, formed by all combinations of the prognostic factors (stratified analysis), when randomisation has been balanced within each stratum (stratified randomisation), or whether adjusting for the main effects alone will suffice, and (2) the best method of adjustment in terms of type I error rate and power, irrespective of the randomisation method. Methods: We used simulation to (1) determine if a stratified analysis is necessary after stratified randomisation, and (2) to compare different methods of adjustment in terms of power and type I error rate. We considered the following methods of analysis: adjusting for covariates in a regression model, adjusting for each stratum using either fixed or random effects, and Mantel-Haenszel or a stratified Cox model depending on outcome. Results: Stratified analysis is required after stratified randomisation to maintain correct type I error rates when (a) there are strong interactions between prognostic factors, and (b) there are approximately equal number of patients in each stratum. However, simulations based on real trial data found that type I error rates were unaffected by the method of analysis (stratified vs unstratified), indicating these conditions were not met in real datasets. Comparison of different analysis methods found that with small sample sizes and a binary or time-to-event outcome, most analysis methods lead to either inflated type I error rates or a reduction in power; the lone exception was a stratified analysis using random effects for strata, which gave nominal type I error rates and adequate power. Conclusions: It is unlikely that a stratified analysis is necessary after stratified randomisation except in extreme scenarios. Therefore, the method of analysis (accounting for the strata, or adjusting only for the covariates) will not generally need to depend on the method of randomisation used. Most methods of analysis work well with large sample sizes, however treating strata as random effects should be the analysis method of choice with binary or time-to-event outcomes and a small sample size

    Berezin-Type Operators on the Cotangent Bundle of a Nilpotent Group

    Get PDF
    We define and study coherent states, a Berezin-Toeplitz quantization and covariant symbols on the product between a connected simply connected nilpotent Lie group and the dual of its Lie algebra. The starting point is a Weyl system codifying the natural Canonical Commutation Relations of the system. The formalism is meant to complement the quantization of the cotangent bundle by pseudo-differential operators, to which it is connected in an explicit way. Some extensions are indicated, concerning τ\tau-quantizations and variable magnetic fields.Comment: 21 page
    corecore