30 research outputs found

    A Probabilistic Model for Estimating the Depth and Threshold Temperature of C-fiber Nociceptors

    Get PDF
    The subjective experience of thermal pain follows the detection and encoding of noxious stimuli by primary afferent neurons called nociceptors. However, nociceptor morphology has been hard to access and the mechanisms of signal transduction remain unresolved. In order to understand how heat transducers in nociceptors are activated in vivo, it is important to estimate the temperatures that directly activate the skin-embedded nociceptor membrane. Hence, the nociceptor’s temperature threshold must be estimated, which in turn will depend on the depth at which transduction happens in the skin. Since the temperature at the receptor cannot be accessed experimentally, such an estimation can currently only be achieved through modeling. However, the current state-of-the-art model to estimate temperature at the receptor suffers from the fact that it cannot account for the natural stochastic variability of neuronal responses. We improve this model using a probabilistic approach which accounts for uncertainties and potential noise in system. Using a data set of 24 C-fibers recorded in vitro, we show that, even without detailed knowledge of the bio-thermal properties of the system, the probabilistic model that we propose here is capable of providing estimates of threshold and depth in cases where the classical method fails

    From perception to action: phase-locked gamma oscillations correlate with reaction times in a speeded response task

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Phase-locked gamma oscillations have so far mainly been described in relation to perceptual processes such as sensation, attention or memory matching. Due to its very short latency (≈90 ms) such oscillations are a plausible candidate for very rapid integration of sensory and motor processes.</p> <p>Results</p> <p>We measured EEG in 13 healthy participants in a speeded reaction task. Participants had to press a button as fast as possible whenever a visual stimulus was presented. The stimulus was always identical and did not have to be discriminated from other possible stimuli. In trials in which the participants showed a fast response, a slow negative potential over central electrodes starting approximately 800 ms before the response and highly phase-locked gamma oscillations over central and posterior electrodes between 90 and 140 ms after the stimulus were observed. In trials in which the participants showed a slow response, no slow negative potential was observed and phase-locked gamma oscillations were significantly reduced. Furthermore, for slow response trials the phase-locked gamma oscillations were significantly delayed with respect to fast response trials.</p> <p>Conclusion</p> <p>These results indicate the relevance of phase-locked gamma oscillations for very fast (not necessarily detailed) integration processes.</p

    PyMVPA: A Unifying Approach to the Analysis of Neuroscientific Data

    Get PDF
    The Python programming language is steadily increasing in popularity as the language of choice for scientific computing. The ability of this scripting environment to access a huge code base in various languages, combined with its syntactical simplicity, make it the ideal tool for implementing and sharing ideas among scientists from numerous fields and with heterogeneous methodological backgrounds. The recent rise of reciprocal interest between the machine learning (ML) and neuroscience communities is an example of the desire for an inter-disciplinary transfer of computational methods that can benefit from a Python-based framework. For many years, a large fraction of both research communities have addressed, almost independently, very high-dimensional problems with almost completely non-overlapping methods. However, a number of recently published studies that applied ML methods to neuroscience research questions attracted a lot of attention from researchers from both fields, as well as the general public, and showed that this approach can provide novel and fruitful insights into the functioning of the brain. In this article we show how PyMVPA, a specialized Python framework for machine learning based data analysis, can help to facilitate this inter-disciplinary technology transfer by providing a single interface to a wide array of machine learning libraries and neural data-processing methods. We demonstrate the general applicability and power of PyMVPA via analyses of a number of neural data modalities, including fMRI, EEG, MEG, and extracellular recordings

    Time Pressure Modulates Electrophysiological Correlates of Early Visual Processing

    Get PDF
    BACKGROUND: Reactions to sensory events sometimes require quick responses whereas at other times they require a high degree of accuracy-usually resulting in slower responses. It is important to understand whether visual processing under different response speed requirements employs different neural mechanisms. METHODOLOGY/PRINCIPAL FINDINGS: We asked participants to classify visual patterns with different levels of detail as real-world or non-sense objects. In one condition, participants were to respond immediately, whereas in the other they responded after a delay of 1 second. As expected, participants performed more accurately in delayed response trials. This effect was pronounced for stimuli with a high level of detail. These behavioral effects were accompanied by modulations of stimulus related EEG gamma oscillations which are an electrophysiological correlate of early visual processing. In trials requiring speeded responses, early stimulus-locked oscillations discriminated real-world and non-sense objects irrespective of the level of detail. For stimuli with a higher level of detail, oscillatory power in a later time window discriminated real-world and non-sense objects irrespective of response speed requirements. CONCLUSIONS/SIGNIFICANCE: Thus, it seems plausible to assume that different response speed requirements trigger different dynamics of processing

    Joint Bayesian inference reveals model properties shared between multiple experimental conditions.

    No full text
    Statistical modeling produces compressed and often more easily interpretable descriptions of experimental data in form of model parameters. When experimental manipulations target selected parameters, it is necessary for their interpretation that other model components remain constant. For example, psychophysicists use dose rate models to describe how behavior changes as a function of a single stimulus variable. The main interest is on shifts of this function induced by experimental manipulation, assuming invariance in other aspects of the function. Combining several experimental conditions in a joint analysis that takes such invariance constraints into account can result in a complex model for which no robust standard procedures are available. We formulate a solution for the joint analysis through repeated applications of standard procedures by allowing an additional assumption. This way, experimental conditions can be analyzed separately such that all conditions are implicitly taken into account. We investigate the validity of the supplementary assumption through simulations. Furthermore, we present a natural way to check whether a joint treatment is appropriate. We illustrate the method for the specific case of the psychometric function; however the procedure applies to other models that encompass multiple experimental conditions

    Overlap as a function of parameter correlations for the isolated models(left) and joint model(right).

    No full text
    <p>The color of each data point corresponds to the overlap of the posterior marginal distributions of parameter . Despite the fact that the generating parameters of the data sets were the same, the inferred parameter distributions can show rather low overlap in the isolated model approach. The cardinal axis denote the correlation between the generating parameters in the first and second data set.</p

    The procedure applied to different data sets.

    No full text
    <p>This figure is constructed equivalently to <a href="http://www.plosone.org/article/info:doi/10.1371/journal.pone.0091710#pone-0091710-g003" target="_blank">Figure 3</a>, but different data sets are used. In Panel A and D an experimental condition without a contrast mask is shown. Panel B and E contain the same data sets as <a href="http://www.plosone.org/article/info:doi/10.1371/journal.pone.0091710#pone-0091710-g003" target="_blank">Figure 3 A and D</a>. In this example, the marginal posterior distributions from the isolated inference procedure result in markedly different parameter posterior distributions (Panel C) which are forced to overlap through joint inference (Panel F).</p

    Deviance as a function of parameter correlations for the isolated models(left) and joint model(right).

    No full text
    <p>The color of each data point corresponds to the combined deviance obtained through psychometric function fits to two artificially generated data sets with the same generating parameters. The cardinal axis denote the correlation between the generating parameters and in the first and second data set.</p

    Histograms of the deviance and overlap data shown in Figure 4 and Figure 6.

    No full text
    <p>The dark histograms corresponds to the joint and the light histograms to the isolated fits.</p
    corecore