262 research outputs found

    Frequency of sexual dysfunction in outpatients with severe mental illness in Greece

    Get PDF
    INTRODUCTION: Patients with psychosis can develop sexual dysfunction, which may be related to the disease itself, psychosocial factors, somatic comorbidities, and the use of psychotropic medication. OBJECTIVE: We aimed to investigate the type and frequency of sexual dysfunction in patients diagnosed with schizophrenia or bipolar disorder in order to assess the side effects of antipsychotics in sexual function. METHODS: This is a multicenter, cross-sectional study, involving patients diagnosed with schizophrenia (79.3%) or bipolar disorder (20.7%) treated in the Department of Psychiatry and Community Mental Health Centers from November 2018 to December 2019. Patients were enrolled in the study after signed informed consent. Demographic and clinical data were collected from patients through a semi-structured interview. The Antipsychotics and Sexual Functioning Questionnaire (ASFQ) was administered to assess sexual function. RESULTS: A total of 87 outpatients on antipsychotics were recruited in the study. The mean age was 43.6 years, while the mean duration of the disease was 16.9 years. Overall, only 9.1% of patients spontaneously reported sexual dysfunction. Patients treated with oral first-generation antipsychotics had more difficulties in achieving orgasm and decreased erection capacity. In contrast, patients treated with oral second-generation antipsychotics had decreased ejaculation capacity. Patients on antipsychotic combination therapy were associated with higher rates of sexual anhedonia. DISCUSSION: These results suggest that sexual dysfunction is a side effect of antipsychotic treatment, which was spontaneously rarely reported by patients. It seems essential to obtain a psychosexual clinical history before initiating antipsychotic treatment to evaluate following changes and adopt an individualized strategy to manage sexual dysfunction induced by antipsychotics

    Translation and Validation of the Greek Version of the Antipsychotics and Sexual Functioning Questionnaire (ASFQ)

    Get PDF
    Introduction Sexual dysfunction in patients with psychoses may be associated with the psychiatric illness itself (negative symptoms, such as apathy, and avolition), comorbid somatic health, psychosocial factors (stigmatization, discrimination), and the use of psychotropic drugs. In Greece, research into the study of antipsychotic-induced sexual dysfunction is not sufficient. Aim This study was conducted to translate and validate the Greek version of the Antipsychotics and Sexual Functioning Questionnaire (ASFQ) in a sample of patients receiving antipsychotic treatment. Methods A “forward-backward translation” method was applied. A pilot study was conducted with 15 outpatients with schizophrenia and bipolar disorder under antipsychotics treatment. Patients also completed the “Subjects’ Response to Antipsychotics (SRA)” questionnaire in order to assess the validity of the ASFQ. The ASFQ and the SRA questionnaire were completed twice within 2 weeks. Main outcome measures Reliability (internal consistency and test-retest) and validity were assessed. Results The Greek translation of ASFQ was reliable, with excellent internal consistency (Cronbach's a = 0.90 for men and 0.95 for women in both measurements). In addition, the Spearman correlation coefficient was 1 (P< .001) in all Likert-type questions in both assessments. Finally, Spearman correlation coefficients between ASFQ and SRA were moderately positive to strongly positive (between 0.25 and 1) in both assessments, demonstrating moderate to high validity. Conclusions The Greek version of the ASFQ has proved to be a reliable and valid clinical instrument, hence it can be used in further studies in the Greek population. Angelaki M, Galanis P, Igoumenou A, et al. Translation and Validation of the Greek Version of the Antipsychotics and Sexual Functioning Questionnaire (ASFQ). J Sex Med 2021;9:100334

    Binocular disparity tuning and visual–vestibular congruency of multisensory neurons in macaque parietal cortex

    Get PDF
    Many neurons in the dorsal medial superior temporal (MSTd) and ventral intraparietal (VIP) areas of the macaque brain are multisensory, responding to both optic flow and vestibular cues to self-motion. The heading tuning of visual and vestibular responses can be either congruent or opposite, but only congruent cells have been implicated in cue integration for heading perception. Because of the geometric properties of motion parallax, however, both congruent and opposite cells could be involved in coding self-motion when observers fixate a world-fixed target during translation, if congruent cells prefer near disparities and opposite cells prefer far disparities. We characterized the binocular disparity selectivity and heading tuning of MSTd and VIP cells using random-dot stimuli. Most (70%) MSTd neurons were disparity-selective with monotonic tuning, and there was no consistent relationship between depth preference and congruency of visual and vestibular heading tuning. One-third of disparity-selective MSTd cells reversed their depth preference for opposite directions of motion (direction-dependent disparity tuning, DDD), but most of these cells were unisensory with no tuning for vestibular stimuli. Inconsistent with previous reports, the direction preferences of most DDD neurons do not reverse with disparity. By comparison to MSTd, VIP contains fewer disparity-selective neurons (41%) and very few DDD cells. On average, VIP neurons also preferred higher speeds and nearer disparities than MSTd cells. Our findings are inconsistent with the hypothesis that visual/vestibular congruency is linked to depth preference, and also suggest that DDD cells are not involved in multisensory integration for heading perception

    Purkinje Cells in Posterior Cerebellar Vermis Encode Motion in an Inertial Reference Frame

    Get PDF
    SummaryThe ability to orient and navigate through the terrestrial environment represents a computational challenge common to all vertebrates. It arises because motion sensors in the inner ear, the otolith organs, and the semicircular canals transduce self-motion in an egocentric reference frame. As a result, vestibular afferent information reaching the brain is inappropriate for coding our own motion and orientation relative to the outside world. Here we show that cerebellar cortical neuron activity in vermal lobules 9 and 10 reflects the critical computations of transforming head-centered vestibular afferent information into earth-referenced self-motion and spatial orientation signals. Unlike vestibular and deep cerebellar nuclei neurons, where a mixture of responses was observed, Purkinje cells represent a homogeneous population that encodes inertial motion. They carry the earth-horizontal component of a spatially transformed and temporally integrated rotation signal from the semicircular canals, which is critical for computing head attitude, thus isolating inertial linear accelerations during navigation

    Spatiotemporal properties of optic flow and vestibular tuning in the cerebellar nodulus and uvula

    Get PDF
    Convergence of visual motion and vestibular information is essential for accurate spatial navigation. Such multisensory integration has been shown in cortex, e.g., the dorsal medial superior temporal (MSTd) and ventral intraparietal (VIP) areas, but not in the parieto-insular vestibular cortex (PIVC). Whether similar convergence occurs subcortically remains unknown. Many Purkinje cells in vermal lobules 10 (nodulus) and 9 (uvula) of the macaque cerebellum are tuned to vestibular translation stimuli, yet little is known about their visual motion responsiveness. Here we show the existence of translational optic flow-tuned Purkinje cells, found exclusively in the anterior part of the nodulus and ventral uvula, near the midline. Vestibular responses of Purkinje cells showed a remarkable similarity to those in MSTd (but not PIVC or VIP) neurons, in terms of both response latency and relative contributions of velocity, acceleration, and position components. In contrast, the spatiotemporal properties of optic flow responses differed from those in MSTd, and matched the vestibular properties of these neurons. Compared with MSTd, optic flow responses of Purkinje cells showed smaller velocity contributions and larger visual motion acceleration responses. The remarkable similarity between the nodulus/uvula and MSTd vestibular translation responsiveness suggests a functional coupling between the two areas for vestibular processing of self-motion information

    Vestibular Facilitation of Optic Flow Parsing

    Get PDF
    Simultaneous object motion and self-motion give rise to complex patterns of retinal image motion. In order to estimate object motion accurately, the brain must parse this complex retinal motion into self-motion and object motion components. Although this computational problem can be solved, in principle, through purely visual mechanisms, extra-retinal information that arises from the vestibular system during self-motion may also play an important role. Here we investigate whether combining vestibular and visual self-motion information improves the precision of object motion estimates. Subjects were asked to discriminate the direction of object motion in the presence of simultaneous self-motion, depicted either by visual cues alone (i.e. optic flow) or by combined visual/vestibular stimuli. We report a small but significant improvement in object motion discrimination thresholds with the addition of vestibular cues. This improvement was greatest for eccentric heading directions and negligible for forward movement, a finding that could reflect increased relative reliability of vestibular versus visual cues for eccentric heading directions. Overall, these results are consistent with the hypothesis that vestibular inputs can help parse retinal image motion into self-motion and object motion components

    Multisensory causal inference in the brain

    Get PDF
    At any given moment, our brain processes multiple inputs from its different sensory modalities (vision, hearing, touch, etc.). In deciphering this array of sensory information, the brain has to solve two problems: (1) which of the inputs originate from the same object and should be integrated and (2) for the sensations originating from the same object, how best to integrate them. Recent behavioural studies suggest that the human brain solves these problems using optimal probabilistic inference, known as Bayesian causal inference. However, how and where the underlying computations are carried out in the brain have remained unknown. By combining neuroimaging-based decoding techniques and computational modelling of behavioural data, a new study now sheds light on how multisensory causal inference maps onto specific brain areas. The results suggest that the complexity of neural computations increases along the visual hierarchy and link specific components of the causal inference process with specific visual and parietal regions

    Investigating human audio-visual object perception with a combination of hypothesis-generating and hypothesis-testing fMRI analysis tools

    Get PDF
    Primate multisensory object perception involves distributed brain regions. To investigate the network character of these regions of the human brain, we applied data-driven group spatial independent component analysis (ICA) to a functional magnetic resonance imaging (fMRI) data set acquired during a passive audio-visual (AV) experiment with common object stimuli. We labeled three group-level independent component (IC) maps as auditory (A), visual (V), and AV, based on their spatial layouts and activation time courses. The overlap between these IC maps served as definition of a distributed network of multisensory candidate regions including superior temporal, ventral occipito-temporal, posterior parietal and prefrontal regions. During an independent second fMRI experiment, we explicitly tested their involvement in AV integration. Activations in nine out of these twelve regions met the max-criterion (A &lt; AV &gt; V) for multisensory integration. Comparison of this approach with a general linear model-based region-of-interest definition revealed its complementary value for multisensory neuroimaging. In conclusion, we estimated functional networks of uni- and multisensory functional connectivity from one dataset and validated their functional roles in an independent dataset. These findings demonstrate the particular value of ICA for multisensory neuroimaging research and using independent datasets to test hypotheses generated from a data-driven analysis

    Probabilistic identification of cerebellar cortical neurones across species.

    Get PDF
    Despite our fine-grain anatomical knowledge of the cerebellar cortex, electrophysiological studies of circuit information processing over the last fifty years have been hampered by the difficulty of reliably assigning signals to identified cell types. We approached this problem by assessing the spontaneous activity signatures of identified cerebellar cortical neurones. A range of statistics describing firing frequency and irregularity were then used, individually and in combination, to build Gaussian Process Classifiers (GPC) leading to a probabilistic classification of each neurone type and the computation of equi-probable decision boundaries between cell classes. Firing frequency statistics were useful for separating Purkinje cells from granular layer units, whilst firing irregularity measures proved most useful for distinguishing cells within granular layer cell classes. Considered as single statistics, we achieved classification accuracies of 72.5% and 92.7% for granular layer and molecular layer units respectively. Combining statistics to form twin-variate GPC models substantially improved classification accuracies with the combination of mean spike frequency and log-interval entropy offering classification accuracies of 92.7% and 99.2% for our molecular and granular layer models, respectively. A cross-species comparison was performed, using data drawn from anaesthetised mice and decerebrate cats, where our models offered 80% and 100% classification accuracy. We then used our models to assess non-identified data from awake monkeys and rabbits in order to highlight subsets of neurones with the greatest degree of similarity to identified cell classes. In this way, our GPC-based approach for tentatively identifying neurones from their spontaneous activity signatures, in the absence of an established ground-truth, nonetheless affords the experimenter a statistically robust means of grouping cells with properties matching known cell classes. Our approach therefore may have broad application to a variety of future cerebellar cortical investigations, particularly in awake animals where opportunities for definitive cell identification are limited

    A retinal code for motion along the gravitational and body axes

    Get PDF
    Self-motion triggers complementary visual and vestibular reflexes supporting image-stabilization and balance. Translation through space produces one global pattern of retinal image motion (optic flow), rotation another. We examined the direction preferences of direction-sensitive ganglion cells (DSGCs) in flattened mouse retinas in vitro. Here we show that for each subtype of DSGC, direction preference varies topographically so as to align with specific translatory optic flow fields, creating a neural ensemble tuned for a specific direction of motion through space. Four cardinal translatory directions are represented, aligned with two axes of high adaptive relevance: the body and gravitational axes. One subtype maximizes its output when the mouse advances, others when it retreats, rises or falls. Two classes of DSGCs, namely, ON-DSGCs and ON-OFF-DSGCs, share the same spatial geometry but weight the four channels differently. Each subtype ensemble is also tuned for rotation. The relative activation of DSGC channels uniquely encodes every translation and rotation. Although retinal and vestibular systems both encode translatory and rotatory self-motion, their coordinate systems differ
    • …
    corecore