160 research outputs found

    Immune-mediated competition in rodent malaria is most likely caused by induced changes in innate immune clearance of merozoites

    Get PDF
    Malarial infections are often genetically diverse, leading to competitive interactions between parasites. A quantitative understanding of the competition between strains is essential to understand a wide range of issues, including the evolution of virulence and drug resistance. In this study, we use dynamical-model based Bayesian inference to investigate the cause of competitive suppression of an avirulent clone of Plasmodium chabaudi (AS) by a virulent clone (AJ) in immuno-deficient and competent mice. We test whether competitive suppression is caused by clone-specific differences in one or more of the following processes: adaptive immune clearance of merozoites and parasitised red blood cells (RBCs), background loss of merozoites and parasitised RBCs, RBC age preference, RBC infection rate, burst size, and within-RBC interference. These processes were parameterised in dynamical mathematical models and fitted to experimental data. We found that just one parameter μ, the ratio of background loss rate of merozoites to invasion rate of mature RBCs, needed to be clone-specific to predict the data. Interestingly, μ was found to be the same for both clones in single-clone infections, but different between the clones in mixed infections. The size of this difference was largest in immuno-competent mice and smallest in immuno-deficient mice. This explains why competitive suppression was alleviated in immuno-deficient mice. We found that competitive suppression acts early in infection, even before the day of peak parasitaemia. These results lead us to argue that the innate immune response clearing merozoites is the most likely, but not necessarily the only, mediator of competitive interactions between virulent and avirulent clones. Moreover, in mixed infections we predict there to be an interaction between the clones and the innate immune response which induces changes in the strength of its clearance of merozoites. What this interaction is unknown, but future refinement of the model, challenged with other datasets, may lead to its discovery

    Processing of spatial-frequency altered faces in schizophrenia: Effects of illness phase and duration

    Get PDF
    Low spatial frequency (SF) processing has been shown to be impaired in people with schizophrenia, but it is not clear how this varies with clinical state or illness chronicity. We compared schizophrenia patients (SCZ, n534), first episode psychosis patients (FEP, n522), and healthy controls (CON, n535) on a gender/facial discrimination task. Images were either unaltered (broadband spatial frequency, BSF), or had high or low SF information removed (LSF and HSF conditions, respectively). The task was performed at hospital admission and discharge for patients, and at corresponding time points for controls. Groups were matched on visual acuity. At admission, compared to their BSF performance, each group was significantly worse with low SF stimuli, and most impaired with high SF stimuli. The level of impairment at each SF did not depend on group. At discharge, the SCZ group performed more poorly in the LSF condition than the other groups, and showed the greatest degree of performance decline collapsed over HSF and LSF conditions, although the latter finding was not significant when controlling for visual acuity. Performance did not change significantly over time for any group. HSF processing was strongly related to visual acuity at both time points for all groups. We conclude the following: 1) SF processing abilities in schizophrenia are relatively stable across clinical state; 2) face processing abnormalities in SCZ are not secondary to problems processing specific SFs, but are due to other known difficulties constructing visual representations from degraded information; and 3) the relationship between HSF processing and visual acuity, along with known SCZ- and medication-related acuity reductions, and the elimination of a SCZ-related impairment after controlling for visual acuity in this study, all raise the possibility that some prior findings of impaired perception in SCZ may be secondary to acuity reductions

    Effects of a soft robotic exosuit on the quality and speed of overground walking depends on walking ability after stroke

    Get PDF
    \ua9 2023, BioMed Central Ltd., part of Springer Nature.Background: Soft robotic exosuits can provide partial dorsiflexor and plantarflexor support in parallel with paretic muscles to improve poststroke walking capacity. Previous results indicate that baseline walking ability may impact a user’s ability to leverage the exosuit assistance, while the effects on continuous walking, walking stability, and muscle slacking have not been evaluated. Here we evaluated the effects of a portable ankle exosuit during continuous comfortable overground walking in 19 individuals with chronic hemiparesis. We also compared two speed-based subgroups (threshold: 0.93 m/s) to address poststroke heterogeneity. Methods: We refined a previously developed portable lightweight soft exosuit to support continuous overground walking. We compared five minutes of continuous walking in a laboratory with the exosuit to walking without the exosuit in terms of ground clearance, foot landing and propulsion, as well as the energy cost of transport, walking stability and plantarflexor muscle slacking. Results: Exosuit assistance was associated with improvements in the targeted gait impairments: 22% increase in ground clearance during swing, 5\ub0 increase in foot-to-floor angle at initial contact, and 22% increase in the center-of-mass propulsion during push-off. The improvements in propulsion and foot landing contributed to a 6.7% (0.04 m/s) increase in walking speed (R 2 = 0.82). This enhancement in gait function was achieved without deterioration in muscle effort, stability or cost of transport. Subgroup analyses revealed that all individuals profited from ground clearance support, but slower individuals leveraged plantarflexor assistance to improve propulsion by 35% to walk 13% faster, while faster individuals did not change either. Conclusions: The immediate restorative benefits of the exosuit presented here underline its promise for rehabilitative gait training in poststroke individuals

    Genetic variants in novel pathways influence blood pressure and cardiovascular disease risk.

    Get PDF
    Blood pressure is a heritable trait influenced by several biological pathways and responsive to environmental stimuli. Over one billion people worldwide have hypertension (≥140 mm Hg systolic blood pressure or  ≥90 mm Hg diastolic blood pressure). Even small increments in blood pressure are associated with an increased risk of cardiovascular events. This genome-wide association study of systolic and diastolic blood pressure, which used a multi-stage design in 200,000 individuals of European descent, identified sixteen novel loci: six of these loci contain genes previously known or suspected to regulate blood pressure (GUCY1A3-GUCY1B3, NPR3-C5orf23, ADM, FURIN-FES, GOSR2, GNAS-EDN3); the other ten provide new clues to blood pressure physiology. A genetic risk score based on 29 genome-wide significant variants was associated with hypertension, left ventricular wall thickness, stroke and coronary artery disease, but not kidney disease or kidney function. We also observed associations with blood pressure in East Asian, South Asian and African ancestry individuals. Our findings provide new insights into the genetics and biology of blood pressure, and suggest potential novel therapeutic pathways for cardiovascular disease prevention

    Genome-wide analysis identifies 12 loci influencing human reproductive behavior.

    Get PDF
    The genetic architecture of human reproductive behavior-age at first birth (AFB) and number of children ever born (NEB)-has a strong relationship with fitness, human development, infertility and risk of neuropsychiatric disorders. However, very few genetic loci have been identified, and the underlying mechanisms of AFB and NEB are poorly understood. We report a large genome-wide association study of both sexes including 251,151 individuals for AFB and 343,072 individuals for NEB. We identified 12 independent loci that are significantly associated with AFB and/or NEB in a SNP-based genome-wide association study and 4 additional loci associated in a gene-based effort. These loci harbor genes that are likely to have a role, either directly or by affecting non-local gene expression, in human reproduction and infertility, thereby increasing understanding of these complex traits

    Apelin Deficiency Accelerates the Progression of Amyotrophic Lateral Sclerosis

    Get PDF
    Amyotrophic lateral sclerosis (ALS) is a neurodegenerative disease characterized by the selective loss of motor neurons. Recent studies have implicated that chronic hypoxia and insufficient vascular endothelial growth factor (VEGF)-dependent neuroprotection may lead to the degeneration of motor neurons in ALS. Expression of apelin, an endogenous ligand for the G protein-coupled receptor APJ, is regulated by hypoxia. In addition, recent reports suggest that apelin protects neurons against glutamate-induced excitotoxicity. Here, we examined whether apelin is an endogenous neuroprotective factor using SOD1G93A mouse model of ALS. In mouse CNS tissues, the highest expressions of both apelin and APJ mRNAs were detected in spinal cord. APJ immunoreactivity was observed in neuronal cell bodies located in gray matter of spinal cord. Although apelin mRNA expression in the spinal cord of wild-type mice was not changed from 4 to 18 weeks age, that of SOD1G93A mice was reduced along with the paralytic phenotype. In addition, double mutant apelin-deficient and SOD1G93A displayed the disease phenotypes earlier than SOD1G93A littermates. Immunohistochemical observation revealed that the number of motor neurons was decreased and microglia were activated in the spinal cord of the double mutant mice, indicating that apelin deficiency pathologically accelerated the progression of ALS. Furthermore, we showed that apelin enhanced the protective effect of VEGF on H2O2-induced neuronal death in primary neurons. These results suggest that apelin/APJ system in the spinal cord has a neuroprotective effect against the pathogenesis of ALS

    Mindfulness based cognitive therapy improves frontal control in bipolar disorder: a pilot EEG study

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Cognitive processing in Bipolar Disorder is characterized by a number of attentional abnormalities. Mindfulness Based Cognitive Therapy combines mindfulness meditation, a form of attentional training, along with aspects of cognitive therapy, and may improve attentional dysfunction in bipolar disorder patients.</p> <p>Methods</p> <p>12 euthymic BD patients and 9 control participants underwent record of electroencephalography (EEG, band frequency analysis) during resting states (eyes open, eyes closed) and during the completion of a continuous performance task (A-X version, EEG event-related potential (ERP) wave component analysis). The individuals with BD completed an 8-week MBCT intervention and record of EEG was repeated.</p> <p>Results</p> <p>(1) Brain activity, individuals with BD showed significantly decreased theta band power, increased beta band power, and decreased theta/beta ratios during the resting state, eyes closed, for frontal and cingulate cortices. Post MBCT intervention improvement over the right frontal cortex was seen in the individuals with BD, as beta band power decreased. (2) Brain activation, individuals with BD showed a significant P300-like wave form over the frontal cortex during the cue. Post MBCT intervention the P300-like waveform was significantly attenuated over the frontal cortex.</p> <p>Conclusions</p> <p>Individuals with BD show decreased attentional readiness and activation of non-relevant information processing during attentional processes. These data are the first that show, MBCT in BD improved attentional readiness, and attenuated activation of non-relevant information processing during attentional processes.</p

    The management of non-valvular atrial fibrillation (NVAF) in Australian general practice: bridging the evidence-practice gap. A national, representative postal survey

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>General practitioners (GPs) are ideally placed to bridge the widely noted evidence-practice gap between current management of NVAF and the need to increase anticoagulant use to reduce the risk of fatal and disabling stroke in NVAF. We aimed to identify gaps in current care, and asked GPs to identify potentially useful strategies to overcome barriers to best practice.</p> <p>Methods</p> <p>We obtained contact details for a random sample of 1000 GPs from a national commercial data-base. Randomly selected GPs were mailed a questionnaire after an advance letter. Standardised reminders were administered to enhance response rates. As part of a larger survey assessing GP management of NVAF, we included questions to explore GPs' risk assessment, estimates of stroke risk and GPs' perceptions of the risks and benefits of anticoagulation with warfarin. In addition, we explored GPs' perceived barriers to the wider uptake of anticoagulation, quality control of anticoagulation and their assessment of strategies to assist in managing NVAF.</p> <p>Results</p> <p>596 out of 924 eligible GPs responded (64.4% response rate). The majority of GPs recognised that the benefits of warfarin outweighed the risks for three case scenarios in which warfarin is recommended according to Australian guidelines. In response to a hypothetical case scenario describing a patient with a supratherapeutic INR level of 5, 41.4% of the 596 GPs (n = 247) and 22.0% (n = 131) would be "highly likely" or "likely", respectively, to cease warfarin therapy and resume at a lower dose when INR levels are within therapeutic range. Only 27.9% (n = 166/596) would reassess the patient's INR levels within one day of recording the supratherapeutic INR. Patient contraindications to warfarin was reported to "usually" or "always" apply to the patients of 40.6% (n = 242/596) of GPs when considering whether or not to prescribe warfarin. Patient refusal to take warfarin "usually" or "always" applied to the patients of 22.3% (n = 133/596) of GPs. When asked to indicate the usefulness of strategies to assist in managing NVAF, the majority of GPs (89.1%, n = 531/596) reported that they would find patient educational resources outlining the benefits and risks of available treatments "quite useful" or "very useful". Just under two-thirds (65.2%; n = 389/596) reported that they would find point of care INR testing "quite" or "very" useful. An outreach specialist service and training to enable GPs to practice stroke medicine as a special interest were also considered to be "quite" or "very useful" by 61.9% (n = 369/596) GPs.</p> <p>Conclusion</p> <p>This survey identified gaps, based on GP self-report, in the current care of NVAF. GPs themselves have provided guidance on the selection of implementation strategies to bridge these gaps. These results may inform future initiatives designed to reduce the risk of fatal and disabling stroke in NVAF.</p

    Adjuvant breast cancer chemotherapy during late-trimester pregnancy: not quite a standard of care

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Diagnosis of breast cancer during pregnancy was formerly considered an indication for abortion. The pendulum has since swung to the other extreme, with most reviews now rejecting termination while endorsing immediate anthracycline-based therapy for any pregnant patient beyond the first trimester. To assess the evidence for this radical change in thinking, a review of relevant studies in the fields of breast cancer chemotherapy, pregnancy, and drug safety was conducted.</p> <p>Discussion</p> <p>Accumulating evidence for the short-term safety of anthracycline-based chemotherapy during late-trimester pregnancy represents a clear advance over the traditional norm of therapeutic abortion. Nonetheless, the emerging orthodoxy favoring routine chemotherapy during gestation should continue to be questioned on several grounds: (1) the assumed difference in maternal survival accruing from chemotherapy administered earlier – i.e., during pregnancy, rather than after delivery – has not been quantified; (2) the added survival benefit of adjuvant cytotoxic therapy prescribed within the hormone-rich milieu of pregnancy remains presumptive, particularly for ER-positive disease; (3) the maternal survival benefit associated with modified adjuvant regimens (e.g., weekly schedules, omission of taxanes, etc.) has not been proven equivalent to standard (e.g., post-delivery) regimens; and (4) the long-term transplacental and transgenerational hazards of late-trimester chemotherapy are unknown.</p> <p>Summary</p> <p>Although an incrementally increased risk of cancer-specific mortality is impossible to exclude, mothers who place a high priority on the lifelong well-being of their progeny may be informed that deferring optimal chemotherapy until after delivery is still an option to consider, especially in ER-positive, node-negative and/or last-trimester disease.</p
    corecore