258 research outputs found

    Lower cognitive baseline scores predict cognitive training success after 6 months in healthy older adults: Results of an online RCT

    Get PDF
    Background: Identifying predictors for general cognitive training (GCT) success in healthy older adults has many potential uses, including aiding intervention and improving individual dementia risk prediction, which are of high importance in health care. However, the factors that predict training improvements and the temporal course of predictors (eg, do the same prognostic factors predict training success after a short training period, such as 6 weeks, as well as after a longer training period, such as 6 months?) are largely unknown. Methods: Data (N = 4,184 healthy older individuals) from two arms (GCT vs. control) of a three-arm randomized controlled trial were reanalyzed to investigate predictors of GCT success in five cognitive tasks (grammatical reasoning, spatial working memory, digit vigilance, paired association learning, and verbal learning) at three time points (after 6 weeks, 3 months, and 6 months of training). Possible investigated predictors were sociodemographic variables, depressive symptoms, number of training sessions, cognitive baseline values, and all interaction terms (group*predictor). Results: Being female was predictive for improvement in grammatical reasoning at 6 weeks in the GCT group, and lower cognitive baseline scores were predictive for improvement in spatial working memory and verbal learning at 6 months. Conclusion: Our data indicate that predictors seem to change over time; remarkably, lower baseline performance at study entry is only a significant predictor at 6 months training. Possible reasons for these results are discussed in relation to the compensation hypothesis. J Am Geriatr Soc 68:-, 2020.This article is freely available via Open Access. Click on the Publisher URL to access it via the publisher's site.Alzheimer's Society UKpublished version, accepted version (12 month embargo), submitted versio

    Evolved ensemble of detectors for gross error detection.

    Get PDF
    In this study, we evolve an ensemble of detectors to check the presence of gross systematic errors on measurement data. We use the Fisher method to combine the output of different detectors and then test the hypothesis about the presence of gross errors based on the combined value. We further develop a detector selection approach in which a subset of detectors is selected for each sample. The selection is conducted by comparing the output of each detector to its associated selection threshold. The thresholds are obtained by minimizing the 0-1 loss function on training data using the Particle Swarm Optimization method. Experiments conducted on a simulated system confirm the advantages of ensemble and evolved ensemble approach

    Inhibition of insulin-degrading enzyme in human neurons promotes amyloid-β deposition

    Get PDF
    Alzheimer’s disease (AD) is characterised by the aggregation and deposition of amyloid-β (Aβ) peptides in the human brain. In age-related late-onset AD, deficient degradation and clearance, rather than enhanced production, of Aβ contributes to disease pathology. In the present study, we assessed the contribution of the two key Aβ-degrading zinc metalloproteases, insulin-degrading enzyme (IDE) and neprilysin (NEP), to Aβ degradation in human induced pluripotent stem cell (iPSC)-derived cortical neurons. Using an Aβ fluorescence polarisation assay, inhibition of IDE but not of NEP, blocked the degradation of Aβ by human neurons. When the neurons were grown in a 3D extracellular matrix to visualise Aβ deposition, inhibition of IDE but not NEP, increased the number of Aβ deposits. The resulting Aβ deposits were stained with the conformation-dependent, anti-amyloid antibodies A11 and OC that recognise Aβ aggregates in the human AD brain. Inhibition of the Aβ-forming β-secretase prevented the formation of the IDE-inhibited Aβ deposits. These data indicate that inhibition of IDE in live human neurons grown in a 3D matrix increased the deposition of Aβ derived from the proteolytic cleavage of the amyloid precursor protein. This work has implications for strategies aimed at enhancing IDE activity to promote Aβ degradation in AD

    A weighted ensemble of regression methods for gross error identification problem.

    Get PDF
    In this study, we proposed a new ensemble method to predict the magnitude of gross errors (GEs) on measurement data obtained from the hydrocarbon and stream processing industries. Our proposed model consists of an ensemble of regressors (EoR) obtained by training different regression algorithms on the training data of measurements and their associated GEs. The predictions of the regressors are aggregated using a weighted combining method to obtain the final GE magnitude prediction. In order to search for optimal weights for combining, we modelled the search problem as an optimisation problem by minimising the difference between GE predictions and corresponding ground truths. We used Genetic Algorithm (GA) to search for the optimal weights associated with each regressor. The experiments were conducted on synthetic measurement data generated from 4 popular systems from the literature. We first conducted experiments in comparing the performances of the proposed ensemble using GA and Particle Swarm Optimisation (PSO), nature-based optimisation algorithms to search for combining weights to show the better performance of the proposed ensemble with GA. We then compared the performance of the proposed ensemble to those of two well-known weighted ensemble methods (Least Square and BEM) and two ensemble methods for regression problems (Random Forest and Gradient Boosting). The experimental results showed that although the proposed ensemble took higher computational time for the training process than those benchmark algorithms, it performed better than them on all experimental datasets

    A comparative study of anomaly detection methods for gross error detection problems.

    Get PDF
    The chemical industry requires highly accurate and reliable measurements to ensure smooth operation and effective monitoring of processing facilities. However, measured data inevitably contains errors from various sources. Traditionally in flow systems, data reconciliation through mass balancing is applied to reduce error by estimating balanced flows. However, this approach can only handle random errors. For non-random errors (called gross errors, GEs) which are caused by measurement bias, instrument failures, or process leaks, among others, this approach would return incorrect results. In recent years, many gross error detection (GED) methods have been proposed by the research community. It is recognised that the basic principle of GED is a special case of the detection of outliers (or anomalies) in data analytics. With the developments of Machine Learning (ML) research, patterns in the data can be discovered to provide effective detection of anomalous instances. In this paper, we present a comprehensive study of the application of ML-based Anomaly Detection methods (ADMs) in the GED context on a number of synthetic datasets and compare the results with several established GED approaches. We also perform data transformation on the measurement data and compare its associated results to the original results, as well as investigate the effects of training size on the detection performance. One class Support Vector Machine outperformed other ADMs and five selected statistical tests for GED on Accuracy, F1 Score, and Overall Power while Interquartile Range (IQR) method obtained the best selectivity outcome among the top 6 AMDs and the five statistical tests. The results indicate that ADMs can potentially be applied to GED problems

    Appropriate Timing of Fluoxetine and Statin Delivery Reduces the Risk of Secondary Bleeding in Ischemic Stroke Rats

    Get PDF
    Background: Ongoing clinical trials are testing the effect of fluoxetine delivered post-stroke where a majority of patients are taking statins. This study determined the influence of the timing of administration of fluoxetine and statin on the final infarct volume and the risk of secondary bleeding in an animal model of ischemic stroke. Methods and findings: Ischemic strokes were induced by endothelin-1 injection into two cortical sites of 10-12 month old female rats, targeting the forelimb motor cortex. Combined medications (5 mg/kg fluoxetine and 1 mg/kg simvastatin) were orally administered either beginning 6-12 hours or 20-26 hours after stroke induction and continued daily for 90 days. Infarct volumes were assessed at poststroke day 91 using Nissl stained coronal brain sections. Control animals typically had 5-13 mm3 infarct volumes following endothelin-1 induced stroke. Animals that received fluoxetine and simvastatin (FS) beginning 20- 26 hours after stroke induction showed a strong trend of reduced infarct volume (3±0.3447 mm3 SEM, P=0.0563). Earlier drug delivery (6-12 hours after stroke) resulted in significantly larger infarct volumes (15.44.260 mm3 SEM, P=0.0157) when the drug groups were directly compared. Examination of the infarcts showed that earlier drug delivery induced secondary hemorrhagic infarcts, while later delivery did not (P=0.0427; Fisher’s exact test). Conclusion: There is a danger of secondary bleeding if fluoxetine and simvastatin are combined within 6-12 hours of ischemic stroke induction in rats resulting in larger infarct volumes. Delaying fluoxetine and simvastatin delivery to 20-26 hours after stroke induction in rats, however, reduces infarct volume and significantly lowers the risk of secondary hemorrhagic infarcts

    Epigenetic suppression of hippocampal calbindin-D28k by ΔFosB drives seizure-related cognitive deficits.

    Get PDF
    The calcium-binding protein calbindin-D28k is critical for hippocampal function and cognition, but its expression is markedly decreased in various neurological disorders associated with epileptiform activity and seizures. In Alzheimer\u27s disease (AD) and epilepsy, both of which are accompanied by recurrent seizures, the severity of cognitive deficits reflects the degree of calbindin reduction in the hippocampal dentate gyrus (DG). However, despite the importance of calbindin in both neuronal physiology and pathology, the regulatory mechanisms that control its expression in the hippocampus are poorly understood. Here we report an epigenetic mechanism through which seizures chronically suppress hippocampal calbindin expression and impair cognition. We demonstrate that ΔFosB, a highly stable transcription factor, is induced in the hippocampus in mouse models of AD and seizures, in which it binds and triggers histone deacetylation at the promoter of the calbindin gene (Calb1) and downregulates Calb1 transcription. Notably, increasing DG calbindin levels, either by direct virus-mediated expression or inhibition of ΔFosB signaling, improves spatial memory in a mouse model of AD. Moreover, levels of ΔFosB and calbindin expression are inversely related in the DG of individuals with temporal lobe epilepsy (TLE) or AD and correlate with performance on the Mini-Mental State Examination (MMSE). We propose that chronic suppression of calbindin by ΔFosB is one mechanism through which intermittent seizures drive persistent cognitive deficits in conditions accompanied by recurrent seizures

    The Freshman, vol. 6, no. 8

    Get PDF
    The Freshman was a weekly, student newsletter issued on Mondays throughout the academic year. The newsletter included calendar notices, coverage of campus social events, lectures, and athletic teams. The intent of the publication was to create unity, a sense of community, and class spirit among first year students
    • …
    corecore