46 research outputs found
Artificial intelligence for diagnostic and prognostic neuroimaging in dementia: a systematic review
Introduction: Artificial intelligence (AI) and neuroimaging offer new opportunities for diagnosis and prognosis of dementia. Methods: We systematically reviewed studies reporting AI for neuroimaging in diagnosis and/or prognosis of cognitive neurodegenerative diseases. Results: A total of 255 studies were identified. Most studies relied on the Alzheimer's Disease Neuroimaging Initiative dataset. Algorithmic classifiers were the most commonly used AI method (48%) and discriminative models performed best for differentiating Alzheimer's disease from controls. The accuracy of algorithms varied with the patient cohort, imaging modalities, and stratifiers used. Few studies performed validation in an independent cohort. Discussion: The literature has several methodological limitations including lack of sufficient algorithm development descriptions and standard definitions. We make recommendations to improve model validation including addressing key clinical questions, providing sufficient description of AI methods and validating findings in independent datasets. Collaborative approaches between experts in AI and medicine will help achieve the promising potential of AI tools in practice. Highlights: There has been a rapid expansion in the use of machine learning for diagnosis and prognosis in neurodegenerative disease Most studies (71%) relied on the Alzheimer's Disease Neuroimaging Initiative (ADNI) dataset with no other individual dataset used more than five times There has been a recent rise in the use of more complex discriminative models (e.g., neural networks) that performed better than other classifiers for classification of AD vs healthy controls We make recommendations to address methodological considerations, addressing key clinical questions, and validation We also make recommendations for the field more broadly to standardize outcome measures, address gaps in the literature, and monitor sources of bias
Pitfalls and artifacts using the D-SPECT dedicated cardiac camera
Myocardial perfusion imaging is a well-established and widely used imaging technique for the assessment of patients with known or suspected coronary artery disease. Pitfalls and artifacts associated with conventional gamma cameras are well known, and the ways to avoid and correct them have been described. In recent years solid-state detector dedicated cardiac cameras were introduced and have been shown to offer improved accuracy in addition to new imaging protocols and novel applications. The purpose of this manuscript is to familiarize the readers with the causes and effects of technical, patient-related, and operator-related pitfalls and artifacts associated with the D-SPECT dedicated cardiac camera with solid-state detectors. The manuscript offers guidance on how to avoid these factors, how to detect them, and how to correct better for them, providing high-quality diagnostic images
Physical principles of radiation detection in sample counters
Chapter 1 outlines the physics of radiation detection, focusing on the properties of sodium iodide detectors. The chapter closes with brief descriptions of liquid scintillators and solid state detectors
Dosimetry in Lu-177-DOTATATE peptide receptor radionuclide therapy: a systematic review
Purpose: 177Lu- labelled somatostatin analog DOTATATE is an excellent vector for systemic radiation therapy in NETs. However, this treatment can affect organ functions or impact the quality of life of the patient, due to collateral irradiation of normal body organs. Here we conducted a comprehensive systematic review on organ and tumour dosimetry in 177Lu-DOTATATE therapy.Design: in this review, published peer-reviewed articles on organ dosimetry in patients following PRRT using 177Lu-DOTATATE have been included. All the articles were screened for inclusion based on the title and abstract of the study. PubMed, Publons and DOAJ were used as search engines to conduct a systematic search in the database. Articles were categorized into three groups: (1) Clinical studies describing the technical parameter and method of dosimetry in 177Lu-DOTATATE therapy or (2) Organ dosimetry in 177Lu-DOTATATE treatment or (3) Tumour dosimetry in 177Lu- DOTATATE treatment.Result: in total, 694 studies were retrieved from database searching on NET and PRRT and 43 original articles on 177Lu-DOTATATE dosimetry were included in this review. The median absorbed dose per unit of administered activity for kidneys, spleen, liver, bone marrow and tumour were 0.64 (0.47–0.90 Gy/GBq), 1.23 (0.53–1.59 Gy/GBq), 0.54 (0.23–0.62 Gy/GBq), 0.04 (0.02–0.06 Gy/GBq) and 4.6 (3.09–9.47 Gy/GBq), respectively.Conclusion: according to the present dosimetric review, 177Lu-DOTATATE PRRT appears to be a safe and reliable treatment option for advanced GEP-NETs. From the dosimetric point of view, kidneys are theoretically the major organs at risk in 177Lu-DOTATATE treatment. The optimization of the number of treatment cycles beyond the prescribed limit of four and the maximum administered activity in each cycle must be determined by individual patient dosimetry in order to reduce the risk of organ toxicities whilst maximizing therapeutic efficacy
A novel videography method for generating crack-extension resistance curves in small bone samples
Assessment of bone quality is an emerging solution for quantifying the effects of bone pathology or treatment. Perhaps one of the most important parameters characterising bone quality is the toughness behaviour of bone. Particularly, fracture toughness, is becoming a popular means for evaluating bone quality. The method is moving from a single value approach that models bone as a linear-elastic material (using the stress intensity factor, K) towards full crack extension resistance curves (R-curves) using a non-linear model (the strain energy release rate in J-R curves). However, for explanted human bone or small animal bones, there are difficulties in measuring crack-extension resistance curves due to size constraints at the millimetre and sub-millimetre scale. This research proposes a novel "whitening front tracking" method that uses videography to generate full fracture resistance curves in small bone samples where crack propagation cannot typically be observed. Here we present this method on sharp edge notched samples (<1 mm×1 mm×Length) prepared from four human femora tested in three-point bending. Each sample was loaded in a mechanical tester with the crack propagation recorded using videography and analysed using an algorithm to track the whitening (damage) zone. Using the "whitening front tracking" method, full R-curves and J-R curves could be generated for these samples. The curves for this antiplane longitudinal orientation were similar to those found in the literature, being between the published longitudinal and transverse orientations. The proposed technique shows the ability to generate full "crack" extension resistance curves by tracking the whitening front propagation to overcome the small size limitations and the single value approach
ADCOMS sensitivity versus baseline diagnosis and progression phenotypes
Background: the Alzheimer's Disease COMposite Score (ADCOMS) is more sensitive in clinical trials than conventional measures when assessing pre-dementia. This study compares ADCOMS trajectories using clustered progression characteristics to better understand different patterns of decline.Methods: post-baseline ADCOMS values were analyzed for sensitivity using mean-to-standard deviation ratio (MSDR), partitioned by baseline diagnosis, comparing with the original scales upon which ADCOMS is based. Because baseline diagnosis was not a particularly reliable predictor of progression, individuals were also grouped into similar ADCOMS progression trajectories using clustering methods and the MSDR compared for each progression group.Results: ADCOMS demonstrated increased sensitivity for clinically important progression groups. ADCOMS did not show statistically significant sensitivity or clinical relevance for the less-severe baseline diagnoses and marginal progression groups.Conclusions: this analysis complements and extends previous work validating the sensitivity of ADCOMS. The large data set permitted evaluation-in a novel approach-by the clustered progression group.</p
Discovery molecular imaging digital ready PET/CT performance evaluation according to the NEMA NU2-2012 standard
Objectives: the aim of this study was to evaluate and benchmark the performance characteristics of the General Electric (GE) Discovery Molecular Imaging (MI) Digital Ready (DR) PET/CT. Materials and methods: performance evaluation against the National Electrical Manufacturers Association (NEMA) 2012 standard was performed on three GE Discovery MI DR PET/CT systems installed across different UK centres. The Discovery MI DR performance was compared with the Siemens Biograph mCT Flow, Phillips Ingenuity TF and GE Discovery 690 fully analogue PET/CT systems. In addition, as the Discovery MI DR is upgradable to the Digital MI with silicon photomultipliers, performance characteristics between analogue and digital were compared with assess potential benefits of a system upgrade. Results: the average NEMA results across three Discovery MI DR scanners were: sensitivity 7.3 cps/kBq, spatial resolution full-width-half-maximum radial 5.5 mm, tangential 4.5 mm and axial 6 mm at 10 cm from the centre of the field-of-view, peak noise equivalent count rate 142 kcps, scatter fraction 37.1%, contrast recovery coefficients from the International Electrotechnical Commission phantom ranged from 52 to 87% for 10-37-mm diameter spheres. Conclusion: all three Discovery MI DR systems tested in this study exceeded the manufacturer's NEMA specification, yet variability between scanners was noted. Discovery MI DR showed similar performance to Discovery 690 and Ingenuity TF, but lower sensitivity and spatial resolution than Biograph mCT Flow. The Discovery MI DR showed lower spatial resolution and contrast recovery than the 20-cm field-of-view Digital MI
Segmenting degenerated lumbar intervertebral discs from MR images
Magnetic Resonance Imaging is the modality of reference for diagnosing intervertebral disc degeneration, a condition related to chronic back pain. Segmentation of intervertebral discs is a prerequisite for computer aided diagnosis, while it could also serve in computer based surgery planning. A small number of studies report on disc segmentation methods applied on normal discs, while segmentation of degenerated discs remains an open issue. In the present study, a testing sample of 26 normal and 49 degenerated discs from T2-weighted midsaggital MR images of the lumbar spine was utilized to investigate the performance of two different segmentation methods. The first, which is based on the Fuzzy C-Means (FCM) algorithm utilizing three tissue classes (disc, bone and cerebrospinal fluid), suffered from severe leakage of disc border due to overlapping grey-level values between disc and surrounding tissues. To overcome this problem a combined method was developed, utilizing a probabilistic atlas of the disc along with the FCM algorithm. The probabilistic atlas was designed on the basis of an additional sample of 40 manually segmented normal intervertebral discs and was rigidly registered to images of the testing sample by means of a landmark based registration technique. The combined method resulted in reducing border leakage and achieved statistically significantly improved performance in comparison to the FCM method for all metrics tested (p«O.Ol). Specifically, sample overlap accuracies (mean standard deviation) of the combined method were O.81±O.06 for normal and O.77±O.07 for degenerated discs, and of the FCM method O.059±O.19 and O.63±O.15 respectively. Sample root mean square border distances of the combined method were J.53 and J.91 pixels for normal and degenerated discs. Concluding, incorporation of prior anatomical knowledge to the FCM method resulted in significantly nhanced performance, demonstrating increased potential in degenerated disc segmentation. © 2008 IEEE
Biomarkers of Inflammation Increase with Tau and Neurodegeneration but not with Amyloid-β in a Heterogenous Clinical Cohort
BACKGROUND: Neuroinflammation is an integral part of Alzheimer’s disease (AD) pathology. Inflammatory mediators can exacerbate the production of amyloid-β (Aβ), the propagation of tau pathology and neuronal loss. OBJECTIVE: To evaluate the relationship between inflammation markers and established markers of AD in a mixed memory clinic cohort. METHODS: 105 cerebrospinal fluid (CSF) samples from a clinical cohort under investigation for cognitive complaints were analyzed. Levels of Aβ(42), total tau, and phosphorylated tau were measured as part of the clinical pathway. Analysis of inflammation markers in CSF samples was performed using multiplex immune assays. Participants were grouped according to their Aβ, tau, and neurodegeneration status and the Paris-Lille-Montpellier (PLM) scale was used to assess the likelihood of AD. RESULTS: From 102 inflammatory markers analyzed, 19 and 23 markers were significantly associated with CSF total tau and phosphorylated tau levels respectively (p < 0.001), while none were associated with Aβ(42). The CSF concentrations of 4 inflammation markers were markedly elevated with increasing PLM class indicating increased likelihood of AD (p < 0.001). Adenosine deaminase, an enzyme involved in sleep homeostasis, was the single best predictor of high likelihood of AD (AUROC 0.788). Functional pathway analysis demonstrated a widespread role for inflammation in neurodegeneration, with certain pathways explaining over 30% of the variability in tau values. CONCLUSION: CSF inflammation markers increase significantly with tau and neurodegeneration, but not with Aβ in this mixed memory clinic cohort. Thus, such markers could become useful for the clinical diagnosis of neurodegenerative disorders alongside the established Aβ and tau measures