7 research outputs found
Recommended from our members
A Headset Method for Measuring the Visual Temporal Discrimination Threshold in Cervical Dystonia
Background: The visual temporal discrimination threshold (TDT) is the shortest time interval at which one can determine two stimuli to be asynchronous and meets criteria for a valid endophenotype in adultâonset idiopathic focal dystonia, a poorly penetrant disorder. Temporal discrimination is assessed in the hospital laboratory; in unaffected relatives of multiplex adultâonset dystonia patients distance from the hospital is a barrier to data acquisition. We devised a portable headset method for visual temporal discrimination determination and our aim was to validate this portable tool against the traditional laboratoryâbased method in a group of patients and in a large cohort of healthy controls.
Methods: Visual TDTs were examined in two groups 1) in 96 healthy control participants divided by age and gender, and 2) in 33 cervical dystonia patients, using two methods of data acquisition, the traditional tableâtop laboratoryâbased system, and the novel portable headset method. The order of assessment was randomized in the control group. The results obtained by each technique were compared.
Results: Visual temporal discrimination in healthy control participants demonstrated similar age and gender effects by the headset method as found by the tableâtop examination. There were no significant differences between visual TDTs obtained using the two methods, both for the control participants and for the cervical dystonia patients. BlandâAltman testing showed good concordance between the two methods in both patients and in controls.
Discussion: The portable headset device is a reliable and accurate method for visual temporal discrimination testing for use outside the laboratory, and will facilitate increased TDT data collection outside of the hospital setting. This is of particular importance in multiplex families where data collection in all available members of the pedigree is important for exome sequencing studies
Development and implementation of an ultralow-dose CT protocol for the assessment of cerebrospinal shunts in adult hydrocephalus
Background: Cerebrospinal fluid shunts in the treatment of hydrocephalus, although associated with clinical benefit, have a high failure rate with repeat computed tomography (CT) imaging resulting in a substantial cumulative radiation dose. Therefore, we sought to develop a whole-body ultralow-dose (ULD) CT protocol for the investigation of shunt malfunction and compare it with the reference standard, plain radiographic shunt series (PRSS). Methods: Following ethical approval, using an anthropomorphic phantom and a human cadaveric ventriculoperitoneal shunt model, a whole-body ULD-CT protocol incorporating two iterative reconstruction (IR) algorithms, pure IR and hybrid IR, including 60% filtered back projection and 40% IR was evaluated in 18 adult patients post new shunt implantation or where shunt malfunction was suspected. Effective dose (ED) and image quality were analysed. Results: ULD-CT permitted a 36% radiation dose reduction (median ED 0.16 mSv, range 0.07â0.17, versus 0.25 mSv (0.06â1.69 mSv) for PRSS (p = 0.002). Shunt visualisation in the thoracoabdominal cavities was improved with ULD-CT with pure IR (p = 0.004 and p = 0.031, respectively) and, in contrast to PRSS, permitted visualisation of the entire shunt course (p < 0.001), the distal shunt entry point and location of the shunt tip in all cases. For shunt complications, ULD-CT had a perfect specificity. False positives (3/22, 13.6%) were observed with PRSS. Conclusions: At a significantly reduced radiation dose, whole body ULD-CT with pure IR demonstrated diagnostic superiority over PRSS in the evaluation of cerebrospinal fluid shunt malfunction
Recommended from our members
A Headset Method for Measuring the Visual Temporal Discrimination Threshold in Cervical Dystonia
Background: The visual temporal discrimination threshold (TDT) is the shortest time interval at which one can determine two stimuli to be asynchronous and meets criteria for a valid endophenotype in adultâonset idiopathic focal dystonia, a poorly penetrant disorder. Temporal discrimination is assessed in the hospital laboratory; in unaffected relatives of multiplex adultâonset dystonia patients distance from the hospital is a barrier to data acquisition. We devised a portable headset method for visual temporal discrimination determination and our aim was to validate this portable tool against the traditional laboratoryâbased method in a group of patients and in a large cohort of healthy controls.
Methods: Visual TDTs were examined in two groups 1) in 96 healthy control participants divided by age and gender, and 2) in 33 cervical dystonia patients, using two methods of data acquisition, the traditional tableâtop laboratoryâbased system, and the novel portable headset method. The order of assessment was randomized in the control group. The results obtained by each technique were compared.
Results: Visual temporal discrimination in healthy control participants demonstrated similar age and gender effects by the headset method as found by the tableâtop examination. There were no significant differences between visual TDTs obtained using the two methods, both for the control participants and for the cervical dystonia patients. BlandâAltman testing showed good concordance between the two methods in both patients and in controls.
Discussion: The portable headset device is a reliable and accurate method for visual temporal discrimination testing for use outside the laboratory, and will facilitate increased TDT data collection outside of the hospital setting. This is of particular importance in multiplex families where data collection in all available members of the pedigree is important for exome sequencing studies
Dietary-Induced Bacterial Metabolites Reduce Inflammation and Inflammation-Associated Cancer via Vitamin D Pathway
Environmental factors, including westernised diets and alterations to the gut microbiota, are considered risk factors for inflammatory bowel diseases (IBD). The mechanisms underpinning diet-microbiota-host interactions are poorly understood in IBD. We present evidence that feeding a lard-based high-fat (HF) diet can protect mice from developing DSS-induced acute and chronic colitis and colitis-associated cancer (CAC) by significantly reducing tumour burden/incidence, immune cell infiltration, cytokine profile, and cell proliferation. We show that HF protection was associated with increased gut microbial diversity and a significant reduction in Proteobacteria and an increase in Firmicutes and Clostridium cluster XIVa abundance. Microbial functionality was modulated in terms of signalling fatty acids and bile acids (BA). Faecal secondary BAs were significantly induced to include moieties that can activate the vitamin D receptor (VDR), a nuclear receptor richly represented in the intestine and colon. Indeed, colonic VDR downstream target genes were upregulated in HF-fed mice and in combinatorial lipid-BAs-treated intestinal HT29 epithelial cells. Collectively, our data indicate that HF diet protects against colitis and CAC risk through gut microbiota and BA metabolites modulating vitamin D targeting pathways. Our data highlights the complex relationship between dietary fat-induced alterations of microbiota-host interactions in IBD/CAC pathophysiology
Thrombotic and hemorrhagic complications of COVID-19 in adults hospitalized in high-income countries compared with those in adults hospitalized in low- and middle-income countries in an international registry
Background: COVID-19 has been associated with a broad range of thromboembolic, ischemic, and hemorrhagic complications (coagulopathy complications). Most studies have focused on patients with severe disease from high-income countries (HICs). Objectives: The main aims were to compare the frequency of coagulopathy complications in developing countries (low- and middle-income countries [LMICs]) with those in HICs, delineate the frequency across a range of treatment levels, and determine associations with in-hospital mortality. Methods: Adult patients enrolled in an observational, multinational registry, the International Severe Acute Respiratory and Emerging Infections COVID-19 study, between January 1, 2020, and September 15, 2021, met inclusion criteria, including admission to a hospital for laboratory-confirmed, acute COVID-19 and data on complications and survival. The advanced-treatment cohort received care, such as admission to the intensive care unit, mechanical ventilation, or inotropes or vasopressors; the basic-treatment cohort did not receive any of these interventions. Results: The study population included 495,682 patients from 52 countries, with 63% from LMICs and 85% in the basic treatment cohort. The frequency of coagulopathy complications was higher in HICs (0.76%-3.4%) than in LMICs (0.09%-1.22%). Complications were more frequent in the advanced-treatment cohort than in the basic-treatment cohort. Coagulopathy complications were associated with increased in-hospital mortality (odds ratio, 1.58; 95% CI, 1.52-1.64). The increased mortality associated with these complications was higher in LMICs (58.5%) than in HICs (35.4%). After controlling for coagulopathy complications, treatment intensity, and multiple other factors, the mortality was higher among patients in LMICs than among patients in HICs (odds ratio, 1.45; 95% CI, 1.39-1.51). Conclusion: In a large, international registry of patients hospitalized for COVID-19, coagulopathy complications were more frequent in HICs than in LMICs (developing countries). Increased mortality associated with coagulopathy complications was of a greater magnitude among patients in LMICs. Additional research is needed regarding timely diagnosis of and intervention for coagulation derangements associated with COVID-19, particularly for limited-resource settings