1,471 research outputs found
Perspective: Vitamin D deficiency and COVIDâ19 severity â plausibly linked by latitude, ethnicity, impacts on cytokines, ACE2 and thrombosis
Background
SARSâCoVâ2 coronavirus infection ranges from asymptomatic through to fatal COVIDâ19 characterized by a âcytokine stormâ and lung failure. Vitamin D deficiency has been postulated as a determinant of severity.
Objectives
To review the evidence relevant to vitamin D and COVIDâ19.
Methods
Narrative review.
Results
Regression modelling shows that more northerly countries in the Northern Hemisphere are currently (May 2020) showing relatively high COVIDâ19 mortality, with an estimated 4.4% increase in mortality for each 1 degree latitude north of 28 degrees North (P = 0.031) after adjustment for age of population. This supports a role for ultraviolet B acting via vitamin D synthesis. Factors associated with worse COVIDâ19 prognosis include old age, ethnicity, male sex, obesity, diabetes and hypertension and these also associate with deficiency of vitamin D or its response. Vitamin D deficiency is also linked to severity of childhood respiratory illness. Experimentally, vitamin D increases the ratio of angiotensinâconverting enzyme 2 (ACE2) to ACE, thus increasing angiotensin II hydrolysis and reducing subsequent inflammatory cytokine response to pathogens and lung injury.
Conclusions
Substantial evidence supports a link between vitamin D deficiency and COVIDâ19 severity but it is all indirect. Communityâbased placeboâcontrolled trials of vitamin D supplementation may be difficult. Further evidence could come from study of COVIDâ19 outcomes in large cohorts with information on prescribing data for vitamin D supplementation or assay of serum unbound 25(OH) vitamin D levels. Meanwhile, vitamin D supplementation should be strongly advised for people likely to be deficient
MARDy: Mycology Antifungal Resistance Database
This is the final version. Available from the publisher via the DOI in this record.Summary: The increase of antifungal drug resistance is a major global human health concern and
threatens agriculture and food security; in order to tackle these concerns, it is important to understand the mechanisms that cause antifungal resistance. The curated Mycology Antifungal
Resistance Database (MARDy) is a web-service of antifungal drug resistance mechanisms, including amino acid substitutions, tandem repeat sequences and genome ploidy. MARDy is implemented on a Linux, Apache, MySQL and PHP web development platform and includes a local
installation of BLASTn of the database of curated genes.Antimicrobial Research Collaborative (ARC)Natural Environment Research Council (NERC
Association of comorbidity and health service usage among patients with dementia in the UK: a population-based study
: The majority of people with dementia have other long-term diseases, the presence of which may affect the progression and management of dementia. This study aimed to identify subgroups with higher healthcare needs, by analysing how primary care consultations, number of prescriptions and hospital admissions by people with dementia varies with having additional long-term diseases (comorbidity).
: A retrospective cohort study based on health data from the Clinical Practice Research Datalink (CPRD) was conducted. Incident cases of dementia diagnosed in the year starting 1/3/2008 were selected and followed for up to 5â
years. The number of comorbidities was obtained from a set of 34 chronic health conditions. Service usage (primary care consultations, hospitalisations and prescriptions) and time-to-death were determined during follow-up. Multilevel negative binomial regression and Cox regression, adjusted for age and gender, were used to model differences in service usage and death between differing numbers of comorbidities.
: Data from 4999 people (14â
866 person-years of follow-up) were analysed. Overall, 91.7% of people had 1 or more additional comorbidities. Compared with those with 2 or 3 comorbidities, people with â„6 comorbidities had higher rates of primary care consultations (rate ratio (RR) 1.31, 95% CI 1.25 to 1.36), prescriptions (RR 1.68, 95% CI 1.57 to 1.81), and hospitalisation (RR 1.62, 95% CI 1.44 to 1.83), and higher risk of death (HR 1.56, 95% CI 1.37 to 1.78).
: In the UK, people with dementia with higher numbers of comorbidities die earlier and have considerably higher health service usage in terms of primary care consultations, hospital admissions and prescribing. This study provides strong evidence that comorbidity is a key factor that should be considered when allocating resources and planning care for people with dementia
Increased Adiposity, Dysregulated Glucose Metabolism and Systemic Inflammation in Galectin-3 KO Mice
PMCID: PMC3579848This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited
A prospective study to evaluate the accuracy of pulse power analysis to monitor cardiac output in critically ill patients
<p>Abstract</p> <p>Background</p> <p>Intermittent measurement of cardiac output may be performed using a lithium dilution technique (LiDCO). This can then be used to calibrate a pulse power algorithm of the arterial waveform which provides a continuous estimate of this variable. The purpose of this study was to examine the duration of accuracy of the pulse power algorithm in critically ill patients with respect to time when compared to measurements of cardiac output by an independent technique.</p> <p>Methods</p> <p>Pulse power analysis was performed on critically ill patients using a proprietary commercial monitor (PulseCO). All measurements were made using an in-dwelling radial artery line and according to manufacturers instructions. Intermittent measurements of cardiac output were made with LiDCO in order to validate the pulse power measurements. These were made at baseline and then following 1, 2, 4 and 8 hours. The LiDCO measurement was considered the reference for comparison in this study. The two methods of measuring cardiac output were then compared by linear regression and a Bland Altman analysis. An error rate for the limits of agreement (LOA) between the two techniques of less than 30% was defined as being acceptable for this study.</p> <p>Results</p> <p>14 critically ill medical and surgical patients were enrolled over a three month period. At baseline patients showed a wide range of cardiac output (median 7.5 L/min, IQR 5.1 -9.0 L/min). The bias and limits of agreement between the two techniques was deemed acceptable for the first four hours of the study with percentage errors being 29%, 22%, and 285 respectively. The percentage error at eight hours following calibration increased to 36%. The ability of the PulseCo to detect changes in cardiac output was assessed with a similar analysis. The PulseCO tracked the changes in cardiac output with adequate accuracy for the first four hours with percentage errors being 20%, 24% and 25%. However at eight hours the error had increased to 43%.</p> <p>Conclusion</p> <p>The agreement between lithium dilution cardiac output and the pulse power algorithm in the PulseCO monitor remains acceptable for up to four hours in critically ill patients.</p
Improving survey methods in sero-epidemiological studies of injecting drug users: a case example of two cross sectional surveys in Serbia and Montenegro
BACKGROUND: Little is known about the prevalence of HIV or HCV in injecting drug users (IDUs) in Serbia and Montenegro. We measured prevalence of antibodies to HIV (anti-HIV) and hepatitis C virus (anti-HCV), and risk factors for anti-HCV, in community-recruited IDUs in Belgrade and Podgorica, and determined the performance of a parallel rapid HIV testing algorithm.
METHODS: Respondent driven sampling and audio-computer assisted survey interviewing (ACASI) methods were employed. Dried blood spots were collected for unlinked anonymous antibody testing. Belgrade IDUs were offered voluntary confidential rapid HIV testing using a parallel testing algorithm, the performance of which was compared with standard laboratory tests. Predictors of anti-HCV positivity and the diagnostic accuracy of the rapid HIV test algorithm were calculated.
RESULTS: Overall population prevalence of anti-HIV and anti-HCV in IDUs were 3% and 63% respectively in Belgrade (n = 433) and 0% and 22% in Podgorica (n = 328). Around a quarter of IDUs in each city had injected with used needles and syringes in the last four weeks. In both cities anti-HCV positivity was associated with increasing number of years injecting (eg Belgrade adjusted odds ratio (AOR) 5.6 (95% CI 3.2-9.7) and Podgorica AOR 2.5 (1.3-5.1) for >or= 10 years v 0-4 years), daily injecting (Belgrade AOR 1.6 (1.0-2.7), Podgorica AOR 2.1 (1.3-5.1)), and having ever shared used needles/syringes (Belgrade AOR 2.3 (1.0-5.4), Podgorica AOR 1.9 (1.4-2.6)). Half (47%) of Belgrade participants accepted rapid HIV testing, and there was complete concordance between rapid test results and subsequent confirmatory laboratory tests (sensitivity 100% (95%CI 59%-100%), specificity 100% (95%CI 98%-100%)).
CONCLUSION: The combination of community recruitment, ACASI, rapid testing and a linked diagnostic accuracy study provide enhanced methods for conducting blood borne virus sero-prevalence studies in IDUs. The relatively high uptake of rapid testing suggests that introducing this method in community settings could increase the number of people tested in high risk populations. The high prevalence of HCV and relatively high prevalence of injecting risk behaviour indicate that further HIV transmission is likely in IDUs in both cities. Urgent scale up of HIV prevention interventions is needed
Recommended from our members
Preventing vitamin D deficiency during the COVID-19 pandemic: UK definitions of vitamin D sufficiency and recommended supplement dose are set too low.
There is growing evidence linking vitamin D deficiency with risk of COVID-19. It is therefore distressing that there is major disagreement about the optimal serum level for 25-hydroxyvitamin D (25(OH)D) and appropriate supplement dose. The UK Scientific Advisory Committee for Nutrition has set the lowest level for defining sufficiency (10 ng/ml or 25 nmol/L) of any national advisory body or scientific society and consequently recommends supplementation with 10 micrograms (400 IU) per day. We have searched for published evidence to support this but not found it. There is considerable evidence to support the higher level for sufficiency (20 ng/ml or 50 nmol/L) recommended by the European Food Safety Authority and the American Institute of Medicine and hence greater supplementation (20 micrograms or 800 IU per day). Serum 25(OH)D concentrations in the UK typically fall by around 50% through winter. We believe that governments should urgently recommend supplementation with 20-25 micrograms (800-1,000 IU) per day
The identification of informative genes from multiple datasets with increasing complexity
Background
In microarray data analysis, factors such as data quality, biological variation, and the increasingly multi-layered nature of more complex biological systems complicates the modelling of regulatory networks that can represent and capture the interactions among genes. We believe that the use of multiple datasets derived from related biological systems leads to more robust models. Therefore, we developed a novel framework for modelling regulatory networks that involves training and evaluation on independent datasets. Our approach includes the following steps: (1) ordering the datasets based on their level of noise and informativeness; (2) selection of a Bayesian classifier with an appropriate level of complexity by evaluation of predictive performance on independent data sets; (3) comparing the different gene selections and the influence of increasing the model complexity; (4) functional analysis of the informative genes.
Results
In this paper, we identify the most appropriate model complexity using cross-validation and independent test set validation for predicting gene expression in three published datasets related to myogenesis and muscle differentiation. Furthermore, we demonstrate that models trained on simpler datasets can be used to identify interactions among genes and select the most informative. We also show that these models can explain the myogenesis-related genes (genes of interest) significantly better than others (P < 0.004) since the improvement in their rankings is much more pronounced. Finally, after further evaluating our results on synthetic datasets, we show that our approach outperforms a concordance method by Lai et al. in identifying informative genes from multiple datasets with increasing complexity whilst additionally modelling the interaction between genes.
Conclusions
We show that Bayesian networks derived from simpler controlled systems have better performance than those trained on datasets from more complex biological systems. Further, we present that highly predictive and consistent genes, from the pool of differentially expressed genes, across independent datasets are more likely to be fundamentally involved in the biological process under study. We conclude that networks trained on simpler controlled systems, such as in vitro experiments, can be used to model and capture interactions among genes in more complex datasets, such as in vivo experiments, where these interactions would otherwise be concealed by a multitude of other ongoing events
- âŠ