204 research outputs found
Recommended from our members
The NF-κB multidimer system model: a knowledge base to explore diverse biological contexts
The nuclear factor κB (NF-κB) system is critical for various biological functions in numerous cell types, including the inflammatory response, cell proliferation, survival, differentiation, and pathogenic responses. Each cell type is characterized by a subset of 15 NF-κB dimers whose activity is regulated in a stimulus-responsive manner. Numerous studies have produced different mathematical models that account for cell type–specific NF-κB activities. However, whereas the concentrations or abundances of NF-κB subunits may differ between cell types, the biochemical interactions that constitute the NF-κB signaling system do not. Here, we synthesized a consensus mathematical model of the NF-κB multidimer system, which could account for the cell type–specific repertoires of NF-κB dimers and their cell type–specific activation and cross-talk. Our review demonstrates that these distinct cell type–specific properties of NF-κB signaling can be explained largely as emergent effects of the cell type–specific expression of NF-κB monomers. The consensus systems model represents a knowledge base that may be used to gain insights into the control and function of NF-κB in diverse physiological and pathological scenarios and that describes a path for generating similar regulatory knowledge bases for other pleiotropic signaling systems
Recommended from our members
Cost analysis of noninvasive blood-based microRNA testing versus CT scans for follow-up in patients with testicular germ-cell tumors
BACKGROUND: Our group has developed a noninvasive blood-based microRNA (miRNA) test for improving diagnosis, disease monitoring, and relapse detection in malignant testicular germ-cell tumors (TGCTs). Performance analysis suggests the test is likely to have comparable sensitivity and specificity in detecting TGCT as computed tomography (CT), thus reducing the need for serial CT scans for follow-up monitoring, with associated reductions in cumulative radiation burden and second cancer risk. To facilitate clinical adoption, we undertook a cost analysis to identify the budget impact of replacing CT scans with miRNA testing within health care systems. METHODS: The TGCT aftercare pathway was mapped out using National Comprehensive Cancer Network guidelines. A Markov model was built to simulate the impact of the miRNA test on TGCT aftercare costs. Incidence, treatment probabilities, relapse rate, and death rate data were collected from published studies to populate the model. RESULTS: Applying our model to the US health care system, the miRNA test has the potential to save up to $69 million per year in aftercare expenses related to TGCT treatment, with exact savings depending on the adoption rate and test price. CONCLUSION: This analysis demonstrates the potential positive budget impact of adopting miRNA testing in place of CT scans in the clinical management of TGCTs
Investigation of humans individual differences as predictors of their animal interaction styles, focused on the domestic cat
Humans' individual differences including their demographics, personality, attitudes and experiences are often associated with important outcomes for the animals they interact with. This is pertinent to companion animals such as cats and dogs, given their social and emotional importance to humans and degree of integration into human society. However, the mechanistic underpinnings and causal relationships that characterise links between human individual differences and companion animal behaviour and wellbeing are not well understood. In this exploratory investigation, we firstly quantified the underlying structure of, and variation in, human's styles of behaviour during typical human-cat interactions (HCI), focusing on aspects of handling and interaction known to be preferred by cats (i.e. 'best practice'), and their variation. We then explored the potential significance of various human individual differences as predictors of these HCI styles. Seven separate HCI styles were identified via Principal Component Analysis (PCA) from averaged observations for 119 participants, interacting with sociable domestic cats within a rehoming context. Using General Linear Models (GLMs) and an Information Theoretic (IT) approach, we found these HCI PC components were weakly to strongly predicted by factors including cat-ownership history, participant personality (measured via the Big Five Inventory, or BFI), age, work experience with animals and participants' subjective ratings of their cat behaviour knowledge. Paradoxically, greater cat ownership experiences and self-assessed cat knowledge were not positively associated with 'best practice' styles of HCI, but were instead generally predictive of HCI styles known to be less preferred by cats, as was greater participant age and Neuroticism. These findings have important implications regarding the quality of human-companion animal relationships and dyadic compatibility, in addition to the role of educational interventions and their targeting for optimal efficacy. In the context of animal adoption, these results strengthen the (limited) evidence base for decision making associated with cat-adopter screening and matching. In particular, our results suggest that greater cat ownership experiences and self-reports of cat knowledge might not necessarily convey advantages for cats in the context of HCI
The effect of baseline cognition and delirium on long-term cognitive impairment and mortality: a prospective population-based study
BACKGROUND: There is an unmet public health need to understand better the relationship between baseline cognitive function, the occurrence and severity of delirium, and subsequent cognitive decline. Our aim was to quantify the relationship between baseline cognition and delirium and follow-up cognitive impairment. METHODS: We did a prospective longitudinal study in a stable representative community sample of adults aged 70 years or older who were registered with a Camden-based general practitioner in the London Borough of Camden (London, UK). Participants were recruited by invitation letters from general practice lists or by direct recruitment of patients from memory clinics or patients recently discharged from secondary care. We quantified baseline cognitive function with the modified Telephone Interview for Cognitive Status. In patients who were admitted to hospital, we undertook daily assessments of delirium using the Memorial Delirium Assessment Scale (MDAS). We estimated the association of pre-admission baseline cognitive function with delirium prevalence, severity, and duration. We assessed subsequent cognitive function 2 years after baseline recruitment using the Telephone Interview for Cognitive Status. Regression models were adjusted by age, sex, education, illness severity, and frailty. FINDINGS: We recruited 1510 participants (median age 77 [IQR 73–82], 57% women) between March, 2017, and October, 2018. 209 participants were admitted to hospital across 371 episodes (1999 person-days of assessment). Better baseline cognition was associated with a lower risk of delirium (odds ratio 0·63, 95% CI 0·45 to 0·89) and with less severe delirium (–1·6 MDAS point, 95% CI –2·6 to –0·7). Individuals with high baseline cognition (baseline Z score +2·0 SD) had demonstrable decline even without delirium (follow-up Z score +1·2 SD). However, those with a high delirium burden had an even larger absolute decline of 2·2 SD in Z score (follow-up Z score –0·2). Once individuals had more than 2 days of moderate delirium, the rates of death over 2 years were similar regardless of baseline cognition; a better baseline cognition no longer conferred any mortality benefit. INTERPRETATION: A higher baseline cognitive function is associated with a good prognosis with regard to likelihood and severity of delirium. However, those with a high baseline cognition and with delirium had the highest degree of cognitive decline, a change similar to the decline observed in individuals with a high amyloid burden in other cohorts. Older people with a healthy baseline cognitive function who develop delirium stand to lose the most after delirium. This group could benefit from targeted cognitive rehabilitation interventions after delirium
Editorial
info:eu-repo/semantics/publishedVersio
To err is human, to correct is public health: a systematic review examining poor quality testing and misdiagnosis of HIV status.
INTRODUCTION: In accordance with global testing and treatment targets, many countries are seeking ways to reach the "90-90-90" goals, starting with diagnosing 90% of all people with HIV. Quality HIV testing services are needed to enable people with HIV to be diagnosed and linked to treatment as early as possible. It is essential that opportunities to reach people with undiagnosed HIV are not missed, diagnoses are correct and HIV-negative individuals are not inadvertently initiated on life-long treatment. We conducted this systematic review to assess the magnitude of misdiagnosis and to describe poor HIV testing practices using rapid diagnostic tests. METHODS: We systematically searched peer-reviewed articles, abstracts and grey literature published from 1 January 1990 to 19 April 2017. Studies were included if they used at least two rapid diagnostic tests and reported on HIV misdiagnosis, factors related to potential misdiagnosis or described quality issues and errors related to HIV testing. RESULTS: Sixty-four studies were included in this review. A small proportion of false positive (median 3.1%, interquartile range (IQR): 0.4-5.2%) and false negative (median: 0.4%, IQR: 0-3.9%) diagnoses were identified. Suboptimal testing strategies were the most common factor in studies reporting misdiagnoses, particularly false positive diagnoses due to using a "tiebreaker" test to resolve discrepant test results. A substantial proportion of false negative diagnoses were related to retesting among people on antiretroviral therapy. Conclusions HIV testing errors and poor practices, particularly those resulting in false positive or false negative diagnoses, do occur but are preventable. Efforts to accelerate HIV diagnosis and linkage to treatment should be complemented by efforts to improve the quality of HIV testing services and strengthen the quality management systems, particularly the use of validated testing algorithms and strategies, retesting people diagnosed with HIV before initiating treatment and providing clear messages to people with HIV on treatment on the risk of a "false negative" test result
Current Challenges for the Early Detection of Alzheimer's Disease: Brain Imaging and CSF Studies
The development of prevention therapies for Alzheimer's disease (AD) would greatly benefit from biomarkers that are sensitive to the subtle brain changes that occur in the preclinical stage of the disease. Reductions in the cerebral metabolic rate of glucose (CMRglc), a measure of neuronal function, have proven to be a promising tool in the early diagnosis of AD. In vivo brain 2-[18F]fluoro-2-Deoxy-D-glucose-positron emission tomography (FDG-PET) imaging demonstrates consistent and progressive CMRglc reductions in AD patients, the extent and topography of which correlate with symptom severity. There is increasing evidence that hypometabolism appears during the preclinical stages of AD and can predict decline years before the onset of symptoms. This review will give an overview of FDG-PET results in individuals at risk for developing dementia, including: presymptomatic individuals carrying mutations responsible for early-onset familial AD; patients with Mild Cognitive Impairment (MCI), often a prodrome to late-onset sporadic AD; non-demented carriers of the Apolipoprotein E (ApoE) ε4 allele, a strong genetic risk factor for late-onset AD; cognitively normal subjects with a family history of AD; subjects with subjective memory complaints; and normal elderly followed longitudinally until they expressed the clinical symptoms and received post-mortem confirmation of AD. Finally, we will discuss the potential to combine different PET tracers and CSF markers of pathology to improve the early detection of AD
Rapid epidemic expansion of the SARS-CoV-2 Omicron variant in southern Africa
The severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) epidemic in southern Africa has been characterised by three distinct waves. The first was associated with a mix of SARS-CoV-2 lineages, whilst the second and third waves were driven by the Beta and Delta variants, respectively1-3. In November 2021, genomic surveillance teams in South Africa and Botswana detected a new SARS-CoV-2 variant associated with a rapid resurgence of infections in Gauteng Province, South Africa. Within three days of the first genome being uploaded, it was designated a variant of concern (Omicron) by the World Health Organization and, within three weeks, had been identified in 87 countries. The Omicron variant is exceptional for carrying over 30 mutations in the spike glycoprotein, predicted to influence antibody neutralization and spike function4. Here, we describe the genomic profile and early transmission dynamics of Omicron, highlighting the rapid spread in regions with high levels of population immunity
C-Terminal Region of EBNA-2 Determines the Superior Transforming Ability of Type 1 Epstein-Barr Virus by Enhanced Gene Regulation of LMP-1 and CXCR7
Type 1 Epstein-Barr virus (EBV) strains immortalize B lymphocytes in vitro much more efficiently than type 2 EBV, a difference previously mapped to the EBNA-2 locus. Here we demonstrate that the greater transforming activity of type 1 EBV correlates with a stronger and more rapid induction of the viral oncogene LMP-1 and the cell gene CXCR7 (which are both required for proliferation of EBV-LCLs) during infection of primary B cells with recombinant viruses. Surprisingly, although the major sequence differences between type 1 and type 2 EBNA-2 lie in N-terminal parts of the protein, the superior ability of type 1 EBNA-2 to induce proliferation of EBV-infected lymphoblasts is mostly determined by the C-terminus of EBNA-2. Substitution of the C-terminus of type 1 EBNA-2 into the type 2 protein is sufficient to confer a type 1 growth phenotype and type 1 expression levels of LMP-1 and CXCR7 in an EREB2.5 cell growth assay. Within this region, the RG, CR7 and TAD domains are the minimum type 1 sequences required. Sequencing the C-terminus of EBNA-2 from additional EBV isolates showed high sequence identity within type 1 isolates or within type 2 isolates, indicating that the functional differences mapped are typical of EBV type sequences. The results indicate that the C-terminus of EBNA-2 accounts for the greater ability of type 1 EBV to promote B cell proliferation, through mechanisms that include higher induction of genes (LMP-1 and CXCR7) required for proliferation and survival of EBV-LCLs
Erratum to: Methods for evaluating medical tests and biomarkers
[This corrects the article DOI: 10.1186/s41512-016-0001-y.]
- …