417 research outputs found

    Development of a novel renal activity index of lupus nephritis in children & young adults

    Get PDF
    BACKGROUND: Noninvasive estimation of the degree of inflammation seen on kidney biopsy with lupus nephritis (LN) remains difficult. The objective of this study was to develop a Renal Activity Index for Lupus (RAIL) that, based solely on laboratory measures, accurately reflects histological LN activity. METHODS: We assayed traditional LN laboratory tests and 16 urine biomarkers (UBMs) in children (n=47) at the time of kidney biopsy. Histological LN activity was measured by the NIH Activity Index (NIH-AI) and the Tubulointerstitial Activity Index (TIAI). High LN-activity status (vs. moderate/low) was defined as NIH-AI scores \u3e 10 (vs.5 (vs.92% accuracy and LN-activityTIAI status with \u3e80% accuracy. RAIL accuracy was minimally influenced by concomitant LN damage. Accuracies between 71 and 85% were achieved without standardization of the UBMs. The strength of these UBMs to reflect LN-activity status was confirmed by principal component and linear discriminant analyses. CONCLUSION: The RAIL is a robust and highly accurate noninvasive measure of LN-activity. The measurement properties of the RAIL, which reflect the degree of inflammatory changes as seen on kidney biopsy, will require independent validation. This article is protected by copyright. All rights reserved

    Early-life glucocorticoids programme behaviour and metabolism in adulthood in zebrafish

    Get PDF
    Glucocorticoids (GCs) in utero influence embryonic development with consequent programmed effects on adult physiology and pathophysiology and altered susceptibility to cardiovascular disease. However, in viviparous species, studies of these processes are compromised by secondary maternal influences. The zebrafish, being fertilised externally, avoids this problem and has been used here to investigate the effects of transient alterations in GC activity during early development. Embryonic fish were treated either with dexamethasone (a synthetic GC), an antisense GC receptor (GR) morpholino (GR Mo), or hypoxia for the first 120h post fertilisation (hpf); responses were measured during embryonic treatment or later, post treatment, in adults. All treatments reduced cortisol levels in embryonic fish to similar levels. However, morpholino- and hypoxia-treated embryos showed delayed physical development (slower hatching and straightening of head–trunk angle, shorter body length), less locomotor activity, reduced tactile responses and anxiogenic activity. In contrast, dexamethasone-treated embryos showed advanced development and thigmotaxis but no change in locomotor activity or tactile responses. Gene expression changes were consistent with increased (dexamethasone) and decreased (hypoxia, GR Mo) GC activity. In adults, stressed cortisol values were increased with dexamethasone and decreased by GR Mo and hypoxia pre-treatments. Other responses were similarly differentially affected. In three separate tests of behaviour, dexamethasone-programmed fish appeared ‘bolder’ than matched controls, whereas Mo and hypoxia pre-treated fish were unaffected or more reserved. Similarly, the dexamethasone group but not the Mo or hypoxia groups were heavier, longer and had a greater girth than controls. Hyperglycaemia and expression of GC responsive gene (pepck) were also increased in the dexamethasone group. We conclude that GC activity controls many aspects of early-life growth and development in the zebrafish and that, like other species, manipulating GC status pharmacologically, physiologically or genetically in early life leads to programmable metabolic and behavioural traits in adulthood

    The potential for quality assurance systems to save costs and lives:the case of early infant diagnosis of HIV

    Get PDF
    OBJECTIVES: Scaling up of point-of-care testing (POCT) for early infant diagnosis of HIV (EID) could reduce the large gap in infant testing. However, suboptimal POCT EID could have limited impact and potentially high avoidable costs. This study models the cost-effectiveness of a quality assurance system to address testing performance and screening interruptions, due to, for example, supply stockouts, in Kenya, Senegal, South Africa, Uganda and Zimbabwe, with varying HIV epidemics and different health systems. METHODS: We modelled a quality assurance system-raised EID quality from suboptimal levels: that is, from misdiagnosis rates of 5%, 10% and 20% and EID testing interruptions in months, to uninterrupted optimal performance (98.5% sensitivity, 99.9% specificity). For each country, we estimated the 1-year impact and cost-effectiveness (US/DALYaverted)ofimprovedscenariosinavertingmissedHIVinfectionsandunneededHIVtreatmentcostsforfalsepositivediagnoses.RESULTS:Themodelled1yearcostsofanationalPOCTqualityassurancesystemrangefromUS/DALY averted) of improved scenarios in averting missed HIV infections and unneeded HIV treatment costs for false-positive diagnoses. RESULTS: The modelled 1-year costs of a national POCT quality assurance system range from US 69 359 in South Africa to US334 341inZimbabwe.Atthecountrylevel,qualityassurancesystemscouldpotentiallyavertbetween36and711missedinfections(i.e.falsenegatives)peryearandunneededtreatmentcostsbetweenUS 334 341 in Zimbabwe. At the country level, quality assurance systems could potentially avert between 36 and 711 missed infections (i.e. false negatives) per year and unneeded treatment costs between US 5808 and US$ 739 030. CONCLUSIONS: The model estimates adding effective quality assurance systems are cost-saving in four of the five countries within the first year. Starting EQA requires an initial investment but will provide a positive return on investment within five years by averting the costs of misdiagnoses and would be even more efficient if implemented across multiple applications of POCT

    Alert but less alarmed: a pooled analysis of terrorism threat perception in Australia

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Previous Australian research has highlighted disparities in community perceptions of the threat posed by terrorism. A study with a large sample size is needed to examine reported concerns and anticipated responses of community sub-groups and to determine their consistency with existing Australian and international findings.</p> <p>Methods</p> <p>Representative samples of New South Wales (NSW) adults completed terrorism perception questions as part of computer assisted telephone interviews (CATI) in 2007 (N = 2081) and 2010 (N = 2038). Responses were weighted against the NSW population. Data sets from the two surveys were pooled and multivariate multilevel analyses conducted to identify health and socio-demographic factors associated with higher perceived risk of terrorism and evacuation response intentions, and to examine changes over time.</p> <p>Results</p> <p>In comparison with 2007, Australians in 2010 were significantly more likely to believe that a terrorist attack would occur in Australia (Adjusted Odd Ratios (AOR) = 1.24, 95%CI:1.06-1.45) but felt less concerned that they would be directly affected by such an incident (AOR = 0.65, 95%CI:0.55-0.75). Higher perceived risk of terrorism and related changes in living were associated with middle age, female gender, lower education and higher reported psychological distress. Australians of migrant background reported significantly lower likelihood of terrorism (AOR = 0.52, 95%CI:0.39-0.70) but significantly higher concern that they would be personally affected by such an incident (AOR = 1.57, 95%CI:1.21-2.04) and having made changes in the way they live due to this threat (AOR = 2.47, 95%CI:1.88-3.25). Willingness to evacuate homes and public places in response to potential incidents increased significantly between 2007 and 2010 (AOR = 1.53, 95%CI:1.33-1.76).</p> <p>Conclusion</p> <p>While an increased proportion of Australians believe that the national threat of terrorism remains high, concern about being personally affected has moderated and may reflect habituation to this threat. Key sub-groups remain disproportionately concerned, notably those with lower education and migrant groups. The dissonance observed in findings relating to Australians of migrant background appears to reflect wider socio-cultural concerns associated with this issue. Disparities in community concerns regarding terrorism-related threat require active policy consideration and specific initiatives to reduce the vulnerabilities of known risk groups, particularly in the aftermath of future incidents.</p

    Development of a Novel Renal Activity Index of Lupus Nephritis in Children and Young Adults

    Get PDF
    OBJECTIVE: Noninvasive estimation of the degree of inflammation seen on kidney biopsy with lupus nephritis (LN) remains difficult. The objective of this study was to develop a Renal Activity Index for Lupus (RAIL) that, based solely on laboratory measures, accurately reflects histologic LN activity. METHODS: We assayed traditional LN laboratory tests and 16 urine biomarkers (UBMs) in children (n = 47) at the time of kidney biopsy. Histologic LN activity was measured by the National Institutes of Health activity index (NIH-AI) and the tubulointerstitial activity index (TIAI). High LN-activity status (versus moderate/low) was defined as NIH-AI scores >10 (versus ≤10) or TIAI scores >5 (versus ≤5). RAIL algorithms that predicted LN-activity status for both NIH-AI and TIAI were derived by stepwise multivariate logistic regression, considering traditional biomarkers and UBMs as candidate components. The accuracy of the RAIL for discriminating by LN-activity status was determined. RESULTS: The differential excretion of 6 UBMs (neutrophil gelatinase-associated lipocalin, monocyte chemotactic protein 1, ceruloplasmin, adiponectin, hemopexin, and kidney injury molecule 1) standardized by urine creatinine was considered in the RAIL. These UBMs predicted LN-activity (NIH-AI) status with >92% accuracy and LN-activity (TIAI) status with >80% accuracy. RAIL accuracy was minimally influenced by concomitant LN damage. Accuracies between 71% and 85% were achieved without standardization of the UBMs. The strength of these UBMs to reflect LN-activity status was confirmed by principal component and linear discriminant analyses. CONCLUSION: The RAIL is a robust and highly accurate noninvasive measure of LN activity. The measurement properties of the RAIL, which reflect the degree of inflammatory changes as seen on kidney biopsy, will require independent validation

    Discovery and Follow-up of ASASSN-23bd (AT 2023clx): The Lowest Redshift and Least Luminous Tidal Disruption Event To Date

    Full text link
    We report the All-Sky Automated Survey for SuperNovae discovery of the tidal disruption event (TDE) ASASSN-23bd (AT 2023clx) in NGC 3799, a LINER galaxy with no evidence of strong AGN activity over the past decade. With a redshift of z=0.01107z = 0.01107 and a peak UV/optical luminosity of (5.4±0.4)×1042(5.4\pm0.4)\times10^{42} erg s1^{-1}, ASASSN-23bd is the lowest-redshift and least-luminous TDE discovered to date. Spectroscopically, ASASSN-23bd shows Hα\alpha and He I emission throughout its spectral time series, and the UV spectrum shows nitrogen lines without the strong carbon and magnesium lines typically seen for AGN. Fits to the rising ASAS-SN light curve show that ASASSN-23bd started to brighten on MJD 599881+1^{+1}_{-1}, \sim9 days before discovery, with a nearly linear rise in flux, peaking in the gg band on MJD 600003+360000^{+3}_{-3}. Scaling relations and TDE light curve modelling find a black hole mass of \sim106^6 MM_\odot, which is on the lower end of supermassive black hole masses. ASASSN-23bd is a dim X-ray source, with an upper limit of L0.310keV<1.0×1040L_{0.3-10\,\mathrm{keV}} < 1.0\times10^{40} erg s1^{-1} from stacking all \emph{Swift} observations prior to MJD 60061, but with soft (0.1\sim 0.1 keV) thermal emission with a luminosity of L0.32keV4×1039L_{0.3-2 \,\mathrm{keV}}\sim4\times10^{39} erg s1^{-1} in \emph{XMM-Newton} observations on MJD 60095. The rapid (t<15(t < 15 days) light curve rise, low UV/optical luminosity, and a luminosity decline over 40 days of ΔL400.7\Delta L_{40}\approx-0.7 make ASASSN-23bd one of the dimmest TDEs to date and a member of the growing ``Low Luminosity and Fast'' class of TDEs.Comment: 17 pages, 13 figures, submitted to MNRA

    Self-monitoring of blood pressure in hypertension: A systematic review and individual patient data meta-analysis.

    Get PDF
    BACKGROUND: Self-monitoring of blood pressure (BP) appears to reduce BP in hypertension but important questions remain regarding effective implementation and which groups may benefit most. This individual patient data (IPD) meta-analysis was performed to better understand the effectiveness of BP self-monitoring to lower BP and control hypertension. METHODS AND FINDINGS: Medline, Embase, and the Cochrane Library were searched for randomised trials comparing self-monitoring to no self-monitoring in hypertensive patients (June 2016). Two reviewers independently assessed articles for eligibility and the authors of eligible trials were approached requesting IPD. Of 2,846 articles in the initial search, 36 were eligible. IPD were provided from 25 trials, including 1 unpublished study. Data for the primary outcomes-change in mean clinic or ambulatory BP and proportion controlled below target at 12 months-were available from 15/19 possible studies (7,138/8,292 [86%] of randomised participants). Overall, self-monitoring was associated with reduced clinic systolic blood pressure (sBP) compared to usual care at 12 months (-3.2 mmHg, [95% CI -4.9, -1.6 mmHg]). However, this effect was strongly influenced by the intensity of co-intervention ranging from no effect with self-monitoring alone (-1.0 mmHg [-3.3, 1.2]), to a 6.1 mmHg (-9.0, -3.2) reduction when monitoring was combined with intensive support. Self-monitoring was most effective in those with fewer antihypertensive medications and higher baseline sBP up to 170 mmHg. No differences in efficacy were seen by sex or by most comorbidities. Ambulatory BP data at 12 months were available from 4 trials (1,478 patients), which assessed self-monitoring with little or no co-intervention. There was no association between self-monitoring and either lower clinic or ambulatory sBP in this group (clinic -0.2 mmHg [-2.2, 1.8]; ambulatory 1.1 mmHg [-0.3, 2.5]). Results for diastolic blood pressure (dBP) were similar. The main limitation of this work was that significant heterogeneity remained. This was at least in part due to different inclusion criteria, self-monitoring regimes, and target BPs in included studies. CONCLUSIONS: Self-monitoring alone is not associated with lower BP or better control, but in conjunction with co-interventions (including systematic medication titration by doctors, pharmacists, or patients; education; or lifestyle counselling) leads to clinically significant BP reduction which persists for at least 12 months. The implementation of self-monitoring in hypertension should be accompanied by such co-interventions

    A study of soft tissue sarcomas after childhood cancer in Britain

    Get PDF
    Among 16 541 3-year survivors of childhood cancer in Britain, 39 soft tissue sarcomas (STSs) occurred and 1.1 sarcomas were expected, yielding a standardised incidence ratio (SIR) of 16.1. When retinoblastomas were excluded from the cohort, the SIR for STSs was 15.9, and the cumulative risk of developing a soft tissue tumour after childhood cancer within 20 years of 3-year survival was 0.23%. In the case–control study, there was a significant excess of STSs in those patients exposed to both radiotherapy (RT) and chemotherapy, which was five times that observed among those not exposed (P=0.02). On the basis of individual radiation dosimetry, there was evidence of a strong dose–response effect with a significant increase in the risk of STS with increasing dose of RT (P<0.001). This effect remained significant in a multivariate model. The adjusted risk in patients exposed to RT doses of over 3000 cGy was over 50 times the risk in the unexposed. There was evidence of a dose–response effect with exposure to alkylating agents, the risk increasing substantially with increasing cumulative dose (P=0.05). This effect remained after adjusting for the effect of radiation exposure

    Long-term effects of cranial irradiation and intrathecal chemotherapy in treatment of childhood leukemia: a MEG study of power spectrum and correlated cognitive dysfunction

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Prophylaxis to prevent relapses in the central nervous system after childhood acute lymphoblastic leukemia (ALL) used to consist of both intrathecal chemotherapy (CT) and cranial irradiation (CRT). CRT was mostly abolished in the eighties because of its neurotoxicity, and replaced with more intensive intrathecal CT. In this study, a group of survivors treated with CRT before 1983 and another group treated without CRT thereafter are investigated 20–25 years later, giving a much stronger perspective on long-term quality of life than previous studies. The outcomes will help to better understand these groups’ current needs and will aid in anticipating late effects of prophylactic CRT that is currently applied for other diseases. This study evaluates oscillatory neuronal activity in these long-term survivors. Power spectrum deviations are hypothesized to correlate with cognitive dysfunction.</p> <p>Methods</p> <p>Resting state eyes-closed magnetoencephalography (MEG) recordings were obtained from 14 ALL survivors treated with CT + CRT, 18 treated with CT alone and 35 controls. Relative spectral power was calculated in the δ, θ, α1, α2, β and γ frequency bands. The Amsterdam Neuropsychological Tasks (ANT) program was used to assess cognition in the executive functions domain. MEG data and ANT scores were correlated.</p> <p>Results</p> <p>In the CT + CRT group, relative θ power was slightly increased (p = 0.069) and α2 power was significantly decreased (p = 0.006). The CT + CRT group performed worse on various cognitive tests. A deficiency in visuomotor accuracy, especially of the right hand, could be clearly associated with the deviating regional θ and α2 powers (0.471 < r < 0.697). A significant association between decreased regional α2 power and less attentional fluctuations was found for CT + CRT patients as well as controls (0.078 < r < 0.666). Patients treated with CT alone displayed a power spectrum similar to controls, except for a significantly increased level of left frontal α2 power (p = 0.030).</p> <p>Conclusions</p> <p>The tendency towards global slowing of brain oscillatory activity, together with the fact that dementia has been reported as a late effect of CRT and the neuropsychological deficiencies currently present, suggest that the irradiated brain might be aging faster and could be at risk for early‐onset dementia. The CT group showed no signs of early aging.</p
    corecore