58 research outputs found

    Long-range forecasting of intermittent streamflow

    Get PDF
    Long-range forecasting of intermittent streamflow in semi-arid Australia poses a number of major challenges. One of the challenges relates to modelling zero, skewed, non-stationary, and non-linear data. To address this, a statistical model to forecast streamflow up to 12 months ahead is applied to five semi-arid catchments in South Western Queensland. The model uses logistic regression through Generalised Additive Models for Location, Scale and Shape (GAMLSS) to determine the probability of flow occurring in any of the systems. We then use the same regression framework in combination with a right-skewed distribution, the Box-Cox t distribution, to model the intensity (depth) of the non-zero streamflows. Time, seasonality and climate indices, describing the Pacific and Indian Ocean sea surface temperatures, are tested as covariates in the GAMLSS model to make probabilistic 6 and 12-month forecasts of the occurrence and intensity of streamflow. The output reveals that in the study region the occurrence and variability of flow is driven by sea surface temperatures and therefore forecasts can be made with some skill

    Detecting the impact of land cover change on observed rainfall

    Get PDF
    Analysis of observational data to pinpoint impact of land cover change on local rainfall is difficult due to multiple environmental factors that cannot be strictly controlled. In this study we use a statistical approach to identify the relationship between removal of tree cover and rainfall with data from best available sources for two large areas in Australia. Gridded rainfall data between 1979 and 2015 was used for the areas, while large scale (exogenous) effects were represented by mean rainfall across a much larger area and climatic indicators, such as Southern Oscillation Index and Indian Ocean Dipole. Both generalised additive modelling and step trend tests were used for the analysis. For a region in south central Queensland, the reported change in tree clearing between 2002–2005 did not result in strong statistically significant precipitation changes. On the other hand, results from a bushfire affected region on the border of New South Wales and Victoria suggest significant changes in the rainfall due to changes in tree cover. This indicates the method works better when an abrupt change in the data can be clearly identified. The results from the step trend test also mainly identified a positive relationship between the tree cover and the rainfall at p < 0.1 at the NSW/Victoria region. High rainfall variability and possible regrowth could have impacted the results in the Queensland region

    Presence of tobramycin in blood and urine during selective decontamination of the digestive tract in critically ill patients, a prospective cohort study

    Get PDF
    Tobramycin is one of the components used for selective decontamination of the digestive tract (SDD), applied to prevent colonization and subsequent infections in critically ill patients. Tobramycin is administered in the oropharynx and gastrointestinal tract and is normally not absorbed. However, critical illness may convey gut barrier failure. The aim of the study was to assess the prevalence and amount of tobramycin leakage from the gut into the blood, to quantify tobramycin excretion in urine, and to determine the association of tobramycin leakage with markers of circulation, kidney function and other organ failure. This was a prospective observational cohort study. The setting was the 20-bed closed format-mixed ICU of a teaching hospital. The study population was critically ill patients with an expected stay of more than two days, receiving SDD with tobramycin, polymyxin-E and amphotericin-B four times daily in the oropharynx and stomach. Tobramycin concentration was measured in serum (sensitive high performance liquid chromatography - mass spectrometry/mass spectrometry (HLPC-MS/MS) assay) and 24-hour urine (conventional immunoassay), in 34 patients, 24 hours after ICU admission, and in 71 patients, once daily for 7 days. Tobramycin leakage was defined as tobramycin detected in serum at least once (> 0.05 mg/L). Ototoxicity was not monitored. Of the 100 patients with available blood samples, 83 had tobramycin leakage. Median highest serum concentration for each patient was 0.12 mg/L; 99% of the patients had at least one positive urinary sample (> 0.5 mg/L), 49% had a urinary concentration ≥ 1 mg/L. The highest tobramycin serum concentration was significantly associated with vasopressor support, renal and hepatic dysfunction, and C-reactive protein. At binary logistic regression analysis, high dopamine dose and low urinary output on Day 1 were the significant predictors of tobramycin leakage. Nephrotoxicity could not be shown. The majority of acute critically ill patients treated with enteral tobramycin as a component of SDD had traces of tobramycin in the blood, especially those with severe shock, inflammation and subsequent acute kidney injury, suggesting loss of gut barrier and decreased renal removal. Unexpectedly, urinary tobramycin was above the therapeutic trough level in half of the patients. Nephrotoxicity could not be demonstrated

    Experiences of patients and health care professionals on the quality of telephone follow-up care during the COVID-19 pandemic:a large qualitative study in a multidisciplinary academic setting

    Get PDF
    OBJECTIVE: To evaluate the perceived quality of follow-up telephone consultations (TCs) from the perspective of patients and healthcare professionals (HCPs) of multiple medical disciplines during the COVID-19 pandemic. DESIGN: A qualitative study using semi-structured interviews and reflexive thematic analysis. SETTING: Seven medical disciplines (general dermatology, dermato-oncology, head and neck oncology, internal medicine, medical oncology, gynaecological oncology and surgical oncology) at a large university hospital in the Netherlands. PARTICIPANTS: Patients who received and HCPs who provided TCs as a substitute for outpatient follow-up appointments during the COVID-19 pandemic. RESULTS: Eighty-two patients and 58 HCPs were interviewed. Predominantly, patients and HCPs were satisfied with the quality of care by TCs. They regarded TCs as efficient, accessible and of acceptable quality, provided there was an established patient-HCP relationship, medical complaints were absent and physical examination was not indicated. However, most patients were worried about the accuracy of their health assessment in the absence of physical examination and non-verbal communication. Both patients and HCPs wish to use TCs in the future alternatively with face-to-face consultations. CONCLUSION: This study concludes that TCs seem a valuable contribution to the context of follow-up care and could partially replace face-to-face consultations. TCs can be performed in stable, chronic patients with whom a doctor-patient relationship has already been established. Face-to-face consultations are considered more appropriate in the case of new patients, challenging or emotionally charged consultations and when clinically relevant physical examination is indicated. Due to the context-dependent nature of experiences of patients and HCPs, TCs should be used with an individually customised approach based on patient and disease specifics, in which shared decision-making plays an extensive role. Before major implementation is considered, sufficient data on the safety regarding missed diagnoses or cancer recurrences should be assembled first

    Neutropenia induced in outbred mice by a simplified low-dose cyclophosphamide regimen: characterization and applicability to diverse experimental models of infectious diseases

    Get PDF
    BACKGROUND: For its low cost and ease of handling, the mouse remains the preferred experimental animal for preclinical tests. To avoid the interaction of the animal immune system, in vivo antibiotic pharmacodynamic studies often employ cyclophosphamide (CPM) to induce neutropenia. Although high doses (350–450 mg/kg) are still used and their effects on mouse leukocytes have been described, a lower dose (250 mg/kg) is widely preferred today, but the characteristics and applicability of this approach in outbred mice have not been determined. METHODS: Fifteen female ICR mice were injected intraperitoneally with 150 and 100 mg/kg of CPM on days 1 and 4, respectively. Blood samples (~160 μL) were drawn from the retro-orbital sinus of each mouse on days 1, 4, 5, 6, 7 and 11. Leukocytes were counted manually and the number of granulocytes was based on microscopic examination of Wright-stained smears. The impact of neutropenia induced by this method was then determined with a variety of pathogens in three different murine models of human infections: pneumonia (Klebsiella pneumoniae, Streptococcus pneumoniae, Staphylococcus aureus), meningoencephalitis (S. pneumoniae), and the thigh model (S. aureus, Escherichia coli, Bacteroides fragilis). RESULTS: The basal count of leukocytes was within the normal range for outbred mice. On day 4, there was an 84% reduction in total white blood cells, and by day 5 the leukopenia reached its nadir (370 ± 84 cells/mm(3)). Profound neutropenia (≤10 neutrophils/mm(3)) was demonstrated at day 4 and persisted through days 5 and 6. Lymphocytes and monocytes had a 92% and 96% decline between days 1 and 5, respectively. Leukocytes recovered completely by day 11. Mice immunosupressed under this protocol displayed clinical and microbiological patterns of progressive and lethal infectious diseases after inoculation in different organs with diverse human pathogens. CONCLUSION: A CPM total dose of 250 mg/kg is sufficient to induce profound and sustained neutropenia (<10 neutrophils/mm(3)) at least during 3 days in outbred mice, is simpler than previously described methods, and allows successful induction of infection in a variety of experimental models

    Molecular characteristics of carbapenemase-producing Enterobacterales in the Netherlands; results of the 2014–2018 national laboratory surveillance

    Get PDF
    Objectives: Carbapenem resistance mediated by mobile genetic elements has emerged worldwide and has become a major public health threat. To gain insight into the molecular epidemiology of carbapenem resistance in The Netherlands, Dutch medical microbiology laboratories are requested to submit suspected carbapenemase-producing Enterobacterales (CPE) to the National Institute for Public Health and the Environment as part of a national surveillance system. Methods: Meropenem MICs and species identification were confirmed by E-test and MALDI-TOF and carbapenemase production was assessed by the Carbapenem Inactivation Method. Of all submitted CPE, one species/carbapenemase gene combination per person per year was subjected to next-generation sequencing (NGS). Results: In total, 1838 unique isolates were received between 2014 and 2018, of which 892 were unique CPE isolates with NGS data available. The predominant CPE species were Klebsiella pneumoniae (n = 388, 43%), Escherichia coli (n = 264, 30%) and Enterobacter cloacae complex (n = 116, 13%). Various carbapenemase alleles of the same carbapenemase gene resulted in different susceptibilities to meropenem and this effect varied between species. Analyses of NGS data showed variation of prevalence of carbapenemase alleles over time with blaOXA-48 being predominant (38%, 336/892), followed by blaNDM-1 (16%, 145/892). For the first time in the Netherlands, blaOXA-181, blaOXA-232 and blaVIM-4 were detected. The genetic background of K. pneumoniae and E. coli isolates was highly diverse. Conclusions: The CPE population in the Netherlands is diverse, suggesting multiple introductions. The predominant carbapenemase alleles are blaOXA-48 and blaNDM-1. There was a clear association between species, carbapenemase allele and susceptibility to meropenem

    National laboratory-based surveillance system for antimicrobial resistance: a successful tool to support the control of antimicrobial resistance in the Netherlands

    Get PDF
    An important cornerstone in the control of antimicrobial resistance (AMR) is a well-designed quantitative system for the surveillance of spread and temporal trends in AMR. Since 2008, the Dutch national AMR surveillance system, based on routine data from medical microbiological laboratories (MMLs), has developed into a successful tool to support the control of AMR in the Netherlands. It provides background information for policy making in public health and healthcare services, supports development of empirical antibiotic therapy guidelines and facilitates in-depth research. In addition, participation of the MMLs in the national AMR surveillance network has contributed to sharing of knowledge and quality improvement. A future improvement will be the implementation of a new semantic standard together with standardised data transfer, which will reduce errors in data handling and enable a more real-time surveillance. Furthermore, the

    Development and Validation of a Risk Score for Chronic Kidney Disease in HIV Infection Using Prospective Cohort Data from the D:A:D Study

    Get PDF
    Ristola M. on työryhmien DAD Study Grp ; Royal Free Hosp Clin Cohort ; INSIGHT Study Grp ; SMART Study Grp ; ESPRIT Study Grp jäsen.Background Chronic kidney disease (CKD) is a major health issue for HIV-positive individuals, associated with increased morbidity and mortality. Development and implementation of a risk score model for CKD would allow comparison of the risks and benefits of adding potentially nephrotoxic antiretrovirals to a treatment regimen and would identify those at greatest risk of CKD. The aims of this study were to develop a simple, externally validated, and widely applicable long-term risk score model for CKD in HIV-positive individuals that can guide decision making in clinical practice. Methods and Findings A total of 17,954 HIV-positive individuals from the Data Collection on Adverse Events of Anti-HIV Drugs (D:A:D) study with >= 3 estimated glomerular filtration rate (eGFR) values after 1 January 2004 were included. Baseline was defined as the first eGFR > 60 ml/min/1.73 m2 after 1 January 2004; individuals with exposure to tenofovir, atazanavir, atazanavir/ritonavir, lopinavir/ritonavir, other boosted protease inhibitors before baseline were excluded. CKD was defined as confirmed (>3 mo apart) eGFR In the D:A:D study, 641 individuals developed CKD during 103,185 person-years of follow-up (PYFU; incidence 6.2/1,000 PYFU, 95% CI 5.7-6.7; median follow-up 6.1 y, range 0.3-9.1 y). Older age, intravenous drug use, hepatitis C coinfection, lower baseline eGFR, female gender, lower CD4 count nadir, hypertension, diabetes, and cardiovascular disease (CVD) predicted CKD. The adjusted incidence rate ratios of these nine categorical variables were scaled and summed to create the risk score. The median risk score at baseline was -2 (interquartile range -4 to 2). There was a 1: 393 chance of developing CKD in the next 5 y in the low risk group (risk score = 5, 505 events), respectively. Number needed to harm (NNTH) at 5 y when starting unboosted atazanavir or lopinavir/ritonavir among those with a low risk score was 1,702 (95% CI 1,166-3,367); NNTH was 202 (95% CI 159-278) and 21 (95% CI 19-23), respectively, for those with a medium and high risk score. NNTH was 739 (95% CI 506-1462), 88 (95% CI 69-121), and 9 (95% CI 8-10) for those with a low, medium, and high risk score, respectively, starting tenofovir, atazanavir/ritonavir, or another boosted protease inhibitor. The Royal Free Hospital Clinic Cohort included 2,548 individuals, of whom 94 individuals developed CKD (3.7%) during 18,376 PYFU (median follow-up 7.4 y, range 0.3-12.7 y). Of 2,013 individuals included from the SMART/ESPRIT control arms, 32 individuals developed CKD (1.6%) during 8,452 PYFU (median follow-up 4.1 y, range 0.6-8.1 y). External validation showed that the risk score predicted well in these cohorts. Limitations of this study included limited data on race and no information on proteinuria. Conclusions Both traditional and HIV-related risk factors were predictive of CKD. These factors were used to develop a risk score for CKD in HIV infection, externally validated, that has direct clinical relevance for patients and clinicians to weigh the benefits of certain antiretrovirals against the risk of CKD and to identify those at greatest risk of CKD.Peer reviewe

    Ecological roles of zoosporic parasites in blue carbon ecosystems

    Full text link
    Pathosystems describe the relationships between parasites, hosts and the environment. Generally these systems remain in a dynamic equilibrium over time. In this review we examine some of the evidence for the potential impacts of change in dynamic equilibrium in blue carbon ecosystems and the relationships to the amount of stored carbon. Blue carbon ecosystems are marine and estuarine ecosystems along the coasts. Virulent pathogens can be introduced into ecosystems along with non-native hosts. Alteration of environmental conditions, such as temperature, pH and salinity, may cause parasites to dominate the pathosystems resulting in significant decreases in productivity and population sizes of producer hosts and in changes in the overall species composition and function in these ecosystems. Such changes in blue carbon ecosystems may result in accelerated release of carbon dioxide back into the ocean and atmosphere, which could then drive further changes in the global climate. The resiliency of these ecosystems is not known. However, recent evidence suggests that significant proportions of blue carbon ecosystems have already disappeared. © 2013 Elsevier Ltd and The British Mycological Society
    corecore