40 research outputs found

    Multi-walled carbon nanotube physicochemical properties predict pulmonary inflammation and genotoxicity

    Get PDF
    <p>Lung deposition of multi-walled carbon nanotubes (MWCNT) induces pulmonary toxicity. Commercial MWCNT vary greatly in physicochemical properties and consequently in biological effects. To identify determinants of MWCNT-induced toxicity, we analyzed the effects of pulmonary exposure to 10 commercial MWCNT (supplied in three groups of different dimensions, with one pristine and two/three surface modified in each group). We characterized morphology, chemical composition, surface area and functionalization levels. MWCNT were deposited in lungs of female C57BL/6J mice by intratracheal instillation of 0, 6, 18 or 54 μg/mouse. Pulmonary inflammation (neutrophil influx in bronchoalveolar lavage (BAL)) and genotoxicity were determined on day 1, 28 or 92. Histopathology of the lungs was performed on day 28 and 92. All MWCNT induced similar histological changes. Lymphocytic aggregates were detected for all MWCNT on day 28 and 92. Using adjusted, multiple regression analyses, inflammation and genotoxicity were related to dose, time and physicochemical properties. The specific surface area (BET) was identified as a positive predictor of pulmonary inflammation on all post-exposure days. In addition, length significantly predicted pulmonary inflammation, whereas surface oxidation (–OH and –COOH) was predictor of lowered inflammation on day 28. BET surface area, and therefore diameter, significantly predicted genotoxicity in BAL fluid cells and lung tissue such that lower BET surface area or correspondingly larger diameter was associated with increased genotoxicity. This study provides information on possible toxicity-driving physicochemical properties of MWCNT. The results may contribute to safe-by-design manufacturing of MWCNT, thereby minimizing adverse effects.</p

    Procalcitonin to Guide Initiation and Duration of Antibiotic Treatment in Acute Respiratory Infections: An Individual Patient Data Meta-Analysis

    Get PDF
    This individual patient data meta-analysis of clinical trials investigating procalcitonin algorithms for antibiotic decision making found no increased risk of death or setting-specific treatment failure but did find significantly lower antibiotic exposure across different acute respiratory infections and clinical setting

    Association of kidney function with effectiveness of procalcitonin-guided antibiotic treatment:A patient-level meta-analysis from randomized controlled trials

    Get PDF
    Patients with impaired kidney function have a significantly slower decrease of procalcitonin (PCT) levels during infection. Our aim was to study PCT-guided antibiotic stewardship and clinical outcomes in patients with impairments of kidney function as assessed by creatinine levels measured upon hospital admission. We pooled and analyzed individual data from 15 randomized controlled trials who were randomly assigned to receive antibiotic therapy based on a PCT-algorithms or based on standard of care. We stratified patients on the initial glomerular filtration rate (GFR, ml/min/1.73 m2) in three groups (GFR >90 [chronic kidney disease; CKD 1], GFR 15-89 [CKD 2-4] and GFR0.05). This individual patient data meta-analysis confirms that the use of PCT in patients with impaired kidney function, as assessed by admission creatinine levels, is associated with shorter antibiotic courses and lower mortality rates

    Acute mucosal pathogenesis of feline immunodeficiency virus is independent of viral dose in vaginally infected cats

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The mucosal pathogenesis of HIV has been shown to be an important feature of infection and disease progression. HIV-1 infection causes depletion of intestinal lamina propria CD4+ T cells (LPL), therefore, intestinal CD4+ T cell preservation may be a useful correlate of protection in evaluating vaccine candidates. Vaccine studies employing the cat/FIV and macaque/SIV models frequently use high doses of parenterally administered challenge virus to ensure high plasma viremia in control animals. However, it is unclear if loss of mucosal T cells would occur regardless of initial viral inoculum dose. The objective of this study was to determine the acute effect of viral dose on mucosal leukocytes and associated innate and adaptive immune responses.</p> <p>Results</p> <p>Cats were vaginally inoculated with a high, middle or low dose of cell-associated and cell-free FIV. PBMC, serum and plasma were assessed every two weeks with tissues assessed eight weeks following infection. We found that irrespective of mucosally administered viral dose, FIV infection was induced in all cats. However, viremia was present in only half of the cats, and viral dose was unrelated to the development of viremia. Importantly, regardless of viral dose, all cats experienced significant losses of intestinal CD4+ LPL and CD8+ intraepithelial lymphocytes (IEL). Innate immune responses by CD56+CD3- NK cells correlated with aviremia and apparent occult infection but did not protect mucosal T cells. CD4+ and CD8+ T cells in viremic cats were more likely to produce cytokines in response to Gag stimulation, whereas aviremic cats T cells tended to produce cytokines in response to Env stimulation. However, while cell-mediated immune responses in aviremic cats may have helped reduce viral replication, they could not be correlated to the levels of viremia. Robust production of anti-FIV antibodies was positively correlated with the magnitude of viremia.</p> <p>Conclusions</p> <p>Our results indicate that mucosal immune pathogenesis could be used as a rapid indicator of vaccine success or failure when combined with a physiologically relevant low dose mucosal challenge. We also show that innate immune responses may play an important role in controlling viral replication following acute mucosal infection, which has not been previously identified.</p

    Economic evaluation of procalcitonin-guided antibiotic therapy in acute respiratory infections: a US health system perspective

    Get PDF
    Background: Whether or not antibiotic stewardship protocols based on procalcitonin levels results in cost savings remains unclear. Herein, our objective was to assess the economic impact of adopting procalcitonin testing among patients with suspected acute respiratory tract infection (ARI) from the perspective of a typical US integrated delivery network (IDN) with a 1,000,000 member catchment area or enrollment. Methods: To conduct an economic evaluation of procalcitonin testing versus usual care we built a cost-impact model based on patient-level meta-analysis data of randomized trials. The meta-analytic data was adapted to the US setting by applying the meta-analytic results to US lengths of stay, costs, and practice patterns. We estimated the annual ARI visit rate for the one million member cohort, by setting (inpatient, ICU, outpatient) and ARI diagnosis. Results: In the inpatient setting, the costs of procalcitonin-guided compared to usual care for the one million member cohort was 2,083,545,comparedto2,083,545, compared to 2,780,322, resulting in net savings of nearly 700,000totheIDNfor2014.IntheICUandoutpatientsettings,savingswere700,000 to the IDN for 2014. In the ICU and outpatient settings, savings were 73,326 and 5,329,824,respectively,summinguptooverallnetsavingsof5,329,824, respectively, summing up to overall net savings of 6,099,927 for the cohort. Results were robust for all ARI diagnoses. For the whole US insured population, procalcitonin-guided care would result in $1.6 billion in savings annually. Conclusions: Our results show substantial savings associated with procalcitonin protocols of ARI across common US treatment settings mainly by direct reduction in unnecessary antibiotic utilization. These results are robust to changes in key parameters, and the savings can be achieved without any negative impact on treatment outcomes

    Large expert-curated database for benchmarking document similarity detection in biomedical literature search

    Get PDF
    Document recommendation systems for locating relevant literature have mostly relied on methods developed a decade ago. This is largely due to the lack of a large offline gold-standard benchmark of relevant documents that cover a variety of research fields such that newly developed literature search techniques can be compared, improved and translated into practice. To overcome this bottleneck, we have established the RElevant LIterature SearcH consortium consisting of more than 1500 scientists from 84 countries, who have collectively annotated the relevance of over 180 000 PubMed-listed articles with regard to their respective seed (input) article/s. The majority of annotations were contributed by highly experienced, original authors of the seed articles. The collected data cover 76% of all unique PubMed Medical Subject Headings descriptors. No systematic biases were observed across different experience levels, research fields or time spent on annotations. More importantly, annotations of the same document pairs contributed by different scientists were highly concordant. We further show that the three representative baseline methods used to generate recommended articles for evaluation (Okapi Best Matching 25, Term Frequency-Inverse Document Frequency and PubMed Related Articles) had similar overall performances. Additionally, we found that these methods each tend to produce distinct collections of recommended articles, suggesting that a hybrid method may be required to completely capture all relevant articles. The established database server located at https://relishdb.ict.griffith.edu.au is freely available for the downloading of annotation data and the blind testing of new methods. We expect that this benchmark will be useful for stimulating the development of new powerful techniques for title and title/abstract-based search engines for relevant articles in biomedical research.Peer reviewe

    Temporal and spatial analysis of the 2014-2015 Ebola virus outbreak in West Africa

    Get PDF
    West Africa is currently witnessing the most extensive Ebola virus (EBOV) outbreak so far recorded. Until now, there have been 27,013 reported cases and 11,134 deaths. The origin of the virus is thought to have been a zoonotic transmission from a bat to a two-year-old boy in December 2013 (ref. 2). From this index case the virus was spread by human-to-human contact throughout Guinea, Sierra Leone and Liberia. However, the origin of the particular virus in each country and time of transmission is not known and currently relies on epidemiological analysis, which may be unreliable owing to the difficulties of obtaining patient information. Here we trace the genetic evolution of EBOV in the current outbreak that has resulted in multiple lineages. Deep sequencing of 179 patient samples processed by the European Mobile Laboratory, the first diagnostics unit to be deployed to the epicentre of the outbreak in Guinea, reveals an epidemiological and evolutionary history of the epidemic from March 2014 to January 2015. Analysis of EBOV genome evolution has also benefited from a similar sequencing effort of patient samples from Sierra Leone. Our results confirm that the EBOV from Guinea moved into Sierra Leone, most likely in April or early May. The viruses of the Guinea/Sierra Leone lineage mixed around June/July 2014. Viral sequences covering August, September and October 2014 indicate that this lineage evolved independently within Guinea. These data can be used in conjunction with epidemiological information to test retrospectively the effectiveness of control measures, and provides an unprecedented window into the evolution of an ongoing viral haemorrhagic fever outbreak.status: publishe
    corecore