81 research outputs found
Rationale and design of an independent randomised controlled trial evaluating the effectiveness of aripiprazole or haloperidol in combination with clozapine for treatment-resistant schizophrenia
<p>Abstract</p> <p>Background</p> <p>One third to two thirds of people with schizophrenia have persistent psychotic symptoms despite clozapine treatment. Under real-world circumstances, the need to provide effective therapeutic interventions to patients who do not have an optimal response to clozapine has been cited as the most common reason for simultaneously prescribing a second antipsychotic drug in combination treatment strategies. In a clinical area where the pressing need of providing therapeutic answers has progressively increased the occurrence of antipsychotic polypharmacy, despite the lack of robust evidence of its efficacy, we sought to implement a pre-planned protocol where two alternative therapeutic answers are systematically provided and evaluated within the context of a pragmatic, multicentre, independent randomised study.</p> <p>Methods/Design</p> <p>The principal clinical question to be answered by the present project is the relative efficacy and tolerability of combination treatment with clozapine plus aripiprazole compared with combination treatment with clozapine plus haloperidol in patients with an incomplete response to treatment with clozapine over an appropriate period of time. This project is a prospective, multicentre, randomized, parallel-group, superiority trial that follow patients over a period of 12 months. Withdrawal from allocated treatment within 3 months is the primary outcome.</p> <p>Discussion</p> <p>The implementation of the protocol presented here shows that it is possible to create a network of community psychiatric services that accept the idea of using their everyday clinical practice to produce randomised knowledge. The employed pragmatic attitude allowed to randomly allocate more than 100 individuals, which means that this study is the largest antipsychotic combination trial conducted so far in Western countries. We expect that the current project, by generating evidence on whether it is clinically useful to combine clozapine with aripiprazole rather than with haloperidol, provides physicians with a solid evidence base to be directly applied in the routine care of patients with schizophrenia.</p> <p>Trial Registration</p> <p><b>Clincaltrials.gov Identifier</b>: NCT00395915</p
Urinary proteome and metabolome in dogs (Canis lupus familiaris): The effect of chronic kidney disease
Chronic kidney disease (CKD) is a progressive and irreversible disease. Although urine is an ideal biological sample for proteomics and metabolomics studies, sensitive and specific biomarkers are currently lacking in dogs. This study characterised dog urine proteome and metabolome aiming to identify and possibly quantify putative biomarkers of CKD in dogs. Twenty-two healthy dogs and 28 dogs with spontaneous CKD were selected and urine samples were collected. Urinary proteome was separated by SDS-PAGE and analysed by mass spectrometry, while urinary metabolome was analysed in protein-depleted samples by 1D 1H NMR spectra. The most abundant proteins in urine samples from healthy dogs were uromodulin, albumin and, in entire male dogs, arginine esterase. In urine samples from CKD dogs, the concentrations of uromodulin and albumin were significantly lower and higher, respectively, than in healthy dogs. In addition, these samples were characterised by a more complex protein pattern indicating mixed glomerular (protein bands ≥65 kDa) and tubular (protein bands <65 kDa) proteinuria. Urine spectra acquired by NMR allowed the identification of 86 metabolites in healthy dogs, belonging to 49 different pathways mainly involved in amino acid metabolism, purine and aminoacyl-tRNA biosynthesis or tricarboxylic acid cycle. Seventeen metabolites showed significantly different concentrations when comparing healthy and CKD dogs. In particular, carnosine, trigonelline, and cis-aconitate, might be suggested as putative biomarkers of CKD in dogsinfo:eu-repo/semantics/acceptedVersio
Hyperoxemia and excess oxygen use in early acute respiratory distress syndrome : Insights from the LUNG SAFE study
Publisher Copyright: © 2020 The Author(s). Copyright: Copyright 2020 Elsevier B.V., All rights reserved.Background: Concerns exist regarding the prevalence and impact of unnecessary oxygen use in patients with acute respiratory distress syndrome (ARDS). We examined this issue in patients with ARDS enrolled in the Large observational study to UNderstand the Global impact of Severe Acute respiratory FailurE (LUNG SAFE) study. Methods: In this secondary analysis of the LUNG SAFE study, we wished to determine the prevalence and the outcomes associated with hyperoxemia on day 1, sustained hyperoxemia, and excessive oxygen use in patients with early ARDS. Patients who fulfilled criteria of ARDS on day 1 and day 2 of acute hypoxemic respiratory failure were categorized based on the presence of hyperoxemia (PaO2 > 100 mmHg) on day 1, sustained (i.e., present on day 1 and day 2) hyperoxemia, or excessive oxygen use (FIO2 ≥ 0.60 during hyperoxemia). Results: Of 2005 patients that met the inclusion criteria, 131 (6.5%) were hypoxemic (PaO2 < 55 mmHg), 607 (30%) had hyperoxemia on day 1, and 250 (12%) had sustained hyperoxemia. Excess FIO2 use occurred in 400 (66%) out of 607 patients with hyperoxemia. Excess FIO2 use decreased from day 1 to day 2 of ARDS, with most hyperoxemic patients on day 2 receiving relatively low FIO2. Multivariate analyses found no independent relationship between day 1 hyperoxemia, sustained hyperoxemia, or excess FIO2 use and adverse clinical outcomes. Mortality was 42% in patients with excess FIO2 use, compared to 39% in a propensity-matched sample of normoxemic (PaO2 55-100 mmHg) patients (P = 0.47). Conclusions: Hyperoxemia and excess oxygen use are both prevalent in early ARDS but are most often non-sustained. No relationship was found between hyperoxemia or excessive oxygen use and patient outcome in this cohort. Trial registration: LUNG-SAFE is registered with ClinicalTrials.gov, NCT02010073publishersversionPeer reviewe
Development and Validation of a Risk Score for Chronic Kidney Disease in HIV Infection Using Prospective Cohort Data from the D:A:D Study
Ristola M. on työryhmien DAD Study Grp ; Royal Free Hosp Clin Cohort ; INSIGHT Study Grp ; SMART Study Grp ; ESPRIT Study Grp jäsen.Background Chronic kidney disease (CKD) is a major health issue for HIV-positive individuals, associated with increased morbidity and mortality. Development and implementation of a risk score model for CKD would allow comparison of the risks and benefits of adding potentially nephrotoxic antiretrovirals to a treatment regimen and would identify those at greatest risk of CKD. The aims of this study were to develop a simple, externally validated, and widely applicable long-term risk score model for CKD in HIV-positive individuals that can guide decision making in clinical practice. Methods and Findings A total of 17,954 HIV-positive individuals from the Data Collection on Adverse Events of Anti-HIV Drugs (D:A:D) study with >= 3 estimated glomerular filtration rate (eGFR) values after 1 January 2004 were included. Baseline was defined as the first eGFR > 60 ml/min/1.73 m2 after 1 January 2004; individuals with exposure to tenofovir, atazanavir, atazanavir/ritonavir, lopinavir/ritonavir, other boosted protease inhibitors before baseline were excluded. CKD was defined as confirmed (>3 mo apart) eGFR In the D:A:D study, 641 individuals developed CKD during 103,185 person-years of follow-up (PYFU; incidence 6.2/1,000 PYFU, 95% CI 5.7-6.7; median follow-up 6.1 y, range 0.3-9.1 y). Older age, intravenous drug use, hepatitis C coinfection, lower baseline eGFR, female gender, lower CD4 count nadir, hypertension, diabetes, and cardiovascular disease (CVD) predicted CKD. The adjusted incidence rate ratios of these nine categorical variables were scaled and summed to create the risk score. The median risk score at baseline was -2 (interquartile range -4 to 2). There was a 1: 393 chance of developing CKD in the next 5 y in the low risk group (risk score = 5, 505 events), respectively. Number needed to harm (NNTH) at 5 y when starting unboosted atazanavir or lopinavir/ritonavir among those with a low risk score was 1,702 (95% CI 1,166-3,367); NNTH was 202 (95% CI 159-278) and 21 (95% CI 19-23), respectively, for those with a medium and high risk score. NNTH was 739 (95% CI 506-1462), 88 (95% CI 69-121), and 9 (95% CI 8-10) for those with a low, medium, and high risk score, respectively, starting tenofovir, atazanavir/ritonavir, or another boosted protease inhibitor. The Royal Free Hospital Clinic Cohort included 2,548 individuals, of whom 94 individuals developed CKD (3.7%) during 18,376 PYFU (median follow-up 7.4 y, range 0.3-12.7 y). Of 2,013 individuals included from the SMART/ESPRIT control arms, 32 individuals developed CKD (1.6%) during 8,452 PYFU (median follow-up 4.1 y, range 0.6-8.1 y). External validation showed that the risk score predicted well in these cohorts. Limitations of this study included limited data on race and no information on proteinuria. Conclusions Both traditional and HIV-related risk factors were predictive of CKD. These factors were used to develop a risk score for CKD in HIV infection, externally validated, that has direct clinical relevance for patients and clinicians to weigh the benefits of certain antiretrovirals against the risk of CKD and to identify those at greatest risk of CKD.Peer reviewe
Role of the environmental microbiome on preterm newborns colonization: a pilot study by NGS
Introduction: Extremely preterm infants, due to their characteristics, represent an extremely at-risk group of contracting healthcare-associated infections (HAIs). Late-onset sepsis is a major cause of morbidity and mortality in newborns admitted to neonatal intensive care units (NICUs). In this study we characterized the colonization by environmental microbiome on preterm infants, by analyzing nasal swabs from preterm infants at the time of birth and during the permanence in the NICU. Materials and Methods: A total of 55 nasal swabs were collected from 30 newborns admitted to NICU, independently from their clinical conditions. The study time-course of samples collection included: 30 swabs at the time of birth (group N), 18 after 9 days (group S) and 7 after 13 days (group SA). At the same time points, samples from the environment have been collected including sinks, footboards of the beds, and floors. Microbiome analyses were simultaneously performed by NGS and by a custom real-time PCR (qPCR) microarray kit. Results: In nasal swabs, the bacteria most frequently detected with the NGS method were: Corynebacterium spp. (N: 40%, S: 56%, SA: 57%), Staphylococcus spp. (N: 53%, S: 94%, SA: 100%), Streptococcus spp. (N: 53%, S: 61%, SA: 71%) and Escherichia-Shigella spp. (N: 33%, S: 61%, SA: 100%). Other bacteria detected frequently were Acinetobacter spp., Pseudomonas spp., Klebsiella spp., Enterobacter spp., Cutibacterium spp. and Rothia spp. The qPCR analysis allowed the identification of bacteria up to species level. To be noted, within the genus Staphylococcus the main species were aureus (N: 3%, S: 33%, SA: 43%) and epidermidis (N: 27%, S: 89%, SA: 100%). Instead, within the genus Streptococcus the most frequently detected species were pneumoniae, infantis, oralis and salivarius, with higher identification rates in the SA group. This method also allowed the identification of the fungus Candida albicans in group S (6%) and SA (43%). Environmental NGS analysis of the NICU showed the presence mainly of Staphylococcus spp., Streptococcus spp., Corynebacterium spp., Cutibacterium spp., Acinetobacter spp., Escherichia-Shigella spp. and Pseudomonas spp. Interestingly, the similarity between the microbiome of nasal swabs and the environmental one increased with the stay of the newborns in the ward. Discussion and Conclusions: Based on molecular analyses, our study highlights the importance of routine screening to assess the rate and type of colonization of fragile newborns by the environmental microbiome. The introduction of this new approach may be important for environmental monitoring and for consequent clinical management of newborns admitted to NICU ward
Candida infections in paediatrics: Results from a prospective single-centre study in a tertiary care children's hospital
To describe the epidemiology of invasive Candida infection in a tertiary care paediatric hospital. Prospective single-centre survey on all Candida strains isolated from normally sterile fluids and urines in the period 2005-2015. A total of 299 ICI were documented in 262 patients. Urinary tract infection represented the most frequent diagnosis (62%), followed by fungaemia (34%) and peritonitis (4%). Fungaemia was most frequent in children with cancer (59%) or in low birth weight neonates (61%), while urinary tract infections were more frequent in patients with urinary tract malformation. C.albicans was the most frequently isolated species (60%) compared with C. non-albicans, but differences were present according to the site of isolation and underlying conditions. Overall 90-day mortality was 7%, 13% in fungaemias, 8% in peritonitis and 2% in urinary tract infections. The rates of invasive Candida infection increased during the study period. Invasive Candida infection is diagnosed with increasing frequency in children. Site of isolation and aetiology are frequently related with the presence of underlying, favouring conditions. Mortality was not negligible, especially in the presence of more invasive infections and specific underlying conditions
Performance of 1,3-\u3b2-D-glucan for diagnosing invasive fungal diseases in children
Plasma 1,3-\u3b2-D-glucan (BDG) is indicated as a tool for early diagnosis of invasive fungal diseases (IFD). However, data on its diagnostic value are scarce in children. Therefore, definition of BDG test performance in paediatrics is needed. BDG was evaluated in children admitted to "Istituto Giannina Gaslini," Genoa, Italy, who developed clinical conditions at risk for IFD. Results were analysed for sensitivity, specificity, predictive values, likelihood ratios, accuracy, informedness and probability of missing one case by a negative test. A total of 1577 BDG determinations were performed on 255 patients (49% males, median age 5.4 years). Overall 46 IFD were diagnosed, 72% proven/probable. The test performance was evaluated for 80 pg/mL, 120 pg/mL, 200 pg/mL, 350 pg/mL, 400 pg/mL cut offs. Sensitivity was always <0.80 and specificity > 0.90 only for cut offs 65200 pg/mL. Negative predictive value was 650.90 for all the cut offs evaluated, while positive predictive value resulted barely 0.50 (8% IFD prevalence). Accuracy was never >0.90, and informedness was at best 0.50. The risk of missing one IFD by a negative result was < 10%. Analyses in haemato-oncological or newborn patients did not show major differences. Detection of serum BDG does not appear a valuable adjunctive diagnostic tool for IFD in paediatrics
Virtualized Security at the Network Edge: A User-centric Approach
The current device-centric protection model against security threats has serious limitations.
On one hand, the proliferation of user terminals such as smartphones, tablets, notebooks, smart TVs, game consoles, and desktop computers makes it extremely difficult to achieve the same level of protection regardless of the device used.
On another hand, when various users share devices (e.g. parents and kids using the same devices at home), the setup of distinct security profiles, policies, and protection rules for the different users of a terminal is far from trivial.
In light of these problems, this article advocates for a paradigm shift in user protection.
In our model, protection is decoupled from users’ terminals, and it is provided by the access network through a trusted
virtual domain.
Each trusted virtual domain provides unified and homogeneous security for a single user irrespective of the terminal employed.
We describe a user-centric model where nontechnically savvy users can define their own profiles and protection rules in an intuitive way.
We show that our model can harness the virtualization power offered by next-generation access networks, especially from network functions virtualization in the points of presence at the edge of telecom operators.
We also analyze the distinctive features of our model, and the challenges faced based on the experience gained in
the development of a proof of concept
- …