46 research outputs found

    Risk factors for occurrence of cephalosporin-resistant Escherichia coli in Norwegian broiler flocks

    Get PDF
    AbstractA longitudinal study of 27 broiler farms including 182 broiler flocks was performed to determine risk factors for occurrence of cephalosporin-resistant Escherichia coli in Norwegian broiler flocks. Information regarding possible risk factors was collected by an online questionnaire and by samples obtained from broiler and parent flocks during the study period. Additional information was provided by the broiler production company. The prevalence of cephalosporin-resistant E. coli in parent flocks and broiler flocks sampled in the study was estimated.Cephalosporin-resistant E. coli was detected in 13.8% of the parent flocks and 22.5% of the broiler flocks included in the study.A multivariable generalized linear model was used to estimate risk factors. The risk for occurrence of cephalosporin-resistant E. coli was associated with the status of the previous flock in the broiler house (odds ratio=12.7), number of parent flocks supplying the broiler flock with day-old chickens (odds ratio=6.3), routines for disinfection of floor between production cycles (odds ratio=0.1), and transport personnel entering the room where the broilers are raised (odds ratio=9.3).Our findings highlights that implementation of a high level of biosecurity with a minimal number of people entering the broiler house during production cycles, as well as rigorous cleaning and disinfection routines between production cycles will contribute to a decrease in the occurrence of cephalosporin-resistant E. coli in broiler flocks provided that there is no selection pressure from antimicrobial use in the broiler production

    Long-term effects of fecal microbiota transplantation (FMT) in patients with irritable bowel syndrome

    Get PDF
    Background We recently found fecal microbiota transplantation (FMT) in irritable bowel syndrome (IBS) patients to be an effective and safe treatment after 3 months. The present follow-up study investigated the efficacy and safety of FMT at 1 year after treatment. Methods This study included 77 of the 91 IBS patients who had responded to FMT in our previous study. Patients provided a fecal sample and completed five questionnaires to assess their symptoms and quality of life at 1 year after FMT. The dysbiosis index (DI) and fecal bacterial profile were analyzed using a 16S rRNA gene-based DNA probe hybridization. The levels of fecal short-chain fatty acids (SCFAs) were determined by gas chromatography. Results There was a persistent response to FMT at 1 year after treatment in 32 (86.5%) and 35 (87.5%) patients who received 30-g and 60-g FMT, respectively. In the 30-g FMT group, 12 (32.4%) and 8 (21.6%) patients showed complete remission at 1 year and 3 months, respectively; the corresponding numbers in the 60-g FMT group were 18 (45%) and 11 (27.5%), respectively. Abdominal symptoms and the quality of life were improved at 1 year compared with after 3 months. These findings were accompanied by comprehensive changes in the fecal bacterial profile and SCFAs. Conclusions Most of the IBS patients maintained a response at 1 year after FMT. Moreover, the improvements in symptoms and quality of life increased over time. Changes in DI, fecal bacterial profile and SCFAs were more comprehensive at 1 year than after 3 months. www.clinicaltrials.gov (NCT03822299).publishedVersio

    Characterization of unknown genetic modifications using high throughput sequencing and computational subtraction

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>When generating a genetically modified organism (GMO), the primary goal is to give a target organism one or several novel traits by using biotechnology techniques. A GMO will differ from its parental strain in that its pool of transcripts will be altered. Currently, there are no methods that are reliably able to determine if an organism has been genetically altered if the nature of the modification is unknown.</p> <p>Results</p> <p>We show that the concept of computational subtraction can be used to identify transgenic cDNA sequences from genetically modified plants. Our datasets include 454-type sequences from a transgenic line of <it>Arabidopsis thaliana </it>and published EST datasets from commercially relevant species (rice and papaya).</p> <p>Conclusion</p> <p>We believe that computational subtraction represents a powerful new strategy for determining if an organism has been genetically modified as well as to define the nature of the modification. Fewer assumptions have to be made compared to methods currently in use and this is an advantage particularly when working with unknown GMOs.</p

    Length of hospital stay and risk of intensive care admission and in-hospital death among COVID-19 patients in Norway: a register-based cohort study comparing patients fully vaccinated with an mRNA vaccine to unvaccinated patients

    Get PDF
    Objectives We estimated the length of stay (LoS) in hospital and the intensive care unit (ICU) and risk of admission to ICU and in-hospital death among COVID-19 patients ≄18 years in Norway who had been fully vaccinated with an mRNA vaccine (at least two doses or one dose and previous SARS-CoV-2 infection), compared to unvaccinated patients. Methods Using national registry data, we analyzed SARS-CoV-2–positive patients hospitalized in Norway between 1 February and 30 November 2021, with COVID-19 as the main cause of hospitalization. We ran Cox proportional hazards models adjusting for vaccination status, age, sex, county of residence, regional health authority, date of admission, country of birth, virus variant, and underlying risk factors. Results We included 716 fully vaccinated patients (crude overall median LoS: 5.2 days; admitted to ICU: 103 (14%); in-hospital death: 86 (13%)) and 2487 unvaccinated patients (crude overall median LoS: 5.0 days; admitted to ICU: 480 (19%); in-hospital death: 102 (4%)). In adjusted models, fully vaccinated patients had a shorter overall LoS in hospital (adjusted log hazard ratios (aHR) for discharge: 1.61, 95% CI: 1.24–2.08), shorter LoS without ICU (aHR: 1.27, 95% CI: 1.07–1.52), and lower risk of ICU admission (aHR: 0.50, 95% CI: 0.37–0.69) compared to unvaccinated patients. We observed no difference in the LoS in ICU or in risk of in-hospital death between fully vaccinated and unvaccinated patients. Discussion Fully vaccinated patients hospitalized with COVID-19 in Norway have a shorter LoS and lower risk of ICU admission than unvaccinated patients. These findings can support patient management and ongoing capacity planning in hospitals.publishedVersio

    Reduced risk of hospitalisation among reported COVID-19 cases infected with the SARS-CoV-2 Omicron BA.1 variant compared with the Delta variant, Norway, December 2021 to January 2022

    Get PDF
    We included 39,524 COVID-19 Omicron and 51,481 Delta cases reported in Norway from December 2021 to January 2022. We estimated a 73% reduced risk of hospitalisation (adjusted hazard ratio: 0.27; 95% confidence interval: 0.20–0.36) for Omicron compared with Delta. Compared with unvaccinated groups, Omicron cases who had completed primary two-dose vaccination 7–179 days before diagnosis had a lower reduced risk than Delta (66% vs 93%). People vaccinated with three doses had a similar risk reduction (86% vs 88%).publishedVersio

    Modeling geographic vaccination strategies for COVID-19 in Norway.

    Get PDF
    Vaccination was a key intervention in controlling the COVID-19 pandemic globally. In early 2021, Norway faced significant regional variations in COVID-19 incidence and prevalence, with large differences in population density, necessitating efficient vaccine allocation to reduce infections and severe outcomes. This study explored alternative vaccination strategies to minimize health outcomes (infections, hospitalizations, ICU admissions, deaths) by varying regions prioritized, extra doses prioritized, and implementation start time. Using two models (individual-based and meta-population), we simulated COVID-19 transmission during the primary vaccination period in Norway, covering the first 7 months of 2021. We investigated alternative strategies to allocate more vaccine doses to regions with a higher force of infection. We also examined the robustness of our results and highlighted potential structural differences between the two models. Our findings suggest that early vaccine prioritization could reduce COVID-19 related health outcomes by 8% to 20% compared to a baseline strategy without geographic prioritization. For minimizing infections, hospitalizations, or ICU admissions, the best strategy was to initially allocate all available vaccine doses to fewer high-risk municipalities, comprising approximately one-fourth of the population. For minimizing deaths, a moderate level of geographic prioritization, with approximately one-third of the population receiving doubled doses, gave the best outcomes by balancing the trade-off between vaccinating younger people in high-risk areas and older people in low-risk areas. The actual strategy implemented in Norway was a two-step moderate level aimed at maintaining the balance and ensuring ethical considerations and public trust. However, it did not offer significant advantages over the baseline strategy without geographic prioritization. Earlier implementation of geographic prioritization could have more effectively addressed the main wave of infections, substantially reducing the national burden of the pandemic

    An Adjusted Likelihood Ratio Approach Analysing Distribution of Food Products to Assist the Investigation of Foodborne Outbreaks.

    No full text
    In order to facilitate foodborne outbreak investigations there is a need to improve the methods for identifying the food products that should be sampled for laboratory analysis. The aim of this study was to examine the applicability of a likelihood ratio approach previously developed on simulated data, to real outbreak data. We used human case and food product distribution data from the Norwegian enterohaemorrhagic Escherichia coli outbreak in 2006. The approach was adjusted to include time, space smoothing and to handle missing or misclassified information. The performance of the adjusted likelihood ratio approach on the data originating from the HUS outbreak and control data indicates that the adjusted approach is promising and indicates that the adjusted approach could be a useful tool to assist and facilitate the investigation of food borne outbreaks in the future if good traceability are available and implemented in the distribution chain. However, the approach needs to be further validated on other outbreak data and also including other food products than meat products in order to make a more general conclusion of the applicability of the developed approach
    corecore