260 research outputs found
Whole-genome sequencing for national surveillance of Shiga toxin–producing Escherichia coli O157
Background. National surveillance of gastrointestinal pathogens, such as Shiga toxin–producing Escherichia coli O157 (STEC O157), is key to rapidly identifying linked cases in the distributed food network to facilitate public health interventions. In this study, we used whole-genome sequencing (WGS) as a tool to inform national surveillance of STEC O157 in terms of identifying linked cases and clusters and guiding epidemiological investigation. Methods. We retrospectively analyzed 334 isolates randomly sampled from 1002 strains of STEC O157 received by the Gastrointestinal Bacteria Reference Unit at Public Health England, Colindale, in 2012. The genetic distance between each isolate, as estimated by WGS, was calculated and phylogenetic methods were used to place strains in an evolutionary context. Results. Estimates of linked clusters representing STEC O157 outbreaks in England and Wales increased by 2-fold when WGS was used instead of traditional typing techniques. The previously unidentified clusters were often widely geographically distributed and small in size. Phylogenetic analysis facilitated identification of temporally distinct cases sharing common exposures and delineating those that shared epidemiological and temporal links. Comparison with multi locus variable number tandem repeat analysis (MLVA) showed that although MLVA is as sensitive as WGS, WGS provides a more timely resolution to outbreak clustering. Conclusions. WGS has come of age as a molecular typing tool to inform national surveillance of STEC O157; it can be used in real time to provide the highest strain-level resolution for outbreak investigation. WGS allows linked cases to be identified with unprecedented specificity and sensitivity that will facilitate targeted and appropriate public health investigations
The Business Model as a technique for problem identification and scoping: a case study of Brazilian drinking water quality assessment sector
In this case study, a Business Model Canvas (BMC) was used as a technique for problem identification and scoping for the introduction of a new technology or methodology for water quality assessment. Therefore, information about the Brazilian water supply sector was used for the application of a BMC based on technological innovations for coliform analysis. The innovations proposed in the study include faster results, internet connection, and portability. To populate the model, data regarding the drinking water quality from Brazil were used from public data banks and reports. Also, a group of accountable representatives from diverse water supply systems and water quality laboratories reported their experience with the new coliform analysis and their perception of its technological improvement. The major gaps identified in this study were simplicity and faster results. These may be associated with technological improvements such as portability and internet connection. It was possible to conclude that the segment is diverse, and the BMC highlighted that value might differ for different niches. The results emphasized that the application of a BMC may be more than a business tool. It can also be used by developers or scientists to understand and improve both technology concepts and applications.
HIGHLIGHTS
A Business Model was applied to structure coliform analysis in drinking water.;
Characteristics of water supply systems influence value proposition priorities.;
Legal demands have influences in different levels and applications.;
Technological innovation and advantages alone may not fit customer needs.;
Information gathered reveals a technological gap in coliform analysis.
Nanotechnology Solutions for Global Water Challenges
The lack of clean and safe drinking water is responsible for more deaths than war, terrorism and weapons of mass destruction combined. This suggests contaminated water poses a significant threat to human health and welfare. In addition, standard water disinfection approaches such as sedimentation, filtration, and chemical or biological degradation are not fully capable of destroying emerging contaminants (e.g. pesticides, pharmaceutical waste products) or certain types of bacteria (e.g. Cryptosporidium parvum). Nanomaterials and nanotechnology based devices can potentially be employed to solve the challenges posed by various contaminants and microorganisms. Nanomaterials of different shapes, namely nanoparticles, nanotubes, nanowires and fibers have the ability to function as adsorbents and catalysts. These possess an expansive array of physicochemical characteristics deeming them highly attractive for the production of reactive media for water membrane filtration, a vital step in the production of potable water. As a result of their exceptional adsorptive capacity for water contaminants, graphene based nanomaterials have emerged as an area of significant importance in the area of membrane filtration and water treatment. In addition, Advanced Oxidation Processes (AOPs) together with or without sources of light irradiation or ultrasound, have been found to be promising alternatives for water treatment at near ambient temperature and pressure. Furthermore, the uses of visible light active titanium dioxide photocatalysts and photo-Fenton processes have shown significant potential for water purification. A wide variety of nanomaterial based sensors, for the monitoring of water quality, have also been reviewed in detail. In conclusion, the rapid and continued growth in the area of nanomaterial based devices offers significant hope for addressing future water quality challenges
The assessment and impact of sarcopenia in lung cancer: a systematic literature review
Objectives There is growing awareness of the relationship between sarcopenia (loss of muscle mass and function), and outcomes in cancer, making it a potential target for future therapies. In order to inform future research and practice, we undertook a systematic review of factors associated with loss of muscle mass, and the relationship between muscle function and muscle mass in lung cancer, a common condition associated with poor outcomes.
Design We conducted a computerised systematic literature search on five databases. Studies were included if they explored muscle mass as an outcome measure in patients with lung cancer, and were published in English.
Setting Secondary care.
Participants Patients with lung cancer.
Primary outcome Factors associated with loss of muscle mass and muscle function, or sarcopenia, and the clinical impact thereof in patients with lung cancer.
Results We reviewed 5726 citations, and 35 articles were selected for analysis. Sarcopenia, as defined by reduced muscle mass alone, was found to be very prevalent in patients with lung cancer, regardless of body mass index, and where present was associated with poorer functional status and overall survival. There were diverse studies exploring molecular and metabolic factors in the development of loss of muscle mass; however, the precise mechanisms that contribute to sarcopenia and cachexia remain uncertain. The effect of nutritional supplements and ATP infusions on muscle mass showed conflicting results. There are very limited data on the correlation between degree of sarcopenia and muscle function, which has a non-linear relationship in older non-cancer populations.
Conclusions Loss of muscle mass is a significant contributor to morbidity in patients with lung cancer. Loss of muscle mass and function may predate clinically overt cachexia, underlining the importance of evaluating sarcopenia, rather than weight loss alone. Understanding this relationship and its associated factors will provide opportunities for focused intervention to improve clinical outcomes
Palliative radiotherapy after oesophageal cancer stenting (ROCS): a multicentre, open-label, phase 3 randomised controlled trial
Background: patients with advanced oesophageal cancer have a median survival of 3-6 months, and most require intervention for dysphagia. Self-expanding metal stent (SEMS) insertion is the most typical form of palliation in these patients, but dysphagia deterioration and re-intervention are common. This study examined the efficacy of adjuvant external beam radiotherapy (EBRT) compared with usual care alone in preventing dysphagia deterioration and reducing service use after SEMS insertion.Methods: this was a multicentre, open-label, phase 3 randomised controlled trial based at cancer centres and acute care hospitals in England, Scotland, and Wales. Patients (aged ≥16 years) with incurable oesophageal carcinoma receiving stent insertion for primary management of dysphagia were randomly assigned (1:1) to receive usual care alone or EBRT (20 Gy in five fractions or 30 Gy in ten fractions) plus usual care after stent insertion. Usual care was implemented according to need as identified by the local multidisciplinary team (MDT). Randomisation was via the method of minimisation stratified by treating centre, stage at diagnosis (I-III vs IV), histology (squamous or non-squamous), and MDT intent to give chemotherapy (yes vs no). The primary outcome was difference in proportions of participants with dysphagia deterioration (>11 point decrease on patient-reported European Organisation for Research and Treatment of Cancer quality of life questionnaire-oesophagogastric module [QLQ-OG25], or a dysphagia-related event consistent with such a deterioration) or death by 12 weeks in a modified intention-to-treat (ITT) population, which excluded patients who did not have a stent inserted and those without a baseline QLQ-OG25 assessment. Secondary outcomes included survival, quality of life (QoL), morbidities (including time to first bleeding event or hospital admission for bleeding event and first dysphagia-related stent complications or re-intervention), and cost-effectiveness. Safety analysis was undertaken in the modified ITT population. The study is registered with the International Standard Randomised Controlled Trial registry, ISRCTN12376468, and ClinicalTrials.gov, NCT01915693, and is completed.Findings: 220 patients were randomly assigned between Dec 16, 2013, and Aug 24, 2018, from 23 UK centres. The modified ITT population (n=199) comprised 102 patients in the usual care group and 97 patients in the EBRT group. Radiotherapy did not reduce dysphagia deterioration, which was reported in 36 (49%) of 74 patients receiving usual care versus 34 (45%) of 75 receiving EBRT (adjusted odds ratio 0·82 [95% CI 0·40-1·68], p=0·59) in those with complete data for the primary endpoint. No significant difference was observed in overall survival: median overall survival was 19·7 weeks (95% CI 14·4-27·7) with usual care and 18·9 weeks (14·7-25·6) with EBRT (adjusted hazard ratio 1·06 [95% CI 0·78-1·45], p=0·70; n=199). Median time to first bleeding event or hospital admission for a bleeding event was 49·0 weeks (95% CI 33·3-not reached) with EBRT versus 65·9 weeks (52·7-not reached) with usual care (adjusted subhazard ratio 0·52 [95% CI 0·28-0·97], p=0·038; n=199). No time versus treatment interaction was observed for prespecified QoL outcomes. We found no evidence of differences between trial group in time to first stent complication or re-intervention event. The most common (grade 3-4) adverse event was fatigue, reported in 19 (19%) of 102 patients receiving usual care alone and 22 (23%) of 97 receiving EBRT. On cost-utility analysis, EBRT was more expensive and less efficacious than usual care.Interpretation: patients with advanced oesophageal cancer having SEMS insertion for the primary management of their dysphagia did not gain additional benefit from concurrent palliative radiotherapy and it should not be routinely offered. For a minority of patients clinically considered to be at high risk of tumour bleeding, concurrent palliative radiotherapy might reduce bleeding risk and the need for associated interventions.Funding: National Institute for Health Research Health Technology Assessment Programme.</p
- …