10 research outputs found

    Towards a novel small animal proton irradiation platform: the SIRMIO project

    Get PDF
    Background: Precision small animal radiotherapy research is a young emerging field aiming to provide new experimental insights into tumor and normal tissue models in different microenvironments, to unravel complex mechanisms of radiation damage in target and non-target tissues and assess efficacy of novel therapeutic strategies. For photon therapy, modern small animal radiotherapy research platforms have been developed over the last years and are meanwhile commercially available. Conversely, for proton therapy, which holds potential for an even superior outcome than photon therapy, no commercial system exists yet. Material and methods: The project SIRMIO (Small Animal Proton Irradiator for Research in Molecular Image-guided Radiation-Oncology) aims at realizing and demonstrating an innovative portable prototype system for precision image-guided small animal proton irradiation, suitable for installation at existing clinical treatment facilities. The proposed design combines precise dose application with in-situ multi-modal anatomical image guidance and in-vivo verification of the actual treatment delivery. Results and conclusions: This manuscript describes the status of the different components under development, featuring a dedicated beamline for degradation and focusing of clinical proton beams, along with novel detector systems for in-situ imaging and range verification. The foreseen workflow includes pre-treatment proton transmission imaging, complemented by ultrasonic tumor localization, for treatment planning and position verification, followed by image-guided delivery with on-site range verification by means of ionoacoustics (for pulsed beams) and positron-emission-tomography (PET, for continuous beams). The proposed compact and cost-effective system promises to open a new era in small animal proton therapy research, contributing to the basic understanding of in-vivo radiation action to identify areas of potential breakthroughs for future translation into innovative clinical strategies

    Implications of storing urinary DNA from different populations for molecular analyses.

    Get PDF
    Molecular diagnosis using urine is established for many sexually transmitted diseases and is increasingly used to diagnose tumours and other infectious diseases. Storage of urine prior to analysis, whether due to home collection or bio-banking, is increasingly advocated yet no best practice has emerged. Here, we examined the stability of DNA in stored urine in two populations over 28 days

    Implications of Storing Urinary DNA from Different Populations for Molecular Analyses

    Get PDF
    Molecular diagnosis using urine is established for many sexually transmitted diseases and is increasingly used to diagnose tumours and other infectious diseases. Storage of urine prior to analysis, whether due to home collection or bio-banking, is increasingly advocated yet no best practice has emerged. Here, we examined the stability of DNA in stored urine in two populations over 28 days.Urine from 40 (20 male) healthy volunteers from two populations, Italy and Zambia, was stored at four different temperatures (RT, 4 degrees C, -20 degrees C & -80 degrees C) with and without EDTA preservative solution. Urines were extracted at days 0, 1, 3, 7 and 28 after storage. Human DNA content was measured using multi-copy (ALU J) and single copy (TLR2) targets by quantitative real-time PCR. Zambian and Italian samples contained comparable DNA quantity at time zero. Generally, two trends were observed during storage; no degradation, or rapid degradation from days 0 to 7 followed by little further degradation to 28 days. The biphasic degradation was always observed in Zambia regardless of storage conditions, but only twice in Italy.Site-specific differences in urine composition significantly affect the stability of DNA during storage. Assessing the quality of stored urine for molecular analysis, by using the type of strategy described here, is paramount before these samples are used for molecular prognostic monitoring, genetic analyses and disease diagnosis

    Oral ketamine: A four-years experience in a tumour clinic in Lusaka Zambia

    No full text
    Pain management is an important component in cancer management. The administration of painful injections to children in an oncology clinic can create difficulties. This study was undertaken to determine the role of oral ketamine to modify the response to pain. Between 1996 and 1999,6324 patients attended a tumour clinic in a developing country teaching hospital. Forty eight children required cytotoxic injections on 103 occasions. These children were subdivided into 3 groups according to the year of attendance: 1996,1997 and 1998/8. each group was premeded differently. The first group received ketamine 4.5mg/Kg; the second received ketamine Gmg/Kg and the third had ketamine 6mg/Kg with diazepam O.lmg/Kg. The  esponse to pain in each group was evaluated by using an observer based scoring system. The visual analogue scale was not used. The study showed that oral ketamine is an effective and safe drug for use in a clinic setting. However, its action was not always predictable due to a number of confounding factors. A phenothiazine should be routinely used in these children to enhance the effectiveness of ketamine and to diminish the likelihood of its well-known side effects. Further studies using less costly lower doses of ketamine is recommended. Key words: Oral Ketamine, Premedication and Oncology

    Local perceptions of cholera and anticipated vaccine acceptance in Katanga province, Democratic Republic of Congo

    Get PDF
    ABSTRACT: BACKGROUND: In regions where access to clean water and the provision of a sanitary infrastructure has not been sustainable, cholera continues to pose an important public health burden. Although oral cholera vaccines (OCV) are effective means to complement classical cholera control efforts, still relatively little is known about their acceptability in targeted communities. Clarification of vaccine acceptability prior to the introduction of a new vaccine provides important information for future policy and planning. METHODS: In a cross-sectional study in Katanga province, Democratic Republic of Congo (DRC), local perceptions of cholera and anticipated acceptance of an OCV were investigated. A random sample of 360 unaffected adults from a rural town and a remote fishing island was interviewed in 2010. In-depth interviews with a purposive sample of key informants and focus-group discussions provided contextual information. Socio-cultural determinants of anticipated OCV acceptance were assessed with logistic regression. RESULTS: Most respondents perceived contaminated water (63%) and food (61%) as main causes of cholera. Vaccines (28%), health education (18%) and the provision of clean water (15%) were considered the most effective measures of cholera control. Anticipated acceptance reached 97% if an OCV would be provided for free. Cholera-specific knowledge of hygiene and self-help in form of praying for healing were positively associated with anticipated OCV acceptance if costs of USD 5 were assumed. Conversely, respondents who feared negative social implications of cholera were less likely to anticipate acceptance of OCVs. These fears were especially prominent among respondents who generated their income through fishing. With an increase of assumed costs to USD 10.5, fear of financial constraints was negatively associated as well. CONCLUSIONS: Results suggest a high motivation to use an OCV as long as it seems affordable. The needs of socially marginalized groups such as fishermen may have to be explicitly addressed when preparing for a mass vaccination campaig

    A Review of Classification Algorithms for EEG-based Brain-Computer Interfaces: A 10-year Update

    Get PDF
    International audienceObjective: Most current Electroencephalography (EEG)-based Brain-Computer Interfaces (BCIs) are based on machine learning algorithms. There is a large diversity of classifier types that are used in this field, as described in our 2007 review paper. Now, approximately 10 years after this review publication, many new algorithms have been developed and tested to classify EEG signals in BCIs. The time is therefore ripe for an updated review of EEG classification algorithms for BCIs. Approach: We surveyed the BCI and machine learning literature from 2007 to 2017 to identify the new classification approaches that have been investigated to design BCIs. We synthesize these studies in order to present such algorithms, to report how they were used for BCIs, what were the outcomes, and to identify their pros and cons. Main results: We found that the recently designed classification algorithms for EEG-based BCIs can be divided into four main categories: adaptive classifiers, matrix and tensor classifiers, transfer learning and deep learning, plus a few other miscellaneous classifiers. Among these, adaptive classifiers were demonstrated to be generally superior to static ones, even with unsupervised adaptation. Transfer learning can also prove useful although the benefits of transfer learning remain unpredictable. Riemannian geometry-based methods have reached state-of-the-art performances on multiple BCI problems and deserve to be explored more thoroughly, along with tensor-based methods. Shrinkage linear discriminant analysis and random forests also appear particularly useful for small training samples settings. On the other hand, deep learning methods have not yet shown convincing improvement over state-of-the-art BCI methods. Significance: This paper provides a comprehensive overview of the modern classification algorithms used in EEG-based BCIs, presents the principles of these Review of Classification Algorithms for EEG-based BCI 2 methods and guidelines on when and how to use them. It also identifies a number of challenges to further advance EEG classification in BCI
    corecore