18 research outputs found

    Emerging risks from ballast water treatment: The run-up to the International Ballast Water Management Convention

    Get PDF
    AbstractUptake and discharge of ballast water by ocean-going ships contribute to the worldwide spread of aquatic invasive species, with negative impacts on the environment, economies, and public health. The International Ballast Water Management Convention aims at a global answer. The agreed standards for ballast water discharge will require ballast water treatment. Systems based on various physical and/or chemical methods were developed for on-board installation and approved by the International Maritime Organization. Most common are combinations of high-performance filters with oxidizing chemicals or UV radiation. A well-known problem of oxidative water treatment is the formation of disinfection by-products, many of which show genotoxicity, carcinogenicity, or other long-term toxicity. In natural biota, genetic damages can affect reproductive success and ultimately impact biodiversity. The future exposure towards chemicals from ballast water treatment can only be estimated, based on land-based testing of treatment systems, mathematical models, and exposure scenarios. Systematic studies on the chemistry of oxidants in seawater are lacking, as are data about the background levels of disinfection by-products in the oceans and strategies for monitoring future developments. The international approval procedure of ballast water treatment systems compares the estimated exposure levels of individual substances with their experimental toxicity. While well established in many substance regulations, this approach is also criticised for its simplification, which may disregard critical aspects such as multiple exposures and long-term sub-lethal effects. Moreover, a truly holistic sustainability assessment would need to take into account factors beyond chemical hazards, e.g. energy consumption, air pollution or waste generation

    Measuring cognitive impairment and monitoring cognitive decline in Huntington's disease:a comparison of assessment instruments

    Get PDF
    Background Progressive cognitive decline is an inevitable feature of Huntington’s disease (HD) but specific criteria and instruments are still insufficiently developed to reliably classify patients into categories of cognitive severity and to monitor the progression of cognitive impairment. Methods We collected data from a cohort of 180 positive gene-carriers: 33 with premanifest HD and 147 with manifest HD. Using a specifically developed gold-standard for cognitive status we classified participants into those with normal cognition, those with mild cognitive impairment, and those with dementia. We administered the Parkinson’s Disease-Cognitive Rating Scale (PD-CRS), the MMSE and the UHDRS cogscore at baseline, and at 6-month and 12-month follow-up visits. Cutoff scores discriminating between the three cognitive categories were calculated for each instrument. For each cognitive group and instrument we addressed cognitive progression, sensitivity to change, and the minimally clinical important difference corresponding to conversion from one category to another. Results The PD-CRS cutoff scores for MCI and dementia showed excellent sensitivity and specificity ratios that were not achieved with the other instruments. Throughout follow-up, in all cognitive groups, PD-CRS captured the rate of conversion from one cognitive category to another and also the different patterns in terms of cognitive trajectories. Conclusion The PD-CRS is a valid and reliable instrument to capture MCI and dementia syndromes in HD. It captures the different trajectories of cognitive progression as a function of cognitive status and shows sensitivity to change in MCI and dementia

    Anticipatory Regulation of Action Control in a Simon Task: Behavioral, Electrophysiological, and fMRI Correlates

    Get PDF
    With the present study we investigated cue-induced preparation in a Simon task and measured electroencephalogram and functional magnetic resonance imaging (fMRI) data in two within-subjects sessions. Cues informed either about the upcoming (1) spatial stimulus-response compatibility (rule cues), or (2) the stimulus location (position cues), or (3) were non-informative. Only rule cues allowed anticipating the upcoming compatibility condition. Position cues allowed anticipation of the upcoming location of the Simon stimulus but not its compatibility condition. Rule cues elicited fastest and most accurate performance for both compatible and incompatible trials. The contingent negative variation (CNV) in the event-related potential (ERP) of the cue-target interval is an index of anticipatory preparation and was magnified after rule cues. The N2 in the post-target ERP as a measure of online action control was reduced in Simon trials after rule cues. Although compatible trials were faster than incompatible trials in all cue conditions only non-informative cues revealed a compatibility effect in additional indicators of Simon task conflict like accuracy and the N2. We thus conclude that rule cues induced anticipatory re-coding of the Simon task that did not involve cognitive conflict anymore. fMRI revealed that rule cues yielded more activation of the left rostral, dorsal, and ventral prefrontal cortex as well as the pre-SMA as compared to POS and NON-cues. Pre-SMA and ventrolateral prefrontal activation after rule cues correlated with the effective use of rule cues in behavioral performance. Position cues induced a smaller CNV effect and exhibited less prefrontal and pre-SMA contributions in fMRI. Our data point to the importance to disentangle different anticipatory adjustments that might also include the prevention of upcoming conflict via task re-coding.Peer Reviewe

    Evaluation of the DBP formation potential of biocides and identification of knowledge gaps in environmental risk assessment

    No full text
    Abstract Disinfectants and preservatives used as biocides may contain or release active substances (a.s.) that can form by-products with the surrounding matrices during their application which may be released into the environment. Over the past 40 years, several hundred of these so-called disinfection by-products (DBPs) have been detected after applications of biocides used for disinfection. Due to intensive research and further development of analytical capabilities, many new DBP classes, such as iodinated DBPs (I-DBPs), halonitromethanes (HNMs), haloacetamides (HaAms), or halomethanesulfonic acids were detected worldwide in various matrices and applications. Due to the possible hazards and risks for humans and the environment, frequently occurring DBP classes, such as trihalomethanes (THM), haloacetic acids (HAA) and nitrosamines (NDMA), have already been included in many legislations and given limit values. In the European Union, biocides are assessed under the Biocidal Products Regulation 528/2012 (BPR) regarding their efficacy, potential hazards, and risks to human health and the environment. However, the available guidance for the environmental risk assessment (ERA) of DBPs remains vague. To identify knowledge gaps and to further develop the assessment scheme for the ERA of DBPs, a literature search on the multiple uses of biocides and their formation potential of DBPs was performed and the existing process for ERA was evaluated. The results show knowledge gaps on the formation of DBP in non-aqueous systems and DBP formation by non-halogen-based biocidal active substances. Based on the literature research on biocides, a possible proposal of grouping a.s. to consider their DBP formation potential is presented to simplify future ERAs. However, this also requires further research. Until then, a pragmatic approach considering the DBPs formation potential of the active substances and the identified knowledge gaps need to be established for the environmental risk assessment of DBPs in the EU. Graphical Abstrac

    Specificity of fast perceptual learning in shape localisation tasks based on detection versus form discrimination

    Get PDF
    AbstractPerceptual learning is defined as a long-lasting improvement of perception as a result of experience. Here we examined the role of task on fast perceptual learning for shape localisation either in simple detection or based on form discrimination in different visual submodalities, using identical stimulus position and stimulus types for both tasks. Thresholds for each submodality were identified by four-alternative-forced-choice tasks. Fast perceptual learning occurred for shape detection-based on luminance, motion and color differences but not for texture differences. In contradistinction, fast perceptual learning was not evident in shape localisation based on discrimination. Thresholds of all submodalities were stable across days. Fast perceptual learning seems to differ not only between different visual submodalities, but also across different tasks within the same visual submodality

    Natural history of Krabbe disease - a nationwide study in Germany using clinical and MRI data

    Get PDF
    Abstract Background Krabbe disease or globoid cell leukodystrophy is a severe neurodegenerative disorder caused by a defect in the GALC gene leading to a deficiency of the enzyme ß-galactocerebrosidase. The aim of this work was to describe the natural disease course covering the whole spectrum of the disease. Methods Natural history data were collected with a standardized questionnaire, supplemented by medical record data. We defined different forms of the disease according to Abdelhalim et al. (2014). Developmental and disease trajectories were described based on the acquisition and loss of milestones as well as the time of first clearly identifiable symptoms and needs such as spasticity, seizures and tube feeding. MRI was assessed using the scoring system by Loes et al. (1999) and in addition a pattern recognition approach, based on Abdelhalim et al. (2014). Results Thirty-eight patients were identified, from 27 of these patients 40 MRIs were available; 30 (79%) had an infantile onset, showing first symptoms in their first year of life, almost all (27 out of 30) starting in the first six months. A later onset after the first year of life was observed in 8 patients (21%, range 18 months to 60 years). Irritability, abnormalities in movement pattern as well as general developmental regression were the first symptoms in the infantile group; disease course was severe with rapid progression, e.g. loss of visual fixation, need for tube feeding and then an early death. Gait disorders were the first symptoms in all patients of the later onset groups; progression was variable. The different forms of the disease were characterized by different MRI patterns (infantile: diffuse white matter involvement and cerebellar structures specifically affected, later onset: parieto-occipital white matter and splenium affected, adult: motor tracts specifically affected). Conclusion This is the first description of the natural history of Krabbe disease in a larger European cohort using developmental, clinical and MRI data. We would like to highlight the very different clinical and MRI characteristics of the later onset forms. These data are important for counselling affected patients and families and may serve as a basis for future treatment trials
    corecore