2,179 research outputs found

    "This town can't be that harmful": risk perception of lead exposure

    Get PDF
    Introduction: Much of the focus of lead risk health campaigns has been on vulnerable populations such as children and pregnant women, thus not communicating the risk of exposure for other adults. This becomes a particular issue for adults who are long-term residents of communities near lead mines as they can be at an increased risk of exposure to lead within their environment. As such, this study investigated the perceived risk of exposure to lead in residents of a lead mining community. Methods: Semi-structured interviews were held with 20 (3 male, 17 female) residents from a community in close proximity to a lead mine and were recruited through the community media and local organizations. Common themes were identified through an interpretative phenomenological analytical framework providing an in depth examination of the lived experiences of participants. Results: Majority of the participants did not perceive a health risk from exposure to lead. Those who reported a specific concern surrounding their exposure to lead had lived within the community for less than five years. However, it was commonly noted that the behaviors to control residents' exposure to lead were easily performed and low cost. Conclusions: These results suggest that residents of a community chronically exposed to lead seem to become complacent about their risk for poor health outcomes the longer they live within the community. These findings have implications for the communication of the risk of exposure to lead for adults who are chronically exposed to lead

    Identification of major factors influencing ELISpot-based monitoring of cellular responses to antigens from mycobacterium tuberculosis

    Get PDF
    A number of different interferon-c ELISpot protocols are in use in laboratories studying antigen-specific immune responses. It is therefore unclear how results from different assays compare, and what factors most significantly influence assay outcome. One such difference is that some laboratories use a short in vitro stimulation period of cells before they are transferred to the ELISpot plate; this is commonly done in the case of frozen cells, in order to enhance assay sensitivity. Other differences that may be significant include antibody coating of plates, the use of media with or without serum, the serum source and the number of cells added to the wells. The aim of this paper was to identify which components of the different ELISpot protocols influenced assay sensitivity and inter-laboratory variation. Four laboratories provided protocols for quantifying numbers of interferon-c spot forming cells in human peripheral blood mononuclear cells stimulated with Mycobacterium tuberculosis derived antigens. The differences in the protocols were compared directly. We found that several sources of variation in assay protocols can be eliminated, for example by avoiding serum supplementation and using AIM-V serum free medium. In addition, the number of cells added to ELISpot wells should also be standardised. Importantly, delays in peripheral blood mononuclear cell processing before stimulation had a marked effect on the number of detectable spot forming cells; processing delay thus should be minimised as well as standardised. Finally, a pre-stimulation culture period improved the sensitivity of the assay, however this effect may be both antigen and donor dependent. In conclusion, small differences in ELISpot protocols in routine use can affect the results obtained and care should be given to conditions selected for use in a given study. A pre-stimulation step may improve the sensitivity of the assay, particularly when cells have been previously frozen

    Mapping Learning and Game Mechanics for Serious Games Analysis in Engineering Education

    Get PDF
    In a world where students are increasing digitally tethered to powerful, ‘always on’ mobile devices, new models of engagement and approaches to teaching and learning are required from educators. Serious Games (SG) have proved to have instructional potential but there is still a lack of methodologies and tools not only for their design but also to support game analysis and assessment. This paper explores the use of SG to increase student engagement and retention. The development phase of the Circuit Warz game is presented to demonstrate how electronic engineering education can be radically reimagined to create immersive, highly engaging learning experiences that are problem-centered and pedagogically sound. The Learning Mechanics–Game Mechanics (LM-GM) framework for SG game analysis is introduced and its practical use in an educational game design scenario is shown as a case study

    Low knowledge and awareness of monoclonal gammopathy of undetermined significance (MGUS) among general practitioners

    Get PDF
    Acknowledgements The authors would like to take this opportunity to thank the organisers of the WONCA Europe 2017 conference and the General Practitioners/Trainee’s for participating in this study. Funding At the time of writing, Dr Charlene McShane was in receipt of a Cancer Research UK Population Science Postdoctoral Research Fellowship (C51094/A18267)Peer reviewedPublisher PD

    Criteria for the use of omics-based predictors in clinical trials.

    Get PDF
    The US National Cancer Institute (NCI), in collaboration with scientists representing multiple areas of expertise relevant to 'omics'-based test development, has developed a checklist of criteria that can be used to determine the readiness of omics-based tests for guiding patient care in clinical trials. The checklist criteria cover issues relating to specimens, assays, mathematical modelling, clinical trial design, and ethical, legal and regulatory aspects. Funding bodies and journals are encouraged to consider the checklist, which they may find useful for assessing study quality and evidence strength. The checklist will be used to evaluate proposals for NCI-sponsored clinical trials in which omics tests will be used to guide therapy

    Age of second language acquisition affects nonverbal conflict processing in children : an fMRI study

    Get PDF
    Background: In their daily communication, bilinguals switch between two languages, a process that involves the selection of a target language and minimization of interference from a nontarget language. Previous studies have uncovered the neural structure in bilinguals and the activation patterns associated with performing verbal conflict tasks. One question that remains, however is whether this extra verbal switching affects brain function during nonverbal conflict tasks. Methods: In this study, we have used fMRI to investigate the impact of bilingualism in children performing two nonverbal tasks involving stimulus-stimulus and stimulus-response conflicts. Three groups of 8-11-year-old children - bilinguals from birth (2L1), second language learners (L2L), and a control group of monolinguals (1L1) - were scanned while performing a color Simon and a numerical Stroop task. Reaction times and accuracy were logged. Results: Compared to monolingual controls, bilingual children showed higher behavioral congruency effect of these tasks, which is matched by the recruitment of brain regions that are generally used in general cognitive control, language processing or to solve language conflict situations in bilinguals (caudate nucleus, posterior cingulate gyrus, STG, precuneus). Further, the activation of these areas was found to be higher in 2L1 compared to L2L. Conclusion: The coupling of longer reaction times to the recruitment of extra language-related brain areas supports the hypothesis that when dealing with language conflicts the specialization of bilinguals hampers the way they can process with nonverbal conflicts, at least at early stages in life
    • 

    corecore