4,211 research outputs found

    Salt enhanced solvent relaxation and particle surface area determination via rapid spin-lattice NMR

    Get PDF
    This paper demonstrates the influence of surface charge chemistry on the application of nuclear magnetic relaxation measurements (NMR relaxometry) for the in situ determination of particle surface area, in the presence of high electrolyte concentration. Specifically, dispersions of titania, calcite and silica with and without 1 M KCl were investigated. The addition of salt, showed no significant change to relaxation measurements for titanium dioxide; however, a significant rate enhancement was observed for both calcite and silica systems. These differences were attributed to counterion layers forming as a result of the particles surface charge, leading to an increase in the relaxation rate of bound surface layer water. Further, changes appeared to be more pronounced in the silica systems, due to their larger charge. No enhancement was observed for titania, which was assumed to be due to the particles being at their isoelectric point, with no resulting counterion layer formation. Solvent relaxation was further used to successfully determine the surface area of particles in a dispersion using a silica standard reference material, with results compared to Brunauer-Emmett-Teller (BET) and spherical equivalent estimations. Two different dispersions of titanium dioxide, of different crystal phases, were shown to have NMR surface area measurements in good agreement with BET. Thus showing the technique was able to measure changes in surface charge when surface chemistry remained relatively similar, due to the reference silica material also being an oxide. In contrast, the NMR technique appeared to overestimate the calcite surface areas in reference to BET, which was assumed to occur due to both better dispersion in the liquid state of nanocrystallites and potential ion enhancement from the solubility of the calcite. These results highlight the potential of this technique as a fast, non-destructive and non-invasive method for dispersion analysis, but also show the competition between surface area and surface chemistry interactions on measured relaxation rates

    Switching middle school teachers onto STEM using a pedagogical framework for technology integration: The case for High Possibility Classrooms in Australia

    Full text link
    Education in STEM (Science, Technology, Engineering and Mathematics) is a significant issue for governments and organizations across the world as concerns are expressed about students’ lack of progress in these areas. In Australia, middle school teachers’ capacity and confidence in teaching the STEM disciplines has been identified as wanting. The paper draws on findings from a study that used a pedagogical framework for technology enhanced learning to develop integrated STEM units of work. Analysis of the findings illustrates that the High Possibility Classrooms framework builds teacher agency in STEM and that being involved in professional development conducted, as a research experience is beneficial. The paper argues for greater teacher professional development resourcing in schools to make STEM an education priority, and it concludes by recommending that more middle school teachers consider pedagogical scaffolds to integrate curriculum and enhance their professional knowledge in STEM

    Case study: Technology-enhanced learning in High Possibility Classrooms in Australian schools

    Full text link
    Research conducted in the classrooms of exemplary teachers in Australian schools is published as a collection of case studies in a new book on technology-enhanced learning. Understanding what makes an effective case study for practitioners to reflect upon to change classroom teaching is important. In doctoral research that inspired the assemblage of case studies in the book, an additional process of cross-case analysis was used to bring participants together for deeper understanding of the study phenomena. An all-day workshop held at the conclusion of the data-gathering period allowed participants to not only meet each other for the first time, they also had opportunities to discuss, interpret, and analyze case summaries prepared by the researcher prior to writing the final case narratives. Carefully prepared case summaries add another layer of understanding to research findings, and it is necessary in organizing published exemplar case studies of teachers' pedagogical practices in schools. In this moment, participants in a study who often worked in isolation within their own contexts, reflected and drew comfort from understanding how other “tech-savvy teachers” worked in both similar and different ways when they finally came together in the workshop. This case study pays attention to the usual processes in case study methods but also demonstrates how validity and reliability in analysis using member-checks, software for staged coding, and a “collective member check” in the format of a day-long workshop supports building a rich picture of the phenomenon studied

    Infrared composition of the Large Magellanic Cloud

    Get PDF
    The evolution of galaxies and the history of star formation in the Universe are among the most important topics in today's astrophysics. Especially, the role of small, irregular galaxies in the star-formation history of the Universe is not yet clear. Using the data from the AKARI IRC survey of the Large Magellanic Cloud at 3.2, 7, 11, 15, and 24 {\mu}m wavelengths, i.e., at the mid- and near-infrared, we have constructed a multiwavelength catalog containing data from a cross-correlation with a number of other databases at different wavelengths. We present the separation of different classes of stars in the LMC in color-color, and color-magnitude, diagrams, and analyze their contribution to the total LMC flux, related to point sources at different infrared wavelengths

    Wide variation in susceptibility of transmitted/founder HIV-1 subtype C Isolates to protease inhibitors and association with in vitro replication efficiency

    Get PDF
    © 2016 The Author(s).The gag gene is highly polymorphic across HIV-1 subtypes and contributes to susceptibility to protease inhibitors (PI), a critical class of antiretrovirals that will be used in up to 2 million individuals as second-line therapy in sub Saharan Africa by 2020. Given subtype C represents around half of all HIV-1 infections globally, we examined PI susceptibility in subtype C viruses from treatment-naïve individuals. PI susceptibility was measured in a single round infection assay of full-length, replication competent MJ4/gag chimeric viruses, encoding the gag gene and 142 nucleotides of pro derived from viruses in 20 patients in the Zambia-Emory HIV Research Project acute infection cohort. Ten-fold variation in susceptibility to PIs atazanavir and lopinavir was observed across 20 viruses, with EC50 s ranging 0.71-6.95 nM for atazanvir and 0.64-8.54 nM for lopinavir. Ten amino acid residues in Gag correlated with lopinavir EC50 (p < 0.01), of which 380 K and 389I showed modest impacts on in vitro drug susceptibility. Finally a significant relationship between drug susceptibility and replication capacity was observed for atazanavir and lopinavir but not darunavir. Our findings demonstrate large variation in susceptibility of PI-naïve subtype C viruses that appears to correlate with replication efficiency and could impact clinical outcomes

    Understanding innovators' experiences of barriers and facilitators in implementation and diffusion of healthcare service innovations: A qualitative study

    Get PDF
    This article is made available through the Brunel Open Access Publishing Fund - Copyright @ 2011 Barnett et al.Background: Healthcare service innovations are considered to play a pivotal role in improving organisational efficiency and responding effectively to healthcare needs. Nevertheless, healthcare organisations encounter major difficulties in sustaining and diffusing innovations, especially those which concern the organisation and delivery of healthcare services. The purpose of the present study was to explore how healthcare innovators of process-based initiatives perceived and made sense of factors that either facilitated or obstructed the innovation implementation and diffusion. Methods: A qualitative study was designed. Fifteen primary and secondary healthcare organisations in the UK, which had received health service awards for successfully generating and implementing service innovations, were studied. In-depth, semi structured interviews were conducted with the organisational representatives who conceived and led the development process. The data were recorded, transcribed and thematically analysed. Results: Four main themes were identified in the analysis of the data: the role of evidence, the function of inter-organisational partnerships, the influence of human-based resources, and the impact of contextual factors. "Hard" evidence operated as a proof of effectiveness, a means of dissemination and a pre-requisite for the initiation of innovation. Inter-organisational partnerships and people-based resources, such as champions, were considered an integral part of the process of developing, establishing and diffusing the innovations. Finally, contextual influences, both intra-organisational and extra-organisational were seen as critical in either impeding or facilitating innovators' efforts. Conclusions: A range of factors of different combinations and co-occurrence were pointed out by the innovators as they were reflecting on their experiences of implementing, stabilising and diffusing novel service initiatives. Even though the innovations studied were of various contents and originated from diverse organisational contexts, innovators' accounts converged to the significant role of the evidential base of success, the inter-personal and inter-organisational networks, and the inner and outer context. The innovators, operating themselves as important champions and being often willing to lead constructive efforts of implementation to different contexts, can contribute to the promulgation and spread of the novelties significantly.This research was supported financially by the Multidisciplinary Assessment of Technology Centre for Healthcare (MATCH)

    A meta-analytic review of stand-alone interventions to improve body image

    Get PDF
    Objective Numerous stand-alone interventions to improve body image have been developed. The present review used meta-analysis to estimate the effectiveness of such interventions, and to identify the specific change techniques that lead to improvement in body image. Methods The inclusion criteria were that (a) the intervention was stand-alone (i.e., solely focused on improving body image), (b) a control group was used, (c) participants were randomly assigned to conditions, and (d) at least one pretest and one posttest measure of body image was taken. Effect sizes were meta-analysed and moderator analyses were conducted. A taxonomy of 48 change techniques used in interventions targeted at body image was developed; all interventions were coded using this taxonomy. Results The literature search identified 62 tests of interventions (N = 3,846). Interventions produced a small-to-medium improvement in body image (d+ = 0.38), a small-to-medium reduction in beauty ideal internalisation (d+ = -0.37), and a large reduction in social comparison tendencies (d+ = -0.72). However, the effect size for body image was inflated by bias both within and across studies, and was reliable but of small magnitude once corrections for bias were applied. Effect sizes for the other outcomes were no longer reliable once corrections for bias were applied. Several features of the sample, intervention, and methodology moderated intervention effects. Twelve change techniques were associated with improvements in body image, and three techniques were contra-indicated. Conclusions The findings show that interventions engender only small improvements in body image, and underline the need for large-scale, high-quality trials in this area. The review identifies effective techniques that could be deployed in future interventions

    Long term time variability of cosmic rays and possible relevance to the development of life on Earth

    Full text link
    An analysis is made of the manner in which the cosmic ray intensity at Earth has varied over its existence and its possible relevance to both the origin and the evolution of life. Much of the analysis relates to the 'high energy' cosmic rays (E>1014eV;=0.1PeVE>10^{14}eV;=0.1PeV) and their variability due to the changing proximity of the solar system to supernova remnants which are generally believed to be responsible for most cosmic rays up to PeV energies. It is pointed out that, on a statistical basis, there will have been considerable variations in the likely 100 My between the Earth's biosphere reaching reasonable stability and the onset of very elementary life. Interestingly, there is the increasingly strong possibility that PeV cosmic rays are responsible for the initiation of terrestrial lightning strokes and the possibility arises of considerable increases in the frequency of lightnings and thereby the formation of some of the complex molecules which are the 'building blocks of life'. Attention is also given to the well known generation of the oxides of nitrogen by lightning strokes which are poisonous to animal life but helpful to plant growth; here, too, the violent swings of cosmic ray intensities may have had relevance to evolutionary changes. A particular variant of the cosmic ray acceleration model, put forward by us, predicts an increase in lightning rate in the past and this has been sought in Korean historical records. Finally, the time dependence of the overall cosmic ray intensity, which manifests itself mainly at sub-10 GeV energies, has been examined. The relevance of cosmic rays to the 'global electrical circuit' points to the importance of this concept.Comment: 18 pages, 5 figures, accepted by 'Surveys in Geophysics

    Stem cell differentiation increases membrane-actin adhesion regulating cell blebability, migration and mechanics

    Get PDF
    This work is licensed under a Creative Commons Attribution 4.0 International License. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in the credit line; if the material is not included under the Creative Commons license, users will need to obtain permission from the license holder in order to reproduce the material. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/K. S. is funded by an EPSRC PhD studentship. S.T. is funded by an EU Marie Curie Intra European Fellowship (GENOMICDIFF)

    Diffuse Gamma Rays: Galactic and Extragalactic Diffuse Emission

    Full text link
    "Diffuse" gamma rays consist of several components: truly diffuse emission from the interstellar medium, the extragalactic background, whose origin is not firmly established yet, and the contribution from unresolved and faint Galactic point sources. One approach to unravel these components is to study the diffuse emission from the interstellar medium, which traces the interactions of high energy particles with interstellar gas and radiation fields. Because of its origin such emission is potentially able to reveal much about the sources and propagation of cosmic rays. The extragalactic background, if reliably determined, can be used in cosmological and blazar studies. Studying the derived "average" spectrum of faint Galactic sources may be able to give a clue to the nature of the emitting objects.Comment: 32 pages, 28 figures, kapproc.cls. Chapter to the book "Cosmic Gamma-Ray Sources," to be published by Kluwer ASSL Series, Edited by K. S. Cheng and G. E. Romero. More details can be found at http://www.gamma.mpe-garching.mpg.de/~aws/aws.htm
    corecore