2,018 research outputs found

    Use of strategies to improve retention in primary care randomised trials: a qualitative study with in-depth interviews

    Get PDF
    Objective To explore the strategies used to improve retention in primary care randomised trials.<p></p> Design Qualitative in-depth interviews and thematic analysis.<p></p> Participants 29 UK primary care chief and principal investigators, trial managers and research nurses.<p></p> Methods In-depth face-to-face interviews.<p></p> Results Primary care researchers use incentive and communication strategies to improve retention in trials, but were unsure of their effect. Small monetary incentives were used to increase response to postal questionnaires. Non-monetary incentives were used although there was scepticism about the impact of these on retention. Nurses routinely used telephone communication to encourage participants to return for trial follow-up. Trial managers used first class post, shorter questionnaires and improved questionnaire designs with the aim of improving questionnaire response. Interviewees thought an open trial design could lead to biased results and were negative about using behavioural strategies to improve retention. There was consensus among the interviewees that effective communication and rapport with participants, participant altruism, respect for participant's time, flexibility of trial personnel and appointment schedules and trial information improve retention. Interviewees noted particular challenges with retention in mental health trials and those involving teenagers.<p></p> Conclusions The findings of this qualitative study have allowed us to reflect on research practice around retention and highlight a gap between such practice and current evidence. Interviewees describe acting from experience without evidence from the literature, which supports the use of small monetary incentives to improve the questionnaire response. No such evidence exists for non-monetary incentives or first class post, use of which may need reconsideration. An exploration of barriers and facilitators to retention in other research contexts may be justified.<p></p&gt

    Meta-analytical methods to identify who benefits most from treatments: daft, deluded, or deft approach?

    Get PDF
    Identifying which individuals beneft most from particular treatments or other interventions underpins so-called personalised or stratifed medicine. However, single trials are typically underpowered for exploring whether participant characteristics, such as age or disease severity, determine an individual's response to treatment. A meta-analysis of multiple trials, particularly one where individual participant data (IPD) are available, provides greater power to investigate interactions between participant characteristics (covariates) and treatment e?ects. We use a published IPD meta-analysis to illustrate three broad approaches used for testing such interactions. Based on another systematic review of recently published IPD meta-analyses, we also show that all three approaches can be applied to aggregate data as well as IPD. We also summarise which methods of analysing and presenting interactions are in current use, and describe their advantages and disadvantages. We recommend that testing for interactions using within-trials information alone (the def approach) becomes standard practice, alongside graphical presentation that directly visualises this

    Bayesian Parameter Estimation for Latent Markov Random Fields and Social Networks

    Get PDF
    Undirected graphical models are widely used in statistics, physics and machine vision. However Bayesian parameter estimation for undirected models is extremely challenging, since evaluation of the posterior typically involves the calculation of an intractable normalising constant. This problem has received much attention, but very little of this has focussed on the important practical case where the data consists of noisy or incomplete observations of the underlying hidden structure. This paper specifically addresses this problem, comparing two alternative methodologies. In the first of these approaches particle Markov chain Monte Carlo (Andrieu et al., 2010) is used to efficiently explore the parameter space, combined with the exchange algorithm (Murray et al., 2006) for avoiding the calculation of the intractable normalising constant (a proof showing that this combination targets the correct distribution in found in a supplementary appendix online). This approach is compared with approximate Bayesian computation (Pritchard et al., 1999). Applications to estimating the parameters of Ising models and exponential random graphs from noisy data are presented. Each algorithm used in the paper targets an approximation to the true posterior due to the use of MCMC to simulate from the latent graphical model, in lieu of being able to do this exactly in general. The supplementary appendix also describes the nature of the resulting approximation.Comment: 26 pages, 2 figures, accepted in Journal of Computational and Graphical Statistics (http://www.amstat.org/publications/jcgs.cfm

    Controlling the Production of Acid Catalyzed Products of Furfural Hydrogenation by Pd/TiO2

    Get PDF
    We demonstrate a modified sol-immobilization procedure using (MeOH)x/(H2O)1-x solvent mixtures to prepare Pd/TiO2 catalysts that are able to reduce the formation of acid catalyzed products, e. g. ethers, for the hydrogenation of furfural. Transmission electron microscopy found a significant increase in polyvinyl alcohol (PVA) deposition at the metal-support interface and temperature programmed reduction found a reduced uptake of hydrogen, compared to an established Pd/TiO2 preparation. We propose that the additional PVA hinders hydrogen spillover onto the TiO2 support and limits the formation of Brþnsted acid sites, required to produce ethers. Elsewhere, the new preparation route was able to successfully anchor colloidal Pd to the TiO2 surface, without the need for acidification. This work demonstrates the potential for minimizing process steps as well as optimizing catalyst selectivity – both important objectives for sustainable chemistry

    Performance Assessments of Nuclear Waste Repositories: A Dialogue on Their Value and Limitations

    Full text link
    Performance Assessment (PA) is the use of mathematical models to simulate the long-term behavior of engineered and geologic barriers in a nuclear waste repository; methods of uncertainty analysis are used to assess effects of parametric and conceptual uncertainties associated with the model system upon the uncertainty in outcomes of the simulation. PA is required by the U.S. Environmental Protection Agency as part of its certification process for geologic repositories for nuclear waste. This paper is a dialogue to explore the value and limitations of PA. Two “skeptics” acknowledge the utility of PA in organizing the scientific investigations that are necessary for confident siting and licensing of a repository; however, they maintain that the PA process, at least as it is currently implemented, is an essentially unscientific process with shortcomings that may provide results of limited use in evaluating actual effects on public health and safety. Conceptual uncertainties in a PA analysis can be so great that results can be confidently applied only over short time ranges, the antithesis of the purpose behind long-term, geologic disposal. Two “proponents” of PA agree that performance assessment is unscientific, but only in the sense that PA is an engineering analysis that uses existing scientific knowledge to support public policy decisions, rather than an investigation intended to increase fundamental knowledge of nature; PA has different goals and constraints than a typical scientific study. The “proponents” describe an ideal, six-step process for conducting generalized PA, here called probabilistic systems analysis (PSA); they note that virtually all scientific content of a PA is introduced during the model-building steps of a PSA; they contend that a PA based on simple but scientifically acceptable mathematical models can provide useful and objective input to regulatory decision makers. The value of the results of any PA must lie between these two views and will depend on the level of knowledge of the site, the degree to which models capture actual physical and chemical processes, the time over which extrapolations are made, and the proper evaluation of health risks attending implementation of the repository. The challenge is in evaluating whether the quality of the PA matches the needs of decision makers charged with protecting the health and safety of the public.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/45530/1/11161_2004_Article_220844.pd

    National Survey of Sea Lice (Lepeophtheirus salmonis Krþyer and Caligus elongatus Nordmann) on Fish Farms in Ireland – 2004

    Get PDF
    This bulletin reports on the National Sea Lice Monitoring Programme carried out by the Marine Institute in 2004. Results presented in this report are mean ovigerous sea lice levels and mean mobile sea lice levels for Lepeophtheirus salmonis and Caligus elongatusFunder: Marine Institut

    Academic freedom in Europe: time for a Magna Charta?

    Get PDF
    This paper is a preliminary attempt to establish a working definition of academic freedom for the European Union states. The paper details why such a definition is required for the European Union and then examines some of the difficulties of defining academic freedom. By drawing upon experience of the legal difficulties beset by the concept in the USA and building on previous analyses of constitutional and legislative protection for academic freedom, and of legal regulations concerning institutional governance and academic tenure, a working definition of academic freedom is then derived. The resultant definition which, it is suggested, could form the basis for a European Magna Charta Libertatis Academicae, goes beyond traditional discussions of academic freedom by specifying not only the rights inherent in the concept but also its accompanying duties, necessary limitations and safeguards. The paper concludes with proposals for how the definition might be tested and carried forward

    National Survey of the Sea Lice (Lepeophtheirus salmonis KrĂžyer and Caligus elongates Nordmann) on Fish Farms in Ireland - 2003

    Get PDF
    Sea lice are regarded as having the most commercially damaging effect on cultured salmon in the world with major economic losses to the fish farming community resulting per annum (Bristow and Berland, 1991; Jackson and Costello, 1991). They affect salmon in a variety of ways; by reducing fish growth; by causing loss of scales, which leaves the fish open to secondary infections (Wootten et al., 1982); and by damaging the fish, which reduces its marketability. The two species of sea lice found on cultured salmonids in Ireland are Caligus elongatus Nordmann, a species of parasite that infests over 80 different types of marine fish, and Lepeophtheirus salmanis Kroyer, which infests only salmon and other salmonids. L. salmonis is regarded as the more serious parasite of the two species and has been found to occur most frequently on farmed salmon (Jackson and Minchin, 1992). Most of the damage caused by these parasites is thought to be mechanical, carried out during the course of attachment and feeding (Kabata, 1974; Brandal et al., 1976; Jones et al., 1990). Inflammation and hyperplasia (enlargement caused by an abnormal increase in the number of cells in an organ or tissue) have been recorded in Atlantic salmon in response to infections with L. salmonis (Jones et al., 1990; Jonsdottir et al., 1992; Nolan et al., 2000). Increases in stress hormones caused by sea lice infestations have been suggested to increase the susceptibility of fish to infectious diseases (MacKinnon, 1998). Severe erosion around the head caused by heavy infestations of L. salmonis has been recorded previously (Pike, 1989; Berland, 1993). This is thought to occur because of the rich supply of mucus secreted by mucous cell-lined ducts in that region (Nolan et al., 1999). In experimental and field investigations carried out in Norway heavy infestation was found to cause fish mortalities (Finstad et al., 2000)

    The electronic structure, surface properties, and in situ N2O decomposition of mechanochemically synthesised LaMnO3

    Get PDF
    The use of mechanochemistry to prepare catalytic materials is of significant interest; it offers an environmentally beneficial, solvent-free, route and produces highly complex structures of mixed amorphous and crystalline phases. This study reports on the effect of milling atmosphere, either air or argon, on mechanochemically prepared LaMnO3 and the catalytic performance towards N2O decomposition (deN2O). In this work, high energy resolution fluorescence detection (HERFD), X-ray absorption near edge structure (XANES), X-ray emission, and X-ray photoelectron spectroscopy (XPS) have been used to probe the electronic structural properties of the mechanochemically prepared materials. Moreover, in situ studies using near ambient pressure (NAP)-XPS, to follow the materials during catalysis, and high pressure energy dispersive EXAFS studies, to mimic the preparation conditions, have also been performed. The studies show that there are clear differences between the air and argon milled samples, with the most pronounced changes observed using NAP-XPS. The XPS results find increased levels of active adsorbed oxygen species, linked to the presence of surface oxide vacancies, for the sample prepared in argon. Furthermore, the argon milled LaMnO3 shows improved catalytic activity towards deN2O at lower temperatures compared to the air milled and sol-gel synthesised LaMnO3. Assessing this improved catalytic behaviour during deN2O of argon milled LaMnO3 by in situ NAP-XPS suggests increased interaction of N2O at room temperature within the O 1s region. This study further demonstrates the complexity of mechanochemically prepared materials and through careful choice of characterisation methods how their properties can be understood
    • 

    corecore