4,194 research outputs found

    From theory to 'measurement' in complex interventions: methodological lessons from the development of an e-health normalisation instrument

    Get PDF
    <b>Background</b> Although empirical and theoretical understanding of processes of implementation in health care is advancing, translation of theory into structured measures that capture the complex interplay between interventions, individuals and context remain limited. This paper aimed to (1) describe the process and outcome of a project to develop a theory-based instrument for measuring implementation processes relating to e-health interventions; and (2) identify key issues and methodological challenges for advancing work in this field.<p></p> <b>Methods</b> A 30-item instrument (Technology Adoption Readiness Scale (TARS)) for measuring normalisation processes in the context of e-health service interventions was developed on the basis on Normalization Process Theory (NPT). NPT focuses on how new practices become routinely embedded within social contexts. The instrument was pre-tested in two health care settings in which e-health (electronic facilitation of healthcare decision-making and practice) was used by health care professionals.<p></p> <b>Results</b> The developed instrument was pre-tested in two professional samples (N = 46; N = 231). Ratings of items representing normalisation 'processes' were significantly related to staff members' perceptions of whether or not e-health had become 'routine'. Key methodological challenges are discussed in relation to: translating multi-component theoretical constructs into simple questions; developing and choosing appropriate outcome measures; conducting multiple-stakeholder assessments; instrument and question framing; and more general issues for instrument development in practice contexts.<p></p> <b>Conclusions</b> To develop theory-derived measures of implementation process for progressing research in this field, four key recommendations are made relating to (1) greater attention to underlying theoretical assumptions and extent of translation work required; (2) the need for appropriate but flexible approaches to outcomes measurement; (3) representation of multiple perspectives and collaborative nature of work; and (4) emphasis on generic measurement approaches that can be flexibly tailored to particular contexts of study

    Evaluation of a Method of Estimating Agricultural Chemical Use

    Full text link
    This research evaluated the validity of an economic-based measure of agricultural chemical use on specific crop types. Estimated chemical use measures, reported in a budget planning document prepared collaboratively with input from farmers, vendors, researchers, and representatives from numerous agricultural agencies, were compared to chemical use measures collected through face-to-face interviews with local farmers regarding their actual chemical application practices over the past growing season. A rural agricultural-based county in Mississippi, USA, was the study area for this project. The measures of comparison were the estimated and actual ounces of individual fungicides, herbicides, and insecticides used per acre on corn, rice, soybean, wheat, and cotton fields, and the estimated and actual total chemical load, which is the sum of all fungicides, herbicides and insecticides used on the various crops. To obtain information regarding crop type and area of cultivated land, contemporary satellite images, overlaid with property maps, were plotted and provided for the farmers to identify their crop types and delineate their crop boundaries. The crop boundaries were digitized, and a GIS database was developed containing data for crop types, amounts of cultivated land, and chemical types and quantities used. Outcomes of this research could assist in studies requiring agricultural chemical data by using estimates generated by the USDA and other agricultural agencies as an alternative to primary data collection

    Chandra Observations of the Radio Galaxy 3C 445 and the Hotspot X-ray Emission Mechanism

    Full text link
    We present new {\it Chandra} observations of the radio galaxy 3C 445, centered on its southern radio hotspot. Our observations detect X-ray emission displaced upstream and to the west of the radio-optical hotspot. Attempting to reproduce both the observed spectral energy distribution (SED) and the displacement, excludes all one zone models. Modeling of the radio-optical hotspot spectrum suggests that the electron distribution has a low energy cutoff or break approximately at the proton rest mass energy. The X-rays could be due to external Compton scattering of the cosmic microwave background (EC/CMB) coming from the fast (Lorentz factor Γ≈4\Gamma\approx 4) part of a decelerating flow, but this requires a small angle between the jet velocity and the observer's line of sight (θ≈14∘\theta\approx 14^{\circ}). Alternatively, the X-ray emission can be synchrotron from a separate population of electrons. This last interpretation does not require the X-ray emission to be beamed.Comment: 9 pages, 5 figures, ApJ, in pres

    Learning machines for health and beyond

    Full text link
    Machine learning techniques are effective for building predictive models because they are good at identifying patterns in large datasets. Development of a model for complex real life problems often stops at the point of publication, proof of concept or when made accessible through some mode of deployment. However, a model in the medical domain risks becoming obsolete as soon as patient demographic changes. The maintenance and monitoring of predictive models post-publication is crucial to guarantee their safe and effective long term use. As machine learning techniques are effectively trained to look for patterns in available datasets, the performance of a model for complex real life problems will not peak and remain fixed at the point of publication or even point of deployment. Rather, data changes over time, and they also changed when models are transported to new places to be used by new demography.Comment: 12 pages, 3 figure

    Self-organising Thermoregulatory Huddling in a Model of Soft Deformable Littermates

    Get PDF
    Thermoregulatory huddling behaviours dominate the early experiences of developing rodents, and constrain the patterns of sensory and motor input that drive neural plasticity. Huddling is a complex emergent group behaviour, thought to provide an early template for the development of adult social systems, and to constrain natural selection on metabolic physiology. However, huddling behaviours are governed by simple rules of interaction between individuals, which can be described in terms of the thermodynamics of heat exchange, and can be easily controlled by manipulation of the environment temperature. Thermoregulatory huddling thus provides an opportunity to investigate the effects of early experience on brain development in a social, developmental, and evolutionary context, through controlled experimentation. This paper demonstrates that thermoregulatory huddling behaviours can self-organise in a simulation of rodent littermates modelled as soft-deformable bodies that exchange heat during contact. The paper presents a novel methodology, based on techniques in computer animation, for simulating the early sensory and motor experiences of the developing rodent

    How Gaussian competition leads to lumpy or uniform species distributions

    Get PDF
    A central model in theoretical ecology considers the competition of a range of species for a broad spectrum of resources. Recent studies have shown that essentially two different outcomes are possible. Either the species surviving competition are more or less uniformly distributed over the resource spectrum, or their distribution is 'lumped' (or 'clumped'), consisting of clusters of species with similar resource use that are separated by gaps in resource space. Which of these outcomes will occur crucially depends on the competition kernel, which reflects the shape of the resource utilization pattern of the competing species. Most models considered in the literature assume a Gaussian competition kernel. This is unfortunate, since predictions based on such a Gaussian assumption are not robust. In fact, Gaussian kernels are a border case scenario, and slight deviations from this function can lead to either uniform or lumped species distributions. Here we illustrate the non-robustness of the Gaussian assumption by simulating different implementations of the standard competition model with constant carrying capacity. In this scenario, lumped species distributions can come about by secondary ecological or evolutionary mechanisms or by details of the numerical implementation of the model. We analyze the origin of this sensitivity and discuss it in the context of recent applications of the model.Comment: 11 pages, 3 figures, revised versio

    Gravitational waves in dynamical spacetimes with matter content in the Fully Constrained Formulation

    Full text link
    The Fully Constrained Formulation (FCF) of General Relativity is a novel framework introduced as an alternative to the hyperbolic formulations traditionally used in numerical relativity. The FCF equations form a hybrid elliptic-hyperbolic system of equations including explicitly the constraints. We present an implicit-explicit numerical algorithm to solve the hyperbolic part, whereas the elliptic sector shares the form and properties with the well known Conformally Flat Condition (CFC) approximation. We show the stability andconvergence properties of the numerical scheme with numerical simulations of vacuum solutions. We have performed the first numerical evolutions of the coupled system of hydrodynamics and Einstein equations within FCF. As a proof of principle of the viability of the formalism, we present 2D axisymmetric simulations of an oscillating neutron star. In order to simplify the analysis we have neglected the back-reaction of the gravitational waves into the dynamics, which is small (<2 %) for the system considered in this work. We use spherical coordinates grids which are well adapted for simulations of stars and allow for extended grids that marginally reach the wave zone. We have extracted the gravitational wave signature and compared to the Newtonian quadrupole and hexadecapole formulae. Both extraction methods show agreement within the numerical errors and the approximations used (~30 %).Comment: 17 pages, 9 figures, 2 tables, accepted for publication in PR

    Quasiparticle Spectrum of d-wave Superconductors in the Mixed State

    Full text link
    The quasiparticle spectrum of a two-dimensional d-wave superconductor in the mixed state, H_{c1} << H << H_{c2}, is studied both analytically and numerically using the linearized Bogoliubov-de Gennes equation. We consider various values of the "anisotropy ratio" v_F/v_Delta for the quasiparticle velocities at the Dirac points, and we examine the implications of symmetry. For a Bravais lattice of vortices, we find there is always an isolated energy-zero (Dirac point) at the center of the Brillouin zone, but for a non-Bravais lattice with two vortices per unit cell there is generally an energy gap. In both of these cases, the density of states should vanish at zero energy, in contrast with the semiclassical prediction of a constant density of states, though the latter may hold down to very low energies for large anisotropy ratios. This result is closely related to the particle-hole symmetry of the band structures in lattices with two vortices per unit cell. More complicated non-Bravais vortex lattice configurations with at least four vortices per unit cell can break the particle-hole symmetry of the linearized energy spectrum and lead to a finite density of states at zero energy.Comment: 16 pages, 14 figures, RevTe

    In Vitro Intracellular Trafficking of Virulence Antigen during Infection by Yersinia pestis

    Get PDF
    Yersinia pestis, the causative agent of plague, encodes several essential virulence factors on a 70 kb plasmid, including the Yersinia outer proteins (Yops) and a multifunctional virulence antigen (V). V is uniquely able to inhibit the host immune response; aid in the expression, secretion, and injection of the cytotoxic Yops via a type III secretion system (T3SS)-dependent mechanism; be secreted extracellularly; and enter the host cell by a T3SS-independent mechanism, where its activity is unknown. To elucidate the intracellular trafficking and target(s) of V, time-course experiments were performed with macrophages (MΦs) infected with Y. pestis or Y. pseudotuberculosis at intervals from 5 min to 6 h. The trafficking pattern was discerned from results of parallel microscopy, immunoblotting, and flow cytometry experiments. The MΦs were incubated with fluorescent or gold conjugated primary or secondary anti-V (antibodies [Abs]) in conjunction with organelle-associated Abs or dyes. The samples were observed for co-localization by immuno-fluorescence and electron microscopy. For fractionation studies, uninfected and infected MΦs were lysed and subjected to density gradient centrifugation coupled with immunoblotting with Abs to V or to organelles. Samples were also analyzed by flow cytometry after lysis and dual-staining with anti-V and anti-organelle Abs. Our findings indicate a co-localization of V with (1) endosomal proteins between 10–45 min of infection, (2) lysosomal protein(s) between 1–2 h of infection, (3) mitochondrial proteins between 2.5–3 h infection, and (4) Golgi protein(s) between 4–6 h of infection. Further studies are being performed to determine the specific intracellular interactions and role in pathogenesis of intracellularly localized V
    • …
    corecore