3,921 research outputs found

    Evidence for engraftment of human bone marrow cells in non-lethally irradiated baboons

    Get PDF
    Background. Prior to organ harvesting, an attempt was made to modulate the donor's immune responses against prospective xenogeneic recipients by infusion of 'recipient-type' bone marrow. Methods. For this purpose, baboons conditioned with total lymphoid irradiation were given 6x108 unmodified human bone marrow cells/kg body weight with no subsequent treatment. Results. Animals survived until they were euthanized at 18 months. Using primers specific for human chorionic gonadotrophin gene, the presence of human DNA was confirmed by polymerase chain reaction in the blood of one animal for up to 18 months after cell transplantation; in the other animal, xenogeneic chimerism became undetectable in the blood at 6 months after bone marrow infusion. However, tissue samples obtained from both animals at the time they were euthanized bad evidence of donor (human) DNA. Additionally, the presence of donor DNA in individually harvested colonies of erythroid and myeloid lineages suggested that infused human bone marrow cells had engrafted across the xenogeneic barrier in both baboons. Conclusions. Bone marrow transplantation from human to baboon leads to establishment of chimerism and modulation of donor-specific immune reactivity, which suggests that this strategy could be reproducibly employed to crease 'surrogate' tolerogenesis in prospective donors for subsequent organ transplantation across xenogeneic barriers

    The Nature of Surface Oxides on Corrosion-Resistant Nickel Alloy Covered by Alkaline Water

    Get PDF
    A nickel alloy with high chrome and molybdenum content was found to form a highly resistive and passive oxide layer. The donor density and mobility of ions in the oxide layer has been determined as a function of the electrical potential when alkaline water layers are on the alloy surface in order to account for the relative inertness of the nickel alloy in corrosive environments

    Improving the normalization of complex interventions: measure development based on normalization process theory (NoMAD): study protocol

    Get PDF
    <b>Background</b> Understanding implementation processes is key to ensuring that complex interventions in healthcare are taken up in practice and thus maximize intended benefits for service provision and (ultimately) care to patients. Normalization Process Theory (NPT) provides a framework for understanding how a new intervention becomes part of normal practice. This study aims to develop and validate simple generic tools derived from NPT, to be used to improve the implementation of complex healthcare interventions.<p></p> <b>Objectives</b> The objectives of this study are to: develop a set of NPT-based measures and formatively evaluate their use for identifying implementation problems and monitoring progress; conduct preliminary evaluation of these measures across a range of interventions and contexts, and identify factors that affect this process; explore the utility of these measures for predicting outcomes; and develop an online users’ manual for the measures.<p></p> <b>Methods</b> A combination of qualitative (workshops, item development, user feedback, cognitive interviews) and quantitative (survey) methods will be used to develop NPT measures, and test the utility of the measures in six healthcare intervention settings.<p></p> <b>Discussion</b> The measures developed in the study will be available for use by those involved in planning, implementing, and evaluating complex interventions in healthcare and have the potential to enhance the chances of their implementation, leading to sustained changes in working practices

    The politicisation of evaluation: constructing and contesting EU policy performance

    Get PDF
    Although systematic policy evaluation has been conducted for decades and has been growing strongly within the European Union (EU) institutions and in the member states, it remains largely underexplored in political science literatures. Extant work in political science and public policy typically focuses on elements such as agenda setting, policy shaping, decision making, or implementation rather than evaluation. Although individual pieces of research on evaluation in the EU have started to emerge, most often regarding policy “effectiveness” (one criterion among many in evaluation), a more structured approach is currently missing. This special issue aims to address this gap in political science by focusing on four key focal points: evaluation institutions (including rules and cultures), evaluation actors and interests (including competencies, power, roles and tasks), evaluation design (including research methods and theories, and their impact on policy design and legislation), and finally, evaluation purpose and use (including the relationships between discourse and scientific evidence, political attitudes and strategic use). The special issue considers how each of these elements contributes to an evolving governance system in the EU, where evaluation is playing an increasingly important role in decision making

    From theory to 'measurement' in complex interventions: methodological lessons from the development of an e-health normalisation instrument

    Get PDF
    <b>Background</b> Although empirical and theoretical understanding of processes of implementation in health care is advancing, translation of theory into structured measures that capture the complex interplay between interventions, individuals and context remain limited. This paper aimed to (1) describe the process and outcome of a project to develop a theory-based instrument for measuring implementation processes relating to e-health interventions; and (2) identify key issues and methodological challenges for advancing work in this field.<p></p> <b>Methods</b> A 30-item instrument (Technology Adoption Readiness Scale (TARS)) for measuring normalisation processes in the context of e-health service interventions was developed on the basis on Normalization Process Theory (NPT). NPT focuses on how new practices become routinely embedded within social contexts. The instrument was pre-tested in two health care settings in which e-health (electronic facilitation of healthcare decision-making and practice) was used by health care professionals.<p></p> <b>Results</b> The developed instrument was pre-tested in two professional samples (N = 46; N = 231). Ratings of items representing normalisation 'processes' were significantly related to staff members' perceptions of whether or not e-health had become 'routine'. Key methodological challenges are discussed in relation to: translating multi-component theoretical constructs into simple questions; developing and choosing appropriate outcome measures; conducting multiple-stakeholder assessments; instrument and question framing; and more general issues for instrument development in practice contexts.<p></p> <b>Conclusions</b> To develop theory-derived measures of implementation process for progressing research in this field, four key recommendations are made relating to (1) greater attention to underlying theoretical assumptions and extent of translation work required; (2) the need for appropriate but flexible approaches to outcomes measurement; (3) representation of multiple perspectives and collaborative nature of work; and (4) emphasis on generic measurement approaches that can be flexibly tailored to particular contexts of study

    Effects of gestational age at birth on cognitive performance : a function of cognitive workload demands

    Get PDF
    Objective: Cognitive deficits have been inconsistently described for late or moderately preterm children but are consistently found in very preterm children. This study investigates the association between cognitive workload demands of tasks and cognitive performance in relation to gestational age at birth. Methods: Data were collected as part of a prospective geographically defined whole-population study of neonatal at-risk children in Southern Bavaria. At 8;5 years, n = 1326 children (gestation range: 23–41 weeks) were assessed with the K-ABC and a Mathematics Test. Results: Cognitive scores of preterm children decreased as cognitive workload demands of tasks increased. The relationship between gestation and task workload was curvilinear and more pronounced the higher the cognitive workload: GA2 (quadratic term) on low cognitive workload: R2 = .02, p<0.001; moderate cognitive workload: R2 = .09, p<0.001; and high cognitive workload tasks: R2 = .14, p<0.001. Specifically, disproportionally lower scores were found for very (<32 weeks gestation) and moderately (32–33 weeks gestation) preterm children the higher the cognitive workload of the tasks. Early biological factors such as gestation and neonatal complications explained more of the variance in high (12.5%) compared with moderate (8.1%) and low cognitive workload tasks (1.7%). Conclusions: The cognitive workload model may help to explain variations of findings on the relationship of gestational age with cognitive performance in the literature. The findings have implications for routine cognitive follow-up, educational intervention, and basic research into neuro-plasticity and brain reorganization after preterm birth

    A Landscape and Climate Data Logistic Model of Tsetse Distribution in Kenya

    Get PDF
    , biologically transmitted by the tsetse fly in Africa, are a major cause of illness resulting in both high morbidity and mortality among humans, cattle, wild ungulates, and other species. However, tsetse fly distributions change rapidly due to environmental changes, and fine-scale distribution maps are few. Due to data scarcity, most presence/absence estimates in Kenya prior to 2000 are a combination of local reports, entomological knowledge, and topographic information. The availability of tsetse fly abundance data are limited, or at least have not been collected into aggregate, publicly available national datasets. Despite this limitation, other avenues exist for estimating tsetse distributions including remotely sensed data, climate information, and statistical tools.Here we present a logistic regression model of tsetse abundance. The goal of this model is to estimate the distribution of tsetse fly in Kenya in the year 2000, and to provide a method by which to anticipate their future distribution. Multiple predictor variables were tested for significance and for predictive power; ultimately, a parsimonious subset of variables was identified and used to construct the regression model with the 1973 tsetse map. These data were validated against year 2000 Food and Agriculture Organization (FAO) estimates. Mapcurves Goodness-Of-Fit scores were used to evaluate the modeled fly distribution against FAO estimates and against 1973 presence/absence data, each driven by appropriate climate data.Logistic regression can be effectively used to produce a model that projects fly abundance under elevated greenhouse gas scenarios. This model identifies potential areas for tsetse abandonment and expansion
    corecore