764 research outputs found

    Acquired immunologic tolerance: with particular reference to transplantation

    Get PDF
    The first unequivocally successful bone marrow cell transplantation in humans was recorded in 1968 by the University of Minnesota team of Robert A. Good (Gatti et al. Lancet 2: 1366–1369, 1968). This achievement was a direct extension of mouse models of acquired immunologic tolerance that were established 15 years earlier. In contrast, organ (i.e. kidney) transplantation was accomplished precociously in humans (in 1959) before demonstrating its feasibility in any experimental model and in the absence of a defensible immunologic rationale. Due to the striking differences between the outcomes with the two kinds of procedure, the mechanisms of organ engraftment were long thought to differ from the leukocyte chimerism-associated ones of bone marrow transplantation. This and other concepts of alloengraftment and acquired tolerance have changed over time. Current concepts and their clinical implications can be understood and discussed best from the perspective provided by the life and times of Bob Good

    Use of Heated Humidified Gases for Early Stabilization of Preterm Infants: A Meta-Analysis

    Get PDF
    Background: Large observational studies in preterm infants have shown an increase in mortality and morbidity when admission temperature is below 36.5°C. Recent randomized controlled studies have shown a reduction in admission hypothermia and an increase in the number of infants admitted with normal temperature (36.5–37.5°C) when heated humidified gases were used for initial stabilization of preterm infants.Objective: The goal of this study was to perform a meta-analysis of published randomized trials using heated humidified gas compared to cold dry gas in preterm infants immediately after birth and during transport to the neonatal unit. Specific research aims were to determine the magnitude of the reduction in hypothermia and to examine neonatal outcomes including mortality.Methods: A literature search was conducted in accordance with the standard methods of the Cochrane Neonatal Work Group. Randomized trials were identified and data entered into RevMan5. A fixed effects statistical model was used. Risk of bias was assessed for included studies and the GRADE approach used to determine quality of evidence. The primary outcome was admission hypothermia (< 36.5°C). Secondary outcomes included admission temperature in the normothermic range (36.5–37.5°C) and neonatal outcomes including mortality.Results: Two studies met inclusion criteria and a total of 476 preterm infants were enrolled, all of whom were < 32 weeks gestation. Studies were not blinded but the overall risk of bias was low. Admission hypothermia was reduced by 36% (CI 17–50%), while admission normothermia was significantly increased. GRADE quality of evidence was high for these outcomes. The number of infants with more severe hypothermia (< 35.5°C) was significantly reduced (RR 0.32 CI 0.14-0.73). In addition, preterm infants < 28 weeks had significantly less admission hypothermia (RR 0.61 CI 0.42, 0.90) Mortality and measures of respiratory outcome were not significantly different (studies were not powered for these outcomes), though there was a trend to improvement in all respiratory measures assessed. There were no significant adverse events and no increase in admission hyperthermia (>37.5°C).Conclusions: Heating and humidification of inspired gases immediately after birth and during transport to the neonatal unit improves admission temperature in preterm infants. Consideration should be given to incorporating this technique into other strategies (e.g., use of plastic wrap) designed to keep preterm infants warm on admission to the neonatal unit

    History of clinical transplantation

    Get PDF
    The emergence of transplantation has seen the development of increasingly potent immunosuppressive agents, progressively better methods of tissue and organ preservation, refinements in histocompatibility matching, and numerous innovations is surgical techniques. Such efforts in combination ultimately made it possible to successfully engraft all of the organs and bone marrow cells in humans. At a more fundamental level, however, the transplantation enterprise hinged on two seminal turning points. The first was the recognition by Billingham, Brent, and Medawar in 1953 that it was possible to induce chimerism-associated neonatal tolerance deliberately. This discovery escalated over the next 15 years to the first successful bone marrow transplantations in humans in 1968. The second turning point was the demonstration during the early 1960s that canine and human organ allografts could self-induce tolerance with the aid of immunosuppression. By the end of 1962, however, it had been incorrectly concluded that turning points one and two involved different immune mechanisms. The error was not corrected until well into the 1990s. In this historical account, the vast literature that sprang up during the intervening 30 years has been summarized. Although admirably documenting empiric progress in clinical transplantation, its failure to explain organ allograft acceptance predestined organ recipients to lifetime immunosuppression and precluded fundamental changes in the treatment policies. After it was discovered in 1992 that long-surviving organ transplant recipient had persistent microchimerism, it was possible to see the mechanistic commonality of organ and bone marrow transplantation. A clarifying central principle of immunology could then be synthesized with which to guide efforts to induce tolerance systematically to human tissues and perhaps ultimately to xenografts

    Can sacrificial feeding areas protect aquatic plants from herbivore grazing? Using behavioural ecology to inform wildlife management

    Get PDF
    Effective wildlife management is needed for conservation, economic and human well-being objectives. However, traditional population control methods are frequently ineffective, unpopular with stakeholders, may affect non-target species, and can be both expensive and impractical to implement. New methods which address these issues and offer effective wildlife management are required. We used an individual-based model to predict the efficacy of a sacrificial feeding area in preventing grazing damage by mute swans (Cygnus olor) to adjacent river vegetation of high conservation and economic value. The accuracy of model predictions was assessed by a comparison with observed field data, whilst prediction robustness was evaluated using a sensitivity analysis. We used repeated simulations to evaluate how the efficacy of the sacrificial feeding area was regulated by (i) food quantity, (ii) food quality, and (iii) the functional response of the forager. Our model gave accurate predictions of aquatic plant biomass, carrying capacity, swan mortality, swan foraging effort, and river use. Our model predicted that increased sacrificial feeding area food quantity and quality would prevent the depletion of aquatic plant biomass by swans. When the functional response for vegetation in the sacrificial feeding area was increased, the food quantity and quality in the sacrificial feeding area required to protect adjacent aquatic plants were reduced. Our study demonstrates how the insights of behavioural ecology can be used to inform wildlife management. The principles that underpin our model predictions are likely to be valid across a range of different resource-consumer interactions, emphasising the generality of our approach to the evaluation of strategies for resolving wildlife management problems

    SARS-CoV-2 viability on sports equipment is limited, and dependent on material composition

    Get PDF
    The control of the COVID-19 pandemic in the UK has necessitated restrictions on amateur and professional sports due to the perceived infection risk to competitors, via direct person to person transmission, or possibly via the surfaces of sports equipment. The sharing of sports equipment such as tennis balls was therefore banned by some sport’s governing bodies. We sought to investigate the potential of sporting equipment as transmission vectors of SARS-CoV-2. Ten different types of sporting equipment, including balls from common sports, were inoculated with 40 μl droplets containing clinically relevant concentrations of live SARS-CoV-2 virus. Materials were then swabbed at time points relevant to sports (1, 5, 15, 30, 90 min). The amount of live SARS-CoV-2 recovered at each time point was enumerated using viral plaque assays, and viral decay and half-life was estimated through fitting linear models to log transformed data from each material. At one minute, SARS-CoV-2 virus was recovered in only seven of the ten types of equipment with the low dose inoculum, one at five minutes and none at 15 min. Retrievable virus dropped significantly for all materials tested using the high dose inoculum with mean recovery of virus falling to 0.74% at 1 min, 0.39% at 15 min and 0.003% at 90 min. Viral recovery, predicted decay, and half-life varied between materials with porous surfaces limiting virus transmission. This study shows that there is an exponential reduction in SARS-CoV-2 recoverable from a range of sports equipment after a short time period, and virus is less transferrable from materials such as a tennis ball, red cricket ball and cricket glove. Given this rapid loss of viral load and the fact that transmission requires a significant inoculum to be transferred from equipment to the mucous membranes of another individual it seems unlikely that sports equipment is a major cause for transmission of SARS-CoV-2. These findings have important policy implications in the context of the pandemic and may promote other infection control measures in sports to reduce the risk of SARS-CoV-2 transmission and urge sports equipment manufacturers to identify surfaces that may or may not be likely to retain transferable virus

    Low pH immobilizes and kills human leukocytes and prevents transmission of cell-associated HIV in a mouse model

    Get PDF
    BACKGROUND: Both cell-associated and cell-free HIV virions are present in semen and cervical secretions of HIV-infected individuals. Thus, topical microbicides may need to inactivate both cell-associated and cell-free HIV to prevent sexual transmission of HIV/AIDS. To determine if the mild acidity of the healthy vagina and acid buffering microbicides would prevent transmission by HIV-infected leukocytes, we measured the effect of pH on leukocyte motility, viability and intracellular pH and tested the ability of an acidic buffering microbicide (BufferGel(®)) to prevent the transmission of cell-associated HIV in a HuPBL-SCID mouse model. METHODS: Human lymphocyte, monocyte, and macrophage motilities were measured as a function of time and pH using various acidifying agents. Lymphocyte and macrophage motilities were measured using video microscopy. Monocyte motility was measured using video microscopy and chemotactic chambers. Peripheral blood mononuclear cell (PBMC) viability and intracellular pH were determined as a function of time and pH using fluorescent dyes. HuPBL-SCID mice were pretreated with BufferGel, saline, or a control gel and challenged with HIV-1-infected human PBMCs. RESULTS: Progressive motility was completely abolished in all cell types between pH 5.5 and 6.0. Concomitantly, at and below pH 5.5, the intracellular pH of PBMCs dropped precipitously to match the extracellular medium and did not recover. After acidification with hydrochloric acid to pH 4.5 for 60 min, although completely immotile, 58% of PBMCs excluded ethidium homodimer-1 (dead-cell dye). In contrast, when acidified to this pH with BufferGel, a microbicide designed to maintain vaginal acidity in the presence of semen, only 4% excluded dye at 10 min and none excluded dye after 30 min. BufferGel significantly reduced transmission of HIV-1 in HuPBL-SCID mice (1 of 12 infected) compared to saline (12 of 12 infected) and a control gel (5 of 7 infected). CONCLUSION: These results suggest that physiologic or microbicide-induced acid immobilization and killing of infected white blood cells may be effective in preventing sexual transmission of cell-associated HIV

    A hypothetico-deductive approach to assessing the social function of chemical signalling in a non-territorial solitary carnivore

    Get PDF
    The function of chemical signalling in non-territorial solitary carnivores is still relatively unclear. Studies on territorial solitary and social carnivores have highlighted odour capability and utility, however the social function of chemical signalling in wild carnivore populations operating dominance hierarchy social systems has received little attention. We monitored scent marking and investigatory behaviour of wild brown bears Ursus arctos, to test multiple hypotheses relating to the social function of chemical signalling. Camera traps were stationed facing bear ‘marking trees’ to document behaviour by different age sex classes in different seasons. We found evidence to support the hypothesis that adult males utilise chemical signalling to communicate dominance to other males throughout the non-denning period. Adult females did not appear to utilise marking trees to advertise oestrous state during the breeding season. The function of marking by subadult bears is somewhat unclear, but may be related to the behaviour of adult males. Subadults investigated trees more often than they scent marked during the breeding season, which could be a result of an increased risk from adult males. Females with young showed an increase in marking and investigation of trees outside of the breeding season. We propose the hypothesis that females engage their dependent young with marking trees from a young age, at a relatively ‘safe’ time of year. Memory, experience, and learning at a young age, may all contribute towards odour capabilities in adult bears
    corecore