1,477 research outputs found

    Multiscreen serum analysis of highly sensitized renal dialysis patients for antibodies toward public and private class I HLA determinants: Implications for computer-predicted acceptable and unacceptable donor mismatches in kidney transplantation

    Get PDF
    A multiscreen serum analysis program has been developed that permits a determination of antibody specificity for the vast majority of highly sensitized patients awaiting transplantation. This program is based on a 2 x 2 table analysis of correlations between serum reactivity with an HLA-typed cell panel and incorporates two modifications. One implements the concept of public HLA determinants based on the serologic crossreactivity among class I HLA antigens. The other modification derives from the premise that most highly sensitized patients maintain the same PRA and antibody profiles over many months and even years. Monthly screening results for patients with persistent PRA values can therefore be combined for analysis. For 132 of 150 highly sensitized patients with >50% PRA, this multiscreen serum analysis program yielded information about antibody specificity toward public and private class IHLA determinants. The vast majority of patients (108 of 112) with PRA values between 50 and 89% showed antibody specificity generally toward one, two, or three public markers and/or the more common private HLA-A, B antigens. For 24 of 38 patients with >90% PRA, it was possible to define one or few HLA-specific antibodies. The primary objective of the multiscreen program was to develop an algorithm about computer-predicted acceptable and unacceptable donor HLA-A, B antigens for patients with preformed antibodies. A retrospective analysis of kidney transplants into 89 highly sensitized patients has demonstrated that allografts with unacceptable HLA-A, B mismatches had significantly lower actuarial survival rates than those with acceptable mismatches (P = 0.01). This was shown for both groups of 32 primary transplants (44% vs. 67% after 1 year) and 60 retransplants (50% vs. 68%). Also, serum creatinine levels were significantly higher in patients with unacceptable class I mismatches (3.0 vs. 8.4 mg% [P = 0.007] after 2 weeks; 3.9 vs. 9.1 mg% [P = 0.014] after 4 weeks). Histopathologic analysis of allograft tissue specimens from 47 transplant recipients revealed a significantly higher incidence of humoral rejection (P = 0.02), but not cellular rejection, in the unacceptable mismatch group. These results suggest that the multiscreen program can establish which donor HLA-A, B mismatches must be avoided in kidney transplantation for most highly sensitized patients. For 18 of 150 high PRA renal dialysis patients, the multiscreen program could not define HLA-specific antibody. Most patients had >90% PRA, and many of their sera appeared to contain IgM type nonspecific lympho- cytotoxins that could be inactivated by dithioerythreitol (DTE). Preliminary studies have shown that this treatment enabled the detection of HLA-specific antibodies upon subsequent screening on many occasions. These data suggest that non-HLA specific reactivity revealed by multiscreen analysis can often be removed by DTE treatment. Multiscreen analysis offers an attractive approach to regional organ-sharing programs for highly sensitized renal transplant candidates. It enables the development of an efficient strategy for donor selection based on the computer assignment of acceptable HLA-A, B mismatches for each patient. © 1990 by Williams and Wilkins

    The influence of HLA matching on cytomegalovirus hepatitis and chronic rejection after liver transplantation

    Get PDF
    Previous findings in liver transplantation patients have raised the concept that HLA plays a dualistic role. HLA matching will reduce rejection but may augment MHC restricted cellular immune mechanisms of liver allograft injury. To evaluate this concept, we studied CMV hepatitis in 399 FK506-treated liver transplant patients, including 355 cases for which complete HLA-A, B, DR, DQ typing information was available. CMV hepatitis developed in 25 patients, and 17 of them (or 68%) showed a one or two HLA-DR antigen match with the donor. In contrast, HLA-DR matches were found in only 35% of 330 patients without CMV hepatitis (P=0.005). No significant associations were seen for HLA-A, HLA-B, and HLA-DQ antigens. In pretransplant CMV-seronegative patients with seropositive grafts (n=39), the frequency of CMV hepatitis was 44% for HLA-DR-matched livers but 14% for HLA-DR-un-matched livers. In seropositive recipients (n=187), these frequencies were 12% and 2% for HLA-DR-matched and unmatched liver grafts. Chronic rejection developed in 29 patients (or 8%) during a follow-up between 10 and 24 months after transplantation. Its incidence was higher in the CMV hepatitis group (24% vs. 6%) (P=0.007). Although no associations were found between HLA matching and the incidence of chronic rejection, there was an earlier onset of chronic rejection of HLA-DR-matched livers irrespective of CMV hepatitis. These findings suggest that an HLA-DR match between donor and recipient increases the incidence of CMV hepatitis in both primary and secondary CMV infections. Although HLA compatibility leads to less acute cellular rejection, it is suggested that DR matching may accelerate chronic rejection of liver transplants, perhaps through HLA-DR-restricted immunological mechanisms toward viral antigens, including CMV. © 1993 by Williams and Wilkins

    The Summer 2019-2020 Wildfires in East Coast Australia and Their Impacts on Air Quality and Health in New South Wales, Australia.

    Full text link
    The 2019–2020 summer wildfire event on the east coast of Australia was a series of major wildfires occurring from November 2019 to end of January 2020 across the states of Queensland, New South Wales (NSW), Victoria and South Australia. The wildfires were unprecedent in scope and the extensive character of the wildfires caused smoke pollutants to be transported not only to New Zealand, but also across the Pacific Ocean to South America. At the peak of the wildfires, smoke plumes were injected into the stratosphere at a height of up to 25 km and hence transported across the globe. The meteorological and air quality Weather Research and Forecasting with Chemistry (WRF-Chem) model is used together with the air quality monitoring data collected during the bushfire period and remote sensing data from the Moderate Resolution Imaging Spectroradiometer (MODIS) and Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation (CALIPSO) satellites to determine the extent of the wildfires, the pollutant transport and their impacts on air quality and health of the exposed population in NSW. The results showed that the WRF-Chem model using Fire Emission Inventory (FINN) from National Center for Atmospheric Research (NCAR) to simulate the dispersion and transport of pollutants from wildfires predicted the daily concentration of PM2.5 having the correlation (R2) and index of agreement (IOA) from 0.6 to 0.75 and 0.61 to 0.86, respectively, when compared with the ground-based data. The impact on health endpoints such as mortality and respiratory and cardiovascular diseases hospitalizations across the modelling domain was then estimated. The estimated health impact on each of the Australian Bureau of Statistics (ABS) census districts (SA4) of New South Wales was calculated based on epidemiological assumptions of the impact function and incidence rate data from the 2016 ABS and NSW Department of Health statistical health records. Summing up all SA4 census district results over NSW, we estimated that there were 247 (CI: 89, 409) premature deaths, 437 (CI: 81, 984) cardiovascular diseases hospitalizations and 1535 (CI: 493, 2087) respiratory diseases hospitalizations in NSW over the period from 1 November 2019 to 8 January 2020. The results are comparable with a previous study based only on observation data, but the results in this study provide much more spatially and temporally detailed data with regard to the health impact from the summer 2019–2020 wildfire

    Structure of the hDmc1-ssDNA filament reveals the principles of its architecture

    Get PDF
    In eukaryotes, meiotic recombination is a major source of genetic diversity, but its defects in humans lead to abnormalities such as Down's, Klinefelter's and other syndromes. Human Dmc1 (hDmc1), a RecA/Rad51 homologue, is a recombinase that plays a crucial role in faithful chromosome segregation during meiosis. The initial step of homologous recombination occurs when hDmc1 forms a filament on single-stranded (ss) DNA. However the structure of this presynaptic complex filament for hDmc1 remains unknown. To compare hDmc1-ssDNA complexes to those known for the RecA/Rad51 family we have obtained electron microscopy (EM) structures of hDmc1-ssDNA nucleoprotein filaments using single particle approach. The EM maps were analysed by docking crystal structures of Dmc1, Rad51, RadA, RecA and DNA. To fully characterise hDmc1-DNA complexes we have analysed their organisation in the presence of Ca2+, Mg2+, ATP, AMP-PNP, ssDNA and dsDNA. The 3D EM structures of the hDmc1-ssDNA filaments allowed us to elucidate the principles of their internal architecture. Similar to the RecA/Rad51 family, hDmc1 forms helical filaments on ssDNA in two states: extended (active) and compressed (inactive). However, in contrast to the RecA/Rad51 family, and the recently reported structure of hDmc1-double stranded (ds) DNA nucleoprotein filaments, the extended (active) state of the hDmc1 filament formed on ssDNA has nine protomers per helical turn, instead of the conventional six, resulting in one protomer covering two nucleotides instead of three. The control reconstruction of the hDmc1-dsDNA filament revealed 6.4 protein subunits per helical turn indicating that the filament organisation varies depending on the DNA templates. Our structural analysis has also revealed that the N-terminal domain of hDmc1 accomplishes its important role in complex formation through domain swapping between adjacent protomers, thus providing a mechanistic basis for coordinated action of hDmc1 protomers during meiotic recombination

    Attention deficit hyperactivity disorder symptomatology and pediatric obesity: Psychopathology or sleep deprivation?

    Get PDF
    The relationship between attention deficit hyperactivity disorder (ADHD) and obesity in children has received considerable attention in recent years. However, the literature currently overlooks the potential causal and maintaining role that sleep problems may play in this relationship. Using a biopsychosocial framework, this article highlights how sleep problems impact the biological, psychological, and social aspects of both ADHD symptomatology and obesity. An in-depth examination of this model illustrates the imperative need for future research and clinical practice to recognize and explore the role sleep has in the link between obesity and ADHD symptomatology

    Guillain-Barré syndrome: a century of progress

    Get PDF
    In 1916, Guillain, Barré and Strohl reported on two cases of acute flaccid paralysis with high cerebrospinal fluid protein levels and normal cell counts — novel findings that identified the disease we now know as Guillain–Barré syndrome (GBS). 100 years on, we have made great progress with the clinical and pathological characterization of GBS. Early clinicopathological and animal studies indicated that GBS was an immune-mediated demyelinating disorder, and that severe GBS could result in secondary axonal injury; the current treatments of plasma exchange and intravenous immunoglobulin, which were developed in the 1980s, are based on this premise. Subsequent work has, however, shown that primary axonal injury can be the underlying disease. The association of Campylobacter jejuni strains has led to confirmation that anti-ganglioside antibodies are pathogenic and that axonal GBS involves an antibody and complement-mediated disruption of nodes of Ranvier, neuromuscular junctions and other neuronal and glial membranes. Now, ongoing clinical trials of the complement inhibitor eculizumab are the first targeted immunotherapy in GBS

    Dispelling urban myths about default uncertainty factors in chemical risk assessment - Sufficient protection against mixture effects?

    Get PDF
    © 2013 Martin et al.; licensee BioMed Central LtdThis article has been made available through the Brunel Open Access Publishing Fund.Assessing the detrimental health effects of chemicals requires the extrapolation of experimental data in animals to human populations. This is achieved by applying a default uncertainty factor of 100 to doses not found to be associated with observable effects in laboratory animals. It is commonly assumed that the toxicokinetic and toxicodynamic sub-components of this default uncertainty factor represent worst-case scenarios and that the multiplication of those components yields conservative estimates of safe levels for humans. It is sometimes claimed that this conservatism also offers adequate protection from mixture effects. By analysing the evolution of uncertainty factors from a historical perspective, we expose that the default factor and its sub-components are intended to represent adequate rather than worst-case scenarios. The intention of using assessment factors for mixture effects was abandoned thirty years ago. It is also often ignored that the conservatism (or otherwise) of uncertainty factors can only be considered in relation to a defined level of protection. A protection equivalent to an effect magnitude of 0.001-0.0001% over background incidence is generally considered acceptable. However, it is impossible to say whether this level of protection is in fact realised with the tolerable doses that are derived by employing uncertainty factors. Accordingly, it is difficult to assess whether uncertainty factors overestimate or underestimate the sensitivity differences in human populations. It is also often not appreciated that the outcome of probabilistic approaches to the multiplication of sub-factors is dependent on the choice of probability distributions. Therefore, the idea that default uncertainty factors are overly conservative worst-case scenarios which can account both for the lack of statistical power in animal experiments and protect against potential mixture effects is ill-founded. We contend that precautionary regulation should provide an incentive to generate better data and recommend adopting a pragmatic, but scientifically better founded approach to mixture risk assessment. © 2013 Martin et al.; licensee BioMed Central Ltd.Oak Foundatio

    Influences on gum feeding in primates

    Get PDF
    This chapter reviews the factors that may affect patterns of gum feeding by primates. These are then examined for mixed-species troops of saddleback (S. fuscicollis) and mustached (S. mystax) tamarins. An important distinction is made between gums produced by tree trunks and branches as a result of damage and those produced by seed pods as part of a dispersal strategy as these may be expected to differ in their biochemistry. Feeding on fruit and Parkia seed pod exudates was more prevalent in the morning whereas other exudates were eaten in the afternoon. This itinerary may represent a deliberate strategy to retain trunk gums in the gut overnight, thus maximising the potential for microbial fermentation of their β-linked oligosaccharides. Both types of exudates were eaten more in the dry than the wet season. Consumption was linked to seasonal changes in resource availability and not the tamarins’ reproductive status pro-viding no support for the suggestion that gums are eaten as a pri-mary calcium source in the later stages of gestation and lactation. The role of availability in determining patterns of consumption is further supported by the finding that dietary overlap for the trunk gums eaten was greater between species within mixed-species troops within years than it was within species between years. These data and those for pygmy marmosets (Cebuella pygmaea) suggest that patterns of primate gummivory may reflect the interaction of prefer-ence and availability for both those able to stimulate gum production and those not

    Alcohol-related blackouts among college students: impact of low level of response to alcohol, ethnicity, sex, and environmental characteristics

    Get PDF
    Objective: To explore how a genetically-influenced characteristic (the level of response to alcohol [LR]), ethnicity, and sex relate to environmental and attitudinal characteristics (peer drinking [PEER], drinking to cope [COPE], and alcohol expectancies [EXPECT]) regarding future alcohol-related blackouts (ARBs). Methods: Structural equation models (SEMs) were used to evaluate how baseline variables related to ARB patterns in 462 college students over 55 weeks. Data were extracted from a longitudinal study of heavy drinking and its consequences at a U.S. university. Results: In the SEM analysis, female sex and Asian ethnicity directly predicted future ARBs (beta weights 0.10 and -0.11, respectively), while all other variables had indirect impacts on ARBs through alcohol quantities (beta weights ~ 0.23 for European American ethnicity and low LR, 0.21 for cannabis use and COPE, and 0.44 for PEER). Alcohol quantities then related to ARBs with beta = 0.44. The SEM explained 23% of the variance. Conclusion: These data may be useful in identifying college students who are more likely to experience future ARBs over a 1-year period. They enhance our understanding of whether the relationships of predictors to ARBs are direct or mediated through baseline drinking patterns, information that may be useful in prevention strategies for ARBs
    • …
    corecore