138 research outputs found
Effects of donor/recipient human leukocyte antigen mismatch on human cytomegalovirus replication following liver transplantation.
Background
Natural immunity against cytomegalovirus (CMV) can control virus replication after solid organ transplantation; however, it is not known which components of the adaptive immune system mediate this protection. We investigated whether this protection requires human leukocyte antigen (HLA) matching between donor and recipient by exploiting the fact that, unlike transplantation of other solid organs, liver transplantation does not require HLA matching, but some donor and recipient pairs may nevertheless be matched by chance.
Methods
To further investigate this immune control, we determined whether chance HLA matching between donor (D) and recipient (R) in liver transplants affected a range of viral replication parameters.
Results
In total, 274 liver transplant recipients were stratified according to matches at the HLA A, HLA B, and HLA DR loci. The incidence of CMV viremia, kinetics of replication, and peak viral load were similar between the HLA matched and mismatched patients in the D+/R+ and D−/R+ transplant groups. D+/R− transplants with 1 or 2 mismatches at the HLA DR locus had a higher incidence of CMV viremia >3000 genomes/mL blood compared to patients matched at this locus (78% vs. 17%; P = 0.01). Evidence was seen that matching at the HLA A locus had a small effect on peak viral loads in D+/R− patients, with median peak loads of 3540 and 14,706 genomes/mL in the 0 and combined (1 and 2) mismatch groups, respectively (P = 0.03).
Conclusion
Overall, our data indicate that, in the setting of liver transplantation, prevention of CMV infection and control of CMV replication by adaptive immunity is minimally influenced by HLA matching of the donor and recipient. Our data raise questions about immune control of CMV in the liver and also about the cells in which the virus is amplified to give rise to CMV viremia
Cytomegalovirus replication kinetics in solid organ transplant recipients managed by preemptive therapy.
After allotransplantation, cytomegalovirus (CMV) may be transmitted from the donor organ, giving rise to primary infection in a CMV negative recipient or reinfection in one who is CMV positive. In addition, latent CMV may reactivate in a CMV positive recipient. In this study, serial blood samples from 689 kidney or liver transplant recipients were tested for CMV DNA by quantitative PCR. CMV was managed using preemptive antiviral therapy and no patient received antiviral prophylaxis. Dynamic and quantitative measures of viremia and treatment were assessed. Median peak viral load, duration of viremia and duration of treatment were highest during primary infection, followed by reinfection then reactivation. In patients who experienced a second episode of viremia, the viral replication rate was significantly slower than in the first episode. Our data provide a clear demonstration of the immune control of CMV in immunosuppressed patients and emphasize the effectiveness of the preemptive approach for prevention of CMV syndrome and end organ disease. Overall, our findings provide quantitative biomarkers which can be used in pharmacodynamic assessments of the ability of novel CMV vaccines or antiviral drugs to reduce or even interrupt such transmission
Rare gallbladder adenomyomatosis presenting as atypical cholecystitis: case report
<p>Abstract</p> <p>Background</p> <p>Gallbladder adenomyomatosis is a benign condition characterized by hyperplastic change in the gallbladder wall and overgrowth of the mucosa because of an unknown cause. Patients with gallbladder adenomyomatosis usually present with abdominal pain. However, we herein describe a case of a patient with gallbladder adenomyomatosis who did not present with abdominal pain, but with only fever.</p> <p>Case presentation</p> <p>A 34-year-old man presented to our hospital with a fever. No abdominal discomfort was declared. His physical examination showed no abnormalities. Ultrasound of the abdomen revealed thickness of the gallbladder. Acute cholecystitis was diagnosed. The fever persisted even after 1 week of antibiotic therapy. Magnetic resonance imaging of the abdomen showed gallbladder adenomyomatosis with intramural Rokitansky-Aschoff sinuses. Exploratory laparotomy with cholecystectomy was performed. The fever recovered and no residual symptoms were reported at the 3-year follow-up.</p> <p>Conclusions</p> <p>Gallbladder adenomyomatosis can present with fever as the only symptom. Although the association between gallbladder adenomyomatosis and malignancy has yet to be elucidated, previous reports have shown a strong association between gallbladder carcinoma and a subtype of gallbladder adenomyomatosis. Surgical intervention remains the first-choice treatment for patients with gallbladder adenomyomatosis.</p
Effects of donor/recipient human leukocyte antigen mismatch on human cytomegalovirus replication following liver transplantation
Natural immunity against cytomegalovirus (CMV) can control virus replication after solid organ transplantation; however, it is not known which components of the adaptive immune system mediate this protection. We investigated whether this protection requires human leukocyte antigen (HLA) matching between donor and recipient by exploiting the fact that, unlike transplantation of other solid organs, liver transplantation does not require HLA matching, but some donor and recipient pairs may nevertheless be matched by chance
Quantification of Rapid Myosin Regulatory Light Chain Phosphorylation Using High-Throughput In-Cell Western Assays: Comparison to Western Immunoblots
Quantification of phospho-proteins (PPs) is crucial when studying cellular signaling pathways. Western immunoblotting (WB) is commonly used for the measurement of relative levels of signaling intermediates in experimental samples. However, WB is in general a labour-intensive and low-throughput technique. Because of variability in protein yield and phospho-signal preservation during protein harvesting, and potential loss of antigen during protein transfer, WB provides only semi-quantitative data. By comparison, the "in-cell western" (ICW) technique has high-throughput capacity and requires less extensive sample preparation. Thus, we compared the ICW technique to WB for measuring phosphorylated myosin regulatory light chain (PMLC(20)) in primary cultures of uterine myocytes to assess their relative specificity, sensitivity, precision, and quantification of biologically relevant responses.ICWs are cell-based microplate assays for quantification of protein targets in their cellular context. ICWs utilize a two-channel infrared (IR) scanner (Odyssey(R)) to quantify signals arising from near-infrared (NIR) fluorophores conjugated to secondary antibodies. One channel is dedicated to measuring the protein of interest and the second is used for data normalization of the signal in each well of the microplate. Using uterine myocytes, we assessed oxytocin (OT)-stimulated MLC(20) phosphorylation measured by ICW and WB, both using NIR fluorescence. ICW and WB data were comparable regarding signal linearity, signal specificity, and time course of phosphorylation response to OT.ICW and WB yield comparable biological data. The advantages of ICW over WB are its high-throughput capacity, improved precision, and reduced sample preparation requirements. ICW might provide better sensitivity and precision with low-quantity samples or for protocols requiring large numbers of samples. These features make the ICW technique an excellent tool for the study of phosphorylation endpoints. However, the drawbacks of ICW include the need for a cell culture format and the lack of utility where protein purification, concentration or stoichiometric analyses are required
Identifying and Prioritizing Greater Sage-Grouse Nesting and Brood-Rearing Habitat for Conservation in Human-Modified Landscapes
BACKGROUND: Balancing animal conservation and human use of the landscape is an ongoing scientific and practical challenge throughout the world. We investigated reproductive success in female greater sage-grouse (Centrocercus urophasianus) relative to seasonal patterns of resource selection, with the larger goal of developing a spatially-explicit framework for managing human activity and sage-grouse conservation at the landscape level. METHODOLOGY/PRINCIPAL FINDINGS: We integrated field-observation, Global Positioning Systems telemetry, and statistical modeling to quantify the spatial pattern of occurrence and risk during nesting and brood-rearing. We linked occurrence and risk models to provide spatially-explicit indices of habitat-performance relationships. As part of the analysis, we offer novel biological information on resource selection during egg-laying, incubation, and night. The spatial pattern of occurrence during all reproductive phases was driven largely by selection or avoidance of terrain features and vegetation, with little variation explained by anthropogenic features. Specifically, sage-grouse consistently avoided rough terrain, selected for moderate shrub cover at the patch level (within 90 m(2)), and selected for mesic habitat in mid and late brood-rearing phases. In contrast, risk of nest and brood failure was structured by proximity to anthropogenic features including natural gas wells and human-created mesic areas, as well as vegetation features such as shrub cover. CONCLUSIONS/SIGNIFICANCE: Risk in this and perhaps other human-modified landscapes is a top-down (i.e., human-mediated) process that would most effectively be minimized by developing a better understanding of specific mechanisms (e.g., predator subsidization) driving observed patterns, and using habitat-performance indices such as those developed herein for spatially-explicit guidance of conservation intervention. Working under the hypothesis that industrial activity structures risk by enhancing predator abundance or effectiveness, we offer specific recommendations for maintaining high-performance habitat and reducing low-performance habitat, particularly relative to the nesting phase, by managing key high-risk anthropogenic features such as industrial infrastructure and water developments
Trends in, and factors associated with, HIV infection amongst tuberculosis patients in the era of anti-retroviral therapy: a retrospective study in England, Wales and Northern Ireland
Background: HIV increases the progression of latent tuberculosis (TB) infection to active disease and contributed to increased TB in the UK until 2004. We describe temporal trends in HIV infection amongst patients with TB and identify factors associated with HIV infection. / Methods: We used national surveillance data of all TB cases reported in England, Wales and Northern Ireland from 2000 to 2014 and determined HIV status through record linkage to national HIV surveillance. We used logistic regression to identify associations between HIV and demographic, clinical and social factors. / Results: There were 106,829 cases of TB in adults (≥ 15 years) reported from 2000 to 2014. The number and proportion of TB patients infected with HIV decreased from 543/6782 (8.0%) in 2004 to 205/6461 (3.2%) in 2014. The proportion of patients diagnosed with HIV > 91 days prior to their TB diagnosis increased from 33.5% in 2000 to 60.2% in 2013. HIV infection was highest in people of black African ethnicity from countries with high HIV prevalence (32.3%), patients who misused drugs (8.1%) and patients with miliary or meningeal TB (17.2%). / Conclusions: There has been an overall decrease in TB-HIV co-infection and a decline in the proportion of patients diagnosed simultaneously with both infections. However, high rates of HIV remain in some sub-populations of patients with TB, particularly black Africans born in countries with high HIV prevalence and people with a history of drug misuse. Whilst the current policy of testing all patients diagnosed with TB for HIV infection is important in ensuring appropriate management of TB patients, many of these TB cases would be preventable if HIV could be diagnosed before TB develops. Improving screening for both latent TB and HIV and ensuring early treatment of HIV in these populations could help prevent these TB cases. British HIV Association guidelines on latent TB testing for people with HIV from sub-Saharan Africa remain relevant, and latent TB screening for people with HIV with a history of drug misuse, homelessness or imprisonment should also be considered
Learning environment, attitudes and anxiety across the transition from primary to secondary school mathematics
Past research has revealed that, relative to primary-school students, high-school students have less-positive attitudes to mathematics and perceive their classroom environments and teacher–student relationships less favourably. This study involved the transition experience of 541 students in 47 classes in 15 primary (year 7) and secondary (year 8) government and Catholic schools in metropolitan and regional South Australia. Scales were adapted from three established instruments, namely, the What Is Happening In this Class?, Test of Mathematics Related Attitudes and Revised Mathematics Anxiety Ratings Scale, to identify changes across the transition from primary to secondary school in terms of the classroom learning environment and students’ attitude/anxiety towards mathematics. Relative to year 7 students, year 8 students reported less Involvement, less positive Attitude to Mathematical Inquiry, less Enjoyment of Mathematics and greater Mathematics Anxiety. Differences between students in Years 7 and 8 were very similar for male and female students, although the magnitude of sex differences in attitudes was slightly different in Years 7 and 8
Utilisation of an operative difficulty grading scale for laparoscopic cholecystectomy
Background
A reliable system for grading operative difficulty of laparoscopic cholecystectomy would standardise description of findings and reporting of outcomes. The aim of this study was to validate a difficulty grading system (Nassar scale), testing its applicability and consistency in two large prospective datasets.
Methods
Patient and disease-related variables and 30-day outcomes were identified in two prospective cholecystectomy databases: the multi-centre prospective cohort of 8820 patients from the recent CholeS Study and the single-surgeon series containing 4089 patients. Operative data and patient outcomes were correlated with Nassar operative difficultly scale, using Kendall’s tau for dichotomous variables, or Jonckheere–Terpstra tests for continuous variables. A ROC curve analysis was performed, to quantify the predictive accuracy of the scale for each outcome, with continuous outcomes dichotomised, prior to analysis.
Results
A higher operative difficulty grade was consistently associated with worse outcomes for the patients in both the reference and CholeS cohorts. The median length of stay increased from 0 to 4 days, and the 30-day complication rate from 7.6 to 24.4% as the difficulty grade increased from 1 to 4/5 (both p < 0.001). In the CholeS cohort, a higher difficulty grade was found to be most strongly associated with conversion to open and 30-day mortality (AUROC = 0.903, 0.822, respectively). On multivariable analysis, the Nassar operative difficultly scale was found to be a significant independent predictor of operative duration, conversion to open surgery, 30-day complications and 30-day reintervention (all p < 0.001).
Conclusion
We have shown that an operative difficulty scale can standardise the description of operative findings by multiple grades of surgeons to facilitate audit, training assessment and research. It provides a tool for reporting operative findings, disease severity and technical difficulty and can be utilised in future research to reliably compare outcomes according to case mix and intra-operative difficulty
Regulatory Response to Carbon Starvation in Caulobacter crescentus
Bacteria adapt to shifts from rapid to slow growth, and have developed strategies for long-term survival during prolonged starvation and stress conditions. We report the regulatory response of C. crescentus to carbon starvation, based on combined high-throughput proteome and transcriptome analyses. Our results identify cell cycle changes in gene expression in response to carbon starvation that involve the prominent role of the FixK FNR/CAP family transcription factor and the CtrA cell cycle regulator. Notably, the SigT ECF sigma factor mediates the carbon starvation-induced degradation of CtrA, while activating a core set of general starvation-stress genes that respond to carbon starvation, osmotic stress, and exposure to heavy metals. Comparison of the response of swarmer cells and stalked cells to carbon starvation revealed four groups of genes that exhibit different expression profiles. Also, cell pole morphogenesis and initiation of chromosome replication normally occurring at the swarmer-to-stalked cell transition are uncoupled in carbon-starved cells
- …