64 research outputs found
Devastating Brain Injuries: Assessment and Management Part I: Overview of Brain Death
“To the world you may be one person, but to one person you may be the world.
Protease Activity Increases in Plasma, Peritoneal Fluid, and Vital Organs after Hemorrhagic Shock in Rats
Hemorrhagic shock (HS) is associated with high mortality. A severe decrease in blood pressure causes the intestine, a major site of digestive enzymes, to become permeable – possibly releasing those enzymes into the circulation and peritoneal space, where they may in turn activate other enzymes, e.g. matrix metalloproteinases (MMPs). If uncontrolled, these enzymes may result in pathophysiologic cleavage of receptors or plasma proteins. Our first objective was to determine, in compartments outside of the intestine (plasma, peritoneal fluid, brain, heart, liver, and lung) protease activities and select protease concentrations after hemorrhagic shock (2 hours ischemia, 2 hours reperfusion). Our second objective was to determine whether inhibition of proteases in the intestinal lumen with a serine protease inhibitor (ANGD), a process that improves survival after shock in rats, reduces the protease activities distant from the intestine. To determine the protease activity, plasma and peritoneal fluid were incubated with small peptide substrates for trypsin-, chymotrypsin-, and elastase-like activities or with casein, a substrate cleaved by multiple proteases. Gelatinase activities were determined by gelatin gel zymography and a specific MMP-9 substrate. Immunoblotting was used to confirm elevated pancreatic trypsin in plasma, peritoneal fluid, and lung and MMP-9 concentrations in all samples after hemorrhagic shock. Caseinolytic, trypsin-, chymotrypsin-, elastase-like, and MMP-9 activities were all significantly (p<0.05) upregulated after hemorrhagic shock regardless of enteral pretreatment with ANGD. Pancreatic trypsin was detected by immunoblot in the plasma, peritoneal space, and lungs after hemorrhagic shock. MMP-9 concentrations and activities were significantly upregulated after hemorrhagic shock in plasma, peritoneal fluid, heart, liver, and lung. These results indicate that protease activities, including that of trypsin, increase in sites distant from the intestine after hemorrhagic shock. Proteases, including pancreatic proteases, may be shock mediators and potential targets for therapy in shock
Cell Type Mediated Resistance of Vesicular Stomatitis Virus and Sendai Virus to Ribavirin
Ribavirin (RBV) is a synthetic nucleoside analog with broad spectrum antiviral activity. Although RBV is approved for the treatment of hepatitis C virus, respiratory syncytial virus, and Lassa fever virus infections, its mechanism of action and therapeutic efficacy remains highly controversial. Recent reports show that the development of cell-based resistance after continuous RBV treatment via decreased RBV uptake can greatly limit its efficacy. Here, we examined whether certain cell types are naturally resistant to RBV even without prior drug exposure. Seven different cell lines from various host species were compared for RBV antiviral activity against two nonsegmented negative-strand RNA viruses, vesicular stomatitis virus (VSV, a rhabdovirus) and Sendai virus (SeV, a paramyxovirus). Our results show striking differences between cell types in their response to RBV, ranging from virtually no antiviral effect to very effective inhibition of viral replication. Despite differences in viral replication kinetics for VSV and SeV in the seven cell lines, the observed pattern of RBV resistance was very similar for both viruses, suggesting that cellular rather than viral determinants play a major role in this resistance. While none of the tested cell lines was defective in RBV uptake, dramatic variations were observed in the long-term accumulation of RBV in different cell types, and it correlated with the antiviral efficacy of RBV. While addition of guanosine neutralized RBV only in cells already highly resistant to RBV, actinomycin D almost completely reversed the RBV effect (but not uptake) in all cell lines. Together, our data suggest that RBV may inhibit the same virus via different mechanisms in different cell types depending on the intracellular RBV metabolism. Our results strongly point out the importance of using multiple cell lines of different origin when antiviral efficacy and potency are examined for new as well as established drugs in vitro
Recommended from our members
Machine Learning Prediction of Liver Allograft Utilization From Deceased Organ Donors Using the National Donor Management Goals Registry.
Early prediction of whether a liver allograft will be utilized for transplantation may allow better resource deployment during donor management and improve organ allocation. The national donor management goals (DMG) registry contains critical care data collected during donor management. We developed a machine learning model to predict transplantation of a liver graft based on data from the DMG registry.MethodsSeveral machine learning classifiers were trained to predict transplantation of a liver graft. We utilized 127 variables available in the DMG dataset. We included data from potential deceased organ donors between April 2012 and January 2019. The outcome was defined as liver recovery for transplantation in the operating room. The prediction was made based on data available 12-18 h after the time of authorization for transplantation. The data were randomly separated into training (60%), validation (20%), and test sets (20%). We compared the performance of our models to the Liver Discard Risk Index.ResultsOf 13 629 donors in the dataset, 9255 (68%) livers were recovered and transplanted, 1519 recovered but used for research or discarded, 2855 were not recovered. The optimized gradient boosting machine classifier achieved an area under the curve of the receiver operator characteristic of 0.84 on the test set, outperforming all other classifiers.ConclusionsThis model predicts successful liver recovery for transplantation in the operating room, using data available early during donor management. It performs favorably when compared to existing models. It may provide real-time decision support during organ donor management and transplant logistics
Machine Learning Prediction of Liver Allograft Utilization From Deceased Organ Donors Using the National Donor Management Goals Registry.
Early prediction of whether a liver allograft will be utilized for transplantation may allow better resource deployment during donor management and improve organ allocation. The national donor management goals (DMG) registry contains critical care data collected during donor management. We developed a machine learning model to predict transplantation of a liver graft based on data from the DMG registry.MethodsSeveral machine learning classifiers were trained to predict transplantation of a liver graft. We utilized 127 variables available in the DMG dataset. We included data from potential deceased organ donors between April 2012 and January 2019. The outcome was defined as liver recovery for transplantation in the operating room. The prediction was made based on data available 12-18 h after the time of authorization for transplantation. The data were randomly separated into training (60%), validation (20%), and test sets (20%). We compared the performance of our models to the Liver Discard Risk Index.ResultsOf 13 629 donors in the dataset, 9255 (68%) livers were recovered and transplanted, 1519 recovered but used for research or discarded, 2855 were not recovered. The optimized gradient boosting machine classifier achieved an area under the curve of the receiver operator characteristic of 0.84 on the test set, outperforming all other classifiers.ConclusionsThis model predicts successful liver recovery for transplantation in the operating room, using data available early during donor management. It performs favorably when compared to existing models. It may provide real-time decision support during organ donor management and transplant logistics
Risk Factors for Traumatic Injury Findings on Thoracic Computed Tomography Among Patients With Blunt Trauma Having a Normal Chest Radiograph
Hypothesis:Wesought to identify risk factors that might predict acute traumatic injury findings on thoracic computed tomography (TCT) among patients having a normal initial chest radiograph (CR). Design: In this retrospective analysis, Abbreviated Injury Score cutoffs were chosen to correspond with obvious physical examination findings. Multivariate logistic regression analysis was performed to identify risk factors predicting acute traumatic injury findings. Setting: Urban level I trauma center. Patients: All patients with blunt trauma having both CR and TCT between July 1, 2005, and June 30, 2007. Patients with abnormalities on their CR were excluded. Main Outcome Measure: Finding of any acute traumatic abnormality on TCT, despite a normal CR. Results: A total of 2435 patients with blunt trauma were identified; 1744 (71.6%) had a normal initial CR, and 394 (22.6%) of these had acute traumatic findings on TCT. Multivariate logistic regression demonstrated that an abdominal Abbreviated Injury Score of 3 or higher (P=.001; odds ratio, 2.6), a pelvic or extremity Abbreviated Injury Score of 2 or higher (P\u3c.001; odds ratio, 2.0), age older than 30 years (P=.004; odds ratio, 1.4), and male sex (P=.04; odds ratio, 1.3) were significantly associated with traumatic findings on TCT. No aortic injuries were diagnosed in patients with a normal CR. Limiting TCT to patients with 1 or more risk factors predicting acute traumatic injury findings would have resulted in reduced radiation exposure and in a cost savings of almost $250 000 over the 2-year period. Limiting TCT to this degree would not have missed any clinically significant vertebral fractures or vascular injuries. Conclusion: Among patients with a normal screening CR, reserving TCT for older male patients with abdominal or extremity blunt trauma seems safe and costeffective
- …