59 research outputs found

    Investigating the variability in pressure–volume relationships during hemorrhage and aortic occlusion

    Get PDF
    IntroductionThe pressure–volume (P-V) relationships of the left ventricle are the classical benchmark for studying cardiac mechanics and pumping function. Perturbations in the P-V relationship (or P-V loop) can be informative and guide the management of heart failure, hypovolemia, and aortic occlusion. Traditionally, P-V loop analyses have been limited to a single-beat P-V loop or an average of consecutive P-V loops (e.g., 10 cardiac cycles). While there are several algorithms to obtain single-beat estimations of the end-systolic and end-diastolic pressure–volume relations (i.e., ESPVR and EDPVR, respectively), there remains a need to better evaluate the variations in P-V relationships longitudinally over time. This is particularly important when studying acute and transient hemodynamic and cardiac events, such as active hemorrhage or aortic occlusion. In this study, we aim to investigate the variability in P-V relationships during hemorrhagic shock and aortic occlusion, by leveraging on a previously published porcine hemorrhage model.MethodsBriefly, swine were instrumented with a P-V catheter in the left ventricle of the heart and underwent a 25% total blood volume hemorrhage over 30 min, followed by either Zone 1 complete aortic occlusion (i.e., REBOA), Zone 1 endovascular variable aortic control (EVAC), or no occlusion as a control, for 45 min. Preload-independent metrics of cardiac performance were obtained at predetermined time points by performing inferior vena cava occlusion during a ventilatory pause. Continuous P-V loop data and other hemodynamic flow and pressure measurements were collected in real-time using a multi-channel data acquisition system.ResultsWe developed a custom algorithm to quantify the time-dependent variance in both load-dependent and independent cardiac parameters from each P-V loop. As expected, all pigs displayed a significant decrease in the end-systolic pressures and volumes (i.e., ESP, ESV) after hemorrhage. The variability in response to hemorrhage was consistent across all three groups. However, upon introduction of REBOA, we observed significantly high levels of variability in both load-dependent and independent cardiac metrics such as ESP, ESV, and the slope of ESPVR (Ees). For instance, pigs receiving REBOA experienced a 342% increase in ESP from hemorrhage, while pigs receiving EVAC experienced only a 188% increase. The level of variability within the EVAC group was consistently less than that of the REBOA group, which suggests that the EVAC group may be more supportive of maintaining healthier cardiac performance than complete occlusion with REBOA.DiscussionIn conclusion, we successfully developed a novel algorithm to reliably quantify the single-beat and longitudinal P-V relations during hemorrhage and aortic occlusion. As expected, hemorrhage resulted in smaller P-V loops, reflective of decreased preload and afterload conditions; however, the cardiac output and heart rate were preserved. The use of REBOA and EVAC for 44 min resulted in the restoration of baseline afterload and preload conditions, but often REBOA exceeded baseline pressure conditions to an alarming level. The level of variability in response to REBOA was significant and could be potentially associated to cardiac injury. By quantifying each P-V loop, we were able to capture the variability in all P-V loops, including those that were irregular in shape and believe that this can help us identify critical time points associated with declining cardiac performance during hemorrhage and REBOA use

    Damage control operations in non-trauma patients: defining criteria for the staged rapid source control laparotomy in emergency general surgery

    Get PDF
    Abstract Background The staged laparotomy in the operative management of emergency general surgery (EGS) patients is an extension of trauma surgeons operating on this population. Indications for its application, however, are not well defined, and are currently based on the lethal triad used in physiologically-decompensated trauma patients. This study sought to determine the acute indications for the staged, rapid source control laparotomy (RSCL) in EGS patients. Methods All EGS patients undergoing emergent staged RSCL and non-RSCL over 3 years were studied. Demographics, physiologic parameters, perioperative variables, outcomes, and survival were compared. Logistic regression models determined the influence of physiologic parameters on mortality and postoperative complications. EGS-RSCL indications were defined. Results 215 EGS patients underwent emergent laparotomy; 53 (25 %) were staged RSCL. In the 53 patients who underwent a staged RSCL based on the lethal triad, adjusted multivariable regression analysis shows that when used alone, no component of the lethal triad independently improved survival. Staged RSCL may decrease mortality in patients with preoperative severe sepsis / septic shock, and an elevated lactate (≥3); acidosis (pH ≤ 7.25); elderly (≥70); male gender; and multiple comorbidities (≥3). Of the 162 non-RSCL emergent laparotomies, 27 (17 %) required unplanned re-explorations; of these, 17 (63 %) had sepsis preoperatively and 9 (33 %) died. Conclusions The acute physiologic indicators that help guide operative decisions in trauma may not confer a similar survival advantage in EGS. To replace the lethal triad, criteria for application of the staged RSCL in EGS need to be defined. Based on these results, the indications should include severe sepsis / septic shock, lactate, acidosis, gender, age, and pre-existing comorbidities. When correctly applied, the staged RSCL may help to improve survival in decompensated EGS patients

    Cost-Effectiveness Analysis of Diagnostic Options for Pneumocystis Pneumonia (PCP)

    Get PDF
    Diagnosis of Pneumocystis jirovecii pneumonia (PCP) is challenging, particularly in developing countries. Highly sensitive diagnostic methods are costly, while less expensive methods often lack sensitivity or specificity. Cost-effectiveness comparisons of the various diagnostic options have not been presented.We compared cost-effectiveness, as measured by cost per life-years gained and proportion of patients successfully diagnosed and treated, of 33 PCP diagnostic options, involving combinations of specimen collection methods [oral washes, induced and expectorated sputum, and bronchoalveolar lavage (BAL)] and laboratory diagnostic procedures [various staining procedures or polymerase chain reactions (PCR)], or clinical diagnosis with chest x-ray alone. Our analyses were conducted from the perspective of the government payer among ambulatory, HIV-infected patients with symptoms of pneumonia presenting to HIV clinics and hospitals in South Africa. Costing data were obtained from the National Institutes of Communicable Diseases in South Africa. At 50% disease prevalence, diagnostic procedures involving expectorated sputum with any PCR method, or induced sputum with nested or real-time PCR, were all highly cost-effective, successfully treating 77-90% of patients at 2651perlifeyeargained.ProceduresusingBALspecimensweresignificantlymoreexpensivewithoutaddedbenefit,successfullytreating689026-51 per life-year gained. Procedures using BAL specimens were significantly more expensive without added benefit, successfully treating 68-90% of patients at costs of 189-232 per life-year gained. A relatively cost-effective diagnostic procedure that did not require PCR was Toluidine Blue O staining of induced sputum (25perlifeyeargained,successfullytreating6825 per life-year gained, successfully treating 68% of patients). Diagnosis using chest x-rays alone resulted in successful treatment of 77% of patients, though cost-effectiveness was reduced (109 per life-year gained) compared with several molecular diagnostic options.For diagnosis of PCP, use of PCR technologies, when combined with less-invasive patient specimens such as expectorated or induced sputum, represent more cost-effective options than any diagnostic procedure using BAL, or chest x-ray alone

    The Princeton Protein Orthology Database (P-POD): A Comparative Genomics Analysis Tool for Biologists

    Get PDF
    Many biological databases that provide comparative genomics information and tools are now available on the internet. While certainly quite useful, to our knowledge none of the existing databases combine results from multiple comparative genomics methods with manually curated information from the literature. Here we describe the Princeton Protein Orthology Database (P-POD, http://ortholog.princeton.edu), a user-friendly database system that allows users to find and visualize the phylogenetic relationships among predicted orthologs (based on the OrthoMCL method) to a query gene from any of eight eukaryotic organisms, and to see the orthologs in a wider evolutionary context (based on the Jaccard clustering method). In addition to the phylogenetic information, the database contains experimental results manually collected from the literature that can be compared to the computational analyses, as well as links to relevant human disease and gene information via the OMIM, model organism, and sequence databases. Our aim is for the P-POD resource to be extremely useful to typical experimental biologists wanting to learn more about the evolutionary context of their favorite genes. P-POD is based on the commonly used Generic Model Organism Database (GMOD) schema and can be downloaded in its entirety for installation on one's own system. Thus, bioinformaticians and software developers may also find P-POD useful because they can use the P-POD database infrastructure when developing their own comparative genomics resources and database tools

    Marine Tar Residues: a Review

    Get PDF
    Abstract Marine tar residues originate from natural and anthropogenic oil releases into the ocean environment and are formed after liquid petroleum is transformed by weathering, sedimentation, and other processes. Tar balls, tar mats, and tar patties are common examples of marine tar residues and can range in size from millimeters in diameter (tar balls) to several meters in length and width (tar mats). These residues can remain in the ocean envi-ronment indefinitely, decomposing or becoming buried in the sea floor. However, in many cases, they are transported ashore via currents and waves where they pose a concern to coastal recreation activities, the seafood industry and may have negative effects on wildlife. This review summarizes the current state of knowledge on marine tar residue formation, transport, degradation, and distribution. Methods of detection and removal of marine tar residues and their possible ecological effects are discussed, in addition to topics of marine tar research that warrant further investigation. Emphasis is placed on ben-thic tar residues, with a focus on the remnants of the Deepwater Horizon oil spill in particular, which are still affecting the northern Gulf of Mexico shores years after the leaking submarine well was capped

    The evolving SARS-CoV-2 epidemic in Africa: Insights from rapidly expanding genomic surveillance

    Get PDF
    INTRODUCTION Investment in Africa over the past year with regard to severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) sequencing has led to a massive increase in the number of sequences, which, to date, exceeds 100,000 sequences generated to track the pandemic on the continent. These sequences have profoundly affected how public health officials in Africa have navigated the COVID-19 pandemic. RATIONALE We demonstrate how the first 100,000 SARS-CoV-2 sequences from Africa have helped monitor the epidemic on the continent, how genomic surveillance expanded over the course of the pandemic, and how we adapted our sequencing methods to deal with an evolving virus. Finally, we also examine how viral lineages have spread across the continent in a phylogeographic framework to gain insights into the underlying temporal and spatial transmission dynamics for several variants of concern (VOCs). RESULTS Our results indicate that the number of countries in Africa that can sequence the virus within their own borders is growing and that this is coupled with a shorter turnaround time from the time of sampling to sequence submission. Ongoing evolution necessitated the continual updating of primer sets, and, as a result, eight primer sets were designed in tandem with viral evolution and used to ensure effective sequencing of the virus. The pandemic unfolded through multiple waves of infection that were each driven by distinct genetic lineages, with B.1-like ancestral strains associated with the first pandemic wave of infections in 2020. Successive waves on the continent were fueled by different VOCs, with Alpha and Beta cocirculating in distinct spatial patterns during the second wave and Delta and Omicron affecting the whole continent during the third and fourth waves, respectively. Phylogeographic reconstruction points toward distinct differences in viral importation and exportation patterns associated with the Alpha, Beta, Delta, and Omicron variants and subvariants, when considering both Africa versus the rest of the world and viral dissemination within the continent. Our epidemiological and phylogenetic inferences therefore underscore the heterogeneous nature of the pandemic on the continent and highlight key insights and challenges, for instance, recognizing the limitations of low testing proportions. We also highlight the early warning capacity that genomic surveillance in Africa has had for the rest of the world with the detection of new lineages and variants, the most recent being the characterization of various Omicron subvariants. CONCLUSION Sustained investment for diagnostics and genomic surveillance in Africa is needed as the virus continues to evolve. This is important not only to help combat SARS-CoV-2 on the continent but also because it can be used as a platform to help address the many emerging and reemerging infectious disease threats in Africa. In particular, capacity building for local sequencing within countries or within the continent should be prioritized because this is generally associated with shorter turnaround times, providing the most benefit to local public health authorities tasked with pandemic response and mitigation and allowing for the fastest reaction to localized outbreaks. These investments are crucial for pandemic preparedness and response and will serve the health of the continent well into the 21st century

    Hardware and software implementation of POCT1-A for integration of point of care testing in research

    No full text
    Point of care testing (POCT) is increasingly utilized in clinical medicine. Small, portable testing devices can now deliver reliable and accurate diagnostic results during a patient encounter. With these increases in POCT, the issue of data and results management quickly emerges. Results need to be cataloged accurately and efficiently while the providers/support staff are simultaneously managing patient encounters. The integration of electronic medical records (EMR) as data repositories requires that point of care testing data imports automatically into the EMR. POCT1-A was developed as a standard communication language for POCT device manufacturers to streamline automatic data import integration. While all modern POCT devices are built with this connectivity, the systems that provide the integration layer are often proprietary and require a fee for service. In the research environment, there is not enough throughput to justify the practical investment in these data management architectures. Moreover, researcher needs are different and unique compared to data management systems for clinicians. To meet this need, we developed a novel hardware and software connectivity solution using commercially available components to automate data management from a point-of-care blood biochemical analyzer during a critical care study in the preclinical research environment

    Severely Elevated Blood Pressure and Early Mortality in Children with Traumatic Brain Injuries: The Neglected End of the Spectrum

    No full text
    Introduction: In adults with traumatic brain injuries (TBI), hypotension and hypertension at presentation are associated with mortality. The effect of age-adjusted blood pressure in children with TBI has been insufficiently studied. We sought to determine if age-adjusted hypertension in children with severe TBI is associated with mortality. Methods: This was a retrospective analysis of the Department of Defense Trauma Registry (DoDTR) between 2001 and 2013. We included for analysis patients 10 years or < 70mmHg + (2 × age) for children ≤10 years. We performed multivariable logistic regression and Cox regression to determine if BP categories were associated with mortality. Results: Of 4,990 children included in the DoDTR, 740 met criteria for analysis. Fifty patients (6.8%) were hypotensive upon arrival to the ED, 385 (52.0%) were normotensive, 115 (15.5%) had moderate hypertension, and 190 (25.7%) had severe hypertension. When compared to normotensive patients, moderate and severe hypertension patients had similar Injury Severity Scores, similar AIS head scores, and similar frequencies of neurosurgical procedures. Multivariable logistic regression demonstrated that hypotension (odd ratio [OR] 2.85, 95 confidence interval [CI] 1.26–6.47) and severe hypertension (OR 2.58, 95 CI 1.32–5.03) were associated with increased 24-hour mortality. Neither hypotension (Hazard ratio (HR) 1.52, 95 CI 0.74–3.11) nor severe hypertension (HR 1.65, 95 CI 0.65–2.30) was associated with time to mortality. Conclusion: Pediatric age-adjusted hypertension is frequent after severe TBI. Severe hypertension is strongly associated with 24-hour mortality. Pediatric age-adjusted blood pressure needs to be further evaluated as a critical marker of early mortality
    corecore