62 research outputs found

    Targeted Temperature Management Using Esophageal Cooling

    Get PDF
    Although specific temperature targets are debated, targeted temperature management (TTM) is a common treatment for postcardiac arrest patients. However, consistently implementing a TTM protocol is challenging, especially in a community hospital. Often, the protocols described in the literature include labor- and cost-intensive methods that are not feasible or sustainable in many health care settings. Esophageal temperature management (ETM) is a TTM method that can be easily utilized alone or combined with surface methods. We sought to evaluate ETM in a cohort of patients treated with TTM after cardiac arrest. Chart reviews were conducted of all patients treated with ETM after cardiac arrest at our community medical center. Initial patient temperature, time to target, supplemental methods (water blankets, chest wraps, or head wraps), and patient survival were extracted for analysis. A total of 54 patients were treated from August 2016 to November 2018; 30 received ETM only, 22 received supplemental cooling, and 2 had treatment discontinued before reaching target due to recovery. Target temperatures ranged from 32°C to 36°C, depending on provider preference. The median time to target temperature for the entire cohort was 219 minutes (interquartile range [IQR] 81-415). For the cohorts without, and with, supplemental cooling modalities, the median time to attain target temperature was 128 minutes (IQR 71-334), and 285 minutes (IQR 204-660), respectively. Survival to intensive care unit discharge was 51.9% for the entire cohort. Survivors exhibited longer times to achieve goal temperature (median 180 minutes in nonsurvivors vs. 255 minutes in survivors). ETM attains target temperature at a rate consistent with current guidelines and with similar performance to alternative modalities. As in other studies, surviving patients required longer times to reach target temperature

    Thermally-aware composite run-time CPU power models

    No full text
    Accurate and stable CPU power modelling is fundamental in modern system-on-chips (SoCs) for two main reasons: 1) they enable significant online energy savings by providing a run-time manager with reliable power consumption data for controlling CPU energy-saving techniques; 2) they can be used as accurate and trusted reference models for system design and exploration. We begin by showing the limitations in typical performance monitoring counter (PMC) based power modelling approaches and illustrate how an improved model formulation results in a more stable model that efficiently captures relationships between the input variables and the power consumption. Using this as a solid foundation, we present a methodology for adding thermal-awareness and analytically decomposing the power into its constituting parts. We develop and validate our methodology using data recorded from a quad-core ARM Cortex-A15 mobile CPU and we achieve an average prediction error of 3.7% across 39 diverse workloads, 8 Dynamic Voltage-Frequency Scaling (DVFS) levels and with a CPU temperature ranging from 31 degrees C to 91 degrees C. Moreover, we measure the effect of switching cores offline and decompose the existing power model to estimate the static power of each CPU and L2 cache, the dynamic power due to constant background (BG) switching, and the dynamic power caused by the activity of each CPU individually. Finally, we provide our model equations and software tools for implementing in a run-time manager or for using with an architectural simulator, such as gem5

    Elevated Intraocular Pressure After Intravitreal Steroid Injection in Diabetic Macular Edema: Monitoring and Management

    Get PDF
    INTRODUCTION: With the increasing use of intravitreal administration of corticosteroids in macular edema, steroid-induced intraocular pressure (IOP) rise is becoming an emergent issue. However, for patients in whom intravitreal steroids are indicated, there are no specific recommendations for IOP monitoring and management after intravitreal administration of corticosteroids. METHOD: An expert panel of European ophthalmologists reviewed evidence on corticosteroid-induced IOP elevation. The objective of the panel was to propose an algorithm based on available literature and their own experience for the monitoring and management of corticosteroid-induced IOP elevation, with a focus on diabetic patients. RESULTS: Data from trials including diabetic patients with a rise of IOP after intravitreal steroid administration indicate that IOP-lowering medical treatment is sufficient for a large majority of patients; only a small percentage underwent laser trabeculoplasty or filtering filtration surgery. A 2-step algorithm is proposed that is based on the basal value of IOP and evidence for glaucoma. The first step is a risk stratification before treatment. Patients normotensive at baseline (IOP ≤ 21 mmHg), do not require additional baseline diagnostic tests. However, patients with baseline ocular hypertension (OHT) (IOP > 21 mmHg) should undergo baseline imaging and visual field testing. The second step describes monitoring and treatment after steroid administration. During follow-up, patients developing OHT should have baseline and periodical imaging and visual field testing; IOP-lowering treatment is proposed only if IOP is >25 mmHg or if diagnostic tests suggest developing glaucoma. CONCLUSION: The management and follow-up of OHT following intravitreal corticosteroid injection is similar to that of primary OHT. If OHT develops, IOP is controlled in a large proportion of patients with standard IOP treatments. The present algorithm was developed to assist ophthalmologists with guiding principles in the management of corticosteroid-induced IOP elevation. FUNDING: Alimera Sciences Limited

    Receptor Density-Dependent Motility of Influenza Virus Particles on Surface Gradients

    Get PDF
    Influenza viruses can move across the surface of host cells while interacting with their glycocalyx. This motility may assist in finding or forming locations for cell entry and thereby promote cellular uptake. Because the binding to and cleavage of cell surface receptors forms the driving force for the process, the surface-bound motility of influenza is expected to be dependent on the receptor density. Surface gradients with gradually varying receptor densities are thus a valuable tool to study binding and motility processes of influenza and can function as a mimic for local receptor density variations at the glycocalyx that may steer the directionality of a virus particle in finding the proper site of uptake. We have tracked individual influenza virus particles moving over surfaces with receptor density gradients. We analyzed the extracted virus tracks first at a general level to verify neuraminidase activity and subsequently with increasing detail to quantify the receptor density-dependent behavior on the level of individual virus particles. While a directional bias was not observed, most likely due to limitations of the steepness of the surface gradient, the surface mobility and the probability of sticking were found to be significantly dependent on receptor density. A combination of high surface mobility and high dissociation probability of influenza was observed at low receptor densities, while the opposite occurred at higher receptor densities. These properties result in an effective mechanism for finding high-receptor density patches, which are believed to be a key feature of potential locations for cell entry.</p

    Receptor Density-Dependent Motility of Influenza Virus Particles on Surface Gradients

    Get PDF
    Influenza viruses can move across the surface of host cells while interacting with their glycocalyx. This motility may assist in finding or forming locations for cell entry and thereby promote cellular uptake. Because the binding to and cleavage of cell surface receptors forms the driving force for the process, the surface-bound motility of influenza is expected to be dependent on the receptor density. Surface gradients with gradually varying receptor densities are thus a valuable tool to study binding and motility processes of influenza and can function as a mimic for local receptor density variations at the glycocalyx that may steer the directionality of a virus particle in finding the proper site of uptake. We have tracked individual influenza virus particles moving over surfaces with receptor density gradients. We analyzed the extracted virus tracks first at a general level to verify neuraminidase activity and subsequently with increasing detail to quantify the receptor density-dependent behavior on the level of individual virus particles. While a directional bias was not observed, most likely due to limitations of the steepness of the surface gradient, the surface mobility and the probability of sticking were found to be significantly dependent on receptor density. A combination of high surface mobility and high dissociation probability of influenza was observed at low receptor densities, while the opposite occurred at higher receptor densities. These properties result in an effective mechanism for finding high-receptor density patches, which are believed to be a key feature of potential locations for cell entry

    Receptor Density-Dependent Motility of Influenza Virus Particles on Surface Gradients

    Get PDF
    Influenza viruses can move across the surface of host cells while interacting with their glycocalyx. This motility may assist in finding or forming locations for cell entry and thereby promote cellular uptake. Because the binding to and cleavage of cell surface receptors forms the driving force for the process, the surface-bound motility of influenza is expected to be dependent on the receptor density. Surface gradients with gradually varying receptor densities are thus a valuable tool to study binding and motility processes of influenza and can function as a mimic for local receptor density variations at the glycocalyx that may steer the directionality of a virus particle in finding the proper site of uptake. We have tracked individual influenza virus particles moving over surfaces with receptor density gradients. We analyzed the extracted virus tracks first at a general level to verify neuraminidase activity and subsequently with increasing detail to quantify the receptor density-dependent behavior on the level of individual virus particles. While a directional bias was not observed, most likely due to limitations of the steepness of the surface gradient, the surface mobility and the probability of sticking were found to be significantly dependent on receptor density. A combination of high surface mobility and high dissociation probability of influenza was observed at low receptor densities, while the opposite occurred at higher receptor densities. These properties result in an effective mechanism for finding high-receptor density patches, which are believed to be a key feature of potential locations for cell entry

    Performing meta-analysis with incomplete statistical information in clinical trials

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Results from clinical trials are usually summarized in the form of sampling distributions. When full information (mean, SEM) about these distributions is given, performing meta-analysis is straightforward. However, when some of the sampling distributions only have mean values, a challenging issue is to decide how to use such distributions in meta-analysis. Currently, the most common approaches are either ignoring such trials or for each trial with a missing SEM, finding a similar trial and taking its SEM value as the missing SEM. Both approaches have drawbacks. As an alternative, this paper develops and tests two new methods, the first being the prognostic method and the second being the interval method, to estimate any missing SEMs from a set of sampling distributions with full information. A merging method is also proposed to handle clinical trials with partial information to simulate meta-analysis.</p> <p>Methods</p> <p>Both of our methods use the assumption that the samples for which the sampling distributions will be merged are randomly selected from the same population. In the prognostic method, we predict the missing SEMs from the given SEMs. In the interval method, we define intervals that we believe will contain the missing SEMs and then we use these intervals in the merging process.</p> <p>Results</p> <p>Two sets of clinical trials are used to verify our methods. One family of trials is on comparing different drugs for reduction of low density lipprotein cholesterol (LDL) for Type-2 diabetes, and the other is about the effectiveness of drugs for lowering intraocular pressure (IOP). Both methods are shown to be useful for approximating the conventional meta-analysis including trials with incomplete information. For example, the meta-analysis result of Latanoprost versus Timolol on IOP reduction for six months provided in <abbrgrp><abbr bid="B1">1</abbr></abbrgrp> was 5.05 ± 1.15 (Mean ± SEM) with full information. If the last trial in this study is assumed to be with partial information, the traditional analysis method for dealing with incomplete information that ignores this trial would give 6.49 ± 1.36 while our prognostic method gives 5.02 ± 1.15, and our interval method provides two intervals as Mean ∈ [4.25, 5.63] and SEM ∈ [1.01, 1.24].</p> <p>Conclusion</p> <p>Both the prognostic and the interval methods are useful alternatives for dealing with missing data in meta-analysis. We recommend clinicians to use the prognostic method to predict the missing SEMs in order to perform meta-analysis and the interval method for obtaining a more cautious result.</p

    High-Speed Single-Event Current Transient Measurements in SiGe HBTs

    Get PDF
    Time-resolved ion beam induced charge reveals heavy ion response of IBM 5AM SiGe HBT: 1) Position correlation. 2) Unique response for different bias schemes. 3) Similarities to TPA pulsed-laser data. Heavy ion broad-beam transients provide more realistic device response: 1) Feedback using microbeam data 2) Overcome existing issues of LET and ion range with microbeam Both micro- and broad-beam data sets yield valuable input for TCAD simulations. Uncover detailed mechanisms for SiGe HBTs and other devices fabricated on lightly-doped substrates
    • …
    corecore