74 research outputs found

    Localization Algorithm for Mobile Sensor Nodes Using 3D Space in Wireless Sensor Network

    Get PDF
    On the problem of wireless sensor network localization; few true three - dimensional (3D) methods have been developed to satisfy the practical needs. In this work we proposed range - based 3D localization algorithm that is accurate, anchor - free, scalable and physical position available. A novel combination of distance and direction measurement techniques introduced to estimate ranges between neighbours. Based on this information local coordinate systems are constructed and then converge to form a global network wide coordinate system ,which finally leads to nodes absolute positions. Simulation results have shown that our algorithm achieves good trade - off between localization percentage and precision

    Early events of Bacillus anthracis germination identified by time-course quantitative proteomics

    Full text link
    Germination of Bacillus anthracis spores involves rehydration of the spore interior and rapid degradation of several of the protective layers, including the spore coat. Here, we examine the temporal changes that occur during B. anthracis spore germination using an isobaric tagging system. Over the course of 17 min from the onset of germination, the levels of at least 19 spore proteins significantly decrease. Included are acid-soluble proteins, several known and predicted coat proteins, and proteins of unknown function. Over half of these proteins are small (less than 100 amino acids) and would have been undetectable by conventional gel-based analysis. We also identified 20 proteins, whose levels modestly increased at the later time points when metabolism has likely resumed. Taken together, our data show that isobaric labeling of complex mixtures is particularly effective for temporal studies. Furthermore, we describe a rigorous statistical approach to define relevant changes that takes into account the nature of data obtained from multidimensional protein identification technology coupled with the use of isobaric tags. This study provides an expanded list of the proteins that may be involved in germination of the B. anthracis spore and their relative levels during germination.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/55849/1/5199_ftp.pd

    Life's simple 7 approach to atrial fibrillation prevention

    Get PDF
    Atrial fibrillation (AF) is the most commonly encountered arrhythmia in clinical practice. It constitutes a major public health problem, with total related annual expenses estimated at $6.65 billion. The American Heart Association developed Life's Simple 7 (LS7) to define and monitor ideal cardiovascular health (CVH). In this review, we examine the role of individual components of LS7 to provide further insight regarding potential influence of achieving AHA's strategic goal on AF prevention. While significant advances have been made in the secondary prevention of AF, little progress has been made to prevent the first occurrence of this arrhythmia in at-risk patients. Improvement of overall cardiovascular health as defined by LS7 may substantially reduce AF risk

    COVID-19 Associated Mucormycosis::A Review of an Emergent Epidemic Fungal Infection in 3 Era of COVID-19 Pandemic

    Get PDF
    At a time when the COVID-19's second wave is still picking up in countries like India, a number of reports describe the potential association with a rise in the number of cases of mucormycosis, commonly known as the black fungus. This fungal infection has been around for centuries and affects those people whose immunity has been compromised due to severe health conditions. In this article, we provide a detailed overview of mucormycosis and discuss how COVID-19 could have caused a sudden spike in an otherwise rare disease in countries like India. The article discusses the various symptoms of the disease, class of people most vulnerable to this infection, preventive measures to avoid the disease, and various treatments that exist in clinical practice and research to manage the disease

    Use of Intravascular Imaging During Chronic Total Occlusion Percutaneous Coronary Intervention: Insights From a Contemporary Multicenter Registry

    Get PDF
    Background: Intravascular imaging can facilitate chronic total occlusion (CTO) percutaneous coronary intervention. Methods and Results: We examined the frequency of use and outcomes of intravascular imaging among 619 CTO percutaneous coronary interventions performed between 2012 and 2015 at 7 US centers. Mean age was 65.4±10 years and 85% of the patients were men. Intravascular imaging was used in 38%: intravascular ultrasound in 36%, optical coherence tomography in 3%, and both in 1.45%. Intravascular imaging was used for stent sizing (26.3%), stent optimization (38.0%), and CTO crossing (35.7%, antegrade in 27.9%, and retrograde in 7.8%). Intravascular imaging to facilitate crossing was used more frequently in lesions with proximal cap ambiguity (49% versus 26%, P<0.0001) and with retrograde as compared with antegrade‐only cases (67% versus 31%, P<0.0001). Despite higher complexity (Japanese CTO score: 2.86±1.19 versus 2.43±1.19, P=0.001), cases in which imaging was used for crossing had similar technical and procedural success (92.8% versus 89.6%, P=0.302 and 90.1% versus 88.3%, P=0.588, respectively) and similar incidence of major cardiac adverse events (2.7% versus 3.2%, P=0.772). Use of intravascular imaging was associated with longer procedure (192 minutes [interquartile range 130, 255] versus 131 minutes [90, 192], P<0.0001) and fluoroscopy (71 minutes [44, 93] versus 39 minutes [25, 69], P<0.0001) time. Conclusions: Intravascular imaging is frequently performed during CTO percutaneous coronary intervention both for crossing and for stent selection/optimization. Despite its use in more complex lesion subsets, intravascular imaging was associated with similar rates of technical and procedural success for CTO percutaneous coronary intervention. Clinical Trial Registration URL: http://www.clinicaltrials.gov. Unique identifier: NCT02061436

    Subsequent Event Risk in Individuals with Established Coronary Heart Disease:Design and Rationale of the GENIUS-CHD Consortium

    Get PDF
    BACKGROUND: The "GENetIcs of sUbSequent Coronary Heart Disease" (GENIUS-CHD) consortium was established to facilitate discovery and validation of genetic variants and biomarkers for risk of subsequent CHD events, in individuals with established CHD. METHODS: The consortium currently includes 57 studies from 18 countries, recruiting 185,614 participants with either acute coronary syndrome, stable CHD or a mixture of both at baseline. All studies collected biological samples and followed-up study participants prospectively for subsequent events. RESULTS: Enrollment into the individual studies took place between 1985 to present day with duration of follow up ranging from 9 months to 15 years. Within each study, participants with CHD are predominantly of self-reported European descent (38%-100%), mostly male (44%-91%) with mean ages at recruitment ranging from 40 to 75 years. Initial feasibility analyses, using a federated analysis approach, yielded expected associations between age (HR 1.15 95% CI 1.14-1.16) per 5-year increase, male sex (HR 1.17, 95% CI 1.13-1.21) and smoking (HR 1.43, 95% CI 1.35-1.51) with risk of subsequent CHD death or myocardial infarction, and differing associations with other individual and composite cardiovascular endpoints. CONCLUSIONS: GENIUS-CHD is a global collaboration seeking to elucidate genetic and non-genetic determinants of subsequent event risk in individuals with established CHD, in order to improve residual risk prediction and identify novel drug targets for secondary prevention. Initial analyses demonstrate the feasibility and reliability of a federated analysis approach. The consortium now plans to initiate and test novel hypotheses as well as supporting replication and validation analyses for other investigators

    The development and validation of a scoring tool to predict the operative duration of elective laparoscopic cholecystectomy

    Get PDF
    Background: The ability to accurately predict operative duration has the potential to optimise theatre efficiency and utilisation, thus reducing costs and increasing staff and patient satisfaction. With laparoscopic cholecystectomy being one of the most commonly performed procedures worldwide, a tool to predict operative duration could be extremely beneficial to healthcare organisations. Methods: Data collected from the CholeS study on patients undergoing cholecystectomy in UK and Irish hospitals between 04/2014 and 05/2014 were used to study operative duration. A multivariable binary logistic regression model was produced in order to identify significant independent predictors of long (> 90 min) operations. The resulting model was converted to a risk score, which was subsequently validated on second cohort of patients using ROC curves. Results: After exclusions, data were available for 7227 patients in the derivation (CholeS) cohort. The median operative duration was 60 min (interquartile range 45–85), with 17.7% of operations lasting longer than 90 min. Ten factors were found to be significant independent predictors of operative durations > 90 min, including ASA, age, previous surgical admissions, BMI, gallbladder wall thickness and CBD diameter. A risk score was then produced from these factors, and applied to a cohort of 2405 patients from a tertiary centre for external validation. This returned an area under the ROC curve of 0.708 (SE = 0.013, p  90 min increasing more than eightfold from 5.1 to 41.8% in the extremes of the score. Conclusion: The scoring tool produced in this study was found to be significantly predictive of long operative durations on validation in an external cohort. As such, the tool may have the potential to enable organisations to better organise theatre lists and deliver greater efficiencies in care

    Global age-sex-specific fertility, mortality, healthy life expectancy (HALE), and population estimates in 204 countries and territories, 1950-2019 : a comprehensive demographic analysis for the Global Burden of Disease Study 2019

    Get PDF
    Background: Accurate and up-to-date assessment of demographic metrics is crucial for understanding a wide range of social, economic, and public health issues that affect populations worldwide. The Global Burden of Diseases, Injuries, and Risk Factors Study (GBD) 2019 produced updated and comprehensive demographic assessments of the key indicators of fertility, mortality, migration, and population for 204 countries and territories and selected subnational locations from 1950 to 2019. Methods: 8078 country-years of vital registration and sample registration data, 938 surveys, 349 censuses, and 238 other sources were identified and used to estimate age-specific fertility. Spatiotemporal Gaussian process regression (ST-GPR) was used to generate age-specific fertility rates for 5-year age groups between ages 15 and 49 years. With extensions to age groups 10–14 and 50–54 years, the total fertility rate (TFR) was then aggregated using the estimated age-specific fertility between ages 10 and 54 years. 7417 sources were used for under-5 mortality estimation and 7355 for adult mortality. ST-GPR was used to synthesise data sources after correction for known biases. Adult mortality was measured as the probability of death between ages 15 and 60 years based on vital registration, sample registration, and sibling histories, and was also estimated using ST-GPR. HIV-free life tables were then estimated using estimates of under-5 and adult mortality rates using a relational model life table system created for GBD, which closely tracks observed age-specific mortality rates from complete vital registration when available. Independent estimates of HIV-specific mortality generated by an epidemiological analysis of HIV prevalence surveys and antenatal clinic serosurveillance and other sources were incorporated into the estimates in countries with large epidemics. Annual and single-year age estimates of net migration and population for each country and territory were generated using a Bayesian hierarchical cohort component model that analysed estimated age-specific fertility and mortality rates along with 1250 censuses and 747 population registry years. We classified location-years into seven categories on the basis of the natural rate of increase in population (calculated by subtracting the crude death rate from the crude birth rate) and the net migration rate. We computed healthy life expectancy (HALE) using years lived with disability (YLDs) per capita, life tables, and standard demographic methods. Uncertainty was propagated throughout the demographic estimation process, including fertility, mortality, and population, with 1000 draw-level estimates produced for each metric. Findings: The global TFR decreased from 2·72 (95% uncertainty interval [UI] 2·66–2·79) in 2000 to 2·31 (2·17–2·46) in 2019. Global annual livebirths increased from 134·5 million (131·5–137·8) in 2000 to a peak of 139·6 million (133·0–146·9) in 2016. Global livebirths then declined to 135·3 million (127·2–144·1) in 2019. Of the 204 countries and territories included in this study, in 2019, 102 had a TFR lower than 2·1, which is considered a good approximation of replacement-level fertility. All countries in sub-Saharan Africa had TFRs above replacement level in 2019 and accounted for 27·1% (95% UI 26·4–27·8) of global livebirths. Global life expectancy at birth increased from 67·2 years (95% UI 66·8–67·6) in 2000 to 73·5 years (72·8–74·3) in 2019. The total number of deaths increased from 50·7 million (49·5–51·9) in 2000 to 56·5 million (53·7–59·2) in 2019. Under-5 deaths declined from 9·6 million (9·1–10·3) in 2000 to 5·0 million (4·3–6·0) in 2019. Global population increased by 25·7%, from 6·2 billion (6·0–6·3) in 2000 to 7·7 billion (7·5–8·0) in 2019. In 2019, 34 countries had negative natural rates of increase; in 17 of these, the population declined because immigration was not sufficient to counteract the negative rate of decline. Globally, HALE increased from 58·6 years (56·1–60·8) in 2000 to 63·5 years (60·8–66·1) in 2019. HALE increased in 202 of 204 countries and territories between 2000 and 2019

    Comfort and patient-centred care without excessive sedation:the eCASH concept

    Get PDF
    We propose an integrated and adaptable approach to improve patient care and clinical outcomes through analgesia and light sedation, initiated early during an episode of critical illness and as a priority of care. This strategy, which may be regarded as an evolution of the Pain, Agitation and Delirium guidelines, is conveyed in the mnemonic eCASH—early Comfort using Analgesia, minimal Sedatives and maximal Humane care. eCASH aims to establish optimal patient comfort with minimal sedation as the default presumption for intensive care unit (ICU) patients in the absence of recognised medical requirements for deeper sedation. Effective pain relief is the first priority for implementation of eCASH: we advocate flexible multimodal analgesia designed to minimise use of opioids. Sedation is secondary to pain relief and where possible should be based on agents that can be titrated to a prespecified target level that is subject to regular review and adjustment; routine use of benzodiazepines should be minimised. From the outset, the objective of sedation strategy is to eliminate the use of sedatives at the earliest medically justifiable opportunity. Effective analgesia and minimal sedation contribute to the larger aims of eCASH by facilitating promotion of sleep, early mobilization strategies and improved communication of patients with staff and relatives, all of which may be expected to assist rehabilitation and avoid isolation, confusion and possible long-term psychological complications of an ICU stay. eCASH represents a new paradigm for patient-centred care in the ICU. Some organizational challenges to the implementation of eCASH are identified.SCOPUS: re.jinfo:eu-repo/semantics/publishe
    corecore