7,413 research outputs found

    A systematic review of air pollution and incidence of out-of-hospital cardiac arrest

    Get PDF
    Introduction: Studies have linked air pollution with the incidence of acute coronary artery events and cardiovascular mortality but the association with out-of-hospital cardiac arrest (OHCA) is less clear. Aim: To examine the association of air pollution with the occurrence of OHCA.Methods: Electronic bibliographic databases (until February 2013) were searched. Search terms included common air pollutants and OHCA. Studies of patients with implantable cardioverter defibrillators and OHCA not attended by paramedics were excluded. Two independent reviewers (THKT and TAW) identified potential studies. Methodological: quality was assessed by the Newcastle-Ottawa Scale.Results: Of 849 studies, 8 met the selection criteria. Significant associations between particulate matter (PM) exposure (especially PM2.5) and OHCA were found in 5 studies. An increase of OHCA risk ranged from 2.4% to 7% per interquartile increase in average PM exposure on the same day and up to 4 days prior to the event. A large study found ozone increased the risk of OHCA within 3 h prior to the event. The strongest risk OR of 3.8–4.6% per 20 parts per billion ozone increase of the average level was within 2 h prior to the event. Similarly, another study found an increased risk of 18% within 2 days prior to the event.Conclusions: Larger studies have suggested an increased risk of OHCA with air pollution exposure from PM 2.5 and ozone

    Spin relaxation due to random Rashba spin-orbit coupling in GaAs (110) quantum wells

    Full text link
    We investigate the spin relaxation due to the random Rashba spin-orbit coupling in symmetric GaAs (110) quantum wells from the fully microscopic kinetic spin Bloch equation approach. All relevant scatterings, such as the electron-impurity, electron--longitudinal-optical-phonon, electron--acoustic-phonon, as well as electron-electron Coulomb scatterings are explicitly included. It is shown that our calculation reproduces the experimental data by M\"uller {\em et al.} [Phys. Rev. Lett. {\bf 101}, 206601 (2008)] for a reasonable choice of parameter values. We also predict that the temperature dependence of spin relaxation time presents a peak in the case with low impurity density, which originates from the electron-electron Coulomb scattering.Comment: 5 pages, 2 figures, EPL in pres

    A hybrid global image orientation method for simultaneously estimating global rotations and global translations

    Get PDF
    In recent years, the determination of global image orientation, i.e. global SfM, has gained a lot of attentions from researchers, mainly due to its time efficiency. Most of the global methods take relative rotations and translations as input for a two-step strategy comprised of global rotation averaging and global translation averaging. This paper by contrast presents a hybrid approach that aims to solve global rotations and translations simultaneously, but hierarchically. We first extract an optimal minimum cover connected image triplet set (OMCTS) which includes all available images with a minimum number of triplets, all of them with the three related relative orientations being compatible to each other. For non-collinear triplets in the OMCTS, we introduce some basic characterizations of the corresponding essential matrices and solve for the image pose parameters by averaging the constrained essential matrices. For the collinear triplets, on the other hand, the image pose parameters are estimated by relative orientation using the depth of object points from individual local spatial intersection. Finally, all image orientations are estimated in a common coordinate frame by traversing every solved triplet using a similarity transformation. We show results of our method on different benchmarks and demonstrate the performance and capability of the proposed approach by comparing with other global SfM methods. © 2020 Copernicus GmbH. All rights reserved

    Effects of hydroxyapatite and PDGF concentrations on osteoblast growth in a nanohydroxyapatite-polylactic acid composite for guided tissue regeneration

    Get PDF
    The technique of guided tissue regeneration (GTR) has evolved over recent years in an attempt to achieve periodontal tissue regeneration by the use of a barrier membrane. However, there are significant limitations in the currently available membranes and overall outcomes may be limited. A degradable composite material was investigated as a potential GTR membrane material. Polylactic acid (PLA) and nanohydroxyapatite (nHA) composite was analysed, its bioactive potential and suitability as a carrier system for growth factors were assessed. The effect of nHA concentrations and the addition of platelet derived growth factor (PDGF) on osteoblast proliferation and differentiation was investigated. The bioactivity was dependent on the nHA concentration in the films, with more apatite deposited on films containing higher nHA content. Osteoblasts proliferated well on samples containing low nHA content and differentiated on films with higher nHA content. The composite films were able to deliver PDGF and cell proliferation increased on samples that were pre absorbed with the growth factor. nHA–PLA composite films are able to deliver active PDGF. In addition the bioactivity and cell differentiation was higher on films containing more nHA. The use of a nHA–PLA composite material containing a high concentration of nHA may be a useful material for GTR membrane as it will not only act as a barrier, but may also be able to enhance bone regeneration by delivery of biologically active molecules

    Data production models for the CDF experiment

    Get PDF
    The data production for the CDF experiment is conducted on a large Linux PC farm designed to meet the needs of data collection at a maximum rate of 40 MByte/sec. We present two data production models that exploits advances in computing and communication technology. The first production farm is a centralized system that has achieved a stable data processing rate of approximately 2 TByte per day. The recently upgraded farm is migrated to the SAM (Sequential Access to data via Metadata) data handling system. The software and hardware of the CDF production farms has been successful in providing large computing and data throughput capacity to the experiment.Comment: 8 pages, 9 figures; presented at HPC Asia2005, Beijing, China, Nov 30 - Dec 3, 200

    Search for Intrinsic Excitations in 152Sm

    Full text link
    The 685 keV excitation energy of the first excited 0+ state in 152Sm makes it an attractive candidate to explore expected two-phonon excitations at low energy. Multiple-step Coulomb excitation and inelastic neutron scattering studies of 152Sm are used to probe the E2 collectivity of excited 0+ states in this "soft" nucleus and the results are compared with model predictions. No candidates for two-phonon K=0+ quadrupole vibrational states are found. A 2+, K=2 state with strong E2 decay to the first excited K=0+ band and a probable 3+ band member are established.Comment: 4 pages, 6 figures, accepted for publication as a Rapid Communication in Physical Review

    Data processing model for the CDF experiment

    Get PDF
    The data processing model for the CDF experiment is described. Data processing reconstructs events from parallel data streams taken with different combinations of physics event triggers and further splits the events into datasets of specialized physics datasets. The design of the processing control system faces strict requirements on bookkeeping records, which trace the status of data files and event contents during processing and storage. The computing architecture was updated to meet the mass data flow of the Run II data collection, recently upgraded to a maximum rate of 40 MByte/sec. The data processing facility consists of a large cluster of Linux computers with data movement managed by the CDF data handling system to a multi-petaByte Enstore tape library. The latest processing cycle has achieved a stable speed of 35 MByte/sec (3 TByte/day). It can be readily scaled by increasing CPU and data-handling capacity as required.Comment: 12 pages, 10 figures, submitted to IEEE-TN

    Managing healthcare budgets in times of austerity: the role of program budgeting and marginal analysis

    Get PDF
    Given limited resources, priority setting or choice making will remain a reality at all levels of publicly funded healthcare across countries for many years to come. The pressures may well be even more acute as the impact of the economic crisis of 2008 continues to play out but, even as economies begin to turn around, resources within healthcare will be limited, thus some form of rationing will be required. Over the last few decades, research on healthcare priority setting has focused on methods of implementation as well as on the development of approaches related to fairness and legitimacy and on more technical aspects of decision making including the use of multi-criteria decision analysis. Recently, research has led to better understanding of evaluating priority setting activity including defining ‘success’ and articulating key elements for high performance. This body of research, however, often goes untapped by those charged with making challenging decisions and as such, in line with prevailing public sector incentives, decisions are often reliant on historical allocation patterns and/or political negotiation. These archaic and ineffective approaches not only lead to poor decisions in terms of value for money but further do not reflect basic ethical conditions that can lead to fairness in the decision-making process. The purpose of this paper is to outline a comprehensive approach to priority setting and resource allocation that has been used in different contexts across countries. This will provide decision makers with a single point of access for a basic understanding of relevant tools when faced with having to make difficult decisions about what healthcare services to fund and what not to fund. The paper also addresses several key issues related to priority setting including how health technology assessments can be used, how performance can be improved at a practical level, and what ongoing resource management practice should look like. In terms of future research, one of the most important areas of priority setting that needs further attention is how best to engage public members
    • …
    corecore