1,941 research outputs found

    Uncertainty analysis of the plasma impedance probe

    Full text link
    A plasma impedance probe (PIP) is a type of in-situ, radio-frequency (RF) probe that is traditionally used to measure plasma properties (e.g. density) in low-density environments such as the Earth's ionosphere. We believe that PIPs are underrepresented in laboratory settings, in part because PIP operation and analysis has not been optimized for signal-to-noise ratio (SNR), reducing the probe's accuracy, upper density limit, and acquisition rate. This work presents our efforts in streamlining and simplifying the PIP design, model, calibration, and analysis for unmagnetized laboratory plasmas, in both continuous and pulsed PIP operation. The focus of this work is a Monte Carlo uncertainty analysis, which identifies operational and analysis procedures that improve SNR by multiple orders of magnitude. Additionally, this analysis provides evidence that the sheath resonance (and not the plasma frequency as previously believed) sets the PIP's upper density limit, which likely provides an additional method for extending the PIP's density limit

    High-speed plasma measurements with a plasma impedance probe

    Full text link
    Plasma impedance probes (PIPs) are a type of RF probe that primarily measure electron density. This work introduces two advancements: a streamlined analytical model for interpreting PIP-monopole measurements and techniques for achieving ≥1\geq 1 MHz time-resolved PIP measurements. The model's improvements include introducing sheath thickness as a measurement and providing a more accurate method for measuring electron density and damping. The model is validated by a quasi-static numerical simulation which compares the simulation with measurements, identifies sources of error, and provides probe design criteria for minimizing uncertainty. The improved time resolution is achieved by introducing higher-frequency hardware, updated analysis algorithms, and a more rigorous approach to RF calibration. Finally, the new model and high-speed techniques are applied to two datasets: a 4 kHz plasma density oscillation resolved at 100 kHz with densities ranging between 2×10142 \times 10^{14} to 3×10153 \times 10^{15} m−3^{-3} and a 150 kHz oscillation resolved at 4 MHz with densities ranging between 4×10144 \times 10^{14} to 6×10146 \times 10^{14} m−3^{-3}

    Trends in match injury risk in professional male rugby union: a 16-season review of 10851 match injuries in the English Premiership (2002-2019):The Professional Rugby Injury Surveillance Project

    Get PDF
    Objectives The Professional Rugby Injury Surveillance Project is the largest and longest running rugby union injury surveillance project globally and focuses on the highest level of rugby in England.Methods We examined match injuries in professional men’s rugby over the period 2002/2003 to 2018/2019 and described trends in injuries over this time.Results Over the period 2002/2003–2018/2019, 10 851 injuries occurred in 1 24 952 hours of match play, equating to a mean of 57 injuries per club per season and one injury per team per match. The mean incidence, severity (days absence) and burden (days absence/1000 hours) of injury were 87/1000 hours (95% CI 82 to 92), 25 days (95% CI 22 to 28) and 2178 days/1000 hours (95% CI 1872 to 2484), respectively. The tackle accounted for 43% injuries with running the second most common activity during injury (12%). The most common injury location was the head/face with an incidence of 11.3/1000 hours, while the location with the highest overall burden was the knee (11.1 days/1000 hours). Long-term trends demonstrated stable injury incidence and proportion of injured players, but an increase in the mean and median severity of injuries. Concussion incidence, severity and burden increased from the 2009/2010 season onwards and from 2011 to 2019 concussion was the most common injury.Conclusion The rise in overall injury severity and concussion incidence are the most significant findings from this work and demonstrate the need for continued efforts to reduce concussion risk as well as a greater understanding of changes in injury severity over time

    Validation of the Saint George's Respiratory Questionnaire in Uganda.

    Get PDF
    INTRODUCTION: Chronic obstructive pulmonary disease (COPD) will soon be the third leading global cause of death and is increasing rapidly in low/middle-income countries. There is a need for local validation of the Saint George's Respiratory Questionnaire (SGRQ), which can be used to identify those experiencing lifestyle impairment due to their breathing. METHODS: The SGRQ was professionally translated into Luganda and reviewed by our field staff and a local pulmonologist. Participants included a COPD-confirmed clinic sample and COPD-positive and negative members of the community who were enrolled in the Lung Function in Nakaseke and Kampala (LiNK) Study. SGRQs were assembled from all participants, while demographic and spirometry data were additionally collected from LiNK participants. RESULTS: In total, 103 questionnaires were included in analysis: 49 with COPD from clinic, 34 community COPD-negative and 20 community COPD-positive. SGRQ score varied by group: 53.5 for clinic, 34.4 for community COPD-positive and 4.1 for community COPD-negative (p<0.001). The cross-validated c statistic for SGRQ total score predicting COPD was 0.87 (95% CI 0.75 to 1.00). SGRQ total score was associated with COPD severity (forced expiratory volume in 1 s per cent of predicted), with an r coefficient of -0.60 (-0.75, -0.39). SGRQ score was associated with dyspnoea (OR 1.05/point; 1.01, 1.09) and cough (1.07; 1.03, 1.11). CONCLUSION: Our Luganda language SGRQ accurately distinguishes between COPD-positive and negative community members in rural Uganda. Scores were correlated with COPD severity and were associated with odds of dyspnoea and cough. We find that it can be successfully used as a respiratory questionnaire for obstructed adults in Uganda

    Patterns of training volume and injury risk in elite rugby union: an analysis of 1.5 million hours of training exposure over eleven seasons

    Get PDF
    Rugby union is a popular team sport that demands high levels of physical fitness and skill. The study aim was to examine trends in training volume and its impact on injury incidence, severity and burden over an 11-season period in English professional rugby. Data were recorded from 2007/08 through 2017/18, capturing 1,501,606 h of training exposure and 3,782 training injuries. Players completed, on average, 6 h 48 minutes of weekly training (95% CI: 6 h 30 mins to 7 h 6 mins): this value remained stable over the 11 seasons. The mean incidence of training-related injuries was 2.6/1000 player-hours (95% CI: 2.4 to 2.8) with a mean severity rising from 17 days in 2007/08 to 37 days in 2017/18 (Change/season = 1.773, P &lt;0.01). Rate of change in severity was dependent on training type, with conditioning (non-gym-based) responsible for the greatest increase (2.4 days/injury/season). As a result of increasing severity, injury burden rose from 51 days absence/1000 player-hours in 2007/08 to 106 days’ absence/1000 player-hours in 2017/18. Despite the low incidence of injury in training compared to match-play, training accounted for 34% of all injuries. Future assessments of training intensity may lead to a greater understanding of the rise in injury severity.</p

    Cost-Effectiveness of HIV Screening in STD Clinics, Emergency Departments, and Inpatient Units: A Model-Based Analysis

    Get PDF
    Identifying and treating persons with human immunodeficiency virus (HIV) infection early in their disease stage is considered an effective means of reducing the impact of the disease. We compared the cost-effectiveness of HIV screening in three settings, sexually transmitted disease (STD) clinics serving men who have sex with men, hospital emergency departments (EDs), settings where patients are likely to be diagnosed early, and inpatient diagnosis based on clinical manifestations.We developed the Progression and Transmission of HIV/AIDS model, a health state transition model that tracks index patients and their infected partners from HIV infection to death. We used program characteristics for each setting to compare the incremental cost per quality-adjusted life year gained from early versus late diagnosis and treatment. We ran the model for 10,000 index patients for each setting, examining alternative scenarios, excluding and including transmission to partners, and assuming HAART was initiated at a CD4 count of either 350 or 500 cells/µL. Screening in STD clinics and EDs was cost-effective compared with diagnosing inpatients, even when including only the benefits to the index patients. Screening patients in STD clinics, who have less-advanced disease, was cost-effective compared with ED screening when treatment with HAART was initiated at a CD4 count of 500 cells/µL. When the benefits of reduced transmission to partners from early diagnosis were included, screening in settings with less-advanced disease stages was cost-saving compared with screening later in the course of infection. The study was limited by a small number of observations on CD4 count at diagnosis and by including transmission only to first generation partners of the index patients.HIV prevention efforts can be advanced by screening in settings where patients present with less-advanced stages of HIV infection and by initiating treatment with HAART earlier in the course of infection

    Acute thrombus formation on phosphorilcholine surface modified flow diverters

    Get PDF
    PURPOSE: Thromboembolic complications remain a limitation of flow diverting stents. We hypothesize that phosphorilcholine surface modified flow diverters (Pipeline Flex with Shield Technology, sPED) would have less acute thrombus formation on the device surface compared with the classic Pipeline Embolization device (cPED). METHODS: Elastase-induced aneurysms were created in 40 rabbits and randomly assigned to receive cPED or sPED devices with and without dual antiplatelet therapy (DAPT) (four groups, n=10/group). Angioplasty was performed to enhance apposition and create intimal injury for a pro-thrombotic environment. Both before and after angioplasty, the flow diverter was imaged with intravascular optical coherence tomography. The outcome measure was the number of predefined segments along the implant relative to the location of the aneurysm with a minimum of 0 (no clot formation) and maximum of 3 (all segments with thrombus). Clot formation over the device at ostia of branch arteries was assessed as either present or absent. RESULTS: Following angioplasty, the number of flow diverter segments with clots was significantly associated with the flow diverter (p \u3c 0.0001), but not with DAPT (p=0.3872) or aneurysm neck size (p=0.8555). The incidence rate for clots with cPED was 1.72 times more than with sPED. The clots on the flow diverter at the location corresponding to side branch ostia was significantly lower with sPED than with cPED (OR 0.180; 95% CI 0.044 to 0.734; p=0.0168), but was not associated with DAPT (p=0.3198). CONCLUSION: In the rabbit model, phosphorilcholine surface modified flow diverters are associated with less thrombus formation on the surface of the device
    • …
    corecore