167 research outputs found

    The motion damping characteristics of wind energy devices

    Get PDF
    Wind Assisted Ship Propulsion (WASP) uses wind energy devices such as wingsails, boundary layer control systems, wind turbines and others to augment the thrust provided by the ship's propeller. Ship motions in a seaway, particularly roll motions, can be reduced with consequent reductions in added hydrodynamic resistance, to yield fuel savings exceeding those predicted on the basis of thrust augmentation alone. Some wind assist devices compare favourably with the responses of conventional roll stabilisers. Attention is restricted to devices which are exterior to the hull and which apply direct aerodynamic forces or moments to the hull. A theoretical analysis has been developed to examine wingsails for roll, pitch and yaw derivatives (both damping and inertia) due to harmonic excitation in roll at different headings and frequencies. Unsteady lifting surface theory has been used, linearised to first order in motion velocity. The theory has been augmented by experiments in the wind tunnel. Some measurements from a fishing boat with and without sails are also included. Other devices - wind turbines, Flettner rotors, Cousteau Turbosails and conventional roll stabilisers have been examined theoretically in roll only, using a quasisteady version of the above theory, as a comparison. Published data on steady lift and drag curves have been used, but these do not account for the motion of the separation point on the surface of the rotor and cylinder in unsteady motion. Although offshore wind turbines are stationary, which makes resistance and propulsion irrelevant, they suffer from excitations due to wave motion which are unknown on land-based wind turbines, and also from shallow water effects and interactions with the aerodynamics which are unknown to deep-water offshore installations. The aerodynamic damping provided by the wind turbine will reduce the induced motions of the supporting structure. The emphasis has been on the aerodynamics, rather than the hydrodynamics, because the hydrodynamic components of roll damping have been extensively researched elsewhere, and have proved impossible to evaluate in any generally applicable manner

    Static Socio-Ecological COVID-19 Vulnerability Index and Vaccine Hesitancy Index for England

    Get PDF
    Background: Population characteristics can be used to infer vulnerability of communities to COVID-19, or to the likelihood of high levels of vaccine hesitancy. Communities harder hit by the virus, or at risk of being so, stand to benefit from greater resource allocation than their population size alone would suggest. This study reports a simple but efficacious method of ranking small areas of England by relative characteristics that are linked with COVID-19 vulnerability and vaccine hesitancy. Methods: Publicly available data on a range of characteristics previously linked with either poor COVID-19 outcomes or vaccine hesitancy were collated for all Middle Super Output Areas of England (MSOA, n=6790, excluding Isles of Scilly), scaled and combined into two numeric indices. Multivariable linear regression was used to build a parsimonious model of vulnerability (static socio-ecological vulnerability index, SEVI) in 60% of MSOAs, and retained variables were used to construct two simple indices. Assuming a monotonic relationship between indices and outcomes, Spearman correlation coefficients were calculated between the SEVI and cumulative COVID-19 case rates at MSOA level in the remaining 40% of MSOAs over periods both during and out with national lockdowns. Similarly, a novel vaccine hesitancy index (VHI) was constructed using population characteristics aligned with factors identified by an Office for National Statistics (ONS) survey analysis. The relationship between the VHI and vaccine coverage in people aged 12+years (as of 2021-06-24) was determined using Spearman correlation. The indices were split into quintiles, and MSOAs within the highest vulnerability and vaccine hesitancy quintiles were mapped. Findings: The SEVI showed a moderate to strong relationship with case rates in the validation dataset across the whole study period, and for every intervening period studied except early in the pandemic when testing was highly selective. The SEVI was more strongly correlated with case rates than any of its domains (rs 0·59 95% CI 0.57-0.62) and outperformed an existing MSOA-level vulnerability index. The VHI was significantly negatively correlated with COVID-19 vaccine coverage in the validation data at the time of writing (rs -0·43 95% CI -0·46 to -0·41). London had the largest number and proportion of MSOAs in quintile 5 (most vulnerable/hesitant) of SEVI and VHI concurrently. Interpretation: The indices presented offer an efficacious way of identifying geographical disparities in COVID-19 risk, thus helping focus resources according to need. Funding: Funder: Integrated Covid Hub North East Award number: n/a Grant recipient: Fiona Matthew

    Front loading the curriculum: early placement experiences enhance career awareness and motivation for students with diverse career options

    Get PDF
    Deciding which career path is right for undergraduate students can be challenging and positive outcomes are linked to early work placements. The aim of the current study was to explore the student experience following the introduction of early career-based awareness-raising and reflective learning opportunities in first-year sport and exercise science-based students. Students met with the first-year coordinator to discuss career progression and career aspirations. From this meeting, students were allocated a placement. Following the placement visit, students submitted a reflection piece addressing their experiences at the placement site with six themes identified including: 1) positive experience; 2) degree selection; 3) exposure and reinforcement of practices; 4) career awareness; 5) supervisor impact; and 6) negative experience. The provision of early placement for students in an observational capacity appears to be beneficial to first year experience to assist in consolidating their choice of degree

    Exposure of monocytes to heat shock does not increase class II expression but modulates antigen-dependent T cell responses

    Get PDF
    Expression of heat shock (HS) proteins (HSP) increases after exposure to elevated temperatures or other types of injury, such as oxldative injury. Because of their function as ‘molecular chaperones', HSP are suggested to participate in antigen processing and presentation. We have previously reported that HS modulates antigen presentation in a human EBV-transformed B cell line. Here we investigated the effects of HS on MHC class II expression and on antigen processing and presentation by human monocytes. Monocytes were isolated from peripheral blood of normal human volunteers, purified by adherence, then exposed to temperatures ranging from 37 to 45°C for 20 min, allowed to recover for 2 h at 37°C and used for immunofluorescence or as antigen presenting cells in autologous and heterologous lymphocyte proliferation assays. No increase in class II expression was detected as assessed by flow cytometry. Monocytes (3 × 104) and lymphocytes (1 × 105) were co-cultured for 5 days in the presence of several antigens [diphtheria toxold, tetanus toxold or purified peptlde derivative (PPD)] and labeled with 1 μCI [3H]thymldlne for 16 h. Pre-exposure to HS (44°C) significantly (P < 0.001) increased T cell responses to diphtheria toxold, whereas the effect on the responses to other antigens (tetanus toxold or PPD) were not significant. HS did not increase heterologous T cell responses nor T cell proliferation induced by the non-processed superantigens such as staphylococcal enterotoxln B. The effect of HS was inhibited by actlnomycln B and thus appeared dependent upon HSP synthesis. HSP-mediated increases in antigen processing may potentiate the ongoing immune response at inflammatory site

    Can vaccination roll-out be more equitable if population risk is taken into account?

    Get PDF
    Background COVID-19 vaccination in many countries, including England, has been prioritised primarily by age. However, people of the same age can have very different health statuses. Frailty is a commonly used metric of health and has been found to be more strongly associated with mortality than age among COVID-19 inpatients. Methods We compared the number of first vaccine doses administered across the 135 NHS Clinical Commissioning Groups (CCGs) of England to both the over 50 population and the estimated frail population in each area. Area-based frailty estimates were generated using the English Longitudinal Survey of Ageing (ELSA), a national survey of older people. We also compared the number of doses to the number of people with other risk factors associated with COVID- 19: Atrial fibrillation, chronic kidney disease, diabetes, learning disabilities, obesity and smoking status. Results We estimate that after 79 days of the vaccine program, across all Clinical Commissioning Group areas, the number of people who received a first vaccine per frail person ranged from 4.4 (95% CI 4.0-4.8) and 20.1 (95% CI 18.3-21.9). The prevalences of other risk factors were also poorly associated with the prevalence of vaccination across England. Conclusions Vaccination with age-based priority created area-based inequities in the number of doses administered relative to the number of people who are frail or have other risk factors associated with COVID-19. As frailty has previously been found to be more strongly associated with mortality than age for COVID-19 inpatients, an age-based priority system may increase the risk of mortality in some areas during the vaccine roll-out period. Authorities planning COVID-19 vaccination programmes should consider the disadvantages of an age-based priority system

    Interventions to improve water quality for preventing diarrhoea

    Get PDF
    Background Diarrhoea is a major cause of death and disease, especially among young children in low-income countries. In these settings, many infectious agents associated with diarrhoea are spread through water contaminated with faeces. In remote and low-income settings, source-based water quality improvement includes providing protected groundwater (springs, wells, and bore holes), or harvested rainwater as an alternative to surface sources (rivers and lakes). Point-of-use water quality improvement interventions include boiling, chlorination, flocculation, filtration, or solar disinfection, mainly conducted at home. Objectives To assess the effectiveness of interventions to improve water quality for preventing diarrhoea. Search methods We searched the Cochrane Infectious Diseases Group Specialized Register (11 November 2014), CENTRAL (the Cochrane Library, 7 November 2014), MEDLINE (1966 to 10 November 2014), EMBASE (1974 to 10 November 2014), and LILACS (1982 to 7 November 2014). We also handsearched relevant conference proceedings, contacted researchers and organizations working in the field, and checked references from identified studies through 11 November 2014. Selection criteria Randomized controlled trials (RCTs), quasi-RCTs, and controlled before-and-after studies (CBA) comparing interventions aimed at improving the microbiological quality of drinking water with no intervention in children and adults. Data collection and analysis Two review authors independently assessed trial quality and extracted data. We used meta-analyses to estimate pooled measures of effect, where appropriate, and investigated potential sources of heterogeneity using subgroup analyses. We assessed the quality of evidence using the GRADE approach. Main results Forty-five cluster-RCTs, two quasi-RCTs, and eight CBA studies, including over 84,000 participants, met the inclusion criteria. Most included studies were conducted in low- or middle-income countries (LMICs) (50 studies) with unimproved water sources (30 studies) and unimproved or unclear sanitation (34 studies). The primary outcome in most studies was self-reported diarrhoea, which is at high risk of bias due to the lack of blinding in over 80% of the included studies. Source-based water quality improvements There is currently insufficient evidence to know if source-based improvements such as protected wells, communal tap stands, or chlorination/filtration of community sources consistently reduce diarrhoea (one cluster-RCT, five CBA studies, very low quality evidence). We found no studies evaluating reliable piped-in water supplies delivered to households. Point-of-use water quality interventions On average, distributing water disinfection products for use at the household level may reduce diarrhoea by around one quarter (Home chlorination products: RR 0.77, 95% CI 0.65 to 0.91; 14 trials, 30,746 participants, low quality evidence; flocculation and disinfection sachets: RR 0.69, 95% CI 0.58 to 0.82, four trials, 11,788 participants, moderate quality evidence). However, there was substantial heterogeneity in the size of the effect estimates between individual studies. Point-of-use filtration systems probably reduce diarrhoea by around a half (RR 0.48, 95% CI 0.38 to 0.59, 18 trials, 15,582 participants, moderate quality evidence). Important reductions in diarrhoea episodes were shown with ceramic filters, biosand systems and LifeStraw® filters; (Ceramic: RR 0.39, 95% CI 0.28 to 0.53; eight trials, 5763 participants, moderate quality evidence; Biosand: RR 0.47, 95% CI 0.39 to 0.57; four trials, 5504 participants, moderate quality evidence; LifeStraw®: RR 0.69, 95% CI 0.51 to 0.93; three trials, 3259 participants, low quality evidence). Plumbed in filters have only been evaluated in high-income settings (RR 0.81, 95% CI 0.71 to 0.94, three trials, 1056 participants, fixed effects model). In low-income settings, solar water disinfection (SODIS) by distribution of plastic bottles with instructions to leave filled bottles in direct sunlight for at least six hours before drinking probably reduces diarrhoea by around a third (RR 0.62, 95% CI 0.42 to 0.94; four trials, 3460 participants, moderate quality evidence). In subgroup analyses, larger effects were seen in trials with higher adherence, and trials that provided a safe storage container. In most cases, the reduction in diarrhoea shown in the studies was evident in settings with improved and unimproved water sources and sanitation. Authors' conclusions Interventions that address the microbial contamination of water at the point-of-use may be important interim measures to improve drinking water quality until homes can be reached with safe, reliable, piped-in water connections. The average estimates of effect for each individual point-of-use intervention generally show important effects. Comparisons between these estimates do not provide evidence of superiority of one intervention over another, as such comparisons are confounded by the study setting, design, and population. Further studies assessing the effects of household connections and chlorination at the point of delivery will help improve our knowledge base. As evidence suggests effectiveness improves with adherence, studies assessing programmatic approaches to optimising coverage and long-term utilization of these interventions among vulnerable populations could also help strategies to improve health outcomes

    The association between frailty, care receipt and unmet need for care with the risk of hospital admissions

    Get PDF
    Background: Frailty is characterised by a decline in physical, cognitive, energy, and health reserves and is linked to greater functional dependency and higher social care utilisation. However, the relationship between receiving care, or receiving insufficient care among older people with different frailty status and the risk of unplanned admission to hospital for any cause, or the risk of falls and fractures remains unclear. Methods and findings: This study used information from 7,656 adults aged 60 and older participating in the English Longitudinal Study of Ageing (ELSA) waves 6-8. Care status was assessed through received care and self-reported unmet care needs, while frailty was measured using a frailty index. Competing-risk regression analysis was used (with death as a potential competing risk), adjusted for demographic and socioeconomic confounders. Around a quarter of the participants received care, of which approximately 60% received low levels of care, while the rest had high levels of care. Older people who received low and high levels of care had a higher risk of unplanned admission independent of frailty status. Unmet need for care was not significantly associated with an increased risk of unplanned admission compared to those receiving no care. Older people in receipt of care had an increased risk of hospitalisation due to falls but not fractures, compared to those who received no care after adjustment for covariates, including frailty status. Conclusions: Care receipt increases the risk of hospitalisation substantially, suggesting this is a group worthy of prevention intervention focus
    corecore