30 research outputs found

    Polarization due to rotational distortion in the bright star Regulus

    Get PDF
    This is the full published article (retrieved from the 6 months post-publication posting on arXiv) including the Methods and Supplementary Information sections: 33 pages, 10 figures, 8 tablesPolarization in stars was first predicted by Chandrasekhar [1] who calculated a substantial linear polarization at the stellar limb for a pure electron-scattering atmosphere. This polarization will average to zero when integrated over a spherical star but could be detected if the symmetry is broken, for example by the eclipse of a binary companion. Nearly 50 years ago, Harrington and Collins [2] modeled another way of breaking the symmetry and producing net polarization - the distortion of a rapidly rotating hot star. Here we report the first detection of this effect. Observations of the linear polarization of Regulus, with two different high-precision polarimeters, range from +42 parts-per-million (ppm) at a wavelength of 741 nm to -22 ppm at 395 nm. The reversal from red to blue is a distinctive feature of rotation-induced polarization. Using a new set of models for the polarization of rapidly rotating stars we find that Regulus is rotating at 96.5(+0.6/-0.8)% of its critical angular velocity for breakup, and has an inclination greater than 76.5 degrees. The rotation axis of the star is at a position angle of 79.5+/-0.7 degrees. The conclusions are independent of, but in good agreement with, the results of previously published interferometric observations of Regulus [3]. The accurate measurement of rotation in early-type stars is important for understanding their stellar environments [4], and course of their evolution [5].Peer reviewedFinal Accepted Versio

    A review of tennis racket performance parameters

    Get PDF
    The application of advanced engineering to tennis racket design has influenced the nature of the sport. As a result, the International Tennis Federation has established rules to limit performance, with the aim of protecting the nature of the game. This paper illustrates how changes to the racket affect the player-racket system. The review integrates engineering and biomechanical issues related to tennis racket performance, covering the biomechanical characteristics of tennis strokes, tennis racket performance, the effect of racket parameters on ball rebound and biomechanical interactions. Racket properties influence the rebound of the ball. Ball rebound speed increases with frame stiffness and as string tension decreases. Reducing inter-string contacting forces increases rebound topspin. Historical trends and predictive modelling indicate swingweights of around 0.030–0.035 kg/m2 are best for high ball speed and accuracy. To fully understand the effect of their design changes, engineers should use impact conditions in their experiments, or models, which reflect those of actual tennis strokes. Sports engineers, therefore, benefit from working closely with biomechanists to ensure realistic impact conditions

    Characterisation of ball degradation events in professional tennis

    Get PDF
    This is an Open Acces Article. It is published by Springer under the Creative Commons Attribution 4.0 International Licence (CC BY). Full details of this licence are available at: http://creativecommons.org/licenses/by/4.0/Tennis balls are acknowledged to degrade with use and are replaced at regular intervals during professional matches to maintain consistency and uniformity in performance, such that the game is not adversely affected. Balls are subject to the international tennis federation’s (ITF) ball approval process, which includes a degradation test to ensure a minimum standard of performance. The aim of this investigation was to establish if the ITF degradation test can assess ball longevity and rate of degradation and determine if there is a need for a new degradation test that is more representative of in-play conditions. Ball tracking data from four different professional events, spanning the three major court surfaces, including both men’s and women’s matches were analysed. The frequency of first serves, second serves, racket impacts and surface impacts were assessed and the corresponding distribution of ball speed and (for surface impacts) impact angle was determined. Comparison of ball impact frequency and conditions between in-play data and the ITF degradation test indicated the development of a new test, more representative of in-play data, would be advantageous in determining ball longevity and rate of degradation with use. Assessment of data from different surfaces highlighted that grass court subjected the ball to fewer racket and surface impacts than hard court or clay. In turn, this appears to influence the distribution of ball speed on impact with the surface or racket, suggesting a surface-specific degradation test may be beneficial. As a result of these findings a new test protocol has been proposed, utilising the in-play data, to define the frequency of impacts and impact conditions to equate to nine games of professional tennis across the different surfaces

    Functional Genomics Unique to Week 20 Post Wounding in the Deep Cone/Fat Dome of the Duroc/Yorkshire Porcine Model of Fibroproliferative Scarring

    Get PDF
    Background: Hypertrophic scar was first described over 100 years ago; PubMed has more than 1,000 references on the topic. Nevertheless prevention and treatment remains poor, because 1) there has been no validated animal model; 2) human scar tissue, which is impossible to obtain in a controlled manner, has been the only source for study; 3) tissues typically have been homogenized, mixing cell populations; and 4) gene-by-gene studies are incomplete.Methodology/Principal Findings: We have assembled a system that overcomes these barriers and permits the study of genome-wide gene expression in microanatomical locations, in shallow and deep partial-thickness wounds, and pigmented and non-pigmented skin, using the Duroc( pigmented fibroproliferative)/Yorkshire( non-pigmented non-fibroproliferative) porcine model. We used this system to obtain the differential transcriptome at 1, 2, 3, 12 and 20 weeks post wounding. It is not clear when fibroproliferation begins, but it is fully developed in humans and the Duroc breed at 20 weeks. Therefore we obtained the derivative functional genomics unique to 20 weeks post wounding. We also obtained long-term, forty-six week follow-up with the model.Conclusions/Significance: 1) the scars are still thick at forty-six weeks post wounding further validating the model. 2) the differential transcriptome provides new insights into the fibroproliferative process as several genes thought fundamental to fibroproliferation are absent and others differentially expressed are newly implicated. 3) the findings in the derivative functional genomics support old concepts, which further validates the model, and suggests new avenues for reductionist exploration. in the future, these findings will be searched for directed networks likely involved in cutaneous fibroproliferation. These clues may lead to a better understanding of the systems biology of cutaneous fibroproliferation, and ultimately prevention and treatment of hypertrophic scarring.The National Institute on Disability and Rehabilitation ResearchThe National Institutes of HealthThe Washington State Council of Fire Fighters Burn FoundationThe Northwest Burn FoundationUniv Washington, Dept Surg, Div Plast Surg, Seattle, WA 98195 USAIowa State Univ, Dept Anim Sci, Ames, IA USAUniv Washington, Dept Biostat, Seattle, WA 98195 USAMahidol Univ, Ramathibodi Hosp, Dept Surg, Bangkok 10700, ThailandUniv Washington, Dept Environm & Occupat Hlth Sci, Seattle, WA 98195 USAUniversidade Federal de São Paulo, Div Plast Surg, Dept Surg, São Paulo, BrazilUniversidade Federal de São Paulo, Div Plast Surg, Dept Surg, São Paulo, BrazilThe National Institute on Disability and Rehabilitation Research: H133G050022The National Institutes of Health: 1R21GM074673The National Institutes of Health: 5U54GM062119-09Web of Scienc

    Pooled analysis of WHO Surgical Safety Checklist use and mortality after emergency laparotomy

    Get PDF
    Background The World Health Organization (WHO) Surgical Safety Checklist has fostered safe practice for 10 years, yet its place in emergency surgery has not been assessed on a global scale. The aim of this study was to evaluate reported checklist use in emergency settings and examine the relationship with perioperative mortality in patients who had emergency laparotomy. Methods In two multinational cohort studies, adults undergoing emergency laparotomy were compared with those having elective gastrointestinal surgery. Relationships between reported checklist use and mortality were determined using multivariable logistic regression and bootstrapped simulation. Results Of 12 296 patients included from 76 countries, 4843 underwent emergency laparotomy. After adjusting for patient and disease factors, checklist use before emergency laparotomy was more common in countries with a high Human Development Index (HDI) (2455 of 2741, 89.6 per cent) compared with that in countries with a middle (753 of 1242, 60.6 per cent; odds ratio (OR) 0.17, 95 per cent c.i. 0.14 to 0.21, P <0001) or low (363 of 860, 422 per cent; OR 008, 007 to 010, P <0.001) HDI. Checklist use was less common in elective surgery than for emergency laparotomy in high-HDI countries (risk difference -94 (95 per cent c.i. -11.9 to -6.9) per cent; P <0001), but the relationship was reversed in low-HDI countries (+121 (+7.0 to +173) per cent; P <0001). In multivariable models, checklist use was associated with a lower 30-day perioperative mortality (OR 0.60, 0.50 to 073; P <0.001). The greatest absolute benefit was seen for emergency surgery in low- and middle-HDI countries. Conclusion Checklist use in emergency laparotomy was associated with a significantly lower perioperative mortality rate. Checklist use in low-HDI countries was half that in high-HDI countries.Peer reviewe

    Review of the anatase to rutile phase transformation

    Full text link

    Global variation in anastomosis and end colostomy formation following left-sided colorectal resection

    Get PDF
    Background End colostomy rates following colorectal resection vary across institutions in high-income settings, being influenced by patient, disease, surgeon and system factors. This study aimed to assess global variation in end colostomy rates after left-sided colorectal resection. Methods This study comprised an analysis of GlobalSurg-1 and -2 international, prospective, observational cohort studies (2014, 2016), including consecutive adult patients undergoing elective or emergency left-sided colorectal resection within discrete 2-week windows. Countries were grouped into high-, middle- and low-income tertiles according to the United Nations Human Development Index (HDI). Factors associated with colostomy formation versus primary anastomosis were explored using a multilevel, multivariable logistic regression model. Results In total, 1635 patients from 242 hospitals in 57 countries undergoing left-sided colorectal resection were included: 113 (6·9 per cent) from low-HDI, 254 (15·5 per cent) from middle-HDI and 1268 (77·6 per cent) from high-HDI countries. There was a higher proportion of patients with perforated disease (57·5, 40·9 and 35·4 per cent; P < 0·001) and subsequent use of end colostomy (52·2, 24·8 and 18·9 per cent; P < 0·001) in low- compared with middle- and high-HDI settings. The association with colostomy use in low-HDI settings persisted (odds ratio (OR) 3·20, 95 per cent c.i. 1·35 to 7·57; P = 0·008) after risk adjustment for malignant disease (OR 2·34, 1·65 to 3·32; P < 0·001), emergency surgery (OR 4·08, 2·73 to 6·10; P < 0·001), time to operation at least 48 h (OR 1·99, 1·28 to 3·09; P = 0·002) and disease perforation (OR 4·00, 2·81 to 5·69; P < 0·001). Conclusion Global differences existed in the proportion of patients receiving end stomas after left-sided colorectal resection based on income, which went beyond case mix alone
    corecore