3,280 research outputs found

    Expanding the Fraction of the Universe In Which We Can Observe Supernova Made Gravitational Waves

    Get PDF
    Gravitational waves are a product of Einstein\u27s Theory of General Relativity. These waves are produced by high energy events, such as collapsing supernova, high mass binary systems, and more. This waves travel through space time unimpeded by obstacles such as dust, and thereby present a unique opportunity to study phenomena that are obstructed by dust and other materials. However, gravitational waves are hard to detect since they only exert a small effect on space time. Gravitational waves also come in polarizations, like light waves. We plan to prove that two polarizations coming from the same source can only be separated by up to one fourth of the period of the waves. We will do this by deriving the relationship through the Quadrupole Expansion, and by checking numerically if simulated waves obey this relationship

    Self-adaptive Multiprecision Preconditioners on Multicore and Manycore Architectures

    Full text link
    Abstract. Based on the premise that preconditioners needed for scien-tific computing are not only required to be robust in the numerical sense, but also scalable for up to thousands of light-weight cores, we argue that this two-fold goal is achieved for the recently developed self-adaptive multi-elimination preconditioner. For this purpose, we revise the under-lying idea and analyze the performance of implementations realized in the PARALUTION and MAGMA open-source software libraries on GPU architectures (using either CUDA or OpenCL), Intel’s Many Integrated Core Architecture, and Intel’s Sandy Bridge processor. The comparison with other well-established preconditioners like multi-coloured Gauss-Seidel, ILU(0) and multi-colored ILU(0), shows that the twofold goal of a numerically stable cross-platform performant algorithm is achieved.

    Exploring the performance reserve: Effect of different magnitudes of power output deception on 4,000 m cycling time-trial performance

    Get PDF
    Purpose The aim of the present study was to investigate whether a magnitude of deception of 5% in power output would lead to a greater reduction in the amount of time taken for participants to complete a 4000 m cycling TT than a magnitude of deception of 2% in power output, which we have previously shown can lead to a small change in 4000 m cycling TT performance. Methods Ten trained male cyclists completed four, 4000 m cycling TTs. The first served as a habituation and the second as a baseline for future trials. During trials three and four participants raced against a pacer which was set, in a randomized order, at a mean power output equal to 2% (+2% TT) or 5% (+5% TT) higher than their baseline performance. However participants were misled into believing that the power output of the pacer was an accurate representation of their baseline performance on both occasions. Cardiorespiratory responses were recorded throughout each TT, and used to estimate energy contribution from aerobic and anaerobic metabolism. Results Participants were able to finish the +2% TT in a significantly shorter duration than at baseline (p = 0.01), with the difference in performance likely attributable to a greater anaerobic contribution to total power output (p = 0.06). There was no difference in performance between the +5% TT and +2% TT or baseline trials. Conclusions Results suggest that a performance reserve is conserved, involving anaerobic energy contribution, which can be utilised given a belief that the exercise will be sustainable however there is an upper limit to how much deception can be tolerated. These findings have implications for performance enhancement in athletes and for our understanding of the nature of fatigue during high-intensity exercise

    Astronomical Spectroscopy

    Full text link
    Spectroscopy is one of the most important tools that an astronomer has for studying the universe. This chapter begins by discussing the basics, including the different types of optical spectrographs, with extension to the ultraviolet and the near-infrared. Emphasis is given to the fundamentals of how spectrographs are used, and the trade-offs involved in designing an observational experiment. It then covers observing and reduction techniques, noting that some of the standard practices of flat-fielding often actually degrade the quality of the data rather than improve it. Although the focus is on point sources, spatially resolved spectroscopy of extended sources is also briefly discussed. Discussion of differential extinction, the impact of crowding, multi-object techniques, optimal extractions, flat-fielding considerations, and determining radial velocities and velocity dispersions provide the spectroscopist with the fundamentals needed to obtain the best data. Finally the chapter combines the previous material by providing some examples of real-life observing experiences with several typical instruments.Comment: An abridged version of a chapter to appear in Planets, Stars and Stellar Systems, to be published in 2011 by Springer. Slightly revise

    Survey of the quality of experimental design, statistical analysis and reporting of research using animals

    Get PDF
    For scientific, ethical and economic reasons, experiments involving animals should be appropriately designed, correctly analysed and transparently reported. This increases the scientific validity of the results, and maximises the knowledge gained from each experiment. A minimum amount of relevant information must be included in scientific publications to ensure that the methods and results of a study can be reviewed, analysed and repeated. Omitting essential information can raise scientific and ethical concerns. We report the findings of a systematic survey of reporting, experimental design and statistical analysis in published biomedical research using laboratory animals. Medline and EMBASE were searched for studies reporting research on live rats, mice and non-human primates carried out in UK and US publicly funded research establishments. Detailed information was collected from 271 publications, about the objective or hypothesis of the study, the number, sex, age and/or weight of animals used, and experimental and statistical methods. Only 59% of the studies stated the hypothesis or objective of the study and the number and characteristics of the animals used. Appropriate and efficient experimental design is a critical component of high-quality science. Most of the papers surveyed did not use randomisation (87%) or blinding (86%), to reduce bias in animal selection and outcome assessment. Only 70% of the publications that used statistical methods described their methods and presented the results with a measure of error or variability. This survey has identified a number of issues that need to be addressed in order to improve experimental design and reporting in publications describing research using animals. Scientific publication is a powerful and important source of information; the authors of scientific publications therefore have a responsibility to describe their methods and results comprehensively, accurately and transparently, and peer reviewers and journal editors share the responsibility to ensure that published studies fulfil these criteria

    Reliability of electromyography during 2000 m rowing ergometry

    Get PDF
    Purpose: This study aimed to investigate the reliability of surface electromyography (EMG) assessed at seven muscles during three repeated 2000 m rowing ergometer sessions. Methods: Twelve male well-trained rowers participated in a repeated measures design, performing three 2000 m rowing ergometer sessions interspersed by 3–7 days (S1, S2, S3). Surface electrodes were attached to the gastrocnemius, biceps femoris, gluteus maximus, erector spinae, vastus medialis, rectus abdominis and latissimus dorsi for EMG analysis. Results: No differences existed between 2000 m sessions for EMG amplitude for any of the seven muscles (p = 0.146–0.979). Mean coefficient of variation of EMG for 6 of 7 muscles was ‘acceptable’ (12.3–18.6%), although classed as ‘weak’ for gastrocnemius (28.6%). Mean intra-class correlation coefficient values across muscles ranged from ‘moderate’ to ‘very large’ (0.31–0.89). Within-session EMG activation rates of vastus medialis were greater during 0–500 m and 1500–2000 m segments, compared with 500–1000 m and 1000–1500 m (p 0.05). Conclusion: Reliability of EMG values over repeated 2000 m sessions was generally ‘acceptable’. However, EMG was seemingly not sensitive enough to detect potential changes in neural activation between-sessions, with respect to changes in pacing strategy

    Delayed-onset heparin-induced thrombocytopenia presenting with multiple arteriovenous thromboses: case report

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Delayed-onset heparin-induced thrombocytopenia with thrombosis, albeit rare, is a severe side effect of heparin exposure. It can occur within one month after coronary artery bypass grafting (CABG) with manifestation of different thrombotic events.</p> <p>Case presentation</p> <p>A 59-year-old man presented with weakness, malaise, bilateral lower limb pitting edema and a suspected diagnosis of deep vein thrombosis 18 days after CABG. Heparin infusion was administered as an anticoagulant. Clinical and paraclinical work-up revealed multiple thrombotic events (stroke, renal failure, deep vein thrombosis, large clots in heart chambers) and 48 ×10<sup>3</sup>/μl platelet count, whereupon heparin-induced thrombocytopenia was suspected. Heparin was discontinued immediately and an alternative anticoagulant agent was administered, as a result of which platelet count recovered. Heparin-induced thrombocytopenia, which causes thrombosis, is a serious side effect of heparin therapy. It is worthy of note that no case of delayed-onset heparin-induced thrombocytopenia with thrombosis associated with cardiopulmonary bypass surgery has thus far been reported in Iran.</p> <p>Conclusion</p> <p>Delayed-onset heparin-induced thrombocytopenia should be suspected in any patient presenting with arterial or venous thromboembolic disorders after recent heparin therapy, even though the heparin exposure dates back to more than a week prior to presentation; and it should be ruled-out before the initiation of heparin therapy.</p
    • …
    corecore