1,072 research outputs found

    The effects of running, cycling, and duathlon exercise performance on cardiac function, haemodynamics and regulation

    Get PDF
    This thesis examined the effects of prolonged exercise, specifically Olympic Distance (OD)duathlon upon ultrasound derived indices of cardiac function, cardiac autonomic regulation measured via heart rate variability (HRV), and high-sensitivity cardiac troponin T (hs-cTnT)release. The primary aims were to (1) ascertain the influence of Olympic distance (OD) duathlon performance on cardiac function; (2) to investigate potential relationships between autonomic regulation, hs-cTnT release, and cardiac function, and (3) to investigate the effect of the individual legs of an OD duathlon on post-exercise cardiac function and to quantify the potential performance reserve of highly-trained endurance athletes when completing standalone legs of the duathlon. Findings from a systematic review and meta-analysis(Chapter 1) on research that performed serial echocardiographic and troponin measurements before and after exercise, intensity predicted changes in post-exercise cardiac troponin release and diastolic function. The findings agreed with previous meta-analyses using a more recent sample of studies; however, the recommendation for future studies to implement advanced cardiac imaging techniques, such as myocardial speckle tracking into their data collection would provide a more sensitive measure of post-exercise cardiac function. Whilst a large degree of heterogeneity in the results exists, this was in part explained by study exercise heart rate, participant age, and the prevalence of cardiac troponin release above the clinical detection threshold. The study performed in Chapter 3 was the first to investigate the effects of OD duathlon exercise on immediate and 24 hours post-exercise cardiac function. Additionally, a second OD duathlon was performed by participants with intra-leg measurements of cardiac function. In a highly trained cohort, there was evidence of transient post-exercise reductions in cardiac function and elevated serum high-sensitivity cardiac troponin T (hs-cTnT) above the clinical reference value, which was largely resolved within 24h of recovery. This study also demonstrated the reliability of lab-based duathlon exercise in a highly trained cohort and identified the pacing features of experienced multi-sport athletes that partially explained the different findings between the running and cycling legs of the duathlon. By investigating each leg of the duathlon individually (10k run, 5k run, 40k cycle), both at duathlon race-pace (DM) and maximal (Max) intensity on separate occasions, the performance reserve of the highly-trained cohort was quantified and further explored. The studies presented in Chapters 4 and 5 revealed that experienced duathletes were able to improve their speed across each leg by between 5-15% in a laboratory setting, compared to the duathlon effort. Additionally, the maximal effort 10k run leg provoked the most persistent changes to cardiac function that were present at 6h of recovery. Changes in cardiac function post DM 10k confirmed the findings of Chapter 3 that the greatest magnitude of cardiac perturbations occur following the initial 10k run leg. Aside from the Max 10k run and 40k cycle trials, all perturbations had resolved within 6h of recovery after each bout of exercise, highlighting the importance of recovery following maximal intensity efforts. The lack of 6h and 24h recovery data in Chapter 4, and Chapters 5 and 6, respectively is a shortcoming of these findings and therefore limits interpretation in the context of providing athletic guidance. Future research in this area should endeavour to include 6h and 24h recovery measures as standard, as multi-sport athletes typically perform multiple daily training sessions. The implications of substantial cardiac fatigue accumulation over many years of endurance training history are still unclear, and athletes may benefit from preventingits occurrence

    An investigation into mild traumatic brain injury identification, management, and mitigation

    Get PDF
    Concussion is classified as a mild traumatic brain injury which can be induced by biomechanical forces such as a physical impact to the head or body, which results in a transient neurological disturbance without obvious structural brain damage. Immediate access to tools that can identify, diagnosis and manage concussion are wide ranging and can lack consistency in application. It is well documented that there are frequent incidences of concussion across amateur and professional sport such as popular contact sports like rugby union. A primary aim of this thesis was to establish the current modalities of ‘pitch side’ concussion management, identification, and diagnosis across amateur and professional sporting populations. Furthermore, the research sought to understand existing concussion management and concussion experiences by means of recording the player’s experiences and perceptions (retired professional rugby union players). These qualitative studies sought to gain insights into concussion experiences, the language used to discuss concussion and the duty of care which medical staff, coaching personnel, and club owners have towards professional rugby players in their employment. In addition, possible interventions to reduce the incidence of concussion in amateur and professional sports were investigated. These included a ‘proof of concept’ using inertial measurement units and a smartphone application, a tackle technique coaching app for amateur sports. Other research data investigating the use of neurological function data and neuromuscular fatigue in current professional rugby players as a novel means of monitoring injury risk were included in this research theme. The findings of these studies suggest that there is an established head injury assessment process for professional sports. However, in amateur sport settings, this is not the existing practice and may expose amateur players to an increased risk of post-concussion syndrome or early retirement. Many past professional rugby union players stated that they did not know the effects of cumulative repetitive head impacts. They discussed how they minimised and ignored repeated concussions due to peer pressure or pressure from coaches or their own internal pressures of maintaining a livelihood. These data suggest that players believed that strong willed medical staff, immutable to pressures from coaching staff or even athletes themselves, were essential for player welfare and that club owners have a long-term duty of care to retired professional rugby union players. However, there are anecdotal methods suggested to reduce concussion incidence. For example, neck strengthening techniques to mitigate against collision impacts. There is, no longitudinal evidence to suggest that neck strength can reduce the impacts of concussion in adult populations . Additionally, other factors such as lowering the tackle height in the professional and amateur game is currently being investigated as a mitigating factor to reduce head injury risk. The final theme of the thesis investigated possible methods to reduce injury incidence in amateur and professional athletes. The novel tackle technique platform could assist inexperienced amateur coaches on how to coach effective tackle technique to youth players. The findings from the neurological function data suggests that this may be an alternative way for coaches to assess and gather fatigue data on professional rugby union players alongside additional subjective measures and neuromuscular function data. Recently, the awareness of concussion as an injury and the recognition of concussion in many sports settings has improved. These incremental improvements have led to increased discussion regarding possible measures to mitigate the effects of concussion. There are many additional procedures to be implemented before a comprehensive concussion management is universally available, particularly in amateur and community sports. These necessary processes could be technological advances (e.g., using smart phone technology) for parents and amateur coaches to assist in the early identification of concussion or evidence-based concussion reduction strategies

    Analog Photonics Computing for Information Processing, Inference and Optimisation

    Full text link
    This review presents an overview of the current state-of-the-art in photonics computing, which leverages photons, photons coupled with matter, and optics-related technologies for effective and efficient computational purposes. It covers the history and development of photonics computing and modern analogue computing platforms and architectures, focusing on optimization tasks and neural network implementations. The authors examine special-purpose optimizers, mathematical descriptions of photonics optimizers, and their various interconnections. Disparate applications are discussed, including direct encoding, logistics, finance, phase retrieval, machine learning, neural networks, probabilistic graphical models, and image processing, among many others. The main directions of technological advancement and associated challenges in photonics computing are explored, along with an assessment of its efficiency. Finally, the paper discusses prospects and the field of optical quantum computing, providing insights into the potential applications of this technology.Comment: Invited submission by Journal of Advanced Quantum Technologies; accepted version 5/06/202

    Scalable Hierarchical Instruction Cache for Ultra-Low-Power Processors Clusters

    Full text link
    High Performance and Energy Efficiency are critical requirements for Internet of Things (IoT) end-nodes. Exploiting tightly-coupled clusters of programmable processors (CMPs) has recently emerged as a suitable solution to address this challenge. One of the main bottlenecks limiting the performance and energy efficiency of these systems is the instruction cache architecture due to its criticality in terms of timing (i.e., maximum operating frequency), bandwidth, and power. We propose a hierarchical instruction cache tailored to ultra-low-power tightly-coupled processor clusters where a relatively large cache (L1.5) is shared by L1 private caches through a two-cycle latency interconnect. To address the performance loss caused by the L1 capacity misses, we introduce a next-line prefetcher with cache probe filtering (CPF) from L1 to L1.5. We optimize the core instruction fetch (IF) stage by removing the critical core-to-L1 combinational path. We present a detailed comparison of instruction cache architectures' performance and energy efficiency for parallel ultra-low-power (ULP) clusters. Focusing on the implementation, our two-level instruction cache provides better scalability than existing shared caches, delivering up to 20\% higher operating frequency. On average, the proposed two-level cache improves maximum performance by up to 17\% compared to the state-of-the-art while delivering similar energy efficiency for most relevant applications.Comment: 14 page

    2023- The Twenty-seventh Annual Symposium of Student Scholars

    Get PDF
    The full program book from the Twenty-seventh Annual Symposium of Student Scholars, held on April 18-21, 2023. Includes abstracts from the presentations and posters.https://digitalcommons.kennesaw.edu/sssprograms/1027/thumbnail.jp

    Energy Concerns with HPC Systems and Applications

    Full text link
    For various reasons including those related to climate changes, {\em energy} has become a critical concern in all relevant activities and technical designs. For the specific case of computer activities, the problem is exacerbated with the emergence and pervasiveness of the so called {\em intelligent devices}. From the application side, we point out the special topic of {\em Artificial Intelligence}, who clearly needs an efficient computing support in order to succeed in its purpose of being a {\em ubiquitous assistant}. There are mainly two contexts where {\em energy} is one of the top priority concerns: {\em embedded computing} and {\em supercomputing}. For the former, power consumption is critical because the amount of energy that is available for the devices is limited. For the latter, the heat dissipated is a serious source of failure and the financial cost related to energy is likely to be a significant part of the maintenance budget. On a single computer, the problem is commonly considered through the electrical power consumption. This paper, written in the form of a survey, we depict the landscape of energy concerns in computer activities, both from the hardware and the software standpoints.Comment: 20 page

    A Review of Bayesian Methods in Electronic Design Automation

    Full text link
    The utilization of Bayesian methods has been widely acknowledged as a viable solution for tackling various challenges in electronic integrated circuit (IC) design under stochastic process variation, including circuit performance modeling, yield/failure rate estimation, and circuit optimization. As the post-Moore era brings about new technologies (such as silicon photonics and quantum circuits), many of the associated issues there are similar to those encountered in electronic IC design and can be addressed using Bayesian methods. Motivated by this observation, we present a comprehensive review of Bayesian methods in electronic design automation (EDA). By doing so, we hope to equip researchers and designers with the ability to apply Bayesian methods in solving stochastic problems in electronic circuits and beyond.Comment: 24 pages, a draft version. We welcome comments and feedback, which can be sent to [email protected]

    2023-2024 Undergraduate Catalog

    Get PDF
    2023-2024 undergraduate catalog for Morehead State University

    Lessons from Formally Verified Deployed Software Systems (Extended version)

    Full text link
    The technology of formal software verification has made spectacular advances, but how much does it actually benefit the development of practical software? Considerable disagreement remains about the practicality of building systems with mechanically-checked proofs of correctness. Is this prospect confined to a few expensive, life-critical projects, or can the idea be applied to a wide segment of the software industry? To help answer this question, the present survey examines a range of projects, in various application areas, that have produced formally verified systems and deployed them for actual use. It considers the technologies used, the form of verification applied, the results obtained, and the lessons that can be drawn for the software industry at large and its ability to benefit from formal verification techniques and tools. Note: a short version of this paper is also available, covering in detail only a subset of the considered systems. The present version is intended for full reference.Comment: arXiv admin note: text overlap with arXiv:1211.6186 by other author
    • …
    corecore