956 research outputs found

    Understanding local neuromuscular mechanisms that explain the efficacy of interventions for patellofemoral pain

    Get PDF
    Patellofemoral pain (PFP) is a common and persistent knee pain complaint among all age ranges, especially highly active people. Multiple approaches have been used to understand symptom persistence, including identifying a mechanism explaining intervention benefits (i.e. changes in specific deficits in groups that show symptoms’ improvement). Research has been conducted to identify the characteristics associated with PFP, but uncertainty regarding local neuromuscular characteristics remain evident. The thesis aimed to a) identify the local neuromuscular characteristics associated with PFP, b) develop an evidence informed laboratory protocol to detect those characteristics, c) establish protocol reliability and feasibility, and d) identify interventions that can target these neuromuscular characteristics. A systematic review with meta-analysis was completed to identify the neuromuscular characteristics of all muscles that cross the knee in people with PFP compared to uninjured groups. Ten deficits within three neuromuscular domains were found. Within the electromyography (EMG) domain, a delay in Vastus medialis (VM) relative to Vastus lateralis (VL) excitation onset, a high Biceps femoris (BF) mean excitation amplitude, and a lower Hoffman-reflex amplitude of VM were identified. Within the muscle performance domain, lower isometric, concentric, and eccentric extensors peak torque and total work, lower concentric flexors peak torque, and lower rate of torque development (RTD) to reach 30%, 60% and 90% of extensors peak torque were identified. Hamstring tightness was identified within muscle flexibility domain. The systematic review was published and the results used to inform testing protocol development. A second systematic review with meta-analysis was conducted to identify interventions that can target the local deficits associated with PFP. The results indicate that currently an intervention that effectively modifies EMG characteristics cannot be identified. Predominantly, exercise interventions have effects on strength and flexibility in PFP. Specifically, hip and knee targeted exercises are found to have a potential mechanism of benefit through both characteristics categories. A unique approach was introduced within the thesis to develop a deficit-detection protocol based on systematic review results. This approach provided a comprehensive analysis of the protocols from the studies that were included in the meta-analysis. A battery of tests was developed and included; a) VM-VL excitation onset timing in step-up task, b) BF mean excitation amplitude in single-leg triple-hop test, c) isometric, d) concentric and e) eccentric extensors peak torque, f) RTD to 30%, 60% and 90% of isometric peak torque, and hamstrings flexibility. Reliability testing of the deficit-detection protocol was conducted with both uninjured and participants with PFP over two phases. Phase one evaluated the original protocols adapted from the review. Phase two was performed on the EMG and RTD domains to explore the effects of signal processing parameters on reliability, such as; onset detection thresholds modification, unnormalised signals, and the addition of absolute RTD. For the PFP group: reliable results were demonstrated for concentric and eccentric extensors peak torque; RTD of the quadriceps at 25ms, 50ms and 90% of peak torque; and hamstrings flexibility. The uninjured group showed reliable results in: unnormalised BF mean excitation amplitude; all three peak torque tests; RTD to 30% of peak torque and at 150 and 175 milliseconds; and hamstrings flexibility. To establish participant recruitment rate and retention, in addition to the acceptability of the test protocol, a preliminary feasibility study of the deficit-detection protocol was conducted. A sample of 14 participants with PFP were recruited and tested at the Mile-end campus of QMUL before and after a six weeks period. Feasibility results indicate that 25.5% were willing to participate following an online screening process (n=17/55) and 82% met the eligibility criteria following face-to-face assessment (n=14/17). Recruitment rate was 0.5 participants per week and drop-out rate was 35.2% (n=11/17). The results indicate that the protocol did not meet all a-priori feasibility criteria, but the results can inform future research planning. The thesis has successfully identified local deficits associated with PFP, developed a test protocol that demonstrates reliability in evaluating these deficits and assessed the feasibility of the protocol in individuals with PFP. Interventions to cause change within these local deficits have been identified, with gap maps demonstrating where further research is required to better align the mechanisms of treatment effects with specific deficits associated with PFP

    Multidisciplinary perspectives on Artificial Intelligence and the law

    Get PDF
    This open access book presents an interdisciplinary, multi-authored, edited collection of chapters on Artificial Intelligence (‘AI’) and the Law. AI technology has come to play a central role in the modern data economy. Through a combination of increased computing power, the growing availability of data and the advancement of algorithms, AI has now become an umbrella term for some of the most transformational technological breakthroughs of this age. The importance of AI stems from both the opportunities that it offers and the challenges that it entails. While AI applications hold the promise of economic growth and efficiency gains, they also create significant risks and uncertainty. The potential and perils of AI have thus come to dominate modern discussions of technology and ethics – and although AI was initially allowed to largely develop without guidelines or rules, few would deny that the law is set to play a fundamental role in shaping the future of AI. As the debate over AI is far from over, the need for rigorous analysis has never been greater. This book thus brings together contributors from different fields and backgrounds to explore how the law might provide answers to some of the most pressing questions raised by AI. An outcome of the Católica Research Centre for the Future of Law and its interdisciplinary working group on Law and Artificial Intelligence, it includes contributions by leading scholars in the fields of technology, ethics and the law.info:eu-repo/semantics/publishedVersio

    Learning and Control of Dynamical Systems

    Get PDF
    Despite the remarkable success of machine learning in various domains in recent years, our understanding of its fundamental limitations remains incomplete. This knowledge gap poses a grand challenge when deploying machine learning methods in critical decision-making tasks, where incorrect decisions can have catastrophic consequences. To effectively utilize these learning-based methods in such contexts, it is crucial to explicitly characterize their performance. Over the years, significant research efforts have been dedicated to learning and control of dynamical systems where the underlying dynamics are unknown or only partially known a priori, and must be inferred from collected data. However, much of these classical results have focused on asymptotic guarantees, providing limited insights into the amount of data required to achieve desired control performance while satisfying operational constraints such as safety and stability, especially in the presence of statistical noise. In this thesis, we study the statistical complexity of learning and control of unknown dynamical systems. By utilizing recent advances in statistical learning theory, high-dimensional statistics, and control theoretic tools, we aim to establish a fundamental understanding of the number of samples required to achieve desired (i) accuracy in learning the unknown dynamics, (ii) performance in the control of the underlying system, and (iii) satisfaction of the operational constraints such as safety and stability. We provide finite-sample guarantees for these objectives and propose efficient learning and control algorithms that achieve the desired performance at these statistical limits in various dynamical systems. Our investigation covers a broad range of dynamical systems, starting from fully observable linear dynamical systems to partially observable linear dynamical systems, and ultimately, nonlinear systems. We deploy our learning and control algorithms in various adaptive control tasks in real-world control systems and demonstrate their strong empirical performance along with their learning, robustness, and stability guarantees. In particular, we implement one of our proposed methods, Fourier Adaptive Learning and Control (FALCON), on an experimental aerodynamic testbed under extreme turbulent flow dynamics in a wind tunnel. The results show that FALCON achieves state-of-the-art stabilization performance and consistently outperforms conventional and other learning-based methods by at least 37%, despite using 8 times less data. The superior performance of FALCON arises from its physically and theoretically accurate modeling of the underlying nonlinear turbulent dynamics, which yields rigorous finite-sample learning and performance guarantees. These findings underscore the importance of characterizing the statistical complexity of learning and control of unknown dynamical systems.</p

    Less is More: Restricted Representations for Better Interpretability and Generalizability

    Get PDF
    Deep neural networks are prevalent in supervised learning for large amounts of tasks such as image classification, machine translation and even scientific discovery. Their success is often at the sacrifice of interpretability and generalizability. The increasing complexity of models and involvement of the pre-training process make the inexplicability more imminent. The outstanding performance when labeled data are abundant while prone to overfit when labeled data are limited demonstrates the difficulty of deep neural networks' generalizability to different datasets. This thesis aims to improve interpretability and generalizability by restricting representations. We choose to approach interpretability by focusing on attribution analysis to understand which features contribute to prediction on BERT, and to approach generalizability by focusing on effective methods in a low-data regime. We consider two strategies of restricting representations: (1) adding bottleneck, and (2) introducing compression. Given input x, suppose we want to learn y with the latent representation z (i.e. x→z→y), adding bottleneck means adding function R such that L(R(z)) < L(z) and introducing compression means adding function R so that L(R(y)) < L(y) where L refers to the number of bits. In other words, the restriction is added either in the middle of the pipeline or at the end of it. We first introduce how adding information bottleneck can help attribution analysis and apply it to investigate BERT's behavior on text classification in Chapter 3. We then extend this attribution method to analyze passage reranking in Chapter 4, where we conduct a detailed analysis to understand cross-layer and cross-passage behavior. Adding bottleneck can not only provide insight to understand deep neural networks but can also be used to increase generalizability. In Chapter 5, we demonstrate the equivalence between adding bottleneck and doing neural compression. We then leverage this finding with a framework called Non-Parametric learning by Compression with Latent Variables (NPC-LV), and show how optimizing neural compressors can be used in the non-parametric image classification with few labeled data. To further investigate how compression alone helps non-parametric learning without latent variables (NPC), we carry out experiments with a universal compressor gzip on text classification in Chapter 6. In Chapter 7, we elucidate methods of adopting the perspective of doing compression but without the actual process of compression using T5. Using experimental results in passage reranking, we show that our method is highly effective in a low-data regime when only one thousand query-passage pairs are available. In addition to the weakly supervised scenario, we also extend our method to large language models like GPT under almost no supervision --- in one-shot and zero-shot settings. The experiments show that without extra parameters or in-context learning, GPT can be used for semantic similarity, text classification, and text ranking and outperform strong baselines, which is presented in Chapter 8. The thesis proposes to tackle two big challenges in machine learning --- "interpretability" and "generalizability" through restricting representation. We provide both theoretical derivation and empirical results to show the effectiveness of using information-theoretic approaches. We not only design new algorithms but also provide numerous insights on why and how "compression" is so important in understanding deep neural networks and improving generalizability

    Effectiveness of an amygdala and insula retraining program combined with mindfulness training to improve the quality of life in patients with long COVID: a randomized controlled trial protocol

    Get PDF
    Background: There has been growing clinical awareness in recent years of the long-term physical and psychological consequences of the SARS-CoV-2 virus, known as Long COVID. The prevalence of Long COVID is approximately 10% of those infected by the virus. Long COVID is associated with physical and neuropsychological symptoms, including those related to mental health, psychological wellbeing, and cognition. However, research on psychological interventions is still in its early stages, in which means that available results are still limited. The main objective of this study is to evaluate the effects of a program based on amygdala and insula retraining (AIR) combined with mindfulness training (AIR + Mindfulness) on the improvement of quality of life, psychological well-being, and cognition in patients with Long COVID. Methods: This study protocol presents a single-blind randomized controlled trial (RCT) that encompasses baseline, post-treatment, and six-month follow-up assessment time points. A total of 100 patients diagnosed with Long COVID by the Spanish National Health Service will be randomly assigned to either AIR + Mindfulness (n = 50) or relaxation intervention (n = 50), the latter as a control group. The primary outcome will be quality of life assessed using the Short Form-36 Health Survey (SF-36). Additional outcomes such as fatigue, pain, anxiety, memory, and sleep quality will also be evaluated. Mixed effects regression models will be used to estimate the effectiveness of the program, and effect size calculations will be made. Discussion: Long COVID syndrome is a clinical condition characterized by the persistence of symptoms for at least 12 weeks after the onset of COVID-19 that significantly affects people’s quality of life. This will be the first RCT conducted in Spain to apply a psychotherapy program for the management of symptoms derived from Long COVID. Positive results from this RCT may have a significant impact on the clinical context by confirming the beneficial effect of the intervention program being evaluated on improving the symptoms of Long COVID syndrome and aiding the development of better action strategies for these patients. Trial registration: Clinical Trials.gov NCT05956405

    Measuring the impact of COVID-19 on hospital care pathways

    Get PDF
    Care pathways in hospitals around the world reported significant disruption during the recent COVID-19 pandemic but measuring the actual impact is more problematic. Process mining can be useful for hospital management to measure the conformance of real-life care to what might be considered normal operations. In this study, we aim to demonstrate that process mining can be used to investigate process changes associated with complex disruptive events. We studied perturbations to accident and emergency (A &E) and maternity pathways in a UK public hospital during the COVID-19 pandemic. Co-incidentally the hospital had implemented a Command Centre approach for patient-flow management affording an opportunity to study both the planned improvement and the disruption due to the pandemic. Our study proposes and demonstrates a method for measuring and investigating the impact of such planned and unplanned disruptions affecting hospital care pathways. We found that during the pandemic, both A &E and maternity pathways had measurable reductions in the mean length of stay and a measurable drop in the percentage of pathways conforming to normative models. There were no distinctive patterns of monthly mean values of length of stay nor conformance throughout the phases of the installation of the hospital’s new Command Centre approach. Due to a deficit in the available A &E data, the findings for A &E pathways could not be interpreted

    Constitutions of Value

    Get PDF
    Gathering an interdisciplinary range of cutting-edge scholars, this book addresses legal constitutions of value. Global value production and transnational value practices that rely on exploitation and extraction have left us with toxic commons and a damaged planet. Against this situation, the book examines law’s fundamental role in institutions of value production and valuation. Utilising pathbreaking theoretical approaches, it problematizes mainstream efforts to redeem institutions of value production by recoupling them with progressive values. Aiming beyond radical critique, the book opens up the possibility of imagining and enacting new and different value practices. This wide-ranging and accessible book will appeal to international lawyers, socio-legal scholars, those working at the intersections of law and economy and others, in politics, economics, environmental studies and elsewhere, who are concerned with rethinking our current ideas of what has value, what does not, and whether and how value may be revalued

    Deep learning for accelerated magnetic resonance imaging

    Get PDF
    Medical imaging has aided the biggest advance in the medical domain in the last century. Whilst X-ray, CT, PET and ultrasound are a form of imaging that can be useful in particular scenarios, they each have disadvantages in cost, image quality, ease-of-use and ionising radiation. MRI is a slow imaging protocol which contributes to its high cost to run. However, MRI is a very versatile imaging protocol allowing images of varying contrast to be easily generated whilst not requiring the use of ionising radiation. If MRI can be made to be more efficient and smart, the effective cost of running MRI may be more affordable and accessible. The focus of this thesis is decreasing the acquisition time involved in MRI whilst maintaining the quality of the generated images and thus diagnosis. In particular, we focus on data-driven deep learning approaches that aid in the image reconstruction process and streamline the diagnostic process. We focus on three particular aspects of MR acquisition. Firstly, we investigate the use of motion estimation in the cine reconstruction process. Motion allows us to combine an abundance of imaging data in a learnt reconstruction model allowing acquisitions to be sped up by up to 50 times in extreme scenarios. Secondly, we investigate the possibility of using under-acquired MR data to generate smart diagnoses in the form of automated text reports. In particular, we investigate the possibility of skipping the imaging reconstruction phase altogether at inference time and instead, directly seek to generate radiological text reports for diffusion-weighted brain images in an effort to streamline the diagnostic process. Finally, we investigate the use of probabilistic modelling for MRI reconstruction without the use of fully-acquired data. In particular, we note that acquiring fully-acquired reference images in MRI can be difficult and nonetheless may still contain undesired artefacts that lead to degradation of the dataset and thus the training process. In this chapter, we investigate the possibility of performing reconstruction without fully-acquired references and furthermore discuss the possibility of generating higher quality outputs than that of the fully-acquired references.Open Acces

    Habitat of grace : biology, religion and the global environmental crisis

    Get PDF
    The Fifth Mission Statement urges the Anglican Church to be, or become, involved in the world-wide effort by all thinking people, of any faith or none, to find workable ways to alleviate the global environmental crisis. As it stands, however, the Statement is an incompatible mixture of contemporary scientific and religious environmental concern set against a Biblical background that had no such concern. Therefore, public exhortations based on the Fifth Mission Statement taken at face value are unlikely to succeed, especially if addressed to secular audiences. The environmental crisis is a moral issue, because it concerns the process of reaching communal decisions about the allocation between competing groups of common resources in short supply, such as finance for conservation, access to forests, fisheries, clean water, clean air, etc. The relevant context for understanding the moral dimensions of environmental protection must include contemporary biological and philosophical knowledge, because we need to understand what decisions are required, and the origin and nature of the ethical context of those decisions, as well as the reasons why so many people ignore the interests of the environment on which we all depend. In this thesis I have explored some ways in which the insights of secular science can be incorporated into the Fifth Mission Statement, which will help Christians make a constructive contribution to the secular debate. From economics we can learn why the current free-market model is so subversive and why management of environmental common goods is so difficult; from game theory, why the personal restraint for which green activists plead is often not rational, except within the context of stable community life; from primatology, what are the evolutionary and social bases of morality and intelligence; from anthropology, how the combination of intelligence and socially-mediated morality as a conditional strategy has coaxed our primate and tribal human ancestors over time from rampant xenophobia through cautious trading of goods and ideas through to the philosophical analysis of true human ethics. The biological account of the origin and general operation of morality is very different from the theological and philosophical one, but is backed by a large and growing body of empirical evidence. It must be considered by any moral exhortation intended, like the Fifth Mission Statement, to be credible to non-Christians. The Christian understanding of true altruism (charity) remains a matter that goes beyond biology and into the realms of grace. An updated Christian theology of creation, and further development of the Fifth Mission Statement along these lines, will arm the Church to play a leading role in the environmental debate. Christian theologians should be among the very first to respond to E.O.Wilson's call for consilience between all branches of learning (Wilson 1998), since the unity of all knowledge is an ancient belief of the Church. Rational, passionate and updated Christianity could make a real contribution to developing some solution to the environmental crisis, to the extent that any solution is possible: otherwise, it will remain, as in the past, part of the problem
    corecore