2,647 research outputs found

    Temporal and spatial patterns of cortical activation during assisted lower limb movement

    Get PDF
    Human gait is a complex process in the central nervous system that results from the integrity of various mechanisms, including different cortical and subcortical structures. In the present study, we investigated cortical activity during lower limb movement using EEG. Assisted by a dynamic tilt table, all subjects performed standardized stepping movements in an upright position. Source localization of the movement-related potential in relation to spontaneous EEG showed activity in brain regions classically associated with human gait such as the primary motor cortex, the premotor cortex, the supplementary motor cortex, the cingulate cortex, the primary somatosensory cortex and the somatosensory association cortex. Further, we observed a task-related power decrease in the alpha and beta frequency band at electrodes overlying the leg motor area. A temporal activation and deactivation of the involved brain regions as well as the chronological sequence of the movement-related potential could be mapped to specific phases of the gait-like leg movement. We showed that most cortical capacity is needed for changing the direction between the flexion and extension phase. An enhanced understanding of the human gait will provide a basis to improve applications in the field of neurorehabilitation and brain-computer interface

    Integrating out the heaviest quark in N--flavour ChPT

    Full text link
    We extend a known method to integrate out the strange quark in three flavour chiral perturbation theory to the context of an arbitrary number of flavours. As an application, we present the explicit formulae to one--loop accuracy for the heavy quark mass dependency of the low energy constants after decreasing the number of flavours by one while integrating out the heaviest quark in N--flavour chiral perturbation theory.Comment: 18 pages, 1 figure. Text and references added. To appear in EPJ

    Exploring the Use of Cost-Benefit Analysis to Compare Pharmaceutical Treatments for Menorrhagia

    Get PDF
    Background: The extra-welfarist theoretical framework tends to focus on health-related quality of life, whilst the welfarist framework captures a wider notion of well-being. EQ-5D and SF-6D are commonly used to value outcomes in chronic conditions with episodic symptoms, such as heavy menstrual bleeding (clinically termed menorrhagia). Because of their narrow-health focus and the condition’s periodic nature these measures may be unsuitable. A viable alternative measure is willingness to pay (WTP) from the welfarist framework. Objective: We explore the use of WTP in a preliminary cost-benefit analysis comparing pharmaceutical treatments for menorrhagia. Methods: A cost-benefit analysis was carried out based on an outcome of WTP. The analysis is based in the UK primary care setting over a 24-month time period, with a partial societal perspective. Ninety-nine women completed a WTP exercise from the ex-ante (pre-treatment/condition) perspective. Maximum average WTP values were elicited for two pharmaceutical treatments, levonorgestrel-releasing intrauterine system (LNG-IUS) and oral treatment. Cost data were offset against WTP and the net present value derived for treatment. Qualitative information explaining the WTP values was also collected. Results: Oral treatment was indicated to be the most cost-beneficial intervention costing £107 less than LNG-IUS and generating £7 more benefits. The mean incremental net present value for oral treatment compared with LNG-IUS was £113. The use of the WTP approach was acceptable as very few protests and non-responses were observed. Conclusion: The preliminary cost-benefit analysis results recommend oral treatment as the first-line treatment for menorrhagia. The WTP approach is a feasible alternative to the conventional EQ-5D/SF-6D approaches and offers advantages by capturing benefits beyond health, which is particularly relevant in menorrhagia

    Accurate deep neural network inference using computational phase-change memory

    Get PDF
    In-memory computing is a promising non-von Neumann approach for making energy-efficient deep learning inference hardware. Crossbar arrays of resistive memory devices can be used to encode the network weights and perform efficient analog matrix-vector multiplications without intermediate movements of data. However, due to device variability and noise, the network needs to be trained in a specific way so that transferring the digitally trained weights to the analog resistive memory devices will not result in significant loss of accuracy. Here, we introduce a methodology to train ResNet-type convolutional neural networks that results in no appreciable accuracy loss when transferring weights to in-memory computing hardware based on phase-change memory (PCM). We also propose a compensation technique that exploits the batch normalization parameters to improve the accuracy retention over time. We achieve a classification accuracy of 93.7% on the CIFAR-10 dataset and a top-1 accuracy on the ImageNet benchmark of 71.6% after mapping the trained weights to PCM. Our hardware results on CIFAR-10 with ResNet-32 demonstrate an accuracy above 93.5% retained over a one day period, where each of the 361,722 synaptic weights of the network is programmed on just two PCM devices organized in a differential configuration.Comment: This is a pre-print of an article accepted for publication in Nature Communication

    Performance of the LHCb Vertex Detector Alignment Algorithm determined with Beam Test Data

    Full text link
    LHCb is the dedicated heavy flavour experiment at the Large Hadron Collider at CERN. The partially assembled silicon vertex locator (VELO) of the LHCb experiment has been tested in a beam test. The data from this beam test have been used to determine the performance of the VELO alignment algorithm. The relative alignment of the two silicon sensors in a module and the relative alignment of the modules has been extracted. This alignment is shown to be accurate at a level of approximately 2 micron and 0.1 mrad for translations and rotations, respectively in the plane of the sensors. A single hit precision at normal track incidence of about 10 micron is obtained for the sensors. The alignment of the system is shown to be stable at better than the 10 micron level under air to vacuum pressure changes and mechanical movements of the assembled system.Comment: accepted for publication in NIM

    Two-loop representations of low-energy pion form factors and pi-pi scattering phases in the presence of isospin breaking

    Full text link
    Dispersive representations of the pi-pi scattering amplitudes and pion form factors, valid at two-loop accuracy in the low-energy expansion, are constructed in the presence of isospin-breaking effects induced by the difference between the charged and neutral pion masses. Analytical expressions for the corresponding phases of the scalar and vector pion form factors are computed. It is shown that each of these phases consists of the sum of a "universal" part and a form-factor dependent contribution. The first one is entirely determined in terms of the pi-pi scattering amplitudes alone, and reduces to the phase satisfying Watson's theorem in the isospin limit. The second one can be sizeable, although it vanishes in the same limit. The dependence of these isospin corrections with respect to the parameters of the subthreshold expansion of the pi-pi amplitude is studied, and an equivalent representation in terms of the S-wave scattering lengths is also briefly presented and discussed. In addition, partially analytical expressions for the two-loop form factors and pi-pi scattering amplitudes in the presence of isospin breaking are provided.Comment: 57 pages, 12 figure

    Assessing the perspective of well-being of older patients with multiple morbidities by using the LAVA tool-a person-centered approach

    Get PDF
    BACKGROUND: Older patients with multiple morbidities are a particularly vulnerable population that is likely to face complex medical decisions at some time in their lives. A patient-centered medical care fosters the inclusion of the patients’ perspectives, priorities, and complaints into clinical decision making. METHODS: This article presents a short and non-normative assessment tool to capture the priorities and problems of older patients. The so-called LAVA (“Life and Vitality Assessment”) tool was developed for practical use in seniors in the general population and for residents in nursing homes in order to gain more knowledge about the patients themselves as well as to facilitate access to the patients. The LAVA tool conceptualizes well-being from the perspectives of older individuals themselves rather than from the perspectives of outside individuals. RESULTS: The LAVA tool is graphically presented and the assessment is explained in detail. Exemplarily, the outcomes of the assessments with the LAVA of three multimorbid older patients are presented and discussed. In each case, the assessment pointed out resources as well as at least one problem area, rated as very important by the patients themselves. CONCLUSIONS: The LAVA tool is a short, non-normative, and useful approach that encapsulates the perspectives of well-being of multimorbid patients and gives insights into their resources and problem areas. SUPPLEMENTARY INFORMATION: The online version contains supplementary material available at 10.1186/s12877-021-02342-3

    Pilot study to test the feasibility of a trial design and complex intervention on PRIoritising MUltimedication in Multimorbidity in general practices (PRIMUMpilot).

    Get PDF
    Published onlineJournal ArticleThis is the final version of the article. Available from BMJ Publishing Group via the DOI in this record.OBJECTIVE: To improve medication appropriateness and adherence in elderly patients with multimorbidity, we developed a complex intervention involving general practitioners (GPs) and their healthcare assistants (HCA). In accordance with the Medical Research Council guidance on developing and evaluating complex interventions, we prepared for the main study by testing the feasibility of the intervention and study design in a cluster randomised pilot study. SETTING: 20 general practices in Hesse, Germany. PARTICIPANTS: 100 cognitively intact patients ≥65 years with ≥3 chronic conditions, ≥5 chronic prescriptions and capable of participating in telephone interviews; 94 patients completed the study. INTERVENTION: The HCA conducted a checklist-based interview with patients on medication-related problems and reconciled their medications. Assisted by a computerised decision-support system (CDSS), the GPs discussed medication intake with patients and adjusted their medication regimens. The control group continued with usual care. OUTCOME MEASURES: Feasibility of the intervention and required time were assessed for GPs, HCAs and patients using mixed methods (questionnaires, interviews and case vignettes after completion of the study). The feasibility of the study was assessed concerning success of achieving recruitment targets, balancing cluster sizes and minimising drop-out rates. Exploratory outcomes included the medication appropriateness index (MAI), quality of life, functional status and adherence-related measures. MAI was evaluated blinded to group assignment, and intra-rater/inter-rater reliability was assessed for a subsample of prescriptions. RESULTS: 10 practices were randomised and analysed per group. GPs/HCAs were satisfied with the interventions despite the time required (35/45 min/patient). In case vignettes, GPs/HCAs needed help using the CDSS. The study made no patients feel uneasy. Intra-rater/inter-rater reliability for MAI was excellent. Inclusion criteria were challenging and potentially inadequate, and should therefore be adjusted. Outcome measures on pain, functionality and self-reported adherence were unfeasible due to frequent missing values, an incorrect manual or potentially invalid results. CONCLUSIONS: Intervention and trial design were feasible. The pilot study revealed important limitations that influenced the design and conduct of the main study, thus highlighting the value of piloting complex interventions. TRIAL REGISTRATION NUMBER: ISRCTN99691973; Results.Funding has been provided by the German Federal Ministry of Education and Research, BMBF, grant number 01GK0702

    Electromagnetic corrections in eta --> 3 pi decays

    Full text link
    We re-evaluate the electromagnetic corrections to eta --> 3 pi decays at next-to-leading order in the chiral expansion, arguing that effects of order e^2(m_u-m_d) disregarded so far are not negligible compared to other contributions of order e^2 times a light quark mass. Despite the appearance of the Coulomb pole in eta --> pi+ pi- pi0 and cusps in eta --> 3 pi0, the overall corrections remain small.Comment: 21 pages, 11 figures; references updated, version published in EPJ

    Generic and Layered Framework Components for the Control of a Large Scale Data Acquisition System

    Get PDF
    The complexity of today's experiments in High Energy Physics results in a large amount of readout channels which can count up to a million and above. The experiments in general consist of various subsystems which themselves comprise a large amount of detectors requiring sophisticated DAQ and readout electronics. We report here on the structured software layers to control such a data acquisition system for the case of LHCb which is one of the four experiments for LHC. Additional focus is given on the protocols in use as well as the required hardware. An abstraction layer was implemented to allow access on the different and distinct hardware types in a coherent and generic manner. The hierarchical structure which allows propagating commands down to the subsystems is explained. Via finite state machines an expert system with auto-recovery abilities can be modeled
    • …
    corecore