7,347 research outputs found

    Generic guide concepts for the European Spallation Source

    Full text link
    The construction of the European Spallation Source (ESS) faces many challenges from the neutron beam transport point of view: The spallation source is specified as being driven by a 5 MW beam of protons, each with 2 GeV energy, and yet the requirements in instrument background suppression relative to measured signal vary between 10−6^{-6} and 10−8^{-8}. The energetic particles, particularly above 20 MeV, which are expected to be produced in abundance in the target, have to be filtered in order to make the beamlines safe, operational and provide good quality measurements with low background. We present generic neutron guides of short and medium length instruments which are optimized for good performance at minimal cost. Direct line of sight to the source is avoided twice, with either the first point out of line of sight or both being inside the bunker (20\,m) to minimize shielding costs. These guide geometries are regarded as a baseline to define standards for instruments to be constructed at ESS. They are used to find commonalities and develop principles and solutions for common problems. Lastly, we report the impact of employing the over-illumination concept to mitigate losses from random misalignment passively, and that over-illumination should be used sparingly in key locations to be effective. For more widespread alignment issues, a more direct, active approach is likely to be needed

    Battery health determination by subspace parameter estimation and sliding mode control for an all-electric Personal Rapid Transit vehicle — the ULTra

    Get PDF
    The paper describes a real-time adaptive battery modelling methodology for use in an all electric personal rapid transit (PRT) vehicle. Through use of a sliding-mode observer and online subspace parameter estimation, the voltages associated with monitoring the state of charge (SoC) of the battery system are shown to be accurately estimated, even with erroneous initial conditions in both the model and parameters. In this way, problems such as self- discharge during storage of the cells and SoC drift (as usually incurred by coulomb-counting methods due to overcharging or ambient temperature fluctuations) are overcome. Moreover, through online monitoring of the degradation of the estimated parameters, battery ageing (State of Health) can be monitored and, in the case of safety- critical systems, cell failure may be predicted in time to avoid inconvenience to passenger networks. Due to the adaptive nature of the proposed methodology, this system can be implemented over a wide range of operating environments, applications and battery topologies, by adjustment of the underlying state-space model

    Evaluation of phylogenetic reconstruction methods using bacterial whole genomes: a simulation based study

    Get PDF
    Background: Phylogenetic reconstruction is a necessary first step in many analyses which use whole genome sequence data from bacterial populations. There are many available methods to infer phylogenies, and these have various advantages and disadvantages, but few unbiased comparisons of the range of approaches have been made. Methods: We simulated data from a defined "true tree" using a realistic evolutionary model. We built phylogenies from this data using a range of methods, and compared reconstructed trees to the true tree using two measures, noting the computational time needed for different phylogenetic reconstructions. We also used real data from Streptococcus pneumoniae alignments to compare individual core gene trees to a core genome tree. Results: We found that, as expected, maximum likelihood trees from good quality alignments were the most accurate, but also the most computationally intensive. Using less accurate phylogenetic reconstruction methods, we were able to obtain results of comparable accuracy; we found that approximate results can rapidly be obtained using genetic distance based methods. In real data we found that highly conserved core genes, such as those involved in translation, gave an inaccurate tree topology, whereas genes involved in recombination events gave inaccurate branch lengths. We also show a tree-of-trees, relating the results of different phylogenetic reconstructions to each other. Conclusions: We recommend three approaches, depending on requirements for accuracy and computational time. Quicker approaches that do not perform full maximum likelihood optimisation may be useful for many analyses requiring a phylogeny, as generating a high quality input alignment is likely to be the major limiting factor of accurate tree topology. We have publicly released our simulated data and code to enable further comparisons

    Sensitivity studies for r-process nucleosynthesis in three astrophysical scenarios

    Full text link
    In rapid neutron capture, or r-process, nucleosynthesis, heavy elements are built up via a sequence of neutron captures and beta decays that involves thousands of nuclei far from stability. Though we understand the basics of how the r-process proceeds, its astrophysical site is still not conclusively known. The nuclear network simulations we use to test potential astrophysical scenarios require nuclear physics data (masses, beta decay lifetimes, neutron capture rates, fission probabilities) for all of the nuclei on the neutron-rich side of the nuclear chart, from the valley of stability to the neutron drip line. Here we discuss recent sensitivity studies that aim to determine which individual pieces of nuclear data are the most crucial for r-process calculations. We consider three types of astrophysical scenarios: a traditional hot r-process, a cold r-process in which the temperature and density drop rapidly, and a neutron star merger trajectory.Comment: 8 pages, 4 figures, submitted to the Proceedings of the International Nuclear Physics Conference (INPC) 201

    Isospin-breaking interactions studied through mirror energy differences

    Get PDF
    Background: Information on charge-dependent (i.e., isospin-non-conserving) interactions is extracted from excited states of mirror nuclei. Purpose: Specifically, the purpose of the study is to extract effective isovector (Vpp 12Vnn) interactions which, in general, can either be of Coulomb or nuclear origin. Methods: A comprehensive shell-model description of isospin-breaking effects is used to fit data on mirror energy differences in the A = 42\u201354 region. The angular-momentum dependence of isospin-breaking interactions was determined from a systematic study of mirror energy differences. Results: The results reveal a significant isovector term, with a very strong spin dependence, beyond that expected of a two-body Coulomb interaction. Conclusions: The isospin-breaking terms that are extracted have a J dependence that is not consistent with the known CSB properties of the bare nucleon-nucleon interaction

    The effectiveness of Lee Silverman Voice Treatment therapy issued interactively through an iPad device: a non-inferiority study.

    Get PDF
    Introduction This study compared the differences in recorded speech variables between people treated with conventional 'in person' Lee Silverman Voice Treatment (LSVT) and those treated remotely via iPad-based 'Facetime'. Method Eight participants were selected for the iPad LSVT, and 21 similarly matched subjects were selected from existing data to form the 'in person' group. Participants in both groups had diagnosed idiopathic Parkinson's disease and moderate hypokinetic dysarthria. Eighteen sessions of prescribed LSVT comprising a pre-treatment assessment, 16 treatment sessions, and a six months' post-treatment assessment were administered for each person. In both groups, pre- and post-treatment assessments were conducted face-to-face. Performance measures were recorded during assessment and treatment. Average measures were determined for all tasks at all time points and a summary outcome variable was composed from across-task performance. Results Non-inferiority testing confirmed that iPad LSVT was non-inferior in treating all LSVT task 3 variables except generating words, with the 90% upper confidence intervals (CI) lying between the non-inferiority margin of ± 2.25 and zero. The iPad was superior in treating the task 3 rainbow reading passage and describing motor task variables with upper and lower 90% CI values being negative. The improvement in the summary outcome variable score was also superior in the iPad group. Discussion Non-inferiority testing implies that the iPad LSVT is non-inferior in treating task three variables when compared to traditional LSVT. The study supports further development of remote delivery solutions involving the Apple iPad and 'Facetime' system as a means of improving access to services and the participant's experience

    Measurements and Monte-Carlo simulations of the particle self-shielding effect of B4C grains in neutron shielding concrete

    Full text link
    A combined measurement and Monte-Carlo simulation study was carried out in order to characterize the particle self-shielding effect of B4C grains in neutron shielding concrete. Several batches of a specialized neutron shielding concrete, with varying B4C grain sizes, were exposed to a 2 {\AA} neutron beam at the R2D2 test beamline at the Institute for Energy Technology located in Kjeller, Norway. The direct and scattered neutrons were detected with a neutron detector placed behind the concrete blocks and the results were compared to Geant4 simulations. The particle self-shielding effect was included in the Geant4 simulations by calculating effective neutron cross-sections during the Monte-Carlo simulation process. It is shown that this method well reproduces the measured results. Our results show that shielding calculations for low-energy neutrons using such materials would lead to an underestimate of the shielding required for a certain design scenario if the particle self-shielding effect is not included in the calculations.Comment: This manuscript version is made available under the CC-BY-NC-ND 4.0 license http://creativecommons.org/licenses/by-nc-nd/4.0

    Exploring capability and accountability outcomes of open development for the poor and marginalized: An analysis of select literature

    Get PDF
    Open development concerns the application of digitally-enabled openness to radically change human capability and governance contexts (Davies & Edwards, 2012; Smith & Reilly, 2013; Smith, Elder, & Emdon, 2011). However, what openness means, and how it contributes to development outcomes is contested (Buskens, 2013; Singh & Gurumurthy, 2013). Furthermore, the potential of open development to support positive social transformation has not yet materialized, particularly for marginalized populations (Bentley & Chib, 2016), partly because relatively little is known regarding how transformation is enacted in the field. Likewise, two promising outcomes – the expansion of human capabilities and accountability – have not been explored in detail. This research interrogates the influence of digitally-enabled openness on transformation processes and outcomes. A purposeful sample of literature was taken to evaluate outcomes and transformation processes according to our theoretical framework, which defines seven cross-cutting dimensions essential to incorporate. We argue that these dimensions explain links between structures, processes and outcomes of open development. These links are essential to understand in the area of Community Informatics as they enable researchers and practitioners to support effective use of openness by and for poor and marginalized communities to pursue their own objectives
    • 

    corecore