750 research outputs found

    Quantum widening of CDT universe

    Full text link
    The physical phase of Causal Dynamical Triangulations (CDT) is known to be described by an effective, one-dimensional action in which three-volumes of the underlying foliation of the full CDT play a role of the sole degrees of freedom. Here we map this effective description onto a statistical-physics model of particles distributed on 1d lattice, with site occupation numbers corresponding to the three-volumes. We identify the emergence of the quantum de-Sitter universe observed in CDT with the condensation transition known from similar statistical models. Our model correctly reproduces the shape of the quantum universe and allows us to analytically determine quantum corrections to the size of the universe. We also investigate the phase structure of the model and show that it reproduces all three phases observed in computer simulations of CDT. In addition, we predict that two other phases may exists, depending on the exact form of the discretised effective action and boundary conditions. We calculate various quantities such as the distribution of three-volumes in our model and discuss how they can be compared with CDT.Comment: 19 pages, 13 figure

    METHODS OF CALCULATION OF MSW STRUCTURES

    Get PDF
    The paper reviews the existing methods for the solution of structures supporting propaga- tion of magnetostatic waves. Due to the fact that these are mostly multilayered structures the mostly used numerical techniques for their calculation are the method of the surface permeability, finite element method and the boundary element method. Because each of them is more or less suitable in special cases, the advantages of each are discussed and pointed out in the paper. The general magnetic anisotropy formulation has been introduced into boundary element method

    Zero-range process with long-range interactions at a T-junction

    Get PDF
    A generalized zero-range process with a limited number of long-range interactions is studied as an example of a transport process in which particles at a T-junction make a choice of which branch to take based on traffic levels on each branch. The system is analysed with a self-consistent mean-field approximation which allows phase diagrams to be constructed. Agreement between the analysis and simulations is found to be very good.Comment: 21 pages, 6 figure

    Pair-factorized steady states on arbitrary graphs

    Full text link
    Stochastic mass transport models are usually described by specifying hopping rates of particles between sites of a given lattice, and the goal is to predict the existence and properties of the steady state. Here we ask the reverse question: given a stationary state that factorizes over links (pairs of sites) of an arbitrary connected graph, what are possible hopping rates that converge to this state? We define a class of hopping functions which lead to the same steady state and guarantee current conservation but may differ by the induced current strength. For the special case of anisotropic hopping in two dimensions we discuss some aspects of the phase structure. We also show how this case can be traced back to an effective zero-range process in one dimension which is solvable for a large class of hopping functions.Comment: IOP style, 9 pages, 1 figur

    Speed and Accuracy of Static Image Discrimination by Rats

    Get PDF
    When discriminating dynamic noisy sensory signals, human and primate subjects achieve higher accuracy when they take more time to decide, an effect attributed to accumulation of evidence over time to overcome neural noise. We measured the speed and accuracy of twelve freely behaving rats discriminating static, high contrast photographs of real-world objects for water reward in a self-paced task. Response latency was longer in correct trials compared to error trials. Discrimination accuracy increased with response latency over the range of 500-1200ms. We used morphs between previously learned images to vary the image similarity parametrically, and thereby modulate task difficulty from ceiling to chance. Over this range we find that rats take more time before responding in trials with more similar stimuli. We conclude that rats' perceptual decisions improve with time even in the absence of temporal information in the stimulus, and that rats modulate speed in response to discrimination difficulty to balance speed and accuracy

    The NuMAX Long Baseline Neutrino Factory Concept

    Full text link
    A Neutrino Factory where neutrinos of all species are produced in equal quantities by muon decay is described as a facility at the intensity frontier for exquisite precision providing ideal conditions for ultimate neutrino studies and the ideal complement to Long Baseline Facilities like LBNF at Fermilab. It is foreseen to be built in stages with progressively increasing complexity and performance, taking advantage of existing or proposed facilities at an existing laboratory like Fermilab. A tentative layout based on a recirculating linac providing opportunities for considerable saving is discussed as well as its possible evolution toward a muon collider if and when requested by Physics. Tentative parameters of the various stages are presented as well as the necessary R&D to address the technological issues and demonstrate their feasibility.Comment: JINST Special Issue on Muon Accelerators. arXiv admin note: text overlap with arXiv:1308.0494, arXiv:1502.0164

    Predicting the effects of deep brain stimulation using a reduced coupled oscillator model

    Get PDF
    This is the final version. Available on open access from Public Library of Science via the DOI in this recordData Availability: The data analysed in this manuscript is available from MRC BNDU Data Sharing platform at: https://data.mrc.ox.ac.uk/data-set/tremor-data-measured-essential-tremor-patients-subjected-phase-locked-deep-brain DOI: 10.5287/bodleian:xq24eN2KmDeep brain stimulation (DBS) is known to be an effective treatment for a variety of neurological disorders, including Parkinson’s disease and essential tremor (ET). At present, it involves administering a train of pulses with constant frequency via electrodes implanted into the brain. New ‘closed-loop’ approaches involve delivering stimulation according to the ongoing symptoms or brain activity and have the potential to provide improvements in terms of efficiency, efficacy and reduction of side effects. The success of closed-loop DBS depends on being able to devise a stimulation strategy that minimizes oscillations in neural activity associated with symptoms of motor disorders. A useful stepping stone towards this is to construct a mathematical model, which can describe how the brain oscillations should change when stimulation is applied at a particular state of the system. Our work focuses on the use of coupled oscillators to represent neurons in areas generating pathological oscillations. Using a reduced form of the Kuramoto model, we analyse how a patient should respond to stimulation when neural oscillations have a given phase and amplitude, provided a number of conditions are satisfied. For such patients, we predict that the best stimulation strategy should be phase specific but also that stimulation should have a greater effect if applied when the amplitude of brain oscillations is lower. We compare this surprising prediction with data obtained from ET patients. In light of our predictions, we also propose a new hybrid strategy which effectively combines two of the closed-loop strategies found in the literature, namely phase-locked and adaptive DBS

    Optimal learning rules for familiarity detection

    Get PDF
    It has been suggested that the mammalian memory system has both familiarity and recollection components. Recently, a high-capacity network to store familiarity has been proposed. Here we derive analytically the optimal learning rule for such a familiarity memory using a signalto- noise ratio analysis. We find that in the limit of large networks the covariance rule, known to be the optimal local, linear learning rule for pattern association, is also the optimal learning rule for familiarity discrimination. The capacity is independent of the sparseness of the patterns, as long as the patterns have a fixed number of bits set. The corresponding information capacity is 0.057 bits per synapse, less than typically found for associative networks
    corecore