19,755 research outputs found

    Confining the Electroweak Model to a Brane

    Full text link
    We introduce a simple scenario where, by starting with a five-dimensional SU(3) gauge theory, we end up with several 4-D parallel branes with localized fermions and gauge fields. Similar to the split fermion scenario, the confinement of fermions is generated by the nontrivial topological solution of a SU(3) scalar field. The 4-D fermions are found to be chiral, and to have interesting properties coming from their 5-D group representation structure. The gauge fields, on the other hand, are localized by loop corrections taking place at the branes produced by the fermions. We show that these two confining mechanisms can be put together to reproduce the basic structure of the electroweak model for both leptons and quarks. A few important results are: Gauge and Higgs fields are unified at the 5-D level; and new fields are predicted: One left-handed neutrino with zero-hypercharge, and one massive vector field coupling together the new neutrino with other left-handed leptons. The hierarchy problem is also addressed.Comment: 9 pages, 8 figures; references added; version published in PR

    Meron-cluster simulation of the quantum antiferromagnetic Heisenberg model in a magnetic field in one- and two-dimensions

    Full text link
    Motivated by the numerical simulation of systems which display quantum phase transitions, we present a novel application of the meron-cluster algorithm to simulate the quantum antiferromagnetic Heisenberg model coupled to an external uniform magnetic field both in one and in two dimensions. In the infinite volume limit and at zero temperature we found numerical evidence that supports a quantum phase transition very close to the critical values Bc=2B_{c}=2 and Bc=4B_{c}=4 for the system in one and two dimensions, respectively. For the one dimensional system, we have compared the numerical data obtained with analytical predictions for the magnetization density as a function of the external field obtained by scaling-behaviour analysis and Bethe Ansatz techniques. Since there is no analytical solution for the two dimensional case, we have compared our results with the magnetization density obtained by scaling relations for small lattice sizes and with the approximated thermodynamical limit at zero temperature guessed by scaling relations. Moreover, we have compared the numerical data with other numerical simulations performed by using different algorithms in one and two dimensions, like the directed loop method. The numerical data obtained are in perfect agreement with all these previous results, which confirms that the meron-algorithm is reliable for quantum Monte Carlo simulations and applicable both in one and two dimensions. Finally, we have computed the integrated autocorrelation time to measure the efficiency of the meron algorithm in one dimension.Comment: 18 pages, 11 figure

    The string swampland constraints require multi-field inflation

    Full text link
    An important unsolved problem that affects practically all attempts to connect string theory to cosmology and phenomenology is how to distinguish effective field theories belonging to the string landscape from those that are not consistent with a quantum theory of gravity at high energies (the "string swampland"). It was recently proposed that potentials of the string landscape must satisfy at least two conditions, the "swampland criteria", that severely restrict the types of cosmological dynamics they can sustain. The first criterion states that the (multi-field) effective field theory description is only valid over a field displacement ΔϕΔO(1)\Delta \phi \leq \Delta \sim \mathcal O(1) (in units where the Planck mass is 1), measured as a distance in the target space geometry. A second, more recent, criterion asserts that, whenever the potential VV is positive, its slope must be bounded from below, and suggests V/VcO(1)|\nabla V| / V \geq c \sim \mathcal O(1). A recent analysis concluded that these two conditions taken together practically rule out slow-roll models of inflation. In this note we show that the two conditions rule out inflationary backgrounds that follow geodesic trajectories in field space, but not those following curved, non-geodesic, trajectories (which are parametrized by a non-vanishing bending rate Ω\Omega of the multi-field trajectory). We derive a universal lower bound on Ω\Omega (relative to the Hubble parameter HH) as a function of Δ,c\Delta, c and the number of efolds NeN_e, assumed to be at least of order 60. If later studies confirm cc and Δ\Delta to be strictly O(1)\mathcal O(1), the bound implies strong turns with Ω/H3Ne180\Omega / H \geq 3 N_e \sim 180. Slow-roll inflation in the landscape is not ruled out, but it is strongly multi-field.Comment: v1: 15 pages; v2: 16 pages, references added, improved discussions, version accepted for publication in JCA

    Sensitivity-based multistep MPC for embedded systems

    Get PDF
    In model predictive control (MPC), an optimization problem is solved every sampling instant to determine an optimal control for a physical system. We aim to accelerate this procedure for fast systems applications and address the challenge of implementing the resulting MPC scheme on an embedded system with limited computing power. We present the sensitivity-based multistep MPC, a strategy which considerably reduces the computing requirements in terms of floating point operations (FLOPs), compared to a standard MPC formulation, while fulfilling closed- loop performance expectations. We illustrate by applying the method to a DC-DC converter model and show how a designer can optimally trade off closed-loop performance considerations with computing requirements in order to fit the controller into a resource-constrained embedded system

    Towards an Ontology Metadata Standard

    Get PDF
    In this poster, we present (i) a proposal for a metadata standard, known as Ontology Metadata Vocabulary (OMV) which is based on discussions in the EU IST thematic network of excellence Knowledge Web1 and (ii) two complementary reference implementations which show the benefit of such a standard in decentralized and centralized scenarios, i.e. the Oyster P2P system and the Onthology metadata portal

    Measurement of uncertainty costs with dynamic traffic simulations

    Get PDF
    Non-recurrent congestion in transportation networks occurs as a consequence of stochastic factors affecting demand and supply. Intelligent Transportation Systems such as Advanced Traveler Information Systems (ATIS) and Advanced Traffic Management Systems (ATMS) are designed in order to reduce the impacts of non-recurrent congestion by providing information to a fraction of users or by controlling the variability of traffic flows. For these reasons, the design of ATIS and ATMS requires reliable forecast of non-recurrent congestion. This paper proposes a new method to measure the impacts of non-recurrent congestion on travel costs by taking risk aversion into account. The traffic model is based on the dynamic traffic simulations model METROPOLIS. Incidents are generated randomly by reducing the capacity of the network. Users can instantaneously adapt to the unexpected travel conditions or can also change their behavior via a day-to-day adjustment process. Comparisons with incident-free simulations provide a benchmark for potential travel time savings that can be brought in by a state-of-the-art information system. We measure the impact of variable travel conditions by describing the willingness to pay to avoid risky or unreliable journeys. Indeed, for risk averse drivers, any uncertainty corresponds to a utility loss. This utility loss is computed for several levels of network disruption. The main results of the paper is that the utility loss due to uncertainty is of the same order of magnitude as the total travel costs.
    corecore