3,178 research outputs found

    Cosmologies with a time dependent vacuum

    Full text link
    The idea that the cosmological term, Lambda, should be a time dependent quantity in cosmology is a most natural one. It is difficult to conceive an expanding universe with a strictly constant vacuum energy density, namely one that has remained immutable since the origin of time. A smoothly evolving vacuum energy density that inherits its time-dependence from cosmological functions, such as the Hubble rate or the scale factor, is not only a qualitatively more plausible and intuitive idea, but is also suggested by fundamental physics, in particular by quantum field theory (QFT) in curved space-time. To implement this notion, is not strictly necessary to resort to ad hoc scalar fields, as usually done in the literature (e.g. in quintessence formulations and the like). A "running" Lambda term can be expected on very similar grounds as one expects (and observes) the running of couplings and masses with a physical energy scale in QFT. Furthermore, the experimental evidence that the equation of state of the dark energy could be evolving with time/redshift (including the possibility that it might currently behave phantom-like) suggests that a time-variable Lambda term (possibly accompanied by a variable Newton's gravitational coupling G=G(t)) could account in a natural way for all these features. Remarkably enough, a class of these models (the "new cosmon") could even be the clue for solving the old cosmological constant problem, including the coincidence problem.Comment: LaTeX, 15 pages, 4 figure

    Perturbations in the relaxation mechanism for a large cosmological constant

    Full text link
    Recently, a mechanism for relaxing a large cosmological constant (CC) has been proposed [arxiv:0902.2215], which permits solutions with low Hubble rates at late times without fine-tuning. The setup is implemented in the LXCDM framework, and we found a reasonable cosmological background evolution similar to the LCDM model with a fine-tuned CC. In this work we analyse analytically the perturbations in this relaxation model, and we show that their evolution is also similar to the LCDM model, especially in the matter era. Some tracking properties of the vacuum energy are discussed, too.Comment: 18 pages, LaTeX; discussion improved, accepted by CQ

    The cosmological constant and the relaxed universe

    Full text link
    We study the role of the cosmological constant (CC) as a component of dark energy (DE). It is argued that the cosmological term is in general unavoidable and it should not be ignored even when dynamical DE sources are considered. From the theoretical point of view quantum zero-point energy and phase transitions suggest a CC of large magnitude in contrast to its tiny observed value. Simply relieving this disaccord with a counterterm requires extreme fine-tuning which is referred to as the old CC problem. To avoid it, we discuss some recent approaches for neutralising a large CC dynamically without adding a fine-tuned counterterm. This can be realised by an effective DE component which relaxes the cosmic expansion by counteracting the effect of the large CC. Alternatively, a CC filter is constructed by modifying gravity to make it insensitive to vacuum energy.Comment: 6 pages, no figures, based on a talk presented at PASCOS 201

    What is there in the black box of dark energy: variable cosmological parameters or multiple (interacting) components?

    Get PDF
    The coincidence problems and other dynamical features of dark energy are studied in cosmological models with variable cosmological parameters and in models with the composite dark energy. It is found that many of the problems usually considered to be cosmological coincidences can be explained or significantly alleviated in the aforementioned models.Comment: 6 pages, 1 figure, talk given at IRGAC2006 (Barcelona, July 11-15, 2006), to appear in J. Phys.

    Hubble expansion and structure formation in the "running FLRW model" of the cosmic evolution

    Full text link
    A new class of FLRW cosmological models with time-evolving fundamental parameters should emerge naturally from a description of the expansion of the universe based on the first principles of quantum field theory and string theory. Within this general paradigm, one expects that both the gravitational Newton's coupling, G, and the cosmological term, Lambda, should not be strictly constant but appear rather as smooth functions of the Hubble rate. This scenario ("running FLRW model") predicts, in a natural way, the existence of dynamical dark energy without invoking the participation of extraneous scalar fields. In this paper, we perform a detailed study of these models in the light of the latest cosmological data, which serves to illustrate the phenomenological viability of the new dark energy paradigm as a serious alternative to the traditional scalar field approaches. By performing a joint likelihood analysis of the recent SNIa data, the CMB shift parameter, and the BAOs traced by the Sloan Digital Sky Survey, we put tight constraints on the main cosmological parameters. Furthermore, we derive the theoretically predicted dark-matter halo mass function and the corresponding redshift distribution of cluster-size halos for the "running" models studied. Despite the fact that these models closely reproduce the standard LCDM Hubble expansion, their normalization of the perturbation's power-spectrum varies, imposing, in many cases, a significantly different cluster-size halo redshift distribution. This fact indicates that it should be relatively easy to distinguish between the "running" models and the LCDM cosmology using realistic future X-ray and Sunyaev-Zeldovich cluster surveys.Comment: Version published in JCAP 08 (2011) 007: 1+41 pages, 6 Figures, 1 Table. Typos corrected. Extended discussion on the computation of the linearly extrapolated density threshold above which structures collapse in time-varying vacuum models. One appendix, a few references and one figure adde

    Structural characterization of high temperature composites

    Get PDF
    Glass, ceramic, and carbon matrix composite materials have emerged in recent years with potential properties and temperature resistance which make them attractive for high temperature applications such as gas turbine engines. At the outset of this study, only flexural tests were available to evaluate brittle matrix composites at temperatures in the 600 to 1000 C range. The results are described of an ongoing effort to develop appropriate tensile, compression, and shear test methods for high temperature use. A tensile test for unidirectional composites was developed and used to evaluate the properties and behavior of ceramic fiber reinforced glass and glass-ceramic matrix composites in air at temperatures up to 1000 C. The results indicate generally efficient fiber reinforcement and tolerance to matrix cracking similar to polymer matrix composites. Limiting properties in these materials may be an inherently very low transverse strain to failure, and high temperature embrittlement due to fiber/matrix interface oxidation

    Anticipating food price crises by reservoir computing

    Full text link
    Anticipating price crises in the market of agri-commodities is critical to guarantee both the sustainability of the food system and to ensure food security. However, this is not an easy task, since the problem implies analyzing small and very volatile time series, which are highly influenced by external factors. In this paper, we show that suitable reservoir computing algorithms can be developed that outperform traditional approaches, by reducing the Mean Absolute Error and, more importantly, increasing the Market Direction Accuracy. For this purpose, the applicability of five variants of such method to forecast this market is explored, and their performance evaluated by comparing the results with those obtained with the standard LSTM and SARIMA benchmarks. We conclude that decomposing the time series and modeling each component with a separate RC is essential to successfully anticipate price trends, and that this method works even in the complex changing temporal scenario of the Covid-19 pandemic, when part of the data were collectedThe project that gave rise to these results received the support of a fellowship from ‘‘la Caixa’’ Foundation (ID 100010434). The fellowship code is LCF/BQ/DR20/11790028. This work has also been partially supported by the Spanish Ministry of Science, Innovation and Universities, Gobierno de España, under Contract No. PID2021-122711NB-C21; and by DG of Research and Technological Innovation of the Community of Madrid (Spain) under Contract No. IND2022/TIC-2371

    Dynamically avoiding fine-tuning the cosmological constant: the "Relaxed Universe"

    Full text link
    We demonstrate that there exists a large class of action functionals of the scalar curvature and of the Gauss-Bonnet invariant which are able to relax dynamically a large cosmological constant (CC), whatever it be its starting value in the early universe. Hence, it is possible to understand, without fine-tuning, the very small current value of the CC as compared to its theoretically expected large value in quantum field theory and string theory. In our framework, this relaxation appears as a pure gravitational effect, where no ad hoc scalar fields are needed. The action involves a positive power of a characteristic mass parameter, M, whose value can be, interestingly enough, of the order of a typical particle physics mass of the Standard Model of the strong and electroweak interactions or extensions thereof, including the neutrino mass. The model universe emerging from this scenario (the "Relaxed Universe") falls within the class of the so-called LXCDM models of the cosmic evolution. Therefore, there is a "cosmon" entity X (represented by an effective object, not a field), which in this case is generated by the effective functional and is responsible for the dynamical adjustment of the cosmological constant. This model universe successfully mimics the essential past epochs of the standard (or "concordance") cosmological model (LCDM). Furthermore, it provides interesting clues to the coincidence problem and it may even connect naturally with primordial inflation.Comment: LaTeX, 63 pp, 8 figures. Extended discussion. Version accepted in JCA

    The J_1-J_2 antiferromagnet with Dzyaloshinskii-Moriya interaction on the square lattice: An exact diagonalization study

    Full text link
    We examine the influence of an anisotropic interaction term of Dzyaloshinskii-Moriya (DM) type on the groundstate ordering of the J_1-J_2 spin-1/2-Heisenberg antiferromagnet on the square lattice. For the DM term we consider several symmetries corresponding to different crystal structures. For the pure J_1-J_2 model there are strong indications for a quantum spin liquid in the region of 0.4 < J_2/J_1 < 0.65. We find that a DM interaction influences the breakdown of the conventional antiferromagnetic order by i) shifting the spin liquid region, ii) changing the isotropic character of the groundstate towards anisotropic correlations and iii) creating for certain symmetries a net ferromagnetic moment.Comment: 7 pages, RevTeX, 6 ps-figures, to appear in J. Phys.: Cond. Ma
    • 

    corecore