15,354 research outputs found

    The Vector Curvaton

    Full text link
    We analyze a massive vector field with a non-canonical kinetic term in the action, minimally coupled to gravity, where the mass and kinetic function of the vector field vary as functions of time during inflation. The vector field is introduced following the same idea of a scalar curvaton, which must not affect the inflationary dynamics since its energy density during inflation is negligible compared to the total energy density in the Universe. Using this hypothesis, the vector curvaton will be solely responsible for generating the primordial curvature perturbation \zeta. We have found that the spectra of the vector field perturbations are scale-invariant in superhorizon scales due to the suitable choice of the time dependence of the kinetic function and the effective mass during inflation. The preferred direction, generated by the vector field, makes the spectrum of \zeta depend on the wavevector, i.e. there exists statistical anisotropy in \zeta. This is discussed principally in the case where the mass of the vector field increases with time during inflation, where it is possible to find a heavy field (M >> H) at the end of inflation, making the particle production be practically isotropic; thus, the longitudinal and transverse spectra are nearly the same order which in turn causes that the statistical anisotropy generated by the vector field is within the observational bounds.Comment: LaTex file in Aipproc style, 6 pages, no figures. Prepared for the conference proceedings of the IX Mexican School of the DGFM-SMF: Cosmology for the XXIst Century. This work is entirely based on Refs. [23-26] and is the result of Andres A. Navarro's MSc thesi

    On the Credibility of the Irish Pound in the EMS

    Get PDF
    This paper assesses the degree of credibility of the Irish Pound in the European Monetary System between 1983 and 1997. Different credibility indicators proposed in the literature are used to measure agents’ perceptions of the credibility of the ERM commitment in an attempt to distinguish between events stemming from problems in the ERM itself and those that appear to have been exclusive to Ireland.

    Numerical study of electrostatically-defined quantum dots in bilayer graphene

    Full text link
    Màster Oficial de Ciùncia i Tecnologia Quàntiques / Quantum Science and Technology, Facultat de Física, Universitat de Barcelona. Curs: 2022-2023. Tutor: Iacopo TorreInteracting quantum many-body systems are so challenging to study that even simplified models, such as the Hubbard model, cannot be solved exactly. For this reason, it is interesting to engineer controllable quantum systems, called quantum simulators, that can emulate the behavior of these models. This makes quantum simulators a promising platform for studying the Hubbard model. These can be implemented, for example, using interacting arrays of quantum dots realized in semiconducting materials. The capability to tune the bands in bilayer graphene with patterned gate electrodes provides an innovative platform to study such a model, as it is the first time to explore the Hubbard model with quantum dots in a twodimensional material. Moreover, this platform opens a wide range of possibilities to study the different parameters of the model. In this work, we study theoretically and numerically realistic models of electrostatically defined quantum dots in bilayer graphene. We can calculate the proposed device’s potential and band-gap landscape induced in bilayer graphene by solving the Poisson equation. The result is then fed to a lowenergy model to calculate the bound states of the quantum dots. This allows calculating the parameters of the corresponding Hubbard model, including tunneling amplitudes and on-site interactions. Our results can be directly used to design quantum-simulation devices based on quantum dots that are realized electrostatically in bilayer graphene

    Technological research in the EU is less efficient than the world average. EU research policy risks Europeans' future

    Get PDF
    We have studied the efficiency of research in the EU by a percentile-based citation approach that analyzes the distribution of country papers among the world papers. Going up in the citation scale, the frequency of papers from efficient countries increases while the frequency from inefficient countries decreases. In the percentile-based approach, this trend, which is permanent at any citation level, is measured by the ep index that equals the Ptop 1%/Ptop 10% ratio. By using the ep index we demonstrate that EU research on fast-evolving technological topics is less efficient than the world average and that the EU is far from being able to compete with the most advanced countries. The ep index also shows that the USA is well ahead of the EU in both fast- and slow-evolving technologies, which suggests that the advantage of the USA over the EU in innovation is due to low research efficiency in the EU. In accord with some previous studies, our results show that the European Commission's ongoing claims about the excellence of EU research are based on a wrong diagnosis. The EU must focus its research policy on the improvement of its inefficient research. Otherwise, the future of Europeans is at risk.Comment: 30 pages, 3 figures, 7 tables, in one single file. Version accepted in Journal of Informetric

    Common bibliometric approaches fail to assess correctly the number of important scientific advances for most countries and institutions

    Full text link
    Although not explicitly declared, most research rankings of countries and institutions are supposed to reveal their contribution to the advancement of knowledge. However, such advances are based on very highly cited publications with very low frequency, which can only very exceptionally be counted with statistical reliability. Percentile indicators enable calculations of the probability or frequency of such rare publications using counts of much more frequent publications; the general rule is that rankings based on the number of top 10% or 1% cited publications (Ptop 10%, Ptop 1%) will also be valid for the rare publications that push the boundaries of knowledge. Japan and its universities are exceptions, as their frequent Nobel Prizes contradicts their low Ptop 10% and Ptop 1%. We explain that this occurs because, in single research fields, the singularity of percentile indicators holds only for research groups that are homogeneous in their aims and efficiency. Correct calculations for ranking countries and institutions should add the results of their homogeneous groups, instead of considering all publications as a single set. Although based on Japan, our findings have a general character. Common predictions of scientific advances based on Ptop 10% might be severalfold lower than correct calculations.Comment: 30 pages, tables and figures embedded in a single pdf fil

    Research assessment by percentile-based double rank analysis

    Get PDF
    In the double rank analysis of research publications, the local rank position of a country or institution publication is expressed as a function of the world rank position. Excluding some highly or lowly cited publications, the double rank plot fits well with a power law, which can be explained because citations for local and world publications follow lognormal distributions. We report here that the distribution of the number of country or institution publications in world percentiles is a double rank distribution that can be fitted to a power law. Only the data points in high percentiles deviate from it when the local and world Ό\mu parameters of the lognormal distributions are very different. The likelihood of publishing very highly cited papers can be calculated from the power law that can be fitted either to the upper tail of the citation distribution or to the percentile-based double rank distribution. The great advantage of the latter method is that it has universal application, because it is based on all publications and not just on highly cited publications. Furthermore, this method extends the application of the well-established percentile approach to very low percentiles where breakthroughs are reported but paper counts cannot be performed.Comment: A pdf file containing text, 9 figures and 4 tables. Accepted in Journal of Informetric

    Rank analysis of most cited publications, a new approach for research assessments

    Full text link
    Citation metrics are the best tools for research assessments. However, current metrics are misleading in research systems that pursue simultaneously different goals, such as the advance of science and incremental innovations, because their publications have different citation distributions. We estimate the contribution to the progress of knowledge by studying only a limited number of the most cited papers, which are dominated by publications pursuing this progress. To field-normalize the metrics, we substitute the number of citations by the rank position of papers from one country in the global list of papers. Using synthetic series of lognormally distributed numbers, we developed the Rk-index, which is calculated from the global ranks of the 10 highest numbers in each series, and demonstrate its equivalence to the number of papers in top percentiles, P top 0.1% and P top 0.01% . In real cases, the Rk-index is simple and easy to calculate, and evaluates the contribution to the progress of knowledge much better than commonly used metrics. Although further research is needed, rank analysis of the most cited papers is a promising approach for research evaluation. It is also demonstrated that, for this purpose, domestic and collaborative papers should be studied independently.Comment: One PDF file, including figures and tables (31 pages
    • 

    corecore