21,283 research outputs found

    A cosmic vector for dark energy

    Get PDF
    In this work we show that the presence of a vector field on cosmological scales could explain the present phase of accelerated expansion of the universe. The proposed theory contains no dimensional parameters nor potential terms and does not require unnatural initial conditions in the early universe, thus avoiding the so called cosmic coincidence problem. In addition, it fits the data from high-redshift supernovae with excellent precision, making definite predictions for cosmological parameters. Upcoming observations will be able to clearly discriminate this model from standard cosmology with cosmological constant.Comment: 5 pages, 3 figures, 1 table. New comments and references included. Final version to appear in Phys. Rev.

    A reflective characterisation of occasional user

    Get PDF
    This work revisits established user classifications and aims to characterise a historically unspecified user category, the Occasional User (OU). Three user categories, novice, intermediate and expert, have dominated the work of user interface (UI) designers, researchers and educators for decades. These categories were created to conceptualise user's needs, strategies and goals around the 80s. Since then, UI paradigm shifts, such as direct manipulation and touch, along with other advances in technology, gave new access to people with little computer knowledge. This fact produced a diversification of the existing user categories not observed in the literature review of traditional classification of users. The findings of this work include a new characterisation of the occasional user, distinguished by user's uncertainty of repetitive use of an interface and little knowledge about its functioning. In addition, the specification of the OU, together with principles and recommendations will help UI community to informatively design for users without requiring a prospective use and previous knowledge of the UI. The OU is an essential type of user to apply user-centred design approach to understand the interaction with technology as universal, accessible and transparent for the user, independently of accumulated experience and technological era that users live in

    On the Numerical Accuracy of Spreadsheets

    Get PDF
    This paper discusses the numerical precision of five spreadsheets (Calc, Excel, Gnumeric, NeoOffice and Oleo) running on two hardware platforms (i386 and amd64) and on three operating systems (Windows Vista, Ubuntu Intrepid and Mac OS Leopard). The methodology consists of checking the number of correct significant digits returned by each spreadsheet when computing the sample mean, standard deviation, first-order autocorrelation, F statistic in ANOVA tests, linear and nonlinear regression and distribution functions. A discussion about the algorithms for pseudorandom number generation provided by these platforms is also conducted. We conclude that there is no safe choice among the spreadsheets here assessed: they all fail in nonlinear regression and they are not suited for Monte Carlo experiments.

    Viability of vector-tensor theories of gravity

    Full text link
    We present a detailed study of the viability of general vector-tensor theories of gravity in the presence of an arbitrary temporal background vector field. We find that there are six different classes of theories which are indistinguishable from General Relativity by means of local gravity experiments. We study the propagation speeds of scalar, vector and tensor perturbations and obtain the conditions for classical stability of those models. We compute the energy density of the different modes and find the conditions for the absence of ghosts in the quantum theory. We conclude that the only theories which can pass all the viability conditions for arbitrary values of the background vector field are not only those of the pure Maxwell type, but also Maxwell theories supplemented with a (Lorentz type) gauge fixing term.Comment: 13 pages, 2 figures, 1 table. Final version to appear in JCA

    A stigmergy-based analysis of city hotspots to discover trends and anomalies in urban transportation usage

    Full text link
    A key aspect of a sustainable urban transportation system is the effectiveness of transportation policies. To be effective, a policy has to consider a broad range of elements, such as pollution emission, traffic flow, and human mobility. Due to the complexity and variability of these elements in the urban area, to produce effective policies remains a very challenging task. With the introduction of the smart city paradigm, a widely available amount of data can be generated in the urban spaces. Such data can be a fundamental source of knowledge to improve policies because they can reflect the sustainability issues underlying the city. In this context, we propose an approach to exploit urban positioning data based on stigmergy, a bio-inspired mechanism providing scalar and temporal aggregation of samples. By employing stigmergy, samples in proximity with each other are aggregated into a functional structure called trail. The trail summarizes relevant dynamics in data and allows matching them, providing a measure of their similarity. Moreover, this mechanism can be specialized to unfold specific dynamics. Specifically, we identify high-density urban areas (i.e hotspots), analyze their activity over time, and unfold anomalies. Moreover, by matching activity patterns, a continuous measure of the dissimilarity with respect to the typical activity pattern is provided. This measure can be used by policy makers to evaluate the effect of policies and change them dynamically. As a case study, we analyze taxi trip data gathered in Manhattan from 2013 to 2015.Comment: Preprin

    SP-Sephadex equilibrium chromatography of bradykinin and related peptides: Application to trypsin-treated human plasma

    Get PDF
    An analytical method is deseribed for the separation of bradykinin, Lys-bradykinin, and Met-Lys-bradykinin by equilibrium chromatography on SP-Sephadex C-25 eluted in 0.02 Tris-HCl buffer, pH 8.10, 0.12 NaCl. A second elution buffer, 0.02 Tris-HCl buffer, pH 7.70, 0.06 NaCl, serves as a second parameter for the identification of bradykinin and also separates the hormone from plasma bradykinin-potentiating peptides. Ten to one-hundred nanomoles of each peptide can be recovered in high yields, identified by elution position, and measured by bioassay with the isolated guinea pig ileum. The identification of bradykinin as the peptide released by trypsin acting on acid-denatured plasma is documented as an illustration of the method

    Anderson transition in systems with chiral symmetry

    Full text link
    Anderson localization is a universal quantum feature caused by destructive interference. On the other hand chiral symmetry is a key ingredient in different problems of theoretical physics: from nonperturbative QCD to highly doped semiconductors. We investigate the interplay of these two phenomena in the context of a three-dimensional disordered system. We show that chiral symmetry induces an Anderson transition (AT) in the region close to the band center. Typical properties at the AT such as multifractality and critical statistics are quantitatively affected by this additional symmetry. The origin of the AT has been traced back to the power-law decay of the eigenstates; this feature may also be relevant in systems without chiral symmetry.Comment: RevTex4, 4 two-column pages, 3 .eps figures, updated references, final version as published in Phys. Rev.

    Stigmergy-based modeling to discover urban activity patterns from positioning data

    Full text link
    Positioning data offer a remarkable source of information to analyze crowds urban dynamics. However, discovering urban activity patterns from the emergent behavior of crowds involves complex system modeling. An alternative approach is to adopt computational techniques belonging to the emergent paradigm, which enables self-organization of data and allows adaptive analysis. Specifically, our approach is based on stigmergy. By using stigmergy each sample position is associated with a digital pheromone deposit, which progressively evaporates and aggregates with other deposits according to their spatiotemporal proximity. Based on this principle, we exploit positioning data to identify high density areas (hotspots) and characterize their activity over time. This characterization allows the comparison of dynamics occurring in different days, providing a similarity measure exploitable by clustering techniques. Thus, we cluster days according to their activity behavior, discovering unexpected urban activity patterns. As a case study, we analyze taxi traces in New York City during 2015
    corecore