3,285 research outputs found

    Non-Newtonian Gravity, Fluctuative Hypothesis and the Sizes of Astrophysical Structures

    Get PDF
    We show that the characteristic sizes of astrophysical and cosmological structures, where gravity is the only overall relevant interaction assembling the system, have a phenomenological relation to the microscopic scales whose order of magnitude is essentially ruled by the Compton wavelength of the proton. This result agrees with the absence of screening mechanisms for the gravitational interaction and could be connected to the presence of Yukawa correcting terms in the Newtonian potential which introduce typical interaction lengths. Furthermore, we are able to justify, in a straightforward way, the Sanders--postulated mass of a vector boson considered in order to obtain the characteristic sizes of galaxies.Comment: 11 pages. to appear in Mod. Phys. Lett.

    Constraining f(R)f(R) gravity by the Large Scale Structure

    Full text link
    Over the past decades, General Relativity and the concordance Λ\LambdaCDM model have been successfully tested using several different astrophysical and cosmological probes based on large datasets ({\it precision cosmology}). Despite their successes, some shortcomings emerge due to the fact that General Relativity should be revised at infrared and ultraviolet limits and to the fact that the fundamental nature of Dark Matter and Dark Energy is still a puzzle to be solved. In this perspective, f(R)f(R) gravity have been extensively investigated being the most straightforward way to modify General Relativity and to overcame some of the above shortcomings. In this paper, we review various aspects of f(R)f(R) gravity at extragalactic and cosmological levels. In particular, we consider cluster of galaxies, cosmological perturbations, and N-Body simulations, focusing on those models that satisfy both cosmological and local gravity constraints. The perspective is that some classes of f(R)f(R) models can be consistently constrained by Large Scale Structure.Comment: 37 pages, 3 Tables, 6 Figures. Invited Review belonging "Special Issue Modified Gravity Cosmology: From Inflation to Dark Energy". The manuscript matches the accepted version. References update

    Inference of Planck action constant by a classical fluctuative postulate holding for stable microscopic and macroscopic dynamical systems

    Get PDF
    The possibility is discussed of inferring or simulating some aspects of quantum dynamics by adding classical statistical fluctuations to classical mechanics. We introduce a general principle of mechanical stability and derive a necessary condition for classical chaotic fluctuations to affect confined dynamical systems, on any scale, ranging from microscopic to macroscopic domains. As a consequence we obtain, both for microscopic and macroscopic aggregates, dimensional relations defining the minimum unit of action of individual constituents, yielding in all cases Planck action constant.Comment: 14 pages, no figure

    Theory of controlled quantum dynamics

    Get PDF
    We introduce a general formalism, based on the stochastic formulation of quantum mechanics, to obtain localized quasi-classical wave packets as dynamically controlled systems, for arbitrary anharmonic potentials. The control is in general linear, and it amounts to introduce additional quadratic and linear time-dependent terms to the given potential. In this way one can construct for general systems either coherent packets moving with constant dispersion, or dynamically squeezed packets whose spreading remains bounded for all times. In the standard operatorial framework our scheme corresponds to a suitable generalization of the displacement and scaling operators that generate the coherent and squeezed states of the harmonic oscillator.Comment: LaTeX, A4wide, 28 pages, no figures. To appear in J. Phys. A: Math. Gen., April 199

    Perspectives of a web-based software to improve crash data quality and reliability in Italy

    Get PDF
    Real‐world crash data play a vital part in the development of safer transport since information on crash data is essential as a means of understanding where and why crashes occurred in the past and how the occurrence of similar events may be prevented in the future. Crash databases provide the basic information for effective highway safety management but several existing databases show significant drawbacks which hinder their effective use for safety analysis and improvement. In Italy, the national crash database is maintained by the National Institute of Statistics (ISTAT) and presents major issues related to the crash report form, the crash classification, the crash location, and the crash severity. Moreover, almost all police departments use an out-of-date paper form, not in line with the national and international needs. Modern technologies offer potential for significant improvements of existing methods and procedures for crash data collection, processing and analysis. To address these issues, in this paper we present the development and evaluation of a web-based platform-independent software for crash data collection, processing and analysis named ReGIS (Crash Data Collection, Processing and Analysis). The software is designed for mobile and desktop electronic devices and enables a guided and automated drafting of the crash report, assisting police officers both on-site and in the office. The software development was based both on the detailed critical review of existing Australasian, EU, and U.S. crash databases and software as well as on the continuous consultation with the stakeholders. The evaluation was carried out comparing the completeness, timeliness, and accuracy of crash data before and after the use of the software in the city of Vico Equense, in south of Italy showing significant advantages. The amount of collected information increased from 82 variables to 268 variables, i.e., a 227% increase. The time saving was more than one hour per crash, i.e., a 36% reduction. The on-site data collection did not produce time saving, however this is a temporary weakness that will be annihilated very soon in the future after officers are more acquainted with the software. The phase of evaluation, processing and analysis carried out in the office was dramatically shortened, i.e., a 69% reduction. Another benefit was the standardization which allowed fast and consistent data analysis and evaluation. Even if all these benefits are remarkable, the most valuable benefit of the new procedure was the reduction of the police officers mistakes during the manual operations of survey and data evaluation. Because of these benefits, the satisfaction questionnaires administrated to the police officers after the testing phase showed very good acceptance of the procedure
    • 

    corecore