9,284 research outputs found

    Constraining spatial variations of the fine structure constant using clusters of galaxies and Planck data

    Full text link
    We propose an improved methodology to constrain spatial variations of the fine structure constant using clusters of galaxies. We use the {\it Planck} 2013 data to measure the thermal Sunyaev-Zeldovich effect at the location of 618 X-ray selected clusters. We then use a Monte Carlo Markov Chain algorithm to obtain the temperature of the Cosmic Microwave Background at the location of each galaxy cluster. When fitting three different phenomenological parameterizations allowing for monopole and dipole amplitudes in the value of the fine structure constant we improve the results of earlier analysis involving clusters and the CMB power spectrum, and we also found that the best-fit direction of a hypothetical dipole is compatible with the direction of other known anomalies. Although the constraining power of our current datasets do not allow us to test the indications of a fine-structure constant dipole obtained though high-resolution optical/UV spectroscopy, our results do highlight that clusters of galaxies will be a very powerful tool to probe fundamental physics at low redshift.Comment: 11 pages, 5 figures and 3 tables. Accepted for publication in Physical Review

    The solution space of metabolic networks: producibility, robustness and fluctuations

    Get PDF
    Flux analysis is a class of constraint-based approaches to the study of biochemical reaction networks: they are based on determining the reaction flux configurations compatible with given stoichiometric and thermodynamic constraints. One of its main areas of application is the study of cellular metabolic networks. We briefly and selectively review the main approaches to this problem and then, building on recent work, we provide a characterization of the productive capabilities of the metabolic network of the bacterium E.coli in a specified growth medium in terms of the producible biochemical species. While a robust and physiologically meaningful production profile clearly emerges (including biomass components, biomass products, waste etc.), the underlying constraints still allow for significant fluctuations even in key metabolites like ATP and, as a consequence, apparently lay the ground for very different growth scenarios.Comment: 10 pages, prepared for the Proceedings of the International Workshop on Statistical-Mechanical Informatics, March 7-10, 2010, Kyoto, Japa

    A cross impact methodology for the assessment of US telecommunications system with application to fiber optics development: Executive summary

    Get PDF
    A cross impact model of the U.S. telecommunications system was developed. For this model, it was necessary to prepare forecasts of the major segments of the telecommunications system, such as satellites, telephone, TV, CATV, radio broadcasting, etc. In addition, forecasts were prepared of the traffic generated by a variety of new or expanded services, such as electronic check clearing and point of sale electronic funds transfer. Finally, the interactions among the forecasts were estimated (the cross impacts). Both the forecasts and the cross impacts were used as inputs to the cross impact model, which could then be used to stimulate the future growth of the entire U.S. telecommunications system. By varying the inputs, technology changes or policy decisions with regard to any segment of the system could be evaluated in the context of the remainder of the system. To illustrate the operation of the model, a specific study was made of the deployment of fiber optics, throughout the telecommunications system

    A cross impact methodology for the assessment of US telecommunications system with application to fiber optics development, volume 2

    Get PDF
    The appendices for the cross impact methodology are presented. These include: user's guide, telecommunication events, cross impacts, projection of historical trends, and projection of trends in satellite communications

    A cross impact methodology for the assessment of US telecommunications system with application to fiber optics development, volume 1

    Get PDF
    A cross impact model of the U.S. telecommunications system was developed. It was necessary to prepare forecasts of the major segments of the telecommunications system, such as satellites, telephone, TV, CATV, radio broadcasting, etc. In addition, forecasts were prepared of the traffic generated by a variety of new or expanded services, such as electronic check clearing and point of sale electronic funds transfer. Finally, the interactions among the forecasts were estimated (the cross impact). Both the forecasts and the cross impacts were used as inputs to the cross impact model, which could then be used to stimulate the future growth of the entire U.S. telecommunications system. By varying the inputs, technology changes or policy decisions with regard to any segment of the system could be evaluated in the context of the remainder of the system. To illustrate the operation of the model, a specific study was made of the deployment of fiber optics throughout the telecommunications system

    LoCuSS: The Near-Infrared Luminosity and Weak-Lensing Mass Scaling Relation of Galaxy Clusters

    Full text link
    We present the first scaling relation between weak-lensing galaxy cluster mass, MWLM_{WL}, and near-infrared luminosity, LKL_K. Our results are based on 17 clusters observed with wide-field instruments on Subaru, the United Kingdom Infrared Telescope, the Mayall Telescope, and the MMT. We concentrate on the relation between projected 2D weak-lensing mass and spectroscopically confirmed luminosity within 1Mpc, modelled as MWLLKbM_{WL} \propto L_{K}^b, obtaining a power law slope of b=0.830.24+0.27b=0.83^{+0.27}_{-0.24} and an intrinsic scatter of σlnMWLLK=105+8%\sigma_{lnM_{WL}|L_{K}}=10^{+8}_{-5}\%. Intrinsic scatter of ~10% is a consistent feature of our results regardless of how we modify our approach to measuring the relationship between mass and light. For example, deprojecting the mass and measuring both quantities within r500r_{500}, that is itself obtained from the lensing analysis, yields σlnMWLLK=105+7%\sigma_{lnM_{WL}|L_{K}}=10^{+7}_{-5}\% and b=0.970.17+0.17b=0.97^{+0.17}_{-0.17}. We also find that selecting members based on their (J-K) colours instead of spectroscopic redshifts neither increases the scatter nor modifies the slope. Overall our results indicate that near-infrared luminosity measured on scales comparable with r500r_{500} (typically 1Mpc for our sample) is a low scatter and relatively inexpensive proxy for weak-lensing mass. Near-infrared luminosity may therefore be a useful mass proxy for cluster cosmology experiments.Comment: 9 Pages, 5 Figures, 3 Tables. Submitted to MNRA

    Constraining the redshift evolution of the Cosmic Microwave Background black-body temperature with PLANCK data

    Get PDF
    We constrain the deviation of adiabatic evolution of the Universe using the data on the Cosmic Microwave Background (CMB) temperature anisotropies measured by the {\it Planck} satellite and a sample of 481 X-ray selected clusters with spectroscopically measured redshifts. To avoid antenna beam effects, we bring all the maps to the same resolution. We use a CMB template to subtract the cosmological signal while preserving the Thermal Sunyaev-Zeldovich (TSZ) anisotropies; next, we remove galactic foreground emissions around each cluster and we mask out all known point sources. If the CMB black-body temperature scales with redshift as T(z)=T0(1+z)1αT(z)=T_0(1+z)^{1-\alpha}, we constrain deviations of adiabatic evolution to be α=0.007±0.013\alpha=-0.007\pm 0.013, consistent with the temperature-redshift relation of the standard cosmological model. This result could suffer from a potential bias δα\delta\alpha associated with the CMB template, that we quantify it to be δα0.02|\delta\alpha|\le 0.02 and with the same sign than the measured value of α\alpha, but is free from those biases associated with using TSZ selected clusters; it represents the best constraint to date of the temperature-redshift relation of the Big-Bang model using only CMB data, confirming previous results.Comment: ApJ, in press. Manuscript matches the accepted version: 10 pages, 7 figures, 3 table

    Statistical mechanics of the mixed majority-minority game with random external information

    Full text link
    We study the asymptotic macroscopic properties of the mixed majority-minority game, modeling a population in which two types of heterogeneous adaptive agents, namely ``fundamentalists'' driven by differentiation and ``trend-followers'' driven by imitation, interact. The presence of a fraction f of trend-followers is shown to induce (a) a significant loss of informational efficiency with respect to a pure minority game (in particular, an efficient, unpredictable phase exists only for f<1/2), and (b) a catastrophic increase of global fluctuations for f>1/2. We solve the model by means of an approximate static (replica) theory and by a direct dynamical (generating functional) technique. The two approaches coincide and match numerical results convincingly.Comment: 19 pages, 3 figure

    Typical properties of optimal growth in the Von Neumann expanding model for large random economies

    Full text link
    We calculate the optimal solutions of the fully heterogeneous Von Neumann expansion problem with NN processes and PP goods in the limit NN\to\infty. This model provides an elementary description of the growth of a production economy in the long run. The system turns from a contracting to an expanding phase as NN increases beyond PP. The solution is characterized by a universal behavior, independent of the parameters of the disorder statistics. Associating technological innovation to an increase of NN, we find that while such an increase has a large positive impact on long term growth when NPN\ll P, its effect on technologically advanced economies (NPN\gg P) is very weak.Comment: 8 pages, 1 figur
    corecore