934 research outputs found

    Accretion History of Subhalo Population now and then

    Get PDF
    In the standard model of structure formation galaxies reside in virialized dark matter haloes which extend much beyond the observational radius of the central system. The dark matter halo formation process is hierarchical, small systems collapse at high redshift and then merge together forming larger ones. In this work we study the mass assembly history of host haloes at different observation redshifts and the mass function of accreted satellites (haloes that merge directly on the main halo progenitor). We show that the satellite mass function is universal, both independent on the host halo mass and observation redshift. The satellite mass function also turn out to be universal once only satellites before or after the host halo formation redshift (time at which the main halo progenitor assembles half of its final mass) are considered. We show that the normalizations of these distributions are directly related to the main halo progenitor mass distributions before and after its formation, while their slope and the exponential high mass cut-off remain unchanged.Comment: To appear in the proceedings of the "Invisible Universe International Conference 2009" (6 pages, 3 figures

    Games judges don't play: predatory pricing and strategic reasoning in US antitrust

    Get PDF
    The paper analyzes the last three decades of debates on predatory pricing in US antitrust law, starting from the literature which followed Areeda & Turner 1975 and ending with the early years of the new century, after the Brooke decision. Special emphasis is given to the game-theoretic approach to predation and to the reasons why this approach has never gained attention in courtrooms. It is argued that, despite their mathematical rigor, the sophisticated stories told by strategic models in order to demonstrate the actual viability of predatory behavior fail to satisfy the criteria which guide the decisions of antitrust courts, in particular their preference for easy-to-apply rules. Therefore predation cases are still governed by a peculiar alliance between Chicago-style price theory – which, contrary to game theory, considers predatory behavior almost always irrational – and a Harvard-style attention for the operational side of antitrust enforcement.Antitrust law; predatory pricing; Chicago School; Harvard; game theory

    From Wald to Savage: homo economicus becomes a Bayesian statistician

    Get PDF
    Bayesian rationality is the paradigm of rational behavior in neoclassical economics. A rational agent in an economic model is one who maximizes her subjective expected utility and consistently revises her beliefs according to Bayes’s rule. The paper raises the question of how, when and why this characterization of rationality came to be endorsed by mainstream economists. Though no definitive answer is provided, it is argued that the question is far from trivial and of great historiographic importance. The story begins with Abraham Wald’s behaviorist approach to statistics and culminates with Leonard J. Savage’s elaboration of subjective expected utility theory in his 1954 classic The Foundations of Statistics. It is the latter’s acknowledged fiasco to achieve its planned goal, the reinterpretation of traditional inferential techniques along subjectivist and behaviorist lines, which raises the puzzle of how a failed project in statistics could turn into such a tremendous hit in economics. A couple of tentative answers are also offered, involving the role of the consistency requirement in neoclassical analysis and the impact of the postwar transformation of US business schools.Savage, Wald, rational behavior, Bayesian decision theory, subjective probability, minimax rule, statistical decision functions, neoclassical economics

    Reaction curves

    Get PDF
    A reaction curve RC, also called reaction function or best-reply function, is the locus of optimal, i.e. profit-maximizing, actions that a firm may undertake for any given action chosen by a rival firm. The RC diagram is the standard tool for the graphical analysis of duopoly. In the diagram the market equilibrium is at the intersection of the RCs, one for each firm. The commonest case of RC diagram is that of the Cournot duopoly model.reaction curves; duopoly; Cournot

    Mathematics as the role model for neoclassical economics (Blanqui Lecture)

    Get PDF
    Born out of the conscious effort to imitate mechanical physics, neoclassical economics ended up in the mid 20th century embracing a purely mathematical notion of rigor as embodied by the axiomatic method. This lecture tries to explain how this could happen, or, why and when the economists’ role model became the mathematician rather than the physicist. According to the standard interpretation, the triumph of axiomatics in modern neoclassical economics can be explained in terms of the discipline’s increasing awareness of its lack of good experimental and observational data, and thus of its intrinsic inability to fully abide by the paradigm of mechanics. Yet this story fails to properly account for the transformation that the word “rigor” itself underwent first and foremost in mathematics as well as for the existence of a specific motivation behind the economists’ decision to pursue the axiomatic route. While the full argument is developed in Giocoli 2003, these pages offer a taste of a (partially) alternative story which begins with the so-called formalist revolution in mathematics, then crosses the economists’ almost innate urge to bring their discipline to the highest possible level of generality and conceptual integrity, and ends with the advent and consolidation of that very core set of methods, tools and ideas that constitute the contemporary image of economics.Axiomatic method, formalism, rationality, neoclassical economics

    Three alternative (?) stories on the late 20th-century rise of game theory

    Get PDF
    The paper presents three different reconstructions of the 1980s boom of game theory and its rise to the present status of indispensable tool-box for modern economics. The first story focuses on the Nash refinements literature and on the development of Bayesian games. The second emphasizes the role of antitrust case law, and in particular of the rehabilitation, via game theory, of some traditional antitrust prohibitions and limitations which had been challenged by the Chicago approach. The third story centers on the wealth of issues classifiable under the general headline of "mechanism design" and on the game theoretical tools and methods which have been applied to tackle them. The bottom lines are, first, that the three stories need not be viewed as conflicting, but rather as complementary, and, second, that in all stories a central role has been played by John Harsanyi and Bayesian decision theory.game theory; mechanism design; refinements of Nash equilibrium; antitrust law; John Harsanyi

    Weak Lensing Light-Cones in Modified Gravity simulations with and without Massive Neutrinos

    Get PDF
    We present a novel suite of cosmological N-body simulations called the DUSTGRAIN-pathfinder, implementing simultaneously the effects of an extension to General Relativity in the form of f(R)f(R) gravity and of a non-negligible fraction of massive neutrinos. We describe the generation of simulated weak lensing and cluster counts observables within a past light-cone extracted from these simulations. The simulations have been performed by means of a combination of the MG-GADGET code and a particle-based implementation of massive neutrinos, while the light-cones have been generated using the MapSim pipeline allowing us to compute weak lensing maps through a ray-tracing algorithm for different values of the source plane redshift. The mock observables extracted from our simulations will be employed for a series of papers focussed on understanding and possibly breaking the well-known observational degeneracy between f(R)f(R) gravity and massive neutrinos, i.e. the fact that some specific combinations of the characteristic parameters for these two phenomena (the fR0f_{R0} scalar amplitude and the total neutrino mass ÎŁmÎœ\Sigma m_{\nu}) may result indistinguishable from the standard ΛCDM\mathrm{\Lambda CDM} cosmology through several standard observational probes. In particular, in the present work we show how a tomographic approach to weak lensing statistics could allow - especially for the next generation of wide-field surveys - to disentangle some of the models that appear statistically indistinguishable through standard single-redshift weak lensing probe.Comment: accepted for publication in MNRAS, added theoretical comparisons to the simulation measurement

    The mass-concentration relation in lensing clusters: the role of statistical biases and selection effects

    Get PDF
    The relation between mass and concentration of galaxy clusters traces their formation and evolution. Massive lensing clusters were observed to be over-concentrated and following a steep scaling in tension with predictions from the concordance Λ\LambdaCDM paradigm. We critically revise the relation in the CLASH, the SGAS, the LOCUSS, and the high-redshift samples of weak lensing clusters. Measurements of mass and concentration are anti-correlated, which can bias the observed relation towards steeper values. We corrected for this bias and compared the measured relation to theoretical predictions accounting for halo triaxiality, adiabatic contraction of the halo, presence of a dominant BCG and, mostly, selection effects in the observed sample. The normalisation, the slope and the scatter of the expected relation are strongly sample-dependent. For the considered samples, the predicted slope is much steeper than that of the underlying relation characterising dark-matter only clusters. We found that the correction for statistical and selection biases in observed relations mostly solve the tension with the Λ\LambdaCDM model.Comment: 13 pages, 3 figures; v2: 14 pages, minor changes, in press on MNRA

    Can giant radio halos probe the merging rate of galaxy clusters?

    Get PDF
    Radio and X-ray observations of galaxy clusters probe a direct link between cluster mergers and giant radio halos (RH), suggesting that these sources can be used as probes of the cluster merging rate with cosmic time. In this paper we carry out an explorative study that combines the observed fractions of merging clusters (fm) and RH (fRH) with the merging rate predicted by cosmological simulations and attempt to infer constraints on merger properties of clusters that appear disturbed in X-rays and of clusters with RH. We use morphological parameters to identify merging systems and analyze the currently largest sample of clusters with radio and X-ray data (M500>6d14 Msun, and 0.2<z<0.33, from the Planck SZ cluster catalogue). We found that in this sample fm~62-67% while fRH~44-51%. The comparison of the theoretical f_m with the observed one allows to constrain the combination (xi_m,tau_m), where xi_m and tau_m are the minimum merger mass ratio and the timescale of merger-induced disturbance. Assuming tau_m~ 2-3 Gyr, as constrained by simulations, we find that the observed f_m matches the theoretical one for xi_m~0.1-0.18. This is consistent with optical and near-IR observations of clusters in the sample (xi_m~0.14-0.16). The fact that RH are found only in a fraction of merging clusters may suggest that merger events generating RH are characterized by larger mass ratio; this seems supported by optical/near-IR observations of RH clusters in the sample (xi_min~0.2-0.25). Alternatively, RH may be generated in all mergers but their lifetime is shorter than \tau_m (by ~ fRH/fm). This is an explorative study, however it suggests that follow up studies using the forthcoming radio surveys and adequate numerical simulations have the potential to derive quantitative constraints on the link between cluster merging rate and RH at different cosmic epochs and for different cluster masses.Comment: 10 pages, 3 figures, accepted for publication in A&

    GLAMER Part II: Multiple Plane Gravitational Lensing

    Full text link
    We present an extension to multiple planes of the gravitational lensing code {\small GLAMER}. The method entails projecting the mass in the observed light-cone onto a discrete number of lens planes and inverse ray-shooting from the image to the source plane. The mass on each plane can be represented as halos, simulation particles, a projected mass map extracted form a numerical simulation or any combination of these. The image finding is done in a source oriented fashion, where only regions of interest are iteratively refined on an initially coarse image plane grid. The calculations are performed in parallel on shared memory machines. The code is able to handle different types of analytic halos (NFW, NSIE, power-law, etc.), haloes extracted from numerical simulations and clusters constructed from semi-analytic models ({\small MOKA}). Likewise, there are several different options for modeling the source(s) which can be distributed throughout the light-cone. The distribution of matter in the light-cone can be either taken from a pre-existing N-body numerical simulations, from halo catalogs, or are generated from an analytic mass function. We present several tests of the code and demonstrate some of its applications such as generating mock images of galaxy and galaxy cluster lenses.Comment: 14 pages, 10 figures, submitted to MNRA
    • 

    corecore