6,935 research outputs found

    Collisional Formation and Modeling of Asteroid Families

    Full text link
    In the last decade, thanks to the development of sophisticated numerical codes, major breakthroughs have been achieved in our understanding of the formation of asteroid families by catastrophic disruption of large parent bodies. In this review, we describe numerical simulations of asteroid collisions that reproduced the main properties of families, accounting for both the fragmentation of an asteroid at the time of impact and the subsequent gravitational interactions of the generated fragments. The simulations demonstrate that the catastrophic disruption of bodies larger than a few hundred meters in diameter leads to the formation of large aggregates due to gravitational reaccumulation of smaller fragments, which helps explain the presence of large members within asteroid families. Thus, for the first time, numerical simulations successfully reproduced the sizes and ejection velocities of members of representative families. Moreover, the simulations provide constraints on the family dynamical histories and on the possible internal structure of family members and their parent bodies.Comment: Chapter to appear in the (University of Arizona Press) Space Science Series Book: Asteroids I

    The Science of Galaxy Formation

    Full text link
    Our knowledge of the Universe remains discovery-led: in the absence of adequate physics-based theory, interpretation of new results requires a scientific methodology. Commonly, scientific progress in astrophysics is motivated by the empirical success of the "Copernican Principle", that the simplest and most objective analysis of observation leads to progress. A complementary approach tests the prediction of models against observation. In practise, astrophysics has few real theories, and has little control over what we can observe. Compromise is unavoidable. Advances in understanding complex non-linear situations, such as galaxy formation, require that models attempt to isolate key physical properties, rather than trying to reproduce complexity. A specific example is discussed, where substantial progress in fundamental physics could be made with an ambitious approach to modelling: simulating the spectrum of perturbations on small scales.Comment: paper at IAU256, The Galaxy Disk in Cosmological Context, Copenhagen, 2008 eds J. Andersen, J. Bland-Hawthorn & B. Nordstro

    Efficient simulation scheme for a class of quantum optics experiments with non-negative Wigner representation

    Full text link
    We provide a scheme for efficient simulation of a broad class of quantum optics experiments. Our efficient simulation extends the continuous variable Gottesman-Knill theorem to a large class of non-Gaussian mixed states, thereby identifying that these non-Gaussian states are not an enabling resource for exponential quantum speed-up. Our results also provide an operationally motivated interpretation of negativity as non-classicality. We apply our scheme to the case of noisy single-photon-added-thermal-states to show that this class admits states with positive Wigner function but negative P -function that are not useful resource states for quantum computation.Comment: 14 pages, 1 figur

    Reconciling long-term cultural diversity and short-term collective social behavior

    Get PDF
    An outstanding open problem is whether collective social phenomena occurring over short timescales can systematically reduce cultural heterogeneity in the long run, and whether offline and online human interactions contribute differently to the process. Theoretical models suggest that short-term collective behavior and long-term cultural diversity are mutually excluding, since they require very different levels of social influence. The latter jointly depends on two factors: the topology of the underlying social network and the overlap between individuals in multidimensional cultural space. However, while the empirical properties of social networks are well understood, little is known about the large-scale organization of real societies in cultural space, so that random input specifications are necessarily used in models. Here we use a large dataset to perform a high-dimensional analysis of the scientific beliefs of thousands of Europeans. We find that inter-opinion correlations determine a nontrivial ultrametric hierarchy of individuals in cultural space, a result unaccessible to one-dimensional analyses and in striking contrast with random assumptions. When empirical data are used as inputs in models, we find that ultrametricity has strong and counterintuitive effects, especially in the extreme case of long-range online-like interactions bypassing social ties. On short time-scales, it strongly facilitates a symmetry-breaking phase transition triggering coordinated social behavior. On long time-scales, it severely suppresses cultural convergence by restricting it within disjoint groups. We therefore find that, remarkably, the empirical distribution of individuals in cultural space appears to optimize the coexistence of short-term collective behavior and long-term cultural diversity, which can be realized simultaneously for the same moderate level of mutual influence

    Thermodynamics of firms' growth

    Get PDF
    The distribution of firms' growth and firms' sizes is a topic under intense scrutiny. In this paper we show that a thermodynamic model based on the Maximum Entropy Principle, with dynamical prior information, can be constructed that adequately describes the dynamics and distribution of firms' growth. Our theoretical framework is tested against a comprehensive data-base of Spanish firms, which covers to a very large extent Spain's economic activity with a total of 1,155,142 firms evolving along a full decade. We show that the empirical exponent of Pareto's law, a rule often observed in the rank distribution of large-size firms, is explained by the capacity of the economic system for creating/destroying firms, and can be used to measure the health of a capitalist-based economy. Indeed, our model predicts that when the exponent is larger that 1, creation of firms is favored; when it is smaller that 1, destruction of firms is favored instead; and when it equals 1 (matching Zipf's law), the system is in a full macroeconomic equilibrium, entailing "free" creation and/or destruction of firms. For medium and smaller firm-sizes, the dynamical regime changes; the whole distribution can no longer be fitted to a single simple analytic form and numerical prediction is required. Our model constitutes the basis of a full predictive framework for the economic evolution of an ensemble of firms that can be potentially used to develop simulations and test hypothetical scenarios, as economic crisis or the response to specific policy measures

    Range separation: The divide between local structures and field theories

    Get PDF
    This work presents parallel histories of the development of two modern theories of condensed matter: the theory of electron structure in quantum mechanics, and the theory of liquid structure in statistical mechanics. Comparison shows that key revelations in both are not only remarkably similar, but even follow along a common thread of controversy that marks progress from antiquity through to the present. This theme appears as a creative tension between two competing philosophies, that of short range structure (atomistic models) on the one hand, and long range structure (continuum or density functional models) on the other. The timeline and technical content are designed to build up a set of key relations as guideposts for using density functional theories together with atomistic simulation.Comment: Expanded version of a 30 minute talk delivered at the 2018 TSRC workshop on Ions in Solution, to appear in the March, 2019 issue of Substantia (https://riviste.fupress.net/index.php/subs/index

    Evaluating the role of quantitative modeling in language evolution

    No full text
    Models are a flourishing and indispensable area of research in language evolution. Here we highlight critical issues in using and interpreting models, and suggest viable approaches. First, contrasting models can explain the same data and similar modelling techniques can lead to diverging conclusions. This should act as a reminder to use the extreme malleability of modelling parsimoniously when interpreting results. Second, quantitative techniques similar to those used in modelling language evolution have proven themselves inadequate in other disciplines. Cross-disciplinary fertilization is crucial to avoid mistakes which have previously occurred in other areas. Finally, experimental validation is necessary both to sharpen models' hypotheses, and to support their conclusions. Our belief is that models should be interpreted as quantitative demonstrations of logical possibilities, rather than as direct sources of evidence. Only an integration of theoretical principles, quantitative proofs and empirical validation can allow research in the evolution of language to progress
    • …
    corecore