1,682 research outputs found

    A method for dense packing discovery

    Full text link
    The problem of packing a system of particles as densely as possible is foundational in the field of discrete geometry and is a powerful model in the material and biological sciences. As packing problems retreat from the reach of solution by analytic constructions, the importance of an efficient numerical method for conducting \textit{de novo} (from-scratch) searches for dense packings becomes crucial. In this paper, we use the \textit{divide and concur} framework to develop a general search method for the solution of periodic constraint problems, and we apply it to the discovery of dense periodic packings. An important feature of the method is the integration of the unit cell parameters with the other packing variables in the definition of the configuration space. The method we present led to improvements in the densest-known tetrahedron packing which are reported in [arXiv:0910.5226]. Here, we use the method to reproduce the densest known lattice sphere packings and the best known lattice kissing arrangements in up to 14 and 11 dimensions respectively (the first such numerical evidence for their optimality in some of these dimensions). For non-spherical particles, we report a new dense packing of regular four-dimensional simplices with density ϕ=128/2190.5845\phi=128/219\approx0.5845 and with a similar structure to the densest known tetrahedron packing.Comment: 15 pages, 5 figure

    Remarks on the notion of quantum integrability

    Full text link
    We discuss the notion of integrability in quantum mechanics. Starting from a review of some definitions commonly used in the literature, we propose a different set of criteria, leading to a classification of models in terms of different integrability classes. We end by highlighting some of the expected physical properties associated to models fulfilling the proposed criteria.Comment: 22 pages, no figures, Proceedings of Statphys 2

    Ecological equivalence: a realistic assumption for niche theory as a testable alternative to neutral theory

    Get PDF
    Hubbell's 2001 neutral theory unifies biodiversity and biogeography by modelling steady-state distributions of species richness and abundances across spatio-temporal scales. Accurate predictions have issued from its core premise that all species have identical vital rates. Yet no ecologist believes that species are identical in reality. Here I explain this paradox in terms of the ecological equivalence that species must achieve at their coexistence equilibrium, defined by zero net fitness for all regardless of intrinsic differences between them. I show that the distinction of realised from intrinsic vital rates is crucial to evaluating community resilience. An analysis of competitive interactions reveals how zero-sum patterns of abundance emerge for species with contrasting life-history traits as for identical species. I develop a stochastic model to simulate community assembly from a random drift of invasions sustaining the dynamics of recruitment following deaths and extinctions. Species are allocated identical intrinsic vital rates for neutral dynamics, or random intrinsic vital rates and competitive abilities for niche dynamics either on a continuous scale or between dominant-fugitive extremes. Resulting communities have steady-state distributions of the same type for more or less extremely differentiated species as for identical species. All produce negatively skewed log-normal distributions of species abundance, zero-sum relationships of total abundance to area, and Arrhenius relationships of species to area. Intrinsically identical species nevertheless support fewer total individuals, because their densities impact as strongly on each other as on themselves. Truly neutral communities have measurably lower abundance/area and higher species/abundance ratios. Neutral scenarios can be parameterized as null hypotheses for testing competitive release, which is a sure signal of niche dynamics. Ignoring the true strength of interactions between and within species risks a substantial misrepresentation of community resilience to habitat los

    Surface stresses on a thin shell surrounding a traversable wormhole

    Full text link
    We match an interior solution of a spherically symmetric traversable wormhole to a unique exterior vacuum solution, with a generic cosmological constant, at a junction interface, and the surface stresses on the thin shell are deduced. In the spirit of minimizing the usage of exotic matter we determine regions in which the weak and null energy conditions are satisfied on the junction surface. The characteristics and several physical properties of the surface stresses are explored, namely, regions where the sign of the tangential surface pressure is positive and negative (surface tension) are determined. This is done by expressing the tangential surface pressure as a function of several parameters, namely, that of the matching radius, the redshift parameter, the surface energy density and of the generic cosmological constant. An equation governing the behavior of the radial pressure across the junction surface is also deduced.Comment: 24 pages, 11 figures, LaTeX2e, IOP style files. Accepted for publication in Classical and Quantum Gravity. V2: Four references added, now 25 page

    Arduous implementation: Does the Normalisation Process Model explain why it's so difficult to embed decision support technologies for patients in routine clinical practice

    Get PDF
    Background: decision support technologies (DSTs, also known as decision aids) help patients and professionals take part in collaborative decision-making processes. Trials have shown favorable impacts on patient knowledge, satisfaction, decisional conflict and confidence. However, they have not become routinely embedded in health care settings. Few studies have approached this issue using a theoretical framework. We explained problems of implementing DSTs using the Normalization Process Model, a conceptual model that focuses attention on how complex interventions become routinely embedded in practice.Methods: the Normalization Process Model was used as the basis of conceptual analysis of the outcomes of previous primary research and reviews. Using a virtual working environment we applied the model and its main concepts to examine: the 'workability' of DSTs in professional-patient interactions; how DSTs affect knowledge relations between their users; how DSTs impact on users' skills and performance; and the impact of DSTs on the allocation of organizational resources.Results: conceptual analysis using the Normalization Process Model provided insight on implementation problems for DSTs in routine settings. Current research focuses mainly on the interactional workability of these technologies, but factors related to divisions of labor and health care, and the organizational contexts in which DSTs are used, are poorly described and understood.Conclusion: the model successfully provided a framework for helping to identify factors that promote and inhibit the implementation of DSTs in healthcare and gave us insights into factors influencing the introduction of new technologies into contexts where negotiations are characterized by asymmetries of power and knowledge. Future research and development on the deployment of DSTs needs to take a more holistic approach and give emphasis to the structural conditions and social norms in which these technologies are enacte

    Large-scale synchrony of gap dynamics and the distribution of understory tree species in maple-beech forests

    Get PDF
    Large-scale synchronous variations in community dynamics are well documented for a vast array of organisms, but are considerably less understood for forest trees. Because of temporal variations in canopy gap dynamics, forest communities—even old-growth ones—are never at equilibrium at the stand scale. This paucity of equilibrium may also be true at the regional scale. Our objectives were to determine (1) if nonequilibrium dynamics caused by temporal variations in the formation of canopy gaps are regionally synchronized, and (2) if spatiotemporal variations in canopy gap formation aVect the relative abundance of tree species in the understory. We examined these questions by analyzing variations in the suppression and release history of Acer saccharum Marsh. and Fagus grandifolia Ehrh. from 481 growth series of understory saplings taken from 34 mature stands. We observed that (1) the proportion of stems in release as a function of time exhibited a U-shaped pattern over the last 35 years, with the lowest levels occurring during 1975–1985, and that (2) the response to this in terms of species composition was that A. saccharum became more abundant at sites that had the highest proportion of stems in release during 1975–1985. We concluded that the understory dynamics, typically thought of as a stand-scale process, may be regionally synchronized

    The influence of perfusion solution on renal graft viability assessment

    Get PDF
    BACKGROUND: Kidneys from donors after cardiac or circulatory death are exposed to extended periods of both warm ischemia and intra-arterial cooling before organ recovery. Marshall’s hypertonic citrate (HOC) and Bretschneider’s histidine-tryptophan-ketoglutarate (HTK) preservation solutions are cheap, low viscosity preservation solutions used clinically for organ flushing. The aim of the present study was to evaluate the effects of these two solutions both on parameters used in clinical practice to assess organ viability prior to transplantation and histological evidence of ischemic injury after reperfusion. METHODS: Rodent kidneys were exposed to post-mortem warm ischemia, extended intra-arterial cooling (IAC) (up to 2 h) with preservation solution and reperfusion with either Krebs-Hensleit or whole blood in a transplant model. Control kidneys were either reperfused directly after retrieval or stored in 0.9% saline. Biochemical, immunological and histological parameters were assessed using glutathione-S-transferase (GST) enzymatic assays, polymerase chain reaction and mitochondrial electron microscopy respectively. Vascular function was assessed by supplementing the Krebs-Hensleit perfusion solution with phenylephrine to stimulate smooth muscle contraction followed by acetylcholine to trigger endothelial dependent relaxation. RESULTS: When compared with kidneys reperfused directly post mortem, 2 h of IAC significantly reduced smooth muscle contractile function, endothelial function and upregulated vascular cellular adhesion molecule type 1 (VCAM-1) independent of the preservation solution. However, GST release, vascular resistance, weight gain and histological mitochondrial injury were dependent on the preservation solution used. CONCLUSIONS: We conclude that initial machine perfusion viability tests, including ischemic vascular resistance and GST, are dependent on the perfusion solution used during in situ cooling. HTK-perfused kidneys will be heavier, have higher GST readings and yet reduced mitochondrial ischemic injury when compared with HOC-perfused kidneys. Clinicians should be aware of this when deciding which kidneys to transplant or discard

    Process evaluation for complex interventions in primary care: understanding trials using the normalization process model

    Get PDF
    Background: the Normalization Process Model is a conceptual tool intended to assist in understanding the factors that affect implementation processes in clinical trials and other evaluations of complex interventions. It focuses on the ways that the implementation of complex interventions is shaped by problems of workability and integration.Method: in this paper the model is applied to two different complex trials: (i) the delivery of problem solving therapies for psychosocial distress, and (ii) the delivery of nurse-led clinics for heart failure treatment in primary care.Results: application of the model shows how process evaluations need to focus on more than the immediate contexts in which trial outcomes are generated. Problems relating to intervention workability and integration also need to be understood. The model may be used effectively to explain the implementation process in trials of complex interventions.Conclusion: the model invites evaluators to attend equally to considering how a complex intervention interacts with existing patterns of service organization, professional practice, and professional-patient interaction. The justification for this may be found in the abundance of reports of clinical effectiveness for interventions that have little hope of being implemented in real healthcare setting
    corecore