843 research outputs found

    The Influence of the Degree of Heterogeneity on the Elastic Properties of Random Sphere Packings

    Full text link
    The macroscopic mechanical properties of colloidal particle gels strongly depend on the local arrangement of the powder particles. Experiments have shown that more heterogeneous microstructures exhibit up to one order of magnitude higher elastic properties than their more homogeneous counterparts at equal volume fraction. In this paper, packings of spherical particles are used as model structures to computationally investigate the elastic properties of coagulated particle gels as a function of their degree of heterogeneity. The discrete element model comprises a linear elastic contact law, particle bonding and damping. The simulation parameters were calibrated using a homogeneous and a heterogeneous microstructure originating from earlier Brownian dynamics simulations. A systematic study of the elastic properties as a function of the degree of heterogeneity was performed using two sets of microstructures obtained from Brownian dynamics simulation and from the void expansion method. Both sets cover a broad and to a large extent overlapping range of degrees of heterogeneity. The simulations have shown that the elastic properties as a function of the degree of heterogeneity are independent of the structure generation algorithm and that the relation between the shear modulus and the degree of heterogeneity can be well described by a power law. This suggests the presence of a critical degree of heterogeneity and, therefore, a phase transition between a phase with finite and one with zero elastic properties.Comment: 8 pages, 6 figures; Granular Matter (published online: 11. February 2012

    Keck Spectroscopy of Faint 3<z<8 Lyman Break Galaxies:- Evidence for a Declining Fraction of Emission Line Sources In the Redshift Range 6<z<8

    Get PDF
    Using deep Keck spectroscopy of Lyman break galaxies selected from infrared imaging data taken with WFC3/IR onboard the Hubble Space Telescope, we present new evidence for a reversal in the redshift-dependent fraction of star forming galaxies with detectable Lyman alpha emission in the redshift range 6.3 < z < 8.8. Our earlier surveys with the DEIMOS spectrograph demonstrated a significant increase with redshift in the fraction of line emitting galaxies over the interval 4 < z < 6, particularly for intrinsically faint systems which dominate the luminosity density. Using the longer wavelength sensitivities of LRIS and NIRSPEC, we have targeted 19 Lyman break galaxies selected using recent WFC3/IR data whose photometric redshifts are in the range 6.3 < z < 8.8 and which span a wide range of intrinsic luminosities. Our spectroscopic exposures typically reach a 5-sigma sensitivity of < 50 A for the rest-frame equivalent width (EW) of Lyman alpha emission. Despite the high fraction of emitters seen only a few hundred million years later, we find only 2 convincing and 1 possible line emitter in our more distant sample. Combining with published data on a further 7 sources obtained using FORS2 on the ESO VLT, and assuming continuity in the trends found at lower redshift, we discuss the significance of this apparent reversal in the redshift-dependent Lyman alpha fraction in the context of our range in continuum luminosity. Assuming all the targeted sources are at their photometric redshift and our assumptions about the Lyman alpha EW distribution are correct, we would expect to find so few emitters in less than 1% of the realizations drawn from our lower redshift samples. Our new results provide further support for the suggestion that, at the redshifts now being probed spectroscopically, we are entering the era where the intergalactic medium is partially neutral.Comment: 8 pages, 5 figures, Accepted to ApJ 10/1/1

    Hamiltonian structure for dispersive and dissipative dynamical systems

    Full text link
    We develop a Hamiltonian theory of a time dispersive and dissipative inhomogeneous medium, as described by a linear response equation respecting causality and power dissipation. The proposed Hamiltonian couples the given system to auxiliary fields, in the universal form of a so-called canonical heat bath. After integrating out the heat bath the original dissipative evolution is exactly reproduced. Furthermore, we show that the dynamics associated to a minimal Hamiltonian are essentially unique, up to a natural class of isomorphisms. Using this formalism, we obtain closed form expressions for the energy density, energy flux, momentum density, and stress tensor involving the auxiliary fields, from which we derive an approximate, ``Brillouin-type,'' formula for the time averaged energy density and stress tensor associated to an almost mono-chromatic wave.Comment: 68 pages, 1 figure; introduction revised, typos correcte

    Scaling in a continuous time model for biological aging

    Full text link
    In this paper we consider a generalization to the asexual version of the Penna model for biological aging, where we take a continuous time limit. The genotype associated to each individual is an interval of real numbers over which Dirac δ\delta--functions are defined, representing genetically programmed diseases to be switched on at defined ages of the individual life. We discuss two different continuous limits for the evolution equation and two different mutation protocols, to be implemented during reproduction. Exact stationary solutions are obtained and scaling properties are discussed.Comment: 10 pages, 6 figure

    Reionization after Planck: the derived growth of the cosmic ionizing emissivity now matches the growth of the galaxy UV luminosity density

    Get PDF
    Thomson optical depth tau measurements from Planck provide new insights into the reionization of the universe. In pursuit of model-independent constraints on the properties of the ionising sources, we determine the empirical evolution of the cosmic ionizing emissivity. We use a simple two-parameter model to map out the evolution in the emissivity at z>~6 from the new Planck optical depth tau measurements, from the constraints provided by quasar absorption spectra and from the prevalence of Ly-alpha emission in z~7-8 galaxies. We find the redshift evolution in the emissivity dot{N}_{ion}(z) required by the observations to be d(log Nion)/dz=-0.15(-0.11)(+0.08), largely independent of the assumed clumping factor C_{HII} and entirely independent of the nature of the ionising sources. The trend in dot{N}_{ion}(z) is well-matched by the evolution of the galaxy UV-luminosity density (dlog_{10} rho_UV/dz=-0.11+/-0.04) to a magnitude limit >~-13 mag, suggesting that galaxies are the sources that drive the reionization of the universe. The role of galaxies is further strengthened by the conversion from the UV luminosity density rho_UV to dot(N)_{ion}(z) being possible for physically-plausible values of the escape fraction f_{esc}, the Lyman-continuum photon production efficiency xi_{ion}, and faint-end cut-off MlimM_{lim} to the luminosity function. Quasars/AGN appear to match neither the redshift evolution nor normalization of the ionizing emissivity. Based on the inferred evolution in the ionizing emissivity, we estimate that the z~10 UV-luminosity density is 8(-4)(+15)x lower than at $z~6, consistent with the observations. The present approach of contrasting the inferred evolution of the ionizing emissivity with that of the galaxy UV luminosity density adds to the growing observational evidence that faint, star-forming galaxies drive the reionization of the universe.Comment: 20 pages, 12 figures, 5 tables, Astrophysical Journal, updated to match version in press, Figure 6 shows the main result of the pape

    Distribution of particulate matter and tissue remodeling in the human lung.

    Get PDF
    We examined the relationship between intrapulmonary particle distribution of carbonaceous and mineral dusts and remodeling of the airways along anatomically distinct airway paths in the lungs of Hispanic males from the central valley of California. Lung autopsy specimens from the Fresno County Coroner's Office were prepared by intratracheal instillation of 2% glutaraldehyde at 30 cm H(2)O pressure. Two distinct airway paths into the apico-posterior and apico-anterior portions of the left upper lung lobe were followed. Tissue samples for histologic analysis were generally taken from the intrapulmonary second, fourth, sixth, and ninth airway generations. Parenchymal tissues beyond the 12th airway generation of each airway path were also analyzed. There was little evidence of visible particle accumulation in the larger conducting airways (generations 2-6), except in bronchial-associated lymphoid tissues and within peribronchial connective tissue. In contrast, terminal and respiratory bronchioles arising from each pathway revealed varying degrees of wall thickening and remodeling. Walls with marked thickening contained moderate to heavy amounts of carbonaceous and mineral dusts. Wall thickening was associated with increases in collagen and interstitial inflammatory cells, including dust-laden macrophages. These changes were significantly greater in first-generation respiratory bronchioles compared to second- and third-generation respiratory bronchioles. These findings suggest that accumulation of carbonaceous and mineral dust in the lungs is significantly affected by lung anatomy with the greatest retention in centers of lung acini. Furthermore, there is significant remodeling of this transitional zone in humans exposed to ambient particulate matter

    Memetic Perspectives on the Evolution of Tonal Systems

    Get PDF
    Cohn (1996) and Taruskin (1985) consider the increasing prominence during the nineteenth century of harmonic progressions derived from the hexatonic and octatonic pitch collections respectively. This development is clearly evident in music of the third quarter of the century onwards and is a consequence of forces towards non-diatonic organization latent in earlier music. This article conceptualizes such forces as memetic — drawing a distinction between memetic processes in music itself and those in the realm of music theory — and interprets the gradualistic evolution of tonal systems as one of their most significant consequences. After outlining hypotheses for the mechanisms driving such evolution, it identifies a number of ‘musemes’ implicated in hexatonic and octatonic organization in a passage from Mahler’s Symphony no. 10. Pople’s (2002) Tonalities music-analysis software is used to explore the tonal organization of the passage, which is considered in relation to the musemes hypothesized to generate and underpin it

    Entanglement-free Heisenberg-limited phase estimation

    Get PDF
    Measurement underpins all quantitative science. A key example is the measurement of optical phase, used in length metrology and many other applications. Advances in precision measurement have consistently led to important scientific discoveries. At the fundamental level, measurement precision is limited by the number N of quantum resources (such as photons) that are used. Standard measurement schemes, using each resource independently, lead to a phase uncertainty that scales as 1/sqrt(N) - known as the standard quantum limit. However, it has long been conjectured that it should be possible to achieve a precision limited only by the Heisenberg uncertainty principle, dramatically improving the scaling to 1/N. It is commonly thought that achieving this improvement requires the use of exotic quantum entangled states, such as the NOON state. These states are extremely difficult to generate. Measurement schemes with counted photons or ions have been performed with N <= 6, but few have surpassed the standard quantum limit and none have shown Heisenberg-limited scaling. Here we demonstrate experimentally a Heisenberg-limited phase estimation procedure. We replace entangled input states with multiple applications of the phase shift on unentangled single-photon states. We generalize Kitaev's phase estimation algorithm using adaptive measurement theory to achieve a standard deviation scaling at the Heisenberg limit. For the largest number of resources used (N = 378), we estimate an unknown phase with a variance more than 10 dB below the standard quantum limit; achieving this variance would require more than 4,000 resources using standard interferometry. Our results represent a drastic reduction in the complexity of achieving quantum-enhanced measurement precision.Comment: Published in Nature. This is the final versio

    A Memetic Analysis of a Phrase by Beethoven: Calvinian Perspectives on Similarity and Lexicon-Abstraction

    Get PDF
    This article discusses some general issues arising from the study of similarity in music, both human-conducted and computer-aided, and then progresses to a consideration of similarity relationships between patterns in a phrase by Beethoven, from the first movement of the Piano Sonata in A flat major op. 110 (1821), and various potential memetic precursors. This analysis is followed by a consideration of how the kinds of similarity identified in the Beethoven phrase might be understood in psychological/conceptual and then neurobiological terms, the latter by means of William Calvin’s Hexagonal Cloning Theory. This theory offers a mechanism for the operation of David Cope’s concept of the lexicon, conceived here as a museme allele-class. I conclude by attempting to correlate and map the various spaces within which memetic replication occurs
    corecore