6,839 research outputs found

    Attitudes, Ideological Associations and the Left–Right Divide in Latin America

    Get PDF
    Do Latin American citizens share a common conception of the ideological left–right distinction? And if so, is this conception linked to individuals’ ideological self-placement? Selecting questions from the 2006 Latinobarómetro survey based on a core definition of the left–right divide rooted in political theory and philosophy, this paper addresses these questions. We apply joint correspondence analysis to explore whether citizens who relate to the same ideological identification also share similar and coherent convictions and beliefs that reflect the ideological content of the left–right distinction. Our analysis indicates that theoretical conceptions about the roots of, and responsibility for, inequality in society, together with the translation of these beliefs into attitudes regarding the state versus market divide, distinguish those who self-identify with the left and those who selfidentify with the right

    Slow, Continuous Beams of Large Gas Phase Molecules

    Full text link
    Cold, continuous, high flux beams of benzonitrile, fluorobenzine, and anisole have been created. Buffer-gas cooling with a cryogenic gas provides the cooling and slow forward beam velocities. The beam of benzonitrile was measured to have a forward velocity peaked at 67 ±5\pm 5 m s−1^{-1}, and a continuous flux of 101510^{15} molecules s−1^{-1}. These beams provide a continuous source for high resolution spectroscopy, and provide an attractive starting point for further spatial manipulation of such molecules, including eventual trapping

    Contrasting Views of Complexity and Their Implications For Network-Centric Infrastructures

    Get PDF
    There exists a widely recognized need to better understand and manage complex “systems of systems,” ranging from biology, ecology, and medicine to network-centric technologies. This is motivating the search for universal laws of highly evolved systems and driving demand for new mathematics and methods that are consistent, integrative, and predictive. However, the theoretical frameworks available today are not merely fragmented but sometimes contradictory and incompatible. We argue that complexity arises in highly evolved biological and technological systems primarily to provide mechanisms to create robustness. However, this complexity itself can be a source of new fragility, leading to “robust yet fragile” tradeoffs in system design. We focus on the role of robustness and architecture in networked infrastructures, and we highlight recent advances in the theory of distributed control driven by network technologies. This view of complexity in highly organized technological and biological systems is fundamentally different from the dominant perspective in the mainstream sciences, which downplays function, constraints, and tradeoffs, and tends to minimize the role of organization and design

    Quantum Uncertainty Considerations for Gravitational Lens Interferometry

    Full text link
    The measurement of the gravitational lens delay time between light paths has relied, to date, on the source having sufficient variability to allow photometric variations from each path to be compared. However, the delay times of many gravitational lenses cannot be measured because the intrinsic source amplitude variations are too small to be detectable. At the fundamental quantum mechanical level, such photometric time stamps allow which-path knowledge, removing the ability to obtain an interference pattern. However, if the two paths can be made equal (zero time delay) then interference can occur. We describe an interferometric approach to measuring gravitational lens delay times using a quantum-eraser/restorer approach, whereby the time travel along the two paths may be rendered measurably equal. Energy and time being non-commuting observables, constraints on the photon energy in the energy-time uncertainty principle, via adjustments of the width of the radio bandpass, dictate the uncertainty of the time delay and therefore whether the path taken along one or the other gravitational lens geodesic is knowable. If one starts with interference, for example, which-path information returns when the bandpass is broadened (constraints on the energy are relaxed) to the point where the uncertainty principle allows a knowledge of the arrival time to better than the gravitational lens delay time itself, at which point the interference will disappear. We discuss the near-term feasibility of such measurements in light of current narrow-band radio detectors and known short time-delay gravitational lenses.Comment: 22 page

    Design degrees of freedom and mechanisms for complexity

    Get PDF
    We develop a discrete spectrum of percolation forest fire models characterized by increasing design degrees of freedom (DDOF’s). The DDOF’s are tuned to optimize the yield of trees after a single spark. In the limit of a single DDOF, the model is tuned to the critical density. Additional DDOF’s allow for increasingly refined spatial patterns, associated with the cellular structures seen in highly optimized tolerance (HOT). The spectrum of models provides a clear illustration of the contrast between criticality and HOT, as well as a concrete quantitative example of how a sequence of robustness tradeoffs naturally arises when increasingly complex systems are developed through additional layers of design. Such tradeoffs are familiar in engineering and biology and are a central aspect of the complex systems that can be characterized as HOT

    Mathematics and the Internet: A Source of Enormous Confusion and Great Potential

    Get PDF
    Graph theory models the Internet mathematically, and a number of plausible mathematically intersecting network models for the Internet have been developed and studied. Simultaneously, Internet researchers have developed methodology to use real data to validate, or invalidate, proposed Internet models. The authors look at these parallel developments, particularly as they apply to scale-free network models of the preferential attachment type

    The magnitude distribution of earthquakes near Southern California faults

    Get PDF
    We investigate seismicity near faults in the Southern California Earthquake Center Community Fault Model. We search for anomalously large events that might be signs of a characteristic earthquake distribution. We find that seismicity near major fault zones in Southern California is well modeled by a Gutenberg-Richter distribution, with no evidence of characteristic earthquakes within the resolution limits of the modern instrumental catalog. However, the b value of the locally observed magnitude distribution is found to depend on distance to the nearest mapped fault segment, which suggests that earthquakes nucleating near major faults are likely to have larger magnitudes relative to earthquakes nucleating far from major faults
    • 

    corecore