10,700 research outputs found

    Slow, Continuous Beams of Large Gas Phase Molecules

    Full text link
    Cold, continuous, high flux beams of benzonitrile, fluorobenzine, and anisole have been created. Buffer-gas cooling with a cryogenic gas provides the cooling and slow forward beam velocities. The beam of benzonitrile was measured to have a forward velocity peaked at 67 ±5\pm 5 m s1^{-1}, and a continuous flux of 101510^{15} molecules s1^{-1}. These beams provide a continuous source for high resolution spectroscopy, and provide an attractive starting point for further spatial manipulation of such molecules, including eventual trapping

    Quantum Uncertainty Considerations for Gravitational Lens Interferometry

    Full text link
    The measurement of the gravitational lens delay time between light paths has relied, to date, on the source having sufficient variability to allow photometric variations from each path to be compared. However, the delay times of many gravitational lenses cannot be measured because the intrinsic source amplitude variations are too small to be detectable. At the fundamental quantum mechanical level, such photometric time stamps allow which-path knowledge, removing the ability to obtain an interference pattern. However, if the two paths can be made equal (zero time delay) then interference can occur. We describe an interferometric approach to measuring gravitational lens delay times using a quantum-eraser/restorer approach, whereby the time travel along the two paths may be rendered measurably equal. Energy and time being non-commuting observables, constraints on the photon energy in the energy-time uncertainty principle, via adjustments of the width of the radio bandpass, dictate the uncertainty of the time delay and therefore whether the path taken along one or the other gravitational lens geodesic is knowable. If one starts with interference, for example, which-path information returns when the bandpass is broadened (constraints on the energy are relaxed) to the point where the uncertainty principle allows a knowledge of the arrival time to better than the gravitational lens delay time itself, at which point the interference will disappear. We discuss the near-term feasibility of such measurements in light of current narrow-band radio detectors and known short time-delay gravitational lenses.Comment: 22 page

    Contrasting Views of Complexity and Their Implications For Network-Centric Infrastructures

    Get PDF
    There exists a widely recognized need to better understand and manage complex “systems of systems,” ranging from biology, ecology, and medicine to network-centric technologies. This is motivating the search for universal laws of highly evolved systems and driving demand for new mathematics and methods that are consistent, integrative, and predictive. However, the theoretical frameworks available today are not merely fragmented but sometimes contradictory and incompatible. We argue that complexity arises in highly evolved biological and technological systems primarily to provide mechanisms to create robustness. However, this complexity itself can be a source of new fragility, leading to “robust yet fragile” tradeoffs in system design. We focus on the role of robustness and architecture in networked infrastructures, and we highlight recent advances in the theory of distributed control driven by network technologies. This view of complexity in highly organized technological and biological systems is fundamentally different from the dominant perspective in the mainstream sciences, which downplays function, constraints, and tradeoffs, and tends to minimize the role of organization and design

    Design degrees of freedom and mechanisms for complexity

    Get PDF
    We develop a discrete spectrum of percolation forest fire models characterized by increasing design degrees of freedom (DDOF’s). The DDOF’s are tuned to optimize the yield of trees after a single spark. In the limit of a single DDOF, the model is tuned to the critical density. Additional DDOF’s allow for increasingly refined spatial patterns, associated with the cellular structures seen in highly optimized tolerance (HOT). The spectrum of models provides a clear illustration of the contrast between criticality and HOT, as well as a concrete quantitative example of how a sequence of robustness tradeoffs naturally arises when increasingly complex systems are developed through additional layers of design. Such tradeoffs are familiar in engineering and biology and are a central aspect of the complex systems that can be characterized as HOT

    Attitudes, Ideological Associations and the Left–Right Divide in Latin America

    Get PDF
    Do Latin American citizens share a common conception of the ideological left–right distinction? And if so, is this conception linked to individuals’ ideological self-placement? Selecting questions from the 2006 Latinobarómetro survey based on a core definition of the left–right divide rooted in political theory and philosophy, this paper addresses these questions. We apply joint correspondence analysis to explore whether citizens who relate to the same ideological identification also share similar and coherent convictions and beliefs that reflect the ideological content of the left–right distinction. Our analysis indicates that theoretical conceptions about the roots of, and responsibility for, inequality in society, together with the translation of these beliefs into attitudes regarding the state versus market divide, distinguish those who self-identify with the left and those who selfidentify with the right

    More "normal" than normal: scaling distributions and complex systems

    Get PDF
    One feature of many naturally occurring or engineered complex systems is tremendous variability in event sizes. To account for it, the behavior of these systems is often described using power law relationships or scaling distributions, which tend to be viewed as "exotic" because of their unusual properties (e.g., infinite moments). An alternate view is based on mathematical, statistical, and data-analytic arguments and suggests that scaling distributions should be viewed as "more normal than normal". In support of this latter view that has been advocated by Mandelbrot for the last 40 years, we review in this paper some relevant results from probability theory and illustrate a powerful statistical approach for deciding whether the variability associated with observed event sizes is consistent with an underlying Gaussian-type (finite variance) or scaling-type (infinite variance) distribution. We contrast this approach with traditional model fitting techniques and discuss its implications for future modeling of complex systems

    Understanding Internet topology: principles, models, and validation

    Get PDF
    Building on a recent effort that combines a first-principles approach to modeling router-level connectivity with a more pragmatic use of statistics and graph theory, we show in this paper that for the Internet, an improved understanding of its physical infrastructure is possible by viewing the physical connectivity as an annotated graph that delivers raw connectivity and bandwidth to the upper layers in the TCP/IP protocol stack, subject to practical constraints (e.g., router technology) and economic considerations (e.g., link costs). More importantly, by relying on data from Abilene, a Tier-1 ISP, and the Rocketfuel project, we provide empirical evidence in support of the proposed approach and its consistency with networking reality. To illustrate its utility, we: 1) show that our approach provides insight into the origin of high variability in measured or inferred router-level maps; 2) demonstrate that it easily accommodates the incorporation of additional objectives of network design (e.g., robustness to router failure); and 3) discuss how it complements ongoing community efforts to reverse-engineer the Internet
    corecore