153 research outputs found

    Snowmelt-triggered debris flows in seasonal snowpacks

    Get PDF
    Snowmelt-triggered debris flows commonly occur in mountains. On 14 June 2019, a debris flow occurred on a steep, east-facing slope composed of unconsolidated glacial and periglacial sediments in Yosemite National Park. Originating as a shallow landslide, ~1,300m3 of ripe snow was instantaneously entrained into the debris flow carrying boulders, trees, and soil downslope. The forested area at the toe of the slope strained out debris leaving a muddy slurry to issue across Highway 120 during dewatering. We document this mass movement and assesses its initiation using local snowpack and meteorological data as well as a regional atmospheric reanalysis to examine synoptic conditions. A multiday warming trend and ripening of the snowpack occurred prior to the event as a 500 hPa ridge amplified over western North America leading to record warm 700 hPa temperatures. Anomalous temperatures and cloud cover prevented refreezing of the snowpack and accelerated its ripening with meltwater contributing to soil saturation. Similar conditions occurred during the catastrophic 1983 Slide Mountain debris flow, also hypothesized to be snowmelt initiated. With projected increases in heat waves, our findings can support natural hazard early warning systems in snow-dominated environments

    Quantum memories at finite temperature

    Get PDF
    To use quantum systems for technological applications one first needs to preserve their coherence for macroscopic time scales, even at finite temperature. Quantum error correction has made it possible to actively correct errors that affect a quantum memory. An attractive scenario is the construction of passive storage of quantum information with minimal active support. Indeed, passive protection is the basis of robust and scalable classical technology, physically realized in the form of the transistor and the ferromagnetic hard disk. The discovery of an analogous quantum system is a challenging open problem, plagued with a variety of no-go theorems. Several approaches have been devised to overcome these theorems by taking advantage of their loopholes. The state-of-the-art developments in this field are reviewed in an informative and pedagogical way. The main principles of self-correcting quantum memories are given and several milestone examples from the literature of two-, three- and higher-dimensional quantum memories are analyzed

    Quantum computing with antiferromagnetic spin clusters

    Full text link
    We show that a wide range of spin clusters with antiferromagnetic intracluster exchange interaction allows one to define a qubit. For these spin cluster qubits, initialization, quantum gate operation, and readout are possible using the same techniques as for single spins. Quantum gate operation for the spin cluster qubit does not require control over the intracluster exchange interaction. Electric and magnetic fields necessary to effect quantum gates need only be controlled on the length scale of the spin cluster rather than the scale for a single spin. Here, we calculate the energy gap separating the logical qubit states from the next excited state and the matrix elements which determine quantum gate operation times. We discuss spin cluster qubits formed by one- and two-dimensional arrays of s=1/2 spins as well as clusters formed by spins s>1/2. We illustrate the advantages of spin cluster qubits for various suggested implementations of spin qubits and analyze the scaling of decoherence time with spin cluster size.Comment: 15 pages, 7 figures; minor change

    Weak lensing, dark matter and dark energy

    Full text link
    Weak gravitational lensing is rapidly becoming one of the principal probes of dark matter and dark energy in the universe. In this brief review we outline how weak lensing helps determine the structure of dark matter halos, measure the expansion rate of the universe, and distinguish between modified gravity and dark energy explanations for the acceleration of the universe. We also discuss requirements on the control of systematic errors so that the systematics do not appreciably degrade the power of weak lensing as a cosmological probe.Comment: Invited review article for the GRG special issue on gravitational lensing (P. Jetzer, Y. Mellier and V. Perlick Eds.). V3: subsection on three-point function and some references added. Matches the published versio

    Cosmology with Gravitational Lensing

    Full text link
    In these lectures I give an overview of gravitational lensing, concentrating on theoretical aspects, including derivations of some of the important results. Topics covered include the determination of surface mass densities of intervening lenses, as well as the statistical analysis of distortions of galaxy images by general inhomogeneities (cosmic shear), both in 2D projection on the sky, and in 3D where source distance information is available. 3D mass reconstruction and the shear ratio test are also considered, and the sensitivity of observables to Dark Energy is used to show how its equation of state may be determined using weak lensing. Finally, the article considers the prospect of testing Einstein's General Relativity with weak lensing, exploiting the differences in growth rates of perturbations in different models.} \abstract{In these lectures I give an overview of gravitational lensing, concentrating on theoretical aspects, including derivations of some of the important results. Topics covered include the determination of surface mass densities of intervening lenses, as well as the statistical analysis of distortions of galaxy images by general inhomogeneities (cosmic shear), both in 2D projection on the sky, and in 3D where source distance information is available. 3D mass reconstruction and the shear ratio test are also considered, and the sensitivity of observables to Dark Energy is used to show how its equation of state may be determined using weak lensing. Finally, the article considers the prospect of testing Einstein's General Relativity with weak lensing, exploiting the differences in growth rates of perturbations in different models.Comment: Lectures given at Como Summer School 2007, now published (in 'Dark Matter and Dark Energy', 2011, ASSL 370, eds. Matarrese, Colpi, Gorini, Moschella

    Getting nowhere fast: a teleological conception of socio-technical acceleration

    Get PDF
    It has been frequently recognized that the perceived acceleration of life that has been experienced from the Industrial Revolution onward is engendered, at least in part, by an understanding of speed as an end in itself. There is no equilibrium to be reached – no perfect speed – and as such, social processes are increasingly driven not by rational ends, but by an indeterminate demand for acceleration that both defines and restricts the decisional possibilities of actors. In Aristotelian terms, this is a final cause – i.e. a teleology – of speed: it is not a defined end-point, but rather, a purposive aim that predicates the emergence of possibilities. By tracing this notion of telos from its beginnings in ancient Greece, through the ur-empiricism of Francis Bacon, and then to our present epoch, this paper seeks to tentatively examine the way in which such a teleology can be theoretically divorced from the idea of historical progress, arguing that the former is premised upon an untenable ontological privileging of becoming

    Critical thinking for 21st-century education: A cyber-tooth curriculum?

    Get PDF
    It is often assumed that the advent of digital technologies requires fundamental change to the curriculum and to the teaching and learning approaches used in schools around the world to educate this generation of “digital natives” or the “net generation”. This article analyses the concepts of 21st-century skills and critical thinking, to understand how these aspects of learning might contribute to a 21st-century education. The author argues that, although both critical thinking and 21st-century skills are indeed necessary in a curriculum for a 21st-century education, they are not sufficient, even in combination. The role of knowledge and an understanding of differing cultural perspectives and values indicate that education should also fit local contexts in a global world and meet the specific needs of students in diverse cultures. It should also fit the particular technical and historical demands of the 21st century in relation to digital skills

    Model Cortical Association Fields Account for the Time Course and Dependence on Target Complexity of Human Contour Perception

    Get PDF
    Can lateral connectivity in the primary visual cortex account for the time dependence and intrinsic task difficulty of human contour detection? To answer this question, we created a synthetic image set that prevents sole reliance on either low-level visual features or high-level context for the detection of target objects. Rendered images consist of smoothly varying, globally aligned contour fragments (amoebas) distributed among groups of randomly rotated fragments (clutter). The time course and accuracy of amoeba detection by humans was measured using a two-alternative forced choice protocol with self-reported confidence and variable image presentation time (20-200 ms), followed by an image mask optimized so as to interrupt visual processing. Measured psychometric functions were well fit by sigmoidal functions with exponential time constants of 30-91 ms, depending on amoeba complexity. Key aspects of the psychophysical experiments were accounted for by a computational network model, in which simulated responses across retinotopic arrays of orientation-selective elements were modulated by cortical association fields, represented as multiplicative kernels computed from the differences in pairwise edge statistics between target and distractor images. Comparing the experimental and the computational results suggests that each iteration of the lateral interactions takes at least ms of cortical processing time. Our results provide evidence that cortical association fields between orientation selective elements in early visual areas can account for important temporal and task-dependent aspects of the psychometric curves characterizing human contour perception, with the remaining discrepancies postulated to arise from the influence of higher cortical areas

    Measurement-induced entanglement and teleportation on a noisy quantum processor

    Full text link
    Measurement has a special role in quantum theory: by collapsing the wavefunction it can enable phenomena such as teleportation and thereby alter the "arrow of time" that constrains unitary evolution. When integrated in many-body dynamics, measurements can lead to emergent patterns of quantum information in space-time that go beyond established paradigms for characterizing phases, either in or out of equilibrium. On present-day NISQ processors, the experimental realization of this physics is challenging due to noise, hardware limitations, and the stochastic nature of quantum measurement. Here we address each of these experimental challenges and investigate measurement-induced quantum information phases on up to 70 superconducting qubits. By leveraging the interchangeability of space and time, we use a duality mapping, to avoid mid-circuit measurement and access different manifestations of the underlying phases -- from entanglement scaling to measurement-induced teleportation -- in a unified way. We obtain finite-size signatures of a phase transition with a decoding protocol that correlates the experimental measurement record with classical simulation data. The phases display sharply different sensitivity to noise, which we exploit to turn an inherent hardware limitation into a useful diagnostic. Our work demonstrates an approach to realize measurement-induced physics at scales that are at the limits of current NISQ processors
    corecore