3,985 research outputs found
Measurement and Particle Statistics in the Szilard Engine
A Szilard Engine is a hypothetical device which is able to extract work from
a single thermal reservoir by measuring the position of particles within the
engine. We derive the amount of work that can be extracted from such a device
in the low temperature limit. Interestingly, we show this work is determined by
the information gain of the initial measurement rather than by the number and
type of particles which constitute the working substance. Our work provides
another clear connection between information gain and extractable work in
thermodynamical processes.Comment: 4 page
Classifying Alarms: Seeking Durability, Credibility, Consistency, and Simplicity
Alongside the development and testing of new audible alarms intended to support International Electrotechnical Commission 60601-1-8, a global standard concerned with alarm safety, the categories of risk that the standard denotes require further thought and possible updating. In this article, we revisit the origins of the categories covered by the standard. These categories were based on the ways that tissue damage can be caused. We consider these categories from the varied professional perspectives of the authors: human factors, semiotics, clinical practice, and the patient or family (layperson). We conclude that while the categories possess many clinically applicable and defensible features from our range of perspectives, the advances in alarm design now available may allow a more flexible approach. We present a three-tier system with superordinate, basic, and subordinate levels that fit both within the thinking embodied in the current standard and possible new developments
Complete quantum teleportation using nuclear magnetic resonance
Quantum mechanics provides spectacular new information processing abilities
(Bennett 1995, Preskill 1998). One of the most unexpected is a procedure called
quantum teleportation (Bennett et al 1993) that allows the quantum state of a
system to be transported from one location to another, without moving through
the intervening space. Partial implementations of teleportation (Bouwmeester et
al 1997, Boschi et al 1998) over macroscopic distances have been achieved using
optical systems, but omit the final stage of the teleportation procedure. Here
we report an experimental implementation of the full quantum teleportation
operation over inter-atomic distances using liquid state nuclear magnetic
resonance (NMR). The inclusion of the final stage enables for the first time a
teleportation implementation which may be used as a subroutine in larger
quantum computations, or for quantum communication. Our experiment also
demonstrates the use of quantum process tomography, a procedure to completely
characterize the dynamics of a quantum system. Finally, we demonstrate a
controlled exploitation of decoherence as a tool to assist in the performance
of an experiment.Comment: 15 pages, 2 figures. Minor differences between this and the published
versio
Recommended from our members
River channel width controls blocking by slow-moving landslides in California's Franciscan melange
This is the final version. Available from European Geosciences Union via the DOI in this record. To explore the sensitivity of rivers to blocking from landslide debris, we exploit two similar geomorphic settings in California's Franciscan mélange where slow-moving landslides, often referred to as earthflows, impinge on river channels with drainage areas that differ by a factor of 30. Analysis of valley widths and river long profiles over ∼19 km of Alameda Creek (185 km2 drainage area) and Arroyo Hondo (200 km2 drainage area) in central California shows a very consistent picture in which earthflows that intersect these channels force tens of meters of gravel aggradation for kilometers upstream, leading to apparently long-lived sediment storage and channel burial at these sites. In contrast, over a ∼30 km section of the Eel River (5547 km2 drainage area), there are no knickpoints or aggradation upstream of locations where earthflows impinge on its channel. Hydraulic and hydrologic data from United States Geological Survey (USGS) gages on Arroyo Hondo and the Eel River, combined with measured size distributions of boulders input by landslides for both locations, suggest that landslide derived boulders are not mobile at either site during the largest floods (>2-year recurrence) with field-measured flow depths. We therefore argue that boulder transport capacity is an unlikely explanation for the observed difference in sensitivity to landslide inputs. At the same time, we find that earthflow fluxes per unit channel width are nearly identical for Oak Ridge earthflow on Arroyo Hondo, where evidence for blocking is clear, and for the Boulder Creek earthflow on the Eel River, where evidence for blocking is absent. These observations suggest that boulder supply is also an unlikely explanation for the observed morphological differences along the two rivers. Instead, we argue that the dramatically different sensitivity of the two locations to landslide blocking is related to differences in channel width relative to typical seasonal displacements of earthflows. A synthesis of seasonal earthflow displacements in the Franciscan mélange shows that the channel width of the Eel River is ∼5 times larger than the largest annual seasonal displacement. In contrast, during wet winters, earthflows are capable of crossing the entire channel width of Arroyo Hondo and Alameda Creek. In support of this interpretation, satellite imagery shows that immobile earthflow-derived boulders are generally confined to the edges of the channel on the Eel River. By contrast, immobile earthflow-derived boulders jam the entire channel on Arroyo Hondo. Our results imply that lower drainage area reaches of earthflow-dominated catchments may be particularly prone to blocking. By inhibiting the upstream propagation of base-level signals, valley-blocking earthflows may therefore promote the formation of so-called “relict topography”.National Science Foundatio
Quantum Computing with Very Noisy Devices
In theory, quantum computers can efficiently simulate quantum physics, factor
large numbers and estimate integrals, thus solving otherwise intractable
computational problems. In practice, quantum computers must operate with noisy
devices called ``gates'' that tend to destroy the fragile quantum states needed
for computation. The goal of fault-tolerant quantum computing is to compute
accurately even when gates have a high probability of error each time they are
used. Here we give evidence that accurate quantum computing is possible with
error probabilities above 3% per gate, which is significantly higher than what
was previously thought possible. However, the resources required for computing
at such high error probabilities are excessive. Fortunately, they decrease
rapidly with decreasing error probabilities. If we had quantum resources
comparable to the considerable resources available in today's digital
computers, we could implement non-trivial quantum computations at error
probabilities as high as 1% per gate.Comment: 47 page
The Multiple Process Model of Goal-Directed Reaching Revisited
Recently our group forwarded a model of speed-accuracy relations in goal-directed reaching. A fundamental feature of our multiple process model was the distinction between two types of online regulation: impulse control and limb-target control. Impulse control begins during the initial stages of the movement trajectory and involves a comparison of actual limb velocity and direction to an internal representation of expectations about the limb trajectory. Limb-target control involves discrete error-reduction based on the relative positions of the limb and the target late in the movement. Our model also considers the role of eye movements, practice, energy optimization and strategic behavior in limb control. Here, we review recent work conducted to test specific aspects of our model. As well, we consider research not fully incorporated into our earlier contribution. We conclude that a slightly modified and expanded version of our model, that includes crosstalk between the two forms of online regulation, does an excellent job of explaining speed, accuracy, and energy optimization in goal-directed reaching
Single particle detection of protein molecules using dark-field microscopy to avoid signals from nonspecific adsorption
A massively parallel single particle sensing method based on core-satellite formation of Au nanoparticles was introduced for the detection of interleukin 6 (IL-6). This method exploits the fact that the localized plasmon resonance (LSPR) of the plasmonic nanoparticles will change as a result of core-satellite formation, resulting in a change in the observed color. In this method, the hue (color) value of thousands of 67 nm Au nanoparticles immobilized on a glass coverslip surface is analyzed by a Matlab code before and after the addition of reporter nanoparticles containing IL-6 as target protein. The average hue shift as the result of core-satellite formation is used as the basis to detect small amount of proteins. This method enjoys two major advantages. First it is able to analyze the hue values of thousands of nanoparticles in parallel in less than a minute. Secondly the method is able to circumvent the effect of non-specific adsorption, a major issue in the field of biosensing
The Influence of Visual Feedback and Prior Knowledge About Feedback on Vertical Aiming Strategies
Two experiments were conducted to examine time and energy optimization strategies for movements made with and against gravity. In Experiment 1, we manipulated concurrent visual feedback, and knowledge about feedback. When vision was eliminated upon movement initiation, participants exhibited greater undershooting, both with their primary submovement and their final endpoint, than when vision was available. When aiming downward, participants were more likely to terminate their aiming following the primary submovement or complete a lower amplitude corrective submovement. This strategy reduced the frequency of energy-consuming corrections against gravity. In Experiment 2, we eliminated vision of the hand and the target at the end of the movement. This procedure was expected to have its greatest impact under no vision conditions where no visual feedback was available for subsequent planning. As anticipated, direction and concurrent visual feedback had a profound impact on endpoint bias. Participants exhibited pronounced undershooting when aiming downward and without vision. Differences in undershooting between vision and no vision were greater under blocked feedback conditions. When performers were uncertain about the impending feedback, they planned their movements for the worst-case scenario. Thus movement planning considers the variability in execution, and avoids outcomes that require time and energy to correct
Video enhancement using adaptive spatio-temporal connective filter and piecewise mapping
This paper presents a novel video enhancement system based on an adaptive spatio-temporal connective (ASTC) noise filter and an adaptive piecewise mapping function (APMF). For ill-exposed videos or those with much noise, we first introduce a novel local image statistic to identify impulse noise pixels, and then incorporate it into the classical bilateral filter to form ASTC, aiming to reduce the mixture of the most two common types of noises - Gaussian and impulse noises in spatial and temporal directions. After noise removal, we enhance the video contrast with APMF based on the statistical information of frame segmentation results. The experiment results demonstrate that, for diverse low-quality videos corrupted by mixed noise, underexposure, overexposure, or any mixture of the above, the proposed system can automatically produce satisfactory results
The experience of enchantment in human-computer interaction
Improving user experience is becoming something of a rallying call in human–computer interaction but experience is not a unitary thing. There are varieties of experiences, good and bad, and we need to characterise these varieties if we are to improve user experience. In this paper we argue that enchantment is a useful concept to facilitate closer relationships between people and technology. But enchantment is a complex concept in need of some clarification. So we explore how enchantment has been used in the discussions of technology and examine experiences of film and cell phones to see how enchantment with technology is possible. Based on these cases, we identify the sensibilities that help designers design for enchantment, including the specific sensuousness of a thing, senses of play, paradox and openness, and the potential for transformation. We use these to analyse digital jewellery in order to suggest how it can be made more enchanting. We conclude by relating enchantment to varieties of experience.</p
- …