6,379 research outputs found

    Salience-based selection: attentional capture by distractors less salient than the target

    Get PDF
    Current accounts of attentional capture predict the most salient stimulus to be invariably selected first. However, existing salience and visual search models assume noise in the map computation or selection process. Consequently, they predict the first selection to be stochastically dependent on salience, implying that attention could even be captured first by the second most salient (instead of the most salient) stimulus in the field. Yet, capture by less salient distractors has not been reported and salience-based selection accounts claim that the distractor has to be more salient in order to capture attention. We tested this prediction using an empirical and modeling approach of the visual search distractor paradigm. For the empirical part, we manipulated salience of target and distractor parametrically and measured reaction time interference when a distractor was present compared to absent. Reaction time interference was strongly correlated with distractor salience relative to the target. Moreover, even distractors less salient than the target captured attention, as measured by reaction time interference and oculomotor capture. In the modeling part, we simulated first selection in the distractor paradigm using behavioral measures of salience and considering the time course of selection including noise. We were able to replicate the result pattern we obtained in the empirical part. We conclude that each salience value follows a specific selection time distribution and attentional capture occurs when the selection time distributions of target and distractor overlap. Hence, selection is stochastic in nature and attentional capture occurs with a certain probability depending on relative salience

    Combined Error Correction Techniques for Quantum Computing Architectures

    Get PDF
    Proposals for quantum computing devices are many and varied. They each have unique noise processes that make none of them fully reliable at this time. There are several error correction/avoidance techniques which are valuable for reducing or eliminating errors, but not one, alone, will serve as a panacea. One must therefore take advantage of the strength of each of these techniques so that we may extend the coherence times of the quantum systems and create more reliable computing devices. To this end we give a general strategy for using dynamical decoupling operations on encoded subspaces. These encodings may be of any form; of particular importance are decoherence-free subspaces and quantum error correction codes. We then give means for empirically determining an appropriate set of dynamical decoupling operations for a given experiment. Using these techniques, we then propose a comprehensive encoding solution to many of the problems of quantum computing proposals which use exchange-type interactions. This uses a decoherence-free subspace and an efficient set of dynamical decoupling operations. It also addresses the problems of controllability in solid state quantum dot devices.Comment: Contribution to Proceedings of the 2002 Physics of Quantum Electronics Conference", to be published in J. Mod. Optics. This paper provides a summary and review of quant-ph/0205156 and quant-ph/0112054, and some new result

    Testing the adaptability of people's use of attribute frame information

    Get PDF
    The informational leakage account of attribute framing effects proposes that a communicator's choice of frame provides informational value, such that different frames are not informationally equivalent. Across five studies communicating food risks, we investigated the adaptability of communication recipients' (our participants) use of frame information by manipulating the degree to which the communicator ostensibly had a choice over how the information was framed. Within-participants framing effects were observed across all conditions of all studies. Only in Study 4 (the only study in which communicator choice was manipulated within-participants) was there any evidence for an attenuation of framing effects where the communicator was not responsible for how the information was framed. Overall, regardless of whether or not framing effects are driven by the informational value contained in a communicator's choice of frame, people show little sensitivity to situations where that choice is removed

    Lymphatic vasculature mediates macrophage reverse cholesterol transport in mice

    Get PDF
    Reverse cholesterol transport (RCT) refers to the mobilization of cholesterol on HDL particles (HDL-C) from extravascular tissues to plasma, ultimately for fecal excretion. Little is known about how HDL-C leaves peripheral tissues to reach plasma. We first used 2 models of disrupted lymphatic drainage from skin — 1 surgical and the other genetic — to quantitatively track RCT following injection of [3H]-cholesterol–loaded macrophages upstream of blocked or absent lymphatic vessels. Macrophage RCT was markedly impaired in both models, even at sites with a leaky vasculature. Inhibited RCT was downstream of cholesterol efflux from macrophages, since macrophage efflux of a fluorescent cholesterol analog (BODIPY-cholesterol) was not altered by impaired lymphatic drainage. We next addressed whether RCT was mediated by lymphatic vessels from the aortic wall by loading the aortae of donor atherosclerotic Apoe-deficient mice with [2H]6-labeled cholesterol and surgically transplanting these aortae into recipient Apoe-deficient mice that were treated with anti-VEGFR3 antibody to block lymphatic regrowth or with control antibody to allow such regrowth. [2H]-Cholesterol was retained in aortae of anti–VEGFR3-treated mice. Thus, the lymphatic vessel route is critical for RCT from multiple tissues, including the aortic wall. These results suggest that supporting lymphatic transport function may facilitate cholesterol clearance in therapies aimed at reversing atherosclerosis

    Keck Interferometer Nuller Data Reduction and On-Sky Performance

    Get PDF
    We describe the Keck Interferometer nuller theory of operation, data reduction, and on-sky performance, particularly as it applies to the nuller exozodiacal dust key science program that was carried out between 2008 February and 2009 January. We review the nuller implementation, including the detailed phasor processing involved in implementing the null-peak mode used for science data and the sequencing used for science observing. We then describe the Level 1 reduction to convert the instrument telemetry streams to raw null leakages, and the Level 2 reduction to provide calibrated null leakages. The Level 1 reduction uses conservative, primarily linear processing, implemented consistently for science and calibrator stars. The Level 2 processing is more flexible, and uses diameters for the calibrator stars measured contemporaneously with the interferometer’s K-band cophasing system in order to provide the requisite accuracy. Using the key science data set of 462 total scans, we assess the instrument performance for sensitivity and systematic error. At 2.0 Jy we achieve a photometrically-limited null leakage uncertainty of 0.25% rms per 10 minutes of integration time in our broadband channel. From analysis of the Level 2 reductions, we estimate a systematic noise floor for bright stars of ~0.2% rms null leakage uncertainty per observing cluster in the broadband channel. A similar analysis is performed for the narrowband channels. We also provide additional information needed for science reduction, including details on the instrument beam pattern and the basic astrophysical response of the system, and references to the data reduction and modeling tools

    CacheZoom: How SGX Amplifies The Power of Cache Attacks

    Get PDF
    In modern computing environments, hardware resources are commonly shared, and parallel computation is widely used. Parallel tasks can cause privacy and security problems if proper isolation is not enforced. Intel proposed SGX to create a trusted execution environment within the processor. SGX relies on the hardware, and claims runtime protection even if the OS and other software components are malicious. However, SGX disregards side-channel attacks. We introduce a powerful cache side-channel attack that provides system adversaries a high resolution channel. Our attack tool named CacheZoom is able to virtually track all memory accesses of SGX enclaves with high spatial and temporal precision. As proof of concept, we demonstrate AES key recovery attacks on commonly used implementations including those that were believed to be resistant in previous scenarios. Our results show that SGX cannot protect critical data sensitive computations, and efficient AES key recovery is possible in a practical environment. In contrast to previous works which require hundreds of measurements, this is the first cache side-channel attack on a real system that can recover AES keys with a minimal number of measurements. We can successfully recover AES keys from T-Table based implementations with as few as ten measurements.Comment: Accepted at Conference on Cryptographic Hardware and Embedded Systems (CHES '17

    Quantum Computing with Very Noisy Devices

    Full text link
    In theory, quantum computers can efficiently simulate quantum physics, factor large numbers and estimate integrals, thus solving otherwise intractable computational problems. In practice, quantum computers must operate with noisy devices called ``gates'' that tend to destroy the fragile quantum states needed for computation. The goal of fault-tolerant quantum computing is to compute accurately even when gates have a high probability of error each time they are used. Here we give evidence that accurate quantum computing is possible with error probabilities above 3% per gate, which is significantly higher than what was previously thought possible. However, the resources required for computing at such high error probabilities are excessive. Fortunately, they decrease rapidly with decreasing error probabilities. If we had quantum resources comparable to the considerable resources available in today's digital computers, we could implement non-trivial quantum computations at error probabilities as high as 1% per gate.Comment: 47 page
    • …
    corecore