1,263 research outputs found

    How behavioral constraints may determine optimal sensory representations

    Get PDF
    The sensory-triggered activity of a neuron is typically characterized in terms of a tuning curve, which describes the neuron's average response as a function of a parameter that characterizes a physical stimulus. What determines the shapes of tuning curves in a neuronal population? Previous theoretical studies and related experiments suggest that many response characteristics of sensory neurons are optimal for encoding stimulus-related information. This notion, however, does not explain the two general types of tuning profiles that are commonly observed: unimodal and monotonic. Here, I quantify the efficacy of a set of tuning curves according to the possible downstream motor responses that can be constructed from them. Curves that are optimal in this sense may have monotonic or non-monotonic profiles, where the proportion of monotonic curves and the optimal tuning curve width depend on the general properties of the target downstream functions. This dependence explains intriguing features of visual cells that are sensitive to binocular disparity and of neurons tuned to echo delay in bats. The numerical results suggest that optimal sensory tuning curves are shaped not only by stimulus statistics and signal-to-noise properties, but also according to their impact on downstream neural circuits and, ultimately, on behavior.Comment: 24 pages, 9 figures (main text + supporting information

    Quantum entanglement

    Get PDF
    All our former experience with application of quantum theory seems to say: {\it what is predicted by quantum formalism must occur in laboratory}. But the essence of quantum formalism - entanglement, recognized by Einstein, Podolsky, Rosen and Schr\"odinger - waited over 70 years to enter to laboratories as a new resource as real as energy. This holistic property of compound quantum systems, which involves nonclassical correlations between subsystems, is a potential for many quantum processes, including ``canonical'' ones: quantum cryptography, quantum teleportation and dense coding. However, it appeared that this new resource is very complex and difficult to detect. Being usually fragile to environment, it is robust against conceptual and mathematical tools, the task of which is to decipher its rich structure. This article reviews basic aspects of entanglement including its characterization, detection, distillation and quantifying. In particular, the authors discuss various manifestations of entanglement via Bell inequalities, entropic inequalities, entanglement witnesses, quantum cryptography and point out some interrelations. They also discuss a basic role of entanglement in quantum communication within distant labs paradigm and stress some peculiarities such as irreversibility of entanglement manipulations including its extremal form - bound entanglement phenomenon. A basic role of entanglement witnesses in detection of entanglement is emphasized.Comment: 110 pages, 3 figures, ReVTex4, Improved (slightly extended) presentation, updated references, minor changes, submitted to Rev. Mod. Phys

    Information in Mechanism Design

    Get PDF
    We survey the recent literature on the role of information for mechanism design. We specifically consider the role of endogeneity of and robustness to private information in mechanism design. We view information acquisition of and robustness to private information as two distinct but related aspects of information management important in many design settings. We review the existing literature and point out directions for additional future work.Mechanism Design, Information Acquisition, Ex Post Equilibrium, Robust Mechanism Design, Interdependent Values, Information Management

    Performance asymmetries in visual search demonstrate failure of independent-processing models

    Get PDF
    We report psychophysical data from orientation-popout experiments that are inconsistent with a rather general decision model. Stimuli consisted of 121 line segments arranged on an 11×11 grid. There were two tasks: in the 1-Singleton Task all lines except one had the same orientation, and observers had to report which quadrant contained the singleton. In the 3-Singleton Task three quadrants contained orientation singletons and observers had to identify the quadrant without singleton. These tasks can be viewed as asymmetric search tasks, in which either a singleton-quadrant has to be found among three homogeneous quadrants, or a homogeneous quadrant has to be found among three singleton-quadrants. Using tools from signal-detection theory we show that the large performance asymmetries between 1-Singleton and 3-Singleton Tasks are inconsistent with any model that makes two (very basic and common) assumptions: (1) independent processing of the four quadrants and (2) an ideal-observer decision. We conclude that at least one of the two assumptions is inadequate. As a plausible reason for the model failure we suggest a global competition between salient elements that reduces popout strength when more than one singleton is present

    Charged-Higgs phenomenology in the Aligned two-Higgs-doublet model

    Get PDF
    The alignment in flavour space of the Yukawa matrices of a general two-Higgs-doublet model results in the absence of tree-level flavour-changing neutral currents. In addition to the usual fermion masses and mixings, the aligned Yukawa structure only contains three complex parameters, which are potential new sources of CP violation. For particular values of these three parameters all known specific implementations of the model based on discrete Z_2 symmetries are recovered. One of the most distinctive features of the two-Higgs-doublet model is the presence of a charged scalar. In this work, we discuss its main phenomenological consequences in flavour-changing processes at low energies and derive the corresponding constraints on the parameters of the aligned two-Higgs-doublet model.Comment: 46 pages, 19 figures. Version accepted for publication in JHEP. References added. Discussion slightly extended. Conclusions unchange

    Conditional Transformation Models

    Full text link
    The ultimate goal of regression analysis is to obtain information about the conditional distribution of a response given a set of explanatory variables. This goal is, however, seldom achieved because most established regression models only estimate the conditional mean as a function of the explanatory variables and assume that higher moments are not affected by the regressors. The underlying reason for such a restriction is the assumption of additivity of signal and noise. We propose to relax this common assumption in the framework of transformation models. The novel class of semiparametric regression models proposed herein allows transformation functions to depend on explanatory variables. These transformation functions are estimated by regularised optimisation of scoring rules for probabilistic forecasts, e.g. the continuous ranked probability score. The corresponding estimated conditional distribution functions are consistent. Conditional transformation models are potentially useful for describing possible heteroscedasticity, comparing spatially varying distributions, identifying extreme events, deriving prediction intervals and selecting variables beyond mean regression effects. An empirical investigation based on a heteroscedastic varying coefficient simulation model demonstrates that semiparametric estimation of conditional distribution functions can be more beneficial than kernel-based non-parametric approaches or parametric generalised additive models for location, scale and shape
    corecore