306,364 research outputs found

    Sensory Systems as Cybernetic Systems that Require Awareness of Alternatives to Interact with the World: Analysis of the Brain-Receptor Loop in Norwich's Entropy Theory of Perception

    Get PDF
    Introduction & Objectives: Norwich’s Entropy Theory of Perception (1975 [1] -present) stands alone. It explains many firing-rate behaviors and psychophysical laws from bare theory. To do so, it demands a unique sort of interaction between receptor and brain, one that Norwich never substantiated. Can it now be confirmed, given the accumulation of empirical sensory neuroscience? Background: Norwich conjoined sensation and a mathematical model of communication, Shannon’s Information Theory, as follows: “In the entropic view of sensation, magnitude of sensation is regarded as a measure of the entropy or uncertainty of the stimulus signal” [2]. “To be uncertain about the outcome of an event, one must first be aware of a set of alternative outcomes” [3]. “The entropy-establishing process begins with the generation of a [internal] sensory signal by the stimulus generator. This is followed by receipt of the [external] stimulus by the sensory receptor, transmission of action potentials by the sensory neurons, and finally recapture of the [response to the internal] signal by the generator” [4]. The latter “recapture” differentiates external from internal stimuli. The hypothetical “stimulus generators” are internal emitters, that generate photons in vision, audible sounds in audition (to Norwich, the spontaneous otoacoustic emissions [SOAEs]), “temperatures in excess of local skin temperature” in skin temperature sensation [4], etc. Method (1): Several decades of empirical sensory physiology literature was scrutinized for internal “stimulus generators”. Results (1): Spontaneous photopigment isomerization (“dark light”) does not involve visible light. SOAEs are electromechanical basilar-membrane artefacts that rarely produce audible tones. The skin’s temperature sensors do not raise skin temperature, etc. Method (2): The putative action of the brain-and-sensory-receptor loop was carefully reexamined. Results (2): The sensory receptor allegedly “perceives”, experiences “awareness”, possesses “memory”, and has a “mind”. But those traits describe the whole human. The receptor, thus anthropomorphized, must therefore contain its own perceptual loop, containing a receptor, containing a perceptual loop, etc. Summary & Conclusions: The Entropy Theory demands sensory awareness of alternatives, through an imagined brain-and-sensory-receptor loop containing internal “stimulus generators”. But (1) no internal “stimulus generators” seem to exist and (2) the loop would be the outermost of an infinite nesting of identical loops

    Perception-aware Path Planning

    Full text link
    In this paper, we give a double twist to the problem of planning under uncertainty. State-of-the-art planners seek to minimize the localization uncertainty by only considering the geometric structure of the scene. In this paper, we argue that motion planning for vision-controlled robots should be perception aware in that the robot should also favor texture-rich areas to minimize the localization uncertainty during a goal-reaching task. Thus, we describe how to optimally incorporate the photometric information (i.e., texture) of the scene, in addition to the the geometric one, to compute the uncertainty of vision-based localization during path planning. To avoid the caveats of feature-based localization systems (i.e., dependence on feature type and user-defined thresholds), we use dense, direct methods. This allows us to compute the localization uncertainty directly from the intensity values of every pixel in the image. We also describe how to compute trajectories online, considering also scenarios with no prior knowledge about the map. The proposed framework is general and can easily be adapted to different robotic platforms and scenarios. The effectiveness of our approach is demonstrated with extensive experiments in both simulated and real-world environments using a vision-controlled micro aerial vehicle.Comment: 16 pages, 20 figures, revised version. Conditionally accepted for IEEE Transactions on Robotic

    No accounting for risk

    Get PDF
    At the present time, the relation between accounting praxis and risk is not well understood. Accounting praxis does not appear to regard the risk it identifies with its activities as being different from 'objective risk' - the concept of risk found in positive financial and accounting research. Instead accounting praxis (as reflected in case studies, surveys and other empirical studies) reveal a collection of different, sometimes contradictory, conceptions and 'taken for granted' understandings of risk that are invoked and applied on an ad hoc, case by case basis. The aim of this paper is to demonstrate that the conceptual disarray in accounting for risk is both costly and unnecessary. Taking an interdisciplinary approach to risk research, the authors review developments in risk thinking at the end of the 20th Century and highlight a way forward for accounting through New Paradigm Risk (NPR). Various illustrations and case study examples are drawn upon to reflect the relevance of NPR to accounting praxis

    Defining the hundred year flood: a Bayesian approach for using historic data to reduce uncertainty in flood frequency estimates

    Get PDF
    This paper describes a Bayesian statistical model for estimating flood frequency by combining uncertain annual maximum (AMAX) data from a river gauge with estimates of flood peak discharge from various historic sources that predate the period of instrument records. Such historic flood records promise to expand the time series data needed for reducing the uncertainty in return period estimates for extreme events, but the heterogeneity and uncertainty of historic records make them difficult to use alongside Flood Estimation Handbook and other standard methods for generating flood frequency curves from gauge data. Using the flow of the River Eden in Carlisle, Cumbria, UK as a case study, this paper develops a Bayesian model for combining historic flood estimates since 1800 with gauge data since 1967 to estimate the probability of low frequency flood events for the area taking account of uncertainty in the discharge estimates. Results show a reduction in 95% confidence intervals of roughly 50% for annual exceedance probabilities of less than 0.0133 (return periods over 75 years) compared to standard flood frequency estimation methods using solely systematic data. Sensitivity analysis shows the model is sensitive to 2 model parameters both of which are concerned with the historic (pre-systematic) period of the time series. This highlights the importance of adequate consideration of historic channel and floodplain changes or possible bias in estimates of historic flood discharges. The next steps required to roll out this Bayesian approach for operational flood frequency estimation at other sites is also discussed

    Project management under uncertainty

    Get PDF
    Morris' (1986) analysis of the factors affecting project success and failure is considered in relation to the psychology of judgement under uncertainty. A model is proposed whereby project managers may identify the specific circumstances in which human decision-making is prone to systematic error, and hence may apply a number of de-biasing techniques
    corecore