1,593 research outputs found

    Synchronizing to the Environment: Information Theoretic Constraints on Agent Learning

    Full text link
    We show that the way in which the Shannon entropy of sequences produced by an information source converges to the source's entropy rate can be used to monitor how an intelligent agent builds and effectively uses a predictive model of its environment. We introduce natural measures of the environment's apparent memory and the amounts of information that must be (i) extracted from observations for an agent to synchronize to the environment and (ii) stored by an agent for optimal prediction. If structural properties are ignored, the missed regularities are converted to apparent randomness. Conversely, using representations that assume too much memory results in false predictability.Comment: 6 pages, 5 figures, Santa Fe Institute Working Paper 01-03-020, http://www.santafe.edu/projects/CompMech/papers/stte.htm

    Statistical Complexity of Simple 1D Spin Systems

    Full text link
    We present exact results for two complementary measures of spatial structure generated by 1D spin systems with finite-range interactions. The first, excess entropy, measures the apparent spatial memory stored in configurations. The second, statistical complexity, measures the amount of memory needed to optimally predict the chain of spin values. These statistics capture distinct properties and are different from existing thermodynamic quantities.Comment: 4 pages with 2 eps Figures. Uses RevTeX macros. Also available at http://www.santafe.edu/projects/CompMech/papers/CompMechCommun.htm

    Statistical Measures of Complexity: Why?

    Full text link
    We review several statistical complexity measures proposed over the last decade and a half as general indicators of structure or correlation. Recently, Lopez-Ruiz, Mancini, and Calbet [Phys. Lett. A 209 (1995) 321] introduced another measure of statistical complexity C_{LMC} that, like others, satisfies the ``boundary conditions'' of vanishing in the extreme ordered and disordered limits. We examine some properties of C_{LMC} and find that it is neither an intensive nor an extensive thermodynamic variable and that it vanishes exponentially in the thermodynamic limit for all one-dimensional finite-range spin systems. We propose a simple alteration of C_{LMC} that renders it extensive. However, this remedy results in a quantity that is a trivial function of the entropy density and hence of no use as a measure of structure or memory. We conclude by suggesting that a useful ``statistical complexity'' must not only obey the ordered-random boundary conditions of vanishing, it must also be defined in a setting that gives a clear interpretation to what structures are quantified.Comment: 7 pages with 2 eps Figures. Uses RevTeX macros. Also available at http://www.santafe.edu/projects/CompMech/papers/CompMechCommun.html Submitted to Phys. Lett.

    The Temperature and Density Structure of the Solar Corona. I. Observations of the Quiet Sun with the EUV Imaging Spectrometer (EIS) on Hinode

    Full text link
    Measurements of the temperature and density structure of the solar corona provide critical constraints on theories of coronal heating. Unfortunately, the complexity of the solar atmosphere, observational uncertainties, and the limitations of current atomic calculations, particularly those for Fe, all conspire to make this task very difficult. A critical assessment of plasma diagnostics in the corona is essential to making progress on the coronal heating problem. In this paper we present an analysis of temperature and density measurements above the limb in the quiet corona using new observations from the EUV Imaging Spectrometer (EIS) on \textit{Hinode}. By comparing the Si and Fe emission observed with EIS we are able to identify emission lines that yield consistent emission measure distributions. With these data we find that the distribution of temperatures in the quiet corona above the limb is strongly peaked near 1 MK, consistent with previous studies. We also find, however, that there is a tail in the emission measure distribution that extends to higher temperatures. EIS density measurements from several density sensitive line ratios are found to be generally consistent with each other and with previous measurements in the quiet corona. Our analysis, however, also indicates that a significant fraction of the weaker emission lines observed in the EIS wavelength ranges cannot be understood with current atomic data.Comment: Submitted to Ap

    Testing the Unitarity of the CKM Matrix with a Space-Based Neutron Decay Experiment

    Full text link
    If the Standard Model is correct, and fundamental fermions exist only in the three generations, then the CKM matrix should be unitary. However, there remains a question over a deviation from unitarity from the value of the neutron lifetime. We discuss a simple space-based experiment that, at an orbit height of 500 km above Earth, would measure the kinetic-energy, solid-angle, flux spectrum of gravitationally bound neutrons (kinetic energy K<0.606 eV at this altitude). The difference between the energy spectrum of neutrons that come up from the Earth's atmosphere and that of the undecayed neutrons that return back down to the Earth would yield a measurement of the neutron lifetime. This measurement would be free of the systematics of laboratory experiments. A package of mass <25<25 kg could provide a 10^{-3} precision in two years.Comment: 10 pages, 4 figures. Revised and updated for publicatio

    Rhythmogenic neuronal networks, pacemakers, and k-cores

    Full text link
    Neuronal networks are controlled by a combination of the dynamics of individual neurons and the connectivity of the network that links them together. We study a minimal model of the preBotzinger complex, a small neuronal network that controls the breathing rhythm of mammals through periodic firing bursts. We show that the properties of a such a randomly connected network of identical excitatory neurons are fundamentally different from those of uniformly connected neuronal networks as described by mean-field theory. We show that (i) the connectivity properties of the networks determines the location of emergent pacemakers that trigger the firing bursts and (ii) that the collective desensitization that terminates the firing bursts is determined again by the network connectivity, through k-core clusters of neurons.Comment: 4+ pages, 4 figures, submitted to Phys. Rev. Let

    Optimal Weighting of Preclinical Alzheimer’s Cognitive Composite (PACC) Scales to Improve their Performance as Outcome Measures for Alzheimer’s Disease Clinical Trials

    Get PDF
    Introduction: Cognitive composite scales constructed by combining existing neuropsychometric tests are seeing wide application as endpoints for clinical trials and cohort studies of Alzheimer’s disease (AD) predementia conditions. Preclinical Alzheimer’s Cognitive Composite (PACC) scales are composite scores calculated as the sum of the component test scores weighted by the reciprocal of their standard deviations at the baseline visit. Reciprocal standard deviation is an arbitrary weighting in this context, and may be an inefficient utilization of the data contained in the component measures. Mathematically derived optimal composite weighting is a promising alternative. Methods: Sample size projections using standard power calculation formulas were used to describe the relative performance of component measures and their composites when used as endpoints for clinical trials. Power calculations were informed by (n=1,333) amnestic mild cognitive impaired participants in the National Alzheimer’s Coordinating Center (NACC) Uniform Data Set. Results: A composite constructed using PACC reciprocal standard deviation weighting was both less sensitive to change than one of its component measures and less sensitive to change than its optimally weighted counterpart. In standard sample size calculations informed by NACC data, a clinical trial using the PACC weighting would require 38% more subjects than a composite calculated using optimal weighting. Discussion: These findings illustrate how reciprocal standard deviation weighting can result in inefficient cognitive composites, and underscore the importance of component weights to the performance of composite scales. In the future, optimal weighting parameters informed by accumulating clinical trial data may improve the efficiency of clinical trials in AD
    • …
    corecore