225 research outputs found

    Lempel-Zip Complexity Reference

    Full text link
    The aim of this note is to provide some reference facts for LZW---mostly from Thomas and Cover \cite{Cover:2006aa} and provide a reference for some metrics that can be derived from it. LZW is an algorithm to compute a Kolmogorov Complexity estimate derived from a limited programming language that only allows copy and insertion in strings (not Turing complete set). Despite its delightful simplicity, it is rather powerful and fast. We then focus on definitions of LZW derived complexity metrics consistent with the notion of descriptive length, and discuss different normalizations, which result in a set of metrics we call ρ0\rho_0, ρ1\rho_1 and ρ2\rho_2, in addition to the Description Length lLZWl_{LZW} and the Entropy Rate.Comment: For the Luminous Project (FET Open); Zip file includes Python code and Jupiter noteboo

    Application of the reciprocity theorem to EEG inversion and optimization of EEG-driven transcranial current stimulation (tCS, including tDCS, tACS, tRNS)

    Full text link
    Multichannel transcranial current stimulation (tCS) systems offer the possibility of EEG-guided optimized, non-invasive brain stimulation. In this brief technical note I explain how it is possible to use tCS electric field realistic brain model to create a forward "lead-field" matrix and, from that, an EEG inverter for cortical mapping. Starting from EEG I show how to generate 2D cortical surface dipole fields that could produce the observed EEG electrode voltages. The main tool is the reciprocity theorem derived by Helmholtz. The application of reciprocity for the generation of a forward mapping matrix (lead field matrix as is sometimes known) is well known [Rush and Driscoll, 1969], but here we will use it in combination with the realistic head models of [Miranda et al 2013] to provide cortical mapping solutions compatible with realistic head model tCS optimization. I also provide a generalization of the reciprocity theorem [Helmholtz 1853] to the case of multiple electrode contact points and dipole sources, and discuss its uses in non-invasive brain stimulation based on EEG. This, as far as I know, is a novel result. Applications are discussed.Comment: 11 pages, 4 figure

    Four approaches to quantization of the relativistic particle

    Get PDF
    The connection between four different approaches to quantization of the relativistic particle is studied: reduced phase space quantization, Dirac quantization, BRST quantization, and (BRST)-Fock quantization are each carried out. The connection to the BFV path integral in phase space is provided. In particular, it is concluded that that the full range of the lapse should be used in such path integrals. The relationship between all these approaches is established.Comment: 27 pages, 0 figure

    Ionospheric (H-atom) Tomography: a Feasibility Study using GNSS Reflections

    Full text link
    In this report we analyze the feasibility of ionospheric monitoring using GNSS technology. The focus will be on the use of LEO GNSS data, exploiting GNSS Reflections, Navigation and Occultation TEC measurements. In order to attack this question, we have simulated GNSS ionospheric TEC data as it would be measured from a polar LEO (exploiting Navigation, Occultation and Reflection TEC data) and IGS ground stations, through the use of a climatic ionospheric model (we have explored both NeQuick and PIM). We have then developed a new tomographic approach inspired on the physics of the hydrogen atom, which has been compared to previous successful but somewhat awkward methods (using a voxel representation) and employed to retrieve the Electronic Density field from the simulated TEC data. These tomographic inversion results using simulated data demonstrate the significant impact of GNSS-R and GNSS-NO data: 3D ionospheric Electron Density fields are retrieved over the oceans quite accurately, even as, in the spirit of this initial study, the simulation and inversion approaches avoided intensive computation and sophisticated algorithmic elements (spatio-temporal smoothing). We conclude that GNSS-R data can contribute significantly to the GIOS (Global/GNSS Ionospheric Observation System).Comment: Abridged Starlab ESA report from ESTEC/ESA Contract No. Starlab/CO/0001/02, Courtesy of ESA and Starla

    PARIS Altimetry with L1 Frequency Data from the Bridge 2 Experiment

    Full text link
    A portion of 20 minutes of the GPS signals collected during the Bridge 2 experimental campaign, performed by ESA, have been processed. An innovative algorithm called Parfait, developed by Starlab and implemented within Starlab's GNSS-R Software package STARLIGHT (STARLab Interferometric Gnss Toolkit), has been successfully used with this set of data. A comparison with tide values independently collected and with differential GPS processed data has been performed. We report a successful PARIS phase altimetric measure of the Zeeland Brug over the sea surface with a rapidly changing tide, with a precision better than 2 cm.Comment: Abridged Starlab ESA/ESTEC Technical Report from the Paris Alpha CCN3, Courtesy of ESA/ESTEC and Starla

    Stationary Phase in Coherent State Path Integrals

    Full text link
    In applying the stationary phase approximation to coherent state path integrals a difficulty occurs; there are no classical paths that satisfy the boundary conditions of the path integral. Others have gotten around this problem by reevaluating the action. In this work it is shown that it is not necessary to reevaluate the action because the stationary phase approximation is applicable so long as the path, about which the expansion is performed, satisfies the associated Lagrange's equations of motion. It is not necessary for this path to satisfy the boundary conditions in order to apply the stationary phase approximation.Comment: 10 pages, RevTeX

    Spherical Harmonics Interpolation, Computation of Laplacians and Gauge Theory

    Full text link
    The aim in this note is to define an algorithm to carry out minimal curvature spherical harmonics interpolation, which is then used to calculate the Laplacian for multi-electrode EEG data analysis. The approach taken is to respect the data. That is, we implement a minimal curvature condition for the interpolating surface subject to the constraints determined from the multi-electrode data. We implement this approach using spherical harmonics interpolation. In this elegant example we show that minimization requirement and constraints complement each other to fix all degrees of freedom automatically, as occurs in gauge theories. That is, the constraints are respected, while only the orthogonal subspace minimization constraints are enforced. As an example, we discuss the application to interpolate control data and calculate the temporal sequence of laplacians from an EEG Mismatch Negativity (MMN) experiment (using an implementation of the algorithm in IDL).Comment: 14 pages, 3 figures. This is an internal Starlab Knowledge Nugget, with public statu

    Reality as Simplicity

    Full text link
    The aim of this paper is to study the relevance of simplicity and its formal representation as Kolmogorov or algorithmic complexity in the cognitive sciences. The discussion is based on two premises: 1) all human experience is generated in the brain, 2) the brain has only access to information. Taken together, these two premises lead us to conclude that all the elements of what we call `reality' are derived mental constructs based on information and compression, i.e., algorithmic models derived from the search for simplicity in data. Naturally, these premises apply to humans in real or virtual environments as well as robots or other cognitive systems. Based on this, it is further hypothesized that there is a hierarchy of processing levels where simplicity and compression play a major role. As applications, I illustrate first the relevance of compression and simplicity in fundamental neuroscience with an analysis of the Mismatch Negativity paradigm. Then I discuss the applicability to Presence research, which studies how to produce real-feeling experiences in mediated interaction, and use Bayesian modeling to define in a formal way different aspects of the illusion of Presence. The idea is put forth that given alternative models (interpretations) for a given mediated interaction, a brain will select the simplest one it can construct weighted by prior models. In the final section the universality of these ideas and applications in robotics, machine learning, biology and education is discussed. I emphasize that there is a common conceptual thread based on the idea of simplicity, which suggests a common study approach.Comment: Submitted to Brain Research Bulletin (special edition on VR, brain research and robotics). 42 pages and 3 figure

    Information, complexity, brains and reality (Kolmogorov Manifesto)

    Full text link
    I discuss several aspects of information theory and its relationship to physics and neuroscience. The unifying thread of this somewhat chaotic essay is the concept of Kolmogorov or algorithmic complexity (Kolmogorov Complexity, for short). I argue that it is natural to interpret cognition as the art of finding algorithms that apprach the Solomonoff-Kolmogorov-Chaitin (algorithmic) Complexity limit with appropriate tradeoffs. In addition, I claim that what we call the universe is an interpreted abstraction--a mental construct--based on the observed coherence between multiple sensory input streams and our own interactions. Hence, the notion of Universe is itself a model.Comment: This is a live essay, kind of a mental log book on a series of topics under the theme of information and compressio

    Analysis of Water Vapor spatio-temporal structure over the Madrid Area using GPS data

    Full text link
    We have analyzed Zenith Wet Delay (ZWD) time series from an experiment over the Madrid (Spain) area obtained from 5 GPS receivers using two different techniques. In the first case a delay correlation analysis of the ZWD time-series has been carried out. We show that for this small network (with a spatial scale of less than 100 km) the correlation between the time series is very strong, and that using windowing techniques a reliable correlation delay time series can be produced for each pair of sites (10 such pairs are available). We use this delay time series together with a frozen flow model to estimate the velocity of a passing front, and compare the results to meteorological data and Numerical Weather Prediction output, showing good agreement. In the second approach, the data is analyzed using Empirical Orthogonal Functions. We demonstrate that the temporally demeaned and normalized analysis yields information about the passing of fronts, while the spatially demeaned data yields orographic information. A common second mode highlights the underlying wave behavior.Comment: 19 pages, 8 figure
    • 

    corecore