364 research outputs found

    Maximum Entropy and Bayesian Data Analysis: Entropic Priors

    Full text link
    The problem of assigning probability distributions which objectively reflect the prior information available about experiments is one of the major stumbling blocks in the use of Bayesian methods of data analysis. In this paper the method of Maximum (relative) Entropy (ME) is used to translate the information contained in the known form of the likelihood into a prior distribution for Bayesian inference. The argument is inspired and guided by intuition gained from the successful use of ME methods in statistical mechanics. For experiments that cannot be repeated the resulting "entropic prior" is formally identical with the Einstein fluctuation formula. For repeatable experiments, however, the expected value of the entropy of the likelihood turns out to be relevant information that must be included in the analysis. The important case of a Gaussian likelihood is treated in detail.Comment: 23 pages, 2 figure

    Semi-supervised Multi-sensor Classification via Consensus-based Multi-View Maximum Entropy Discrimination

    Full text link
    In this paper, we consider multi-sensor classification when there is a large number of unlabeled samples. The problem is formulated under the multi-view learning framework and a Consensus-based Multi-View Maximum Entropy Discrimination (CMV-MED) algorithm is proposed. By iteratively maximizing the stochastic agreement between multiple classifiers on the unlabeled dataset, the algorithm simultaneously learns multiple high accuracy classifiers. We demonstrate that our proposed method can yield improved performance over previous multi-view learning approaches by comparing performance on three real multi-sensor data sets.Comment: 5 pages, 4 figures, Accepted in 40th IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP 15

    Does psychedelic therapy have a transdiagnostic action and prophylactic potential?

    Get PDF
    Addressing global mental health is a major twenty-first century challenge. Current treatments have recognised limitations; in this context, new ones that are prophylactic and effective across diagnostic boundaries would represent a major advance. The view that there exists a core of transdiagnostic overlap between psychiatric disorders has re-emerged in recent years, and evidence that psychedelic therapy holds promise for a range of psychiatric disorders supports the position that it may be trans diagnostically effective. Here we propose that psychedelic therapy's core, trans diagnostically relevant action, lies in its ability to increase neuronal and mental plasticity, thus enhancing the potential for change, which we consider to be a key to its therapeutic benefits. Moreover, we suggest that enhanced plasticity via psychedelics, combined with a psychotherapeutic approach, can aid healthy adaptability and resilience, protective factors for long-term well-being. We present candidate neurological , and psychological markers of this plasticity and link them with a predictive processing model of the action of psychedelics. We propose that a model of psychedelic induced plasticity combined with an adequate therapeutic context has prophylactic and transdiagnostic potential, implying that it could have abroad, positive impact on public health

    Questions, relevance and relative entropy

    Get PDF
    What is a question? According to Cox a question can be identified with the set of assertions that constitute possible answers. In this paper we propose a different approach that combines the notion that questions are requests for information with the notion that probability distributions represent uncertainties resulting from lack of information. This suggests that to each probability distribution one can naturally associate that particular question which requests the information that is missing and vice-versa. We propose to represent questions q by probability distributions Next we consider how questions relate to each other: to what extent is finding the answer to one question relevant to answering another? A natural measure of relevance is derived by requiring that it satisfy three desirable features (three axioms). We find that the relevance of a question q to another question Q turns out to be the relative entropy S[q,Q] of the corresponding distributions. An application to statistical physics is briefly considered.Comment: Presented at MaxEnt 2004, the 24th International Workshop on Bayesian Inference and Maximum Entropy Methods (July 25-30, 2004, Garching bei Munchen, Germany

    Multi-observation PET image analysis for patient follow-up quantitation and therapy assessment.: Multi observation PET image fusion for patient follow-up quantitation and therapy response

    No full text
    International audienceIn positron emission tomography (PET) imaging, an early therapeutic response is usually characterized by variations of semi-quantitative parameters restricted to maximum SUV measured in PET scans during the treatment. Such measurements do not reflect overall tumor volume and radiotracer uptake variations. The proposed approach is based on multi-observation image analysis for merging several PET acquisitions to assess tumor metabolic volume and uptake variations. The fusion algorithm is based on iterative estimation using a stochastic expectation maximization (SEM) algorithm. The proposed method was applied to simulated and clinical follow-up PET images. We compared the multi-observation fusion performance to threshold-based methods, proposed for the assessment of the therapeutic response based on functional volumes. On simulated datasets the adaptive threshold applied independently on both images led to higher errors than the ASEM fusion and on clinical datasets it failed to provide coherent measurements for four patients out of seven due to aberrant delineations. The ASEM method demonstrated improved and more robust estimation of the evaluation leading to more pertinent measurements. Future work will consist in extending the methodology and applying it to clinical multi-tracer datasets in order to evaluate its potential impact on the biological tumor volume definition for radiotherapy applications

    Pair diffusion, hydrodynamic interactions, and available volume in dense fluids

    Full text link
    We calculate the pair diffusion coefficient D(r) as a function of the distance r between two hard-sphere particles in a dense monodisperse suspension. The distance-dependent pair diffusion coefficient describes the hydrodynamic interactions between particles in a fluid that are central to theories of polymer and colloid dynamics. We determine D(r) from the propagators (Green's functions) of particle pairs obtained from discontinuous molecular dynamics simulations. At distances exceeding 3 molecular diameters, the calculated pair diffusion coefficients are in excellent agreement with predictions from exact macroscopic hydrodynamic theory for large Brownian particles suspended in a solvent bath, as well as the Oseen approximation. However, the asymptotic 1/r distance dependence of D(r) associated with hydrodynamic effects emerges only after the pair distance dynamics has been followed for relatively long times, indicating non-negligible memory effects in the pair diffusion at short times. Deviations of the calculated D(r) from the hydrodynamic models at short distances r reflect the underlying many-body fluid structure, and are found to be correlated to differences in the local available volume. The procedure used here to determine the pair diffusion coefficients can also be used for single-particle diffusion in confinement with spherical symmetry.Comment: 6 pages, 5 figure

    Component separation methods for the Planck mission

    Get PDF
    The Planck satellite will map the full sky at nine frequencies from 30 to 857 GHz. The CMB intensity and polarization that are its prime targets are contaminated by foreground emission. The goal of this paper is to compare proposed methods for separating CMB from foregrounds based on their different spectral and spatial characteristics, and to separate the foregrounds into components of different physical origin. A component separation challenge has been organized, based on a set of realistically complex simulations of sky emission. Several methods including those based on internal template subtraction, maximum entropy method, parametric method, spatial and harmonic cross correlation methods, and independent component analysis have been tested. Different methods proved to be effective in cleaning the CMB maps from foreground contamination, in reconstructing maps of diffuse Galactic emissions, and in detecting point sources and thermal Sunyaev-Zeldovich signals. The power spectrum of the residuals is, on the largest scales, four orders of magnitude lower than that of the input Galaxy power spectrum at the foreground minimum. The CMB power spectrum was accurately recovered up to the sixth acoustic peak. The point source detection limit reaches 100 mJy, and about 2300 clusters are detected via the thermal SZ effect on two thirds of the sky. We have found that no single method performs best for all scientific objectives. We foresee that the final component separation pipeline for Planck will involve a combination of methods and iterations between processing steps targeted at different objectives such as diffuse component separation, spectral estimation and compact source extraction.Comment: Matches version accepted by A&A. A version with high resolution figures is available at http://people.sissa.it/~leach/compsepcomp.pd
    corecore