341,978 research outputs found

    A Multiple Attribute Decision Making Approach Based on New Similarity Measures of Interval-valued Hesitant Fuzzy Sets

    Get PDF
    Hesitant fuzzy sets, as an extension of fuzzy sets to deal with uncertainty, have attracted much attention since its introduction, in both theory and application aspects. The present work is focused on the interval-valued hesitant fuzzy sets (IVHFSs) to manage additional uncertainty. Now that distance and similarity as a kind of information measures are essential and important numerical indexes in fuzzy set theory and all their extensions, the present work aims at investigating distance and similarity measures in the IVHFSs and then employing them into multiple attribute decision making application. To begin with, II-type generalized interval-valued hesitant fuzzy distance is firstly introduced in the IVHFS, along with its properties and its relationships with the traditional Hamming-Distance and the Euclidean distance. Afterwards, another interval-valued hesitant fuzzy Lp distance based on Lp metric is proposed and its relationship with the Hausdorff distance is discussed. In addition, different from most of similarity measures with dependent on the corresponding distances, a new similarity measure based on set-theoretic approach for IVHFSs is introduced and its properties are discussed; especially, a relative similarity measure is proposed based on the positive ideal IVHFS and the negative ideal IVHFS. Finally, we describe how the IVHFS and its relative similarity measure can be applied to multiple attribute decision making. A numerical example is then provided to illustrate the effectiveness of the proposed method

    Photons uncertainty solves Einstein-Podolsky-Rosen paradox

    Full text link
    Einstein, Podolsky and Rosen (EPR) pointed out that the quantum-mechanical description of "physical reality" implied an unphysical, instantaneous action between distant measurements. To avoid such an action at a distance, EPR concluded that Quantum Mechanics had to be incomplete. However, its extensions involving additional "hidden variables", allowing for the recovery of determinism and locality, have been disproved experimentally (Bell's theorem). Here, I present an opposite solution of the paradox based on the greater indeterminism of the modern Quantum Field Theory (QFT) description of Particle Physics, that prevents the preparation of any state having a definite number of particles. The resulting uncertainty in photons radiation has interesting consequences in Quantum Information Theory (e.g. cryptography and teleportation). Moreover, since it allows for less elements of EPR physical reality than the old non-relativistic Quantum Mechanics, QFT satisfies the EPR condition of completeness without the need of hidden variables. The residual physical reality does never violate locality, thus the unique objective proof of "quantum nonlocality" is removed in an interpretation-independent way. On the other hand, the supposed nonlocality of the EPR correlations turns out to be a problem of the interpretation of the theory. If we do not rely on hidden variables or new physics beyond QFT, the unique viable interpretation is a minimal statistical one, that preserves locality and Lorentz symmetry.Comment: Published version, with updated referenc

    Information Gains from Cosmological Probes

    Full text link
    In light of the growing number of cosmological observations, it is important to develop versatile tools to quantify the constraining power and consistency of cosmological probes. Originally motivated from information theory, we use the relative entropy to compute the information gained by Bayesian updates in units of bits. This measure quantifies both the improvement in precision and the 'surprise', i.e. the tension arising from shifts in central values. Our starting point is a WMAP9 prior which we update with observations of the distance ladder, supernovae (SNe), baryon acoustic oscillations (BAO), and weak lensing as well as the 2015 Planck release. We consider the parameters of the flat Λ\LambdaCDM concordance model and some of its extensions which include curvature and Dark Energy equation of state parameter ww. We find that, relative to WMAP9 and within these model spaces, the probes that have provided the greatest gains are Planck (10 bits), followed by BAO surveys (5.1 bits) and SNe experiments (3.1 bits). The other cosmological probes, including weak lensing (1.7 bits) and {H0\rm H_0} measures (1.7 bits), have contributed information but at a lower level. Furthermore, we do not find any significant surprise when updating the constraints of WMAP9 with any of the other experiments, meaning that they are consistent with WMAP9. However, when we choose Planck15 as the prior, we find that, accounting for the full multi-dimensionality of the parameter space, the weak lensing measurements of CFHTLenS produce a large surprise of 4.4 bits which is statistically significant at the 8 σ\sigma level. We discuss how the relative entropy provides a versatile and robust framework to compare cosmological probes in the context of current and future surveys.Comment: 26 pages, 5 figure

    Information Theoretical Estimators Toolbox

    Get PDF
    We present ITE (information theoretical estimators) a free and open source, multi-platform, Matlab/Octave toolbox that is capable of estimating many different variants of entropy, mutual information, divergence, association measures, cross quantities, and kernels on distributions. Thanks to its highly modular design, ITE supports additionally (i) the combinations of the estimation techniques, (ii) the easy construction and embedding of novel information theoretical estimators, and (iii) their immediate application in information theoretical optimization problems. ITE also includes a prototype application in a central problem class of signal processing, independent subspace analysis and its extensions.Comment: 5 pages; ITE toolbox: https://bitbucket.org/szzoli/ite
    • …
    corecore