64 research outputs found

    Bounds on Two Parametric New Generalized Fuzzy Entropy

    Get PDF
    In this paper we define a new two parametric generalized fuzzy average code-word length...Keywords Fuzzy set, Membership function, Shannon’s entropy, Fuzzy entropy, Code-word length, Kraft inequality, Coding theorem, Holder’s inequality and Optimal code length. More details can be found in the full paper.

    Generalization of Renyi’s Entropy and its Application in Source Coding

    Get PDF
    In this paper, we introduce a new generalization of Renyis entropy β(P) and the most important feature of this generalized entropy Rαβ (P) is that it derives most important entropies that are well known and influence information theory and applied mathematics. Some significant properties of Rαβ (P) has been undertaken in this article. In addition, we introduce a new generalized exponentiated mean codeword length Lβα (P) in this article then determine how Rβα (P) and Lβα (P) are related in terms of source coding theorem

    A New Two-Parametric ‘Useful’ Fuzzy Information Measure and its Properties

    Get PDF
    A ‘useful’ fuzzy measure of order α and type β is developed. Its validity established with a numerical example

    Some generalization of Fuzzy Entropy measure and its applications.

    Get PDF
    This thesis represents only a small section of the different issue and topics that I was involved since 2010. Yet, it shows one of the consequences of my engagements during the past few years. Over these years, I had the historical opportunity to read and witness the rise and the fall of the important theories and results that are cited in this study and my involvements in these topics became a part of my whole academic life. Fuzzy SetTheory has come a long way since it was formally introduced by L.A. Zadeh in his classic paper entitled ‘Fuzzy Sets’ published in the journal ‘information and Control’ in the year 1965. Since that time the subject has been applied to every branch of knowledge. Many research investigations by mathematicians, scientists and social scientists, computer and management scientists and engineers all over the world have been made in the theory and applications of the subject. Applications of fuzzy logic and fuzzy set theory in decision-making, Pattern recognition, Image processing, Control systems, Neural networks, Genetic algorithm and in many other areas has given significant results.Digital copy of ThesisUniversity of Kashmir

    Logical Information Theory: New Logical Foundations for Information Theory

    Get PDF
    here is a new theory of information based on logic. The definition of Shannon entropy as well as the notions on joint, conditional, and mutual entropy as defined by Shannon can all be derived by a uniform transformation from the corresponding formulas of logical information theory. Information is first defined in terms of sets of distinctions without using any probability measure. When a probability measure is introduced, the logical entropies are simply the values of the (product) probability measure on the sets of distinctions. The compound notions of joint, conditional, and mutual entropies are obtained as the values of the measure, respectively, on the union, difference, and intersection of the sets of distinctions. These compound notions of logical entropy satisfy the usual Venn diagram relationships (e.g., inclusion-exclusion formulas) since they are values of a measure (in the sense of measure theory). The uniform transformation into the formulas for Shannon entropy is linear so it explains the long-noted fact that the Shannon formulas satisfy the Venn diagram relations--as an analogy or mnemonic--since Shannon entropy is not a measure (in the sense of measure theory) on a given set. What is the logic that gives rise to logical information theory? Partitions are dual (in a category-theoretic sense) to subsets, and the logic of partitions was recently developed in a dual/parallel relationship to the Boolean logic of subsets (the latter being usually mis-specified as the special case of "propositional logic"). Boole developed logical probability theory as the normalized counting measure on subsets. Similarly the normalized counting measure on partitions is logical entropy--when the partitions are represented as the set of distinctions that is the complement to the equivalence relation for the partition. In this manner, logical information theory provides the set-theoretic and measure-theoretic foundations for information theory. The Shannon theory is then derived by the transformation that replaces the counting of distinctions with the counting of the number of binary partitions (bits) it takes, on average, to make the same distinctions by uniquely encoding the distinct elements--which is why the Shannon theory perfectly dovetails into coding and communications theory

    Logical Information Theory: New Logical Foundations for Information Theory

    Get PDF
    here is a new theory of information based on logic. The definition of Shannon entropy as well as the notions on joint, conditional, and mutual entropy as defined by Shannon can all be derived by a uniform transformation from the corresponding formulas of logical information theory. Information is first defined in terms of sets of distinctions without using any probability measure. When a probability measure is introduced, the logical entropies are simply the values of the (product) probability measure on the sets of distinctions. The compound notions of joint, conditional, and mutual entropies are obtained as the values of the measure, respectively, on the union, difference, and intersection of the sets of distinctions. These compound notions of logical entropy satisfy the usual Venn diagram relationships (e.g., inclusion-exclusion formulas) since they are values of a measure (in the sense of measure theory). The uniform transformation into the formulas for Shannon entropy is linear so it explains the long-noted fact that the Shannon formulas satisfy the Venn diagram relations--as an analogy or mnemonic--since Shannon entropy is not a measure (in the sense of measure theory) on a given set. What is the logic that gives rise to logical information theory? Partitions are dual (in a category-theoretic sense) to subsets, and the logic of partitions was recently developed in a dual/parallel relationship to the Boolean logic of subsets (the latter being usually mis-specified as the special case of "propositional logic"). Boole developed logical probability theory as the normalized counting measure on subsets. Similarly the normalized counting measure on partitions is logical entropy--when the partitions are represented as the set of distinctions that is the complement to the equivalence relation for the partition. In this manner, logical information theory provides the set-theoretic and measure-theoretic foundations for information theory. The Shannon theory is then derived by the transformation that replaces the counting of distinctions with the counting of the number of binary partitions (bits) it takes, on average, to make the same distinctions by uniquely encoding the distinct elements--which is why the Shannon theory perfectly dovetails into coding and communications theory

    On the Efficiency of Classical and Quantum Secure Function Evaluation

    Full text link
    We provide bounds on the efficiency of secure one-sided output two-party computation of arbitrary finite functions from trusted distributed randomness in the statistical case. From these results we derive bounds on the efficiency of protocols that use different variants of OT as a black-box. When applied to implementations of OT, these bounds generalize most known results to the statistical case. Our results hold in particular for transformations between a finite number of primitives and for any error. In the second part we study the efficiency of quantum protocols implementing OT. While most classical lower bounds for perfectly secure reductions of OT to distributed randomness still hold in the quantum setting, we present a statistically secure protocol that violates these bounds by an arbitrarily large factor. We then prove a weaker lower bound that does hold in the statistical quantum setting and implies that even quantum protocols cannot extend OT. Finally, we present two lower bounds for reductions of OT to commitments and a protocol based on string commitments that is optimal with respect to both of these bounds
    corecore