41,606 research outputs found

    Axiomatic relation between thermodynamic and information-theoretic entropies

    Get PDF
    Thermodynamic entropy, as defined by Clausius, characterizes macroscopic observations of a system based on phenomenological quantities such as temperature and heat. In contrast, information-theoretic entropy, introduced by Shannon, is a measure of uncertainty. In this Letter, we connect these two notions of entropy, using an axiomatic framework for thermodynamics [Lieb, Yngvason, Proc. Roy. Soc.(2013)]. In particular, we obtain a direct relation between the Clausius entropy and the Shannon entropy, or its generalisation to quantum systems, the von Neumann entropy. More generally, we find that entropy measures relevant in non-equilibrium thermodynamics correspond to entropies used in one-shot information theory

    Enhancing Automated Test Selection in Probabilistic Networks

    Get PDF
    In diagnostic decision-support systems, test selection amounts to selecting, in a sequential manner, a test that is expected to yield the largest decrease in the uncertainty about a patient’s diagnosis. For capturing this uncertainty, often an information measure is used. In this paper, we study the Shannon entropy, the Gini index, and the misclassification error for this purpose. We argue that the Gini index can be regarded as an approximation of the Shannon entropy and that the misclassification error can be looked upon as an approximation of the Gini index. We further argue that the differences between the first derivatives of the three functions can explain different test sequences in practice. Experimental results from using the measures with a real-life probabilistic network in oncology support our observations

    Analysis and Design: Assessing Actual and Desired Course Content

    Get PDF
    A survey concerning the topics taught in the systems analysis and design course, how much time an instructor devoted to each topic and the perceived importance of the topics, was assembled from responses received from a posting to ISWorld list and the Information Systems Education Conference list of past participants. Using a consensus or agreement measure based on the Shannon entropy, the results are tabulated and ranked in order of entropy. Not all topics present in the standard textbooks are viewed as equally important, and some topics, like the creation of data flow diagrams and data modeling, while viewed as definitely important by IS educators, have a modest amount of time devoted to it by those same educators. Most topics could be grouped based on the agreed importance given that topic by IS educators and evaluated by the entropy measure. No agreement could be reached with regard to object-oriented technology

    Shannon Entropy in Stochastic Analysis of Some Mems

    Get PDF
    This work is focused on the numerical determination of Shannon probabilistic entropy for MEMS devices exhibiting some uncertainty in their structural response. This entropy is a universal measure of statistical or stochastic disorder in static deformation or dynamic vibrations of engineering systems and is available for both continuous and discrete distributions functions of structural parameters. An interval algorithm using Monte Carlo simulation and polynomial structural response recovery has been implemented to demonstrate an uncertainty propagation of the forced vibrations in some small MEMS devices. A computational example includes stochastic nonlinear vibrations described by the Duffing equation calibrated for some micro-resonators, whose damping is adopted as a Gaussian, uniformly and triangularly distributed input uncertainty source

    Рекурентна ентропія та фінансові кризи

    Get PDF
    Entropy is one of the most frequently and effectively used measure of the complexity of systems of various nature. And if the Shannon's canonical entropy is more a measure of the randomness of the system, then the approximate, sample, permutation and other new type entropy that have appeared recently, exploiting the Shannon entropy form have allowed us to quantify the complexity of the systems in question using fast and efficient algorithms. For the first time, a new type of recurrence entropy is used to analyze the dynamics of financial time series under crashes conditions. It is shown that recurrent entropy can be used as the indicator-predictor of financial crashes.Ентропія є одним з найбільш часто і ефективно використовуваних показників складності систем різної природи. І якщо канонічна ентропія Шеннона є скоріше мірою мірою випадковості системи, то наближена, вибіркова, перестановки і інша ентропія нового типу, що з'явилася недавно з використанням форми ентропії Шеннона, дозволили нам кількісно оцінити складність систем в Питання з використанням швидких і ефективних алгоритмів. Вперше новий тип рекуррентной ентропії використовується для аналізу динаміки фінансових часових рядів в умовах краху. Показано, що рекуррентную ентропію можна використовувати як індикатор-передвісник фінансових катастроф

    Performance of Shannon-entropy compacted N-electron wave functions for configuration interaction methods

    Get PDF
    The coefficients of full configuration interaction wave functions (FCI) for N-electron systems expanded in N-electron Slater determinants depend on the orthonormal one-particle basis chosen although the total energy remains invariant. Some bases result in more compact wave functions, i.e. result in fewer determinants with significant expansion coefficients. In this work, the Shannon entropy, as a measure of information content, is evaluated for such wave functions to examine whether there is a relationship between the FCI Shannon entropy of a given basis and the performance of that basis in truncated CI approaches. The results obtained for a set of randomly picked bases are compared to those obtained using the traditional canonical molecular orbitals, natural orbitals, seniority minimising orbitals and a basis that derives from direct minimisation of the Shannon entropy. FCI calculations for selected atomic and molecular systems clearly reflect the influence of the chosen basis. However, it is found that there is no direct relationship between the entropy computed for each basis and truncated CI energies.Instituto de Investigaciones Fisicoquímicas Teóricas y Aplicada

    Performance of Shannon-entropy compacted N-electron wave functions for configuration interaction methods

    Get PDF
    The coefficients of full configuration interaction wave functions (FCI) for N-electron systems expanded in N-electron Slater determinants depend on the orthonormal one-particle basis chosen although the total energy remains invariant. Some bases result in more compact wave functions, i.e. result in fewer determinants with significant expansion coefficients. In this work, the Shannon entropy, as a measure of information content, is evaluated for such wave functions to examine whether there is a relationship between the FCI Shannon entropy of a given basis and the performance of that basis in truncated CI approaches. The results obtained for a set of randomly picked bases are compared to those obtained using the traditional canonical molecular orbitals, natural orbitals, seniority minimising orbitals and a basis that derives from direct minimisation of the Shannon entropy. FCI calculations for selected atomic and molecular systems clearly reflect the influence of the chosen basis. However, it is found that there is no direct relationship between the entropy computed for each basis and truncated CI energies.Fil: Alcoba, Diego Ricardo. Consejo Nacional de Investigaciones Científicas y Técnicas; Argentina. Consejo Nacional de Investigaciones Científicas y Técnicas; ArgentinaFil: Torre, Alicia. Universidad de Buenos Aires; Argentina. Universidad de Buenos Aires; ArgentinaFil: Lain, Luis. Universidad del País Vasco; España. Universidad del País Vasco; EspañaFil: Massaccesi, Gustavo Ernesto. Universidad del País Vasco; España. Universidad del País Vasco; EspañaFil: Oña, Ofelia Beatriz. Universidad de Buenos Aires; Argentina. Universidad de Buenos Aires; ArgentinaFil: Ayers, P. W.. Consejo Nacional de Investigaciones Científicas y Técnicas; Argentina. Consejo Nacional de Investigaciones Científicas y Técnicas; ArgentinaFil: Van Raemdonck, M.. Mcmaster University; Canadá. Mcmaster University; CanadáFil: Bultinck, P.. University of Ghent; Bélgica. University of Ghent; BélgicaFil: Van Neck, D.. University of Ghent; Bélgica. University of Ghent; Bélgic
    corecore