266 research outputs found

    Quantum Entropy

    Full text link
    Quantum physics, despite its observables being intrinsically of a probabilistic nature, does not have a quantum entropy assigned to them. We propose a quantum entropy that quantify the randomness of a pure quantum state via a conjugate pair of observables forming the quantum phase space. The entropy is dimensionless, it is a relativistic scalar, it is invariant under coordinate transformation of position and momentum that maintain conjugate properties, and under CPT transformations; and its minimum is positive due to the uncertainty principle. We expand the entropy to also include mixed states and show that the proposed entropy is always larger than von Neumann's entropy. We conjecture an entropy law whereby that entropy of a closed system never decreases, implying a time arrow for particles physics.Comment: arXiv admin note: text overlap with A follow up on this paper is arXiv:2106.15378. A related paper is arXiv:2111.11605. Early versions of the ideas in this paper are in arXiv:1906.11712 and arXiv:2103.0799

    An Optimization of Thermodynamic Efficiency vs. Capacity for Communications Systems

    Get PDF
    This work provides a fundamental view of the mechanisms which affect the power efficiency of communications processes along with a method for efficiency enhancement. Shannon\u27s work is the definitive source for analyzing information capacity of a communications system but his formulation does not predict an efficiency relationship suitable for calculating the power consumption of a system, particularly for practical signals which may only approach the capacity limit. This work leverages Shannon\u27s while providing additional insight through physical models which enable the calculation and improvement of efficiency for the encoding of signals. The proliferation of Mobile Communications platforms is challenging capacity of networks largely because of the ever increasing data rate at each node. This places significant power management demands on personal computing devices as well as cellular and WLAN terminals. The increased data throughput translates to shorter meantime between battery charging cycles and increased thermal footprint. Solutions are developed herein to counter this trend. Hardware was constructed to measure the efficiency of a prototypical Gaussian signal prior to efficiency enhancement. After an optimization was performed, the efficiency of the encoding apparatus increased from 3.125% to greater than 86% for a manageable investment of resources. Likewise several telecommunications standards based waveforms were also tested on the same hardware. The results reveal that the developed physical theories extrapolate in a very accurate manner to an electronics application, predicting the efficiency of single ended and differential encoding circuits before and after optimization

    Social economy as social science and practice : historical perspectives on France

    Get PDF
    This article aims to investigate the multiple meanings of "économie sociale" ("social economy"), a term which first appeared in France at the founding moment of modern capitalism, both as a concept in the framework for the creation of a social science in close relation with the tradition of classical, Christian and socialist economists, and also to establish an ensemble of social practices and institutions. A historical perspective shows the close yet ambivalent relationship between these two principal connotations. Stemming from this, the conclusion presents some new research orientations towards social economy as a social science and social practice.social economy ; social science ; France

    Payment scale economies, competition, and pricing

    Get PDF
    Payment scale economies affect banking costs, competition in payment services, and pricing. Our scale measure relates operating cost to physical measures of European banking "output", finding large economies. This differs from relating total cost to the value of balance sheet assets (the conventional approach). Interest expenses are excluded since differences here are primarily due to mix, not scale. Also, since standard indicators of competition can give inconsistent results, a revenue-based frontier measure is developed and applied to European banks, with little difference evident across countries. Existing differences in bank prices (EC report) are associated with small differences in competition. JEL Classification: E41, C53Bank competition, European banks, frontier analysis, Payment scale economies

    Planning and managing water resources at the river-basin level: emergence and evolution of a concept

    Get PDF
    River basin development / Legislation / Environmental effects / Water resource management / Watersheds

    The Place of Fantasy in a Critical Political Economy: The Case of Market Boundaries

    Get PDF

    Haar Wavelets-Based Methods for Credit Risk Portfolio Modeling

    Get PDF
    In this dissertation we have investigated the credit risk measurement of a credit portfolio by means of the wavelets theory. Banks became subject to regulatory capital requirements under Basel Accords and also to the supervisory review process of capital adequacy, this is the economic capital. Concentration risks in credit portfolios arise from an unequal distribution of loans to single borrowers (name concentration) or different industry or regional sectors (sector concentration) and may lead banks to face bankruptcy. The Merton model is the basis of the Basel II approach, it is a Gaussian one-factor model such that default events are driven by a latent common factor that is assumed to follow the Gaussian distribution. Under this model, loss only occurs when an obligor defaults in a fixed time horizon. If we assume certain homogeneity conditions, this one-factor model leads to a simple analytical asymptotic approximation of the loss distribution function and VaR. The VaR value at a high confidence level is the measure chosen in Basel II to calculate regulatory capital. This approximation, usually called Asymptotic Single Risk Factor model (ASRF), works well for a large number of small exposures but can underestimates risks in the presence of exposure concentrations. Then, the ASRF model does not provide an appropriate quantitative framework for the computation of economic capital. Monte Carlo simulation is a standard method for measuring credit portfolio risk in order to deal with concentration risks. However, this method is very time consuming when the size of the portfolio increases, making the computation unworkable in many situations. In summary, credit risk managers are interested in how can concentration risk be quantified in short times and how can the contributions of individual transactions to the total risk be computed. Since the loss variable can take only a finite number of discrete values, the cumulative distribution function (CDF) is discontinuous and then the Haar wavelets are particularly well-suited for this stepped-shape functions. For this reason, we have developed a new method for numerically inverting the Laplace transform of the density function, once we have approximated the CDF by a finite sum of Haar wavelet basis functions. Wavelets are used in mathematical analysis to denote a kind of orthonormal basis with remarkable approximation properties. The difference between the usual sine wave and a wavelet may be described by the localization property, while the sine wave is localized in frequency domain but not in time domain, a wavelet is localized in both, frequency and time domain. Once the CDF has been computed, we are able to calculate the VaR at a high loss level. Furthermore, we have computed also the Expected Shortfall (ES), since VaR is not a coherent risk measure in the sense that it is not sub-additive. We have shown that, in a wide variety of portfolios, these measures are fast and accurately computed with a relative error lower than 1% when compared with Monte Carlo. We have also extended this methodology to the estimation of the risk contributions to the VaR and the ES, by taking partial derivatives with respect to the exposures, obtaining again high accuracy. Some technical improvements have also been implemented in the computation of the Gauss-Hermite integration formula in order to get the coefficients of the approximation, making the method faster while the accuracy remains. Finally, we have extended the wavelet approximation method to the multi-factor setting by means of Monte Carlo and quasi-Monte Carlo methods
    corecore