451 research outputs found

    Quasilinear eigenvalues

    Get PDF
    In this work, we review and extend some well known results for the eigenvalues of the Dirichlet pp-Laplace operator to a more general class of monotone quasilinear elliptic operators. As an application we obtain some homogenization results for nonlinear eigenvalues.Comment: 23 pages, Rev. UMA, to appea

    Information and Entropy

    Full text link
    What is information? Is it physical? We argue that in a Bayesian theory the notion of information must be defined in terms of its effects on the beliefs of rational agents. Information is whatever constrains rational beliefs and therefore it is the force that induces us to change our minds. This problem of updating from a prior to a posterior probability distribution is tackled through an eliminative induction process that singles out the logarithmic relative entropy as the unique tool for inference. The resulting method of Maximum relative Entropy (ME), which is designed for updating from arbitrary priors given information in the form of arbitrary constraints, includes as special cases both MaxEnt (which allows arbitrary constraints) and Bayes' rule (which allows arbitrary priors). Thus, ME unifies the two themes of these workshops -- the Maximum Entropy and the Bayesian methods -- into a single general inference scheme that allows us to handle problems that lie beyond the reach of either of the two methods separately. I conclude with a couple of simple illustrative examples.Comment: Presented at MaxEnt 2007, the 27th International Workshop on Bayesian Inference and Maximum Entropy Methods (July 8-13, 2007, Saratoga Springs, New York, USA

    Neural networks can detect model-free static arbitrage strategies

    Full text link
    In this paper we demonstrate both theoretically as well as numerically that neural networks can detect model-free static arbitrage opportunities whenever the market admits some. Due to the use of neural networks, our method can be applied to financial markets with a high number of traded securities and ensures almost immediate execution of the corresponding trading strategies. To demonstrate its tractability, effectiveness, and robustness we provide examples using real financial data. From a technical point of view, we prove that a single neural network can approximately solve a class of convex semi-infinite programs, which is the key result in order to derive our theoretical results that neural networks can detect model-free static arbitrage strategies whenever the financial market admits such opportunities

    From Information Geometry to Newtonian Dynamics

    Get PDF
    Newtonian dynamics is derived from prior information codified into an appropriate statistical model. The basic assumption is that there is an irreducible uncertainty in the location of particles so that the state of a particle is defined by a probability distribution. The corresponding configuration space is a statistical manifold the geometry of which is defined by the information metric. The trajectory follows from a principle of inference, the method of Maximum Entropy. No additional "physical" postulates such as an equation of motion, or an action principle, nor the concepts of momentum and of phase space, not even the notion of time, need to be postulated. The resulting entropic dynamics reproduces the Newtonian dynamics of any number of particles interacting among themselves and with external fields. Both the mass of the particles and their interactions are explained as a consequence of the underlying statistical manifold.Comment: Presented at MaxEnt 2007, the 27th International Workshop on Bayesian Inference and Maximum Entropy Methods (July 8-13, 2007, Saratoga Springs, New York, USA

    Updating Probabilities with Data and Moments

    Full text link
    We use the method of Maximum (relative) Entropy to process information in the form of observed data and moment constraints. The generic "canonical" form of the posterior distribution for the problem of simultaneous updating with data and moments is obtained. We discuss the general problem of non-commuting constraints, when they should be processed sequentially and when simultaneously. As an illustration, the multinomial example of die tosses is solved in detail for two superficially similar but actually very different problems.Comment: Presented at the 27th International Workshop on Bayesian Inference and Maximum Entropy Methods in Science and Engineering, Saratoga Springs, NY, July 8-13, 2007. 10 pages, 1 figure V2 has a small typo in the end of the appendix that was fixed. aj=mj+1 is now aj=m(k-j)+
    corecore