5,379 research outputs found

    Orthogonal testing families and holomorphic extension from the sphere to the ball

    Full text link
    Let B2\mathbb{B}^2 denote the open unit ball in C2\mathbb{C}^2, and let p∈C2∖B2‾p\in \mathbb{C}^2\setminus\overline{\mathbb{B}^2}. We prove that if ff is an analytic function on the sphere ∂B2\partial\mathbb{B}^2 that extends holomorphically in each variable separately and along each complex line through pp, then ff is the trace of a holomorphic function in the ball.Comment: 9 pages, 2 figures. Final version to appear in Math.

    Fully Adaptive Gaussian Mixture Metropolis-Hastings Algorithm

    Get PDF
    Markov Chain Monte Carlo methods are widely used in signal processing and communications for statistical inference and stochastic optimization. In this work, we introduce an efficient adaptive Metropolis-Hastings algorithm to draw samples from generic multi-modal and multi-dimensional target distributions. The proposal density is a mixture of Gaussian densities with all parameters (weights, mean vectors and covariance matrices) updated using all the previously generated samples applying simple recursive rules. Numerical results for the one and two-dimensional cases are provided

    Two adaptive rejection sampling schemes for probability density functions log-convex tails

    Get PDF
    Monte Carlo methods are often necessary for the implementation of optimal Bayesian estimators. A fundamental technique that can be used to generate samples from virtually any target probability distribution is the so-called rejection sampling method, which generates candidate samples from a proposal distribution and then accepts them or not by testing the ratio of the target and proposal densities. The class of adaptive rejection sampling (ARS) algorithms is particularly interesting because they can achieve high acceptance rates. However, the standard ARS method can only be used with log-concave target densities. For this reason, many generalizations have been proposed. In this work, we investigate two different adaptive schemes that can be used to draw exactly from a large family of univariate probability density functions (pdf's), not necessarily log-concave, possibly multimodal and with tails of arbitrary concavity. These techniques are adaptive in the sense that every time a candidate sample is rejected, the acceptance rate is improved. The two proposed algorithms can work properly when the target pdf is multimodal, with first and second derivatives analytically intractable, and when the tails are log-convex in a infinite domain. Therefore, they can be applied in a number of scenarios in which the other generalizations of the standard ARS fail. Two illustrative numerical examples are shown

    On the Ekeland-Hofer symplectic capacities of the real bidisc

    Get PDF
    In C2\mathbb{C}^2 with the standard symplectic structure we consider the bidisc D2×D2D^2\times D^2 constructed as the product of two open real discs of radius 11. We compute explicit values for the first, second and third Ekeland-Hofer symplectic capacity of D2×D2D^2\times D^2. We discuss some applications to questions of symplectic rigidity.Comment: v3: Final version, to appear in "Pacific J. Math.", 20 page

    Improved Adaptive Rejection Metropolis Sampling Algorithms

    Full text link
    Markov Chain Monte Carlo (MCMC) methods, such as the Metropolis-Hastings (MH) algorithm, are widely used for Bayesian inference. One of the most important issues for any MCMC method is the convergence of the Markov chain, which depends crucially on a suitable choice of the proposal density. Adaptive Rejection Metropolis Sampling (ARMS) is a well-known MH scheme that generates samples from one-dimensional target densities making use of adaptive piecewise proposals constructed using support points taken from rejected samples. In this work we pinpoint a crucial drawback in the adaptive procedure in ARMS: support points might never be added inside regions where the proposal is below the target. When this happens in many regions it leads to a poor performance of ARMS, with the proposal never converging to the target. In order to overcome this limitation we propose two improved adaptive schemes for constructing the proposal. The first one is a direct modification of the ARMS procedure that incorporates support points inside regions where the proposal is below the target, while satisfying the diminishing adaptation property, one of the required conditions to assure the convergence of the Markov chain. The second one is an adaptive independent MH algorithm with the ability to learn from all previous samples except for the current state of the chain, thus also guaranteeing the convergence to the invariant density. These two new schemes improve the adaptive strategy of ARMS, thus simplifying the complexity in the construction of the proposals. Numerical results show that the new techniques provide better performance w.r.t. the standard ARMS.Comment: Matlab code provided in http://a2rms.sourceforge.net

    An extension theorem for regular functions of two quaternionic variables

    Get PDF
    For functions of two quaternionic variables that are regular in the sense of Fueter, we establish a result similar in spirit to the Hanges and Tr\`eves theorem. Namely, we show that a ball contained in the boundary of a domain is a propagator of regular extendability across the boundary.Comment: v3: Final version, to appear in "Journal of Mathematical Analysis and Applications", 10 pages, 1 figur

    Rethinking the Effective Sample Size

    Full text link
    The effective sample size (ESS) is widely used in sample-based simulation methods for assessing the quality of a Monte Carlo approximation of a given distribution and of related integrals. In this paper, we revisit and complete the approximation of the ESS in the specific context of importance sampling (IS). The derivation of this approximation, that we will denote as ESS^\widehat{\text{ESS}}, is only partially available in Kong [1992]. This approximation has been widely used in the last 25 years due to its simplicity as a practical rule of thumb in a wide variety of importance sampling methods. However, we show that the multiple assumptions and approximations in the derivation of ESS^\widehat{\text{ESS}}, makes it difficult to be considered even as a reasonable approximation of the ESS. We extend the discussion of the ESS in the multiple importance sampling (MIS) setting, and we display numerical examples. This paper does not cover the use of ESS for MCMC algorithms
    • …
    corecore