516 research outputs found

    Subelliptic Li-Yau estimates on three dimensional model spaces

    Get PDF
    We describe three elementary models in three dimensional subelliptic geometry which correspond to the three models of the Riemannian geometry (spheres, Euclidean spaces and Hyperbolic spaces) which are respectively the SU(2), Heisenberg and SL(2) groups. On those models, we prove parabolic Li-Yau inequalities on positive solutions of the heat equation. We use for that the Γ2\Gamma_{2} techniques that we adapt to those elementary model spaces. The important feature developed here is that although the usual notion of Ricci curvature is meaningless (or more precisely leads to bounds of the form −∞-\infty for the Ricci curvature), we describe a parameter ρ\rho which plays the same role as the lower bound on the Ricci curvature, and from which one deduces the same kind of results as one does in Riemannian geometry, like heat kernel upper bounds, Sobolev inequalities and diameter estimates

    Testing surface area with arbitrary accuracy

    Full text link
    Recently, Kothari et al.\ gave an algorithm for testing the surface area of an arbitrary set A⊂[0,1]nA \subset [0, 1]^n. Specifically, they gave a randomized algorithm such that if AA's surface area is less than SS then the algorithm will accept with high probability, and if the algorithm accepts with high probability then there is some perturbation of AA with surface area at most ÎșnS\kappa_n S. Here, Îșn\kappa_n is a dimension-dependent constant which is strictly larger than 1 if n≄2n \ge 2, and grows to 4/π4/\pi as n→∞n \to \infty. We give an improved analysis of Kothari et al.'s algorithm. In doing so, we replace the constant Îșn\kappa_n with 1+η1 + \eta for η>0\eta > 0 arbitrary. We also extend the algorithm to more general measures on Riemannian manifolds.Comment: 5 page

    Log-Harnack Inequality for Stochastic Differential Equations in Hilbert Spaces and its Consequences

    Full text link
    A logarithmic type Harnack inequality is established for the semigroup of solutions to a stochastic differential equation in Hilbert spaces with non-additive noise. As applications, the strong Feller property as well as the entropy-cost inequality for the semigroup are derived with respect to the corresponding distance (cost function)

    Dimension dependent hypercontractivity for Gaussian kernels

    Get PDF
    We derive sharp, local and dimension dependent hypercontractive bounds on the Markov kernel of a large class of diffusion semigroups. Unlike the dimension free ones, they capture refined properties of Markov kernels, such as trace estimates. They imply classical bounds on the Ornstein-Uhlenbeck semigroup and a dimensional and refined (transportation) Talagrand inequality when applied to the Hamilton-Jacobi equation. Hypercontractive bounds on the Ornstein-Uhlenbeck semigroup driven by a non-diffusive L\'evy semigroup are also investigated. Curvature-dimension criteria are the main tool in the analysis.Comment: 24 page

    String effects and the distribution of the glue in mesons at finite temperature

    Get PDF
    The distribution of the gluon action density in mesonic systems is investigated at finite temperature. The simulations are performed in quenched QCD for two temperatures below the deconfinment phase. Unlike the gluonic profiles displayed at T=0, the action density iso-surfaces display a prolate-spheroid like shape. The curved width profile of the flux-tube is found to be consistent with the prediction of the free Bosonic string model at large distances.Comment: 14 pages,10 figure

    Sharp estimates on the first eigenvalue of the p-Laplacian with negative Ricci lower bound

    Full text link
    We complete the picture of sharp eigenvalue estimates for the p-Laplacian on a compact manifold by providing sharp estimates on the first nonzero eigenvalue of the nonlinear operator Δp\Delta_p when the Ricci curvature is bounded from below by a negative constant. We assume that the boundary of the manifold is convex, and put Neumann boundary conditions on it. The proof is based on a refined gradient comparison technique and a careful analysis of the underlying model spaces.Comment: Sign mistake fixed in the proof of the gradient comparison theorem (theorem 5.1 pag 10), and some minor improvements aroun
    • 

    corecore