554,073 research outputs found

    Fast optimization algorithms and the cosmological constant

    Get PDF
    Denef and Douglas have observed that in certain landscape models the problem of finding small values of the cosmological constant is a large instance of an NP-hard problem. The number of elementary operations (quantum gates) needed to solve this problem by brute force search exceeds the estimated computational capacity of the observable universe. Here we describe a way out of this puzzling circumstance: despite being NP-hard, the problem of finding a small cosmological constant can be attacked by more sophisticated algorithms whose performance vastly exceeds brute force search. In fact, in some parameter regimes the average-case complexity is polynomial. We demonstrate this by explicitly finding a cosmological constant of order 10−12010^{-120} in a randomly generated 10910^9-dimensional ADK landscape.Comment: 19 pages, 5 figure

    Prescribing Gauss curvature of surfaces in 3-dimensional spacetimes, Application to the Minkowski problem in the Minkowski space

    Full text link
    We study the existence of surfaces with constant or prescribed Gauss curvature in certain Lorentzian spacetimes. We prove in particular that every (non-elementary) 3-dimensional maximal globally hyperbolic spatially compact spacetime with constant non-negative curvature is foliated by compact spacelike surfaces with constant Gauss curvature. In the constant negative curvature case, such a foliation exists outside the convex core. The existence of these foliations, together with a theorem of C. Gerhardt, yield several corollaries. For example, they allow to solve the Minkowski problem in the 3-dimensional Minkowski space for datas that are invariant under the action of a co-compact Fuchsian group

    Simultaneous sparse approximation via greedy pursuit

    Get PDF
    A simple sparse approximation problem requests an approximation of a given input signal as a linear combination of T elementary signals drawn from a large, linearly dependent collection. An important generalization is simultaneous sparse approximation. Now one must approximate several input signals at once using different linear combinations of the same T elementary signals. This formulation appears, for example, when analyzing multiple observations of a sparse signal that have been contaminated with noise. A new approach to this problem is presented here: a greedy pursuit algorithm called simultaneous orthogonal matching pursuit. The paper proves that the algorithm calculates simultaneous approximations whose error is within a constant factor of the optimal simultaneous approximation error. This result requires that the collection of elementary signals be weakly correlated, a property that is also known as incoherence. Numerical experiments demonstrate that the algorithm often succeeds, even when the inputs do not meet the hypotheses of the proof
    • 

    corecore