171,545 research outputs found

    Complexity of the Sherrington-Kirkpatrick Model in the Annealed Approximation

    Get PDF
    A careful critical analysis of the complexity, at the annealed level, of the Sherrington-Kirkpatrick model has been performed. The complexity functional is proved to be always invariant under the Becchi-Rouet-Stora-Tyutin supersymmetry, disregarding the formulation used to define it. We consider two different saddle points of such functional, one satisfying the supersymmetry [A. Cavagna {\it et al.}, J. Phys. A {\bf 36} (2003) 1175] and the other one breaking it [A.J. Bray and M.A. Moore, J. Phys. C {\bf 13} (1980) L469]. We review the previews studies on the subject, linking different perspectives and pointing out some inadequacies and even inconsistencies in both solutions.Comment: 20 pages, 4 figure

    PURRS: Towards Computer Algebra Support for Fully Automatic Worst-Case Complexity Analysis

    Full text link
    Fully automatic worst-case complexity analysis has a number of applications in computer-assisted program manipulation. A classical and powerful approach to complexity analysis consists in formally deriving, from the program syntax, a set of constraints expressing bounds on the resources required by the program, which are then solved, possibly applying safe approximations. In several interesting cases, these constraints take the form of recurrence relations. While techniques for solving recurrences are known and implemented in several computer algebra systems, these do not completely fulfill the needs of fully automatic complexity analysis: they only deal with a somewhat restricted class of recurrence relations, or sometimes require user intervention, or they are restricted to the computation of exact solutions that are often so complex to be unmanageable, and thus useless in practice. In this paper we briefly describe PURRS, a system and software library aimed at providing all the computer algebra services needed by applications performing or exploiting the results of worst-case complexity analyses. The capabilities of the system are illustrated by means of examples derived from the analysis of programs written in a domain-specific functional programming language for real-time embedded systems.Comment: 6 page

    Quenched Computation of the Complexity of the Sherrington-Kirkpatrick Model

    Full text link
    The quenched computation of the complexity in the Sherrington-Kirkpatrick model is presented. A modified Full Replica Symmetry Breaking Ansatz is introduced in order to study the complexity dependence on the free energy. Such an Ansatz corresponds to require Becchi-Rouet-Stora-Tyutin supersymmetry. The complexity computed this way is the Legendre transform of the free energy averaged over the quenched disorder. The stability analysis shows that this complexity is inconsistent at any free energy level but the equilibirum one. The further problem of building a physically well defined solution not invariant under supersymmetry and predicting an extensive number of metastable states is also discussed.Comment: 19 pages, 13 figures. Some formulas added corrected, changes in discussion and conclusion, one figure adde

    Average case complexity of linear multivariate problems

    Get PDF
    We study the average case complexity of a linear multivariate problem (\lmp) defined on functions of dd variables. We consider two classes of information. The first \lstd consists of function values and the second \lall of all continuous linear functionals. Tractability of \lmp means that the average case complexity is O((1/\e)^p) with pp independent of dd. We prove that tractability of an \lmp in \lstd is equivalent to tractability in \lall, although the proof is {\it not} constructive. We provide a simple condition to check tractability in \lall. We also address the optimal design problem for an \lmp by using a relation to the worst case setting. We find the order of the average case complexity and optimal sample points for multivariate function approximation. The theoretical results are illustrated for the folded Wiener sheet measure.Comment: 7 page

    Query processing of spatial objects: Complexity versus Redundancy

    Get PDF
    The management of complex spatial objects in applications, such as geography and cartography, imposes stringent new requirements on spatial database systems, in particular on efficient query processing. As shown before, the performance of spatial query processing can be improved by decomposing complex spatial objects into simple components. Up to now, only decomposition techniques generating a linear number of very simple components, e.g. triangles or trapezoids, have been considered. In this paper, we will investigate the natural trade-off between the complexity of the components and the redundancy, i.e. the number of components, with respect to its effect on efficient query processing. In particular, we present two new decomposition methods generating a better balance between the complexity and the number of components than previously known techniques. We compare these new decomposition methods to the traditional undecomposed representation as well as to the well-known decomposition into convex polygons with respect to their performance in spatial query processing. This comparison points out that for a wide range of query selectivity the new decomposition techniques clearly outperform both the undecomposed representation and the convex decomposition method. More important than the absolute gain in performance by a factor of up to an order of magnitude is the robust performance of our new decomposition techniques over the whole range of query selectivity
    • …
    corecore