32 research outputs found

    Inapproximability of the Standard Pebble Game and Hard to Pebble Graphs

    Full text link
    Pebble games are single-player games on DAGs involving placing and moving pebbles on nodes of the graph according to a certain set of rules. The goal is to pebble a set of target nodes using a minimum number of pebbles. In this paper, we present a possibly simpler proof of the result in [CLNV15] and strengthen the result to show that it is PSPACE-hard to determine the minimum number of pebbles to an additive n1/3−ϔn^{1/3-\epsilon} term for all Ï”>0\epsilon > 0, which improves upon the currently known additive constant hardness of approximation [CLNV15] in the standard pebble game. We also introduce a family of explicit, constant indegree graphs with nn nodes where there exists a graph in the family such that using constant kk pebbles requires Ω(nk)\Omega(n^k) moves to pebble in both the standard and black-white pebble games. This independently answers an open question summarized in [Nor15] of whether a family of DAGs exists that meets the upper bound of O(nk)O(n^k) moves using constant kk pebbles with a different construction than that presented in [AdRNV17].Comment: Preliminary version in WADS 201

    On the Hardness of Red-Blue Pebble Games

    Full text link
    Red-blue pebble games model the computation cost of a two-level memory hierarchy. We present various hardness results in different red-blue pebbling variants, with a focus on the oneshot model. We first study the relationship between previously introduced red-blue pebble models (base, oneshot, nodel). We also analyze a new variant (compcost) to obtain a more realistic model of computation. We then prove that red-blue pebbling is NP-hard in all of these model variants. Furthermore, we show that in the oneshot model, a ÎŽ\delta-approximation algorithm for ÎŽ<2\delta<2 is only possible if the unique games conjecture is false. Finally, we show that greedy algorithms are not good candidates for approximation, since they can return significantly worse solutions than the optimum

    Definable inapproximability: New challenges for duplicator

    Get PDF
    AbstractWe consider the hardness of approximation of optimization problems from the point of view of definability. For many NP\textrm{NP}-hard optimization problems it is known that, unless P=NP\textrm{P} = \textrm{NP} , no polynomial-time algorithm can give an approximate solution guaranteed to be within a fixed constant factor of the optimum. We show, in several such instances and without any complexity theoretic assumption, that no algorithm that is expressible in fixed-point logic with counting (FPC) can compute an approximate solution. Since important algorithmic techniques for approximation algorithms (such as linear or semidefinite programming) are expressible in FPC, this yields lower bounds on what can be achieved by such methods. The results are established by showing lower bounds on the number of variables required in first-order logic with counting to separate instances with a high optimum from those with a low optimum for fixed-size instances.</jats:p

    Approximating Cumulative Pebbling Cost Is Unique Games Hard

    Get PDF
    The cumulative pebbling complexity of a directed acyclic graph GG is defined as cc(G)=min⁥P∑i∣Pi∣\mathsf{cc}(G) = \min_P \sum_i |P_i|, where the minimum is taken over all legal (parallel) black pebblings of GG and ∣Pi∣|P_i| denotes the number of pebbles on the graph during round ii. Intuitively, cc(G)\mathsf{cc}(G) captures the amortized Space-Time complexity of pebbling mm copies of GG in parallel. The cumulative pebbling complexity of a graph GG is of particular interest in the field of cryptography as cc(G)\mathsf{cc}(G) is tightly related to the amortized Area-Time complexity of the Data-Independent Memory-Hard Function (iMHF) fG,Hf_{G,H} [AS15] defined using a constant indegree directed acyclic graph (DAG) GG and a random oracle H(⋅)H(\cdot). A secure iMHF should have amortized Space-Time complexity as high as possible, e.g., to deter brute-force password attacker who wants to find xx such that fG,H(x)=hf_{G,H}(x) = h. Thus, to analyze the (in)security of a candidate iMHF fG,Hf_{G,H}, it is crucial to estimate the value cc(G)\mathsf{cc}(G) but currently, upper and lower bounds for leading iMHF candidates differ by several orders of magnitude. Blocki and Zhou recently showed that it is NP\mathsf{NP}-Hard to compute cc(G)\mathsf{cc}(G), but their techniques do not even rule out an efficient (1+Δ)(1+\varepsilon)-approximation algorithm for any constant Δ>0\varepsilon>0. We show that for any constant c>0c > 0, it is Unique Games hard to approximate cc(G)\mathsf{cc}(G) to within a factor of cc. (See the paper for the full abstract.)Comment: 28 pages, updated figures and corrected typo

    Hardness of Approximation in PSPACE and Separation Results for Pebble Games

    Full text link
    We consider the pebble game on DAGs with bounded fan-in introduced in [Paterson and Hewitt '70] and the reversible version of this game in [Bennett '89], and study the question of how hard it is to decide exactly or approximately the number of pebbles needed for a given DAG in these games. We prove that the problem of eciding whether ss~pebbles suffice to reversibly pebble a DAG GG is PSPACE-complete, as was previously shown for the standard pebble game in [Gilbert, Lengauer and Tarjan '80]. Via two different graph product constructions we then strengthen these results to establish that both standard and reversible pebbling space are PSPACE-hard to approximate to within any additive constant. To the best of our knowledge, these are the first hardness of approximation results for pebble games in an unrestricted setting (even for polynomial time). Also, since [Chan '13] proved that reversible pebbling is equivalent to the games in [Dymond and Tompa '85] and [Raz and McKenzie '99], our results apply to the Dymond--Tompa and Raz--McKenzie games as well, and from the same paper it follows that resolution depth is PSPACE-hard to determine up to any additive constant. We also obtain a multiplicative logarithmic separation between reversible and standard pebbling space. This improves on the additive logarithmic separation previously known and could plausibly be tight, although we are not able to prove this. We leave as an interesting open problem whether our additive hardness of approximation result could be strengthened to a multiplicative bound if the computational resources are decreased from polynomial space to the more common setting of polynomial time

    Constructing Hard Examples for Graph Isomorphism.

    Get PDF
    We describe a method for generating graphs that provide difficult examples for practical Graph Isomorphism testers. We first give the theoretical construction, showing that we can have a family of graphs without any non-trivial automorphisms which also have high Weisfeiler-Leman dimension. The construction is based on properties of random 3XOR-formulas. We describe how to convert such a formula into a graph which has the desired properties with high probability. We validate the method by experimental implementations. We construct random formulas and validate them with a SAT solver to filter through suitable ones, and then convert them into graphs. Experimental results demonstrate that the resulting graphs do provide hard examples that match the hardest known benchmarks for graph isomorphism

    Static-Memory-Hard Functions, and Modeling the Cost of Space vs. Time

    Get PDF
    A series of recent research starting with (Alwen and Serbinenko, STOC 2015) has deepened our understanding of the notion of memory-hardness in cryptography — a useful property of hash functions for deterring large-scale password-cracking attacks — and has shown memory-hardness to have intricate connections with the theory of graph pebbling. Definitions of memory-hardness are not yet unified in the somewhat nascent field of memory-hardness, however, and the guarantees proven to date are with respect to a range of proposed definitions. In this paper, we observe two significant and practical considerations that are not analyzed by existing models of memory-hardness, and propose new models to capture them, accompanied by constructions based on new hard-to-pebble graphs. Our contribution is two-fold, as follows. First, existing measures of memory-hardness only account for dynamic memory usage (i.e., memory read/written at runtime), and do not consider static memory usage (e.g., memory on disk). Among other things, this means that memory requirements considered by prior models are inherently upper-bounded by a hash function’s runtime; in contrast, counting static memory would potentially allow quantification of much larger memory requirements, decoupled from runtime. We propose a new definition of static-memory-hard function (SHF) which takes static memory into account: we model static memory usage by oracle access to a large preprocessed string, which may be considered part of the hash function description. Static memory requirements are complementary to dynamic memory requirements: neither can replace the other, and to deter large-scale password-cracking attacks, a hash function will benefit from being both dynamic memory-hard and static-memory-hard. We give two SHF constructions based on pebbling. To prove static-memory-hardness, we define a new pebble game (“black-magic pebble game”), and new graph constructions with optimal complexity under our proposed measure. Moreover, we provide a prototype implementation of our first SHF construction (which is based on pebbling of a simple “cylinder” graph), providing an initial demonstration of practical feasibility for a limited range of parameter settings. Secondly, existing memory-hardness models implicitly assume that the cost of space and time are more or less on par: they consider only linear ratios between the costs of time and space. We propose a new model to capture nonlinear time-space trade-offs: e.g., how is the adversary impacted when space is quadratically more expensive than time? We prove that nonlinear tradeoffs can in fact cause adversaries to employ different strategies from linear tradeoffs. Finally, as an additional contribution of independent interest, we present an asymptotically tight graph construction that achieves the best possible space complexity up to log log n-factors for an existing memory-hardness measure called cumulative complexity in the sequential pebbling model
    corecore