3,272 research outputs found

    PPP-Completeness with Connections to Cryptography

    Get PDF
    Polynomial Pigeonhole Principle (PPP) is an important subclass of TFNP with profound connections to the complexity of the fundamental cryptographic primitives: collision-resistant hash functions and one-way permutations. In contrast to most of the other subclasses of TFNP, no complete problem is known for PPP. Our work identifies the first PPP-complete problem without any circuit or Turing Machine given explicitly in the input, and thus we answer a longstanding open question from [Papadimitriou1994]. Specifically, we show that constrained-SIS (cSIS), a generalized version of the well-known Short Integer Solution problem (SIS) from lattice-based cryptography, is PPP-complete. In order to give intuition behind our reduction for constrained-SIS, we identify another PPP-complete problem with a circuit in the input but closely related to lattice problems. We call this problem BLICHFELDT and it is the computational problem associated with Blichfeldt's fundamental theorem in the theory of lattices. Building on the inherent connection of PPP with collision-resistant hash functions, we use our completeness result to construct the first natural hash function family that captures the hardness of all collision-resistant hash functions in a worst-case sense, i.e. it is natural and universal in the worst-case. The close resemblance of our hash function family with SIS, leads us to the first candidate collision-resistant hash function that is both natural and universal in an average-case sense. Finally, our results enrich our understanding of the connections between PPP, lattice problems and other concrete cryptographic assumptions, such as the discrete logarithm problem over general groups

    Moving Toward Non-transcription Based Discourse Analysis in Stable and Progressive Aphasia

    Get PDF
    Measurement of communication ability at the discourse level holds promise for predicting how well persons with stable (e.g., stroke-induced), or progressive aphasia navigate everyday communicative interactions. However, barriers to the clinical utilization of discourse measures have persisted. Recent advancements in the standardization of elicitation protocols and the existence of large databases for development of normative references have begun to address some of these barriers. Still, time remains a consistently reported barrier by clinicians. Non-transcription based discourse measurement would reduce the time required for discourse analysis, making clinical utilization a reality. The purpose of this article is to present evidence regarding discourse measures (main concept analysis, core lexicon, and derived efficiency scores) that are well suited to non-transcription based analysis. Combined with previous research, our results suggest that these measures are sensitive to changes following stroke or neurodegenerative disease. Given the evidence, further research specifically assessing the reliability of these measures in clinical implementation is warranted

    On the Optimal Linear Convergence Rate of a Generalized Proximal Point Algorithm

    Full text link
    The proximal point algorithm (PPA) has been well studied in the literature. In particular, its linear convergence rate has been studied by Rockafellar in 1976 under certain condition. We consider a generalized PPA in the generic setting of finding a zero point of a maximal monotone operator, and show that the condition proposed by Rockafellar can also sufficiently ensure the linear convergence rate for this generalized PPA. Indeed we show that these linear convergence rates are optimal. Both the exact and inexact versions of this generalized PPA are discussed. The motivation to consider this generalized PPA is that it includes as special cases the relaxed versions of some splitting methods that are originated from PPA. Thus, linear convergence results of this generalized PPA can be used to better understand the convergence of some widely used algorithms in the literature. We focus on the particular convex minimization context and specify Rockafellar's condition to see how to ensure the linear convergence rate for some efficient numerical schemes, including the classical augmented Lagrangian method proposed by Hensen and Powell in 1969 and its relaxed version, the original alternating direction method of multipliers (ADMM) by Glowinski and Marrocco in 1975 and its relaxed version (i.e., the generalized ADMM by Eckstein and Bertsekas in 1992). Some refined conditions weaker than existing ones are proposed in these particular contexts.Comment: 22 pages, 1 figur

    Using a novel source-localized phase regressor technique for evaluation of the vascular contribution to semantic category area localization in BOLD fMRI.

    Get PDF
    Numerous studies have shown that gradient-echo blood oxygen level dependent (BOLD) fMRI is biased toward large draining veins. However, the impact of this large vein bias on the localization and characterization of semantic category areas has not been examined. Here we address this issue by comparing standard magnitude measures of BOLD activity in the Fusiform Face Area (FFA) and Parahippocampal Place Area (PPA) to those obtained using a novel method that suppresses the contribution of large draining veins: source-localized phase regressor (sPR). Unlike previous suppression methods that utilize the phase component of the BOLD signal, sPR yields robust and unbiased suppression of large draining veins even in voxels with no task-related phase changes. This is confirmed in ideal simulated data as well as in FFA/PPA localization data from four subjects. It was found that approximately 38% of right PPA, 14% of left PPA, 16% of right FFA, and 6% of left FFA voxels predominantly reflect signal from large draining veins. Surprisingly, with the contributions from large veins suppressed, semantic category representation in PPA actually tends to be lateralized to the left rather than the right hemisphere. Furthermore, semantic category areas larger in volume and higher in fSNR were found to have more contributions from large veins. These results suggest that previous studies using gradient-echo BOLD fMRI were biased toward semantic category areas that receive relatively greater contributions from large veins

    GR@PPA 2.8: initial-state jet matching for weak boson production processes at hadron collisions

    Full text link
    The initial-state jet matching method introduced in our previous studies has been applied to the event generation of single WW and ZZ production processes and diboson (W+W−W^{+}W^{-}, WZWZ and ZZZZ) production processes at hadron collisions in the framework of the GR@PPA event generator. The generated events reproduce the transverse momentum spectra of weak bosons continuously in the entire kinematical region. The matrix elements (ME) for hard interactions are still at the tree level. As in previous versions, the decays of weak bosons are included in the matrix elements. Therefore, spin correlations and phase-space effects in the decay of weak bosons are exact at the tree level. The program package includes custom-made parton shower programs as well as ME-based hard interaction generators in order to achieve self-consistent jet matching. The generated events can be passed to general-purpose event generators to make the simulation proceed down to the hadron level.Comment: 29 pages, 14 figures; minor changes to clarify the discussions, and corrections of typo
    • …
    corecore