1,994 research outputs found

    Quantum algorithms for highly non-linear Boolean functions

    Full text link
    Attempts to separate the power of classical and quantum models of computation have a long history. The ultimate goal is to find exponential separations for computational problems. However, such separations do not come a dime a dozen: while there were some early successes in the form of hidden subgroup problems for abelian groups--which generalize Shor's factoring algorithm perhaps most faithfully--only for a handful of non-abelian groups efficient quantum algorithms were found. Recently, problems have gotten increased attention that seek to identify hidden sub-structures of other combinatorial and algebraic objects besides groups. In this paper we provide new examples for exponential separations by considering hidden shift problems that are defined for several classes of highly non-linear Boolean functions. These so-called bent functions arise in cryptography, where their property of having perfectly flat Fourier spectra on the Boolean hypercube gives them resilience against certain types of attack. We present new quantum algorithms that solve the hidden shift problems for several well-known classes of bent functions in polynomial time and with a constant number of queries, while the classical query complexity is shown to be exponential. Our approach uses a technique that exploits the duality between bent functions and their Fourier transforms.Comment: 15 pages, 1 figure, to appear in Proceedings of the 21st Annual ACM-SIAM Symposium on Discrete Algorithms (SODA'10). This updated version of the paper contains a new exponential separation between classical and quantum query complexit

    Learning from Multi-View Multi-Way Data via Structural Factorization Machines

    Full text link
    Real-world relations among entities can often be observed and determined by different perspectives/views. For example, the decision made by a user on whether to adopt an item relies on multiple aspects such as the contextual information of the decision, the item's attributes, the user's profile and the reviews given by other users. Different views may exhibit multi-way interactions among entities and provide complementary information. In this paper, we introduce a multi-tensor-based approach that can preserve the underlying structure of multi-view data in a generic predictive model. Specifically, we propose structural factorization machines (SFMs) that learn the common latent spaces shared by multi-view tensors and automatically adjust the importance of each view in the predictive model. Furthermore, the complexity of SFMs is linear in the number of parameters, which make SFMs suitable to large-scale problems. Extensive experiments on real-world datasets demonstrate that the proposed SFMs outperform several state-of-the-art methods in terms of prediction accuracy and computational cost.Comment: 10 page
    • …
    corecore