2,058 research outputs found

    Faster polynomial multiplication over finite fields

    Full text link
    Let p be a prime, and let M_p(n) denote the bit complexity of multiplying two polynomials in F_p[X] of degree less than n. For n large compared to p, we establish the bound M_p(n) = O(n log n 8^(log^* n) log p), where log^* is the iterated logarithm. This is the first known F\"urer-type complexity bound for F_p[X], and improves on the previously best known bound M_p(n) = O(n log n log log n log p)

    Progressive refinement rendering of implicit surfaces

    Get PDF
    The visualisation of implicit surfaces can be an inefficient task when such surfaces are complex and highly detailed. Visualising a surface by first converting it to a polygon mesh may lead to an excessive polygon count. Visualising a surface by direct ray casting is often a slow procedure. In this paper we present a progressive refinement renderer for implicit surfaces that are Lipschitz continuous. The renderer first displays a low resolution estimate of what the final image is going to be and, as the computation progresses, increases the quality of this estimate at an interactive frame rate. This renderer provides a quick previewing facility that significantly reduces the design cycle of a new and complex implicit surface. The renderer is also capable of completing an image faster than a conventional implicit surface rendering algorithm based on ray casting

    Ratios of WW and ZZ cross sections at large boson pTp_T as a constraint on PDFs and background to new physics

    Get PDF
    We motivate a measurement of various ratios of WW and ZZ cross sections at the Large Hadron Collider (LHC) at large values of the boson transverse momentum (pTMW,Zp_T\gtrsim M_{W,Z}). We study the dependence of predictions for these cross-section ratios on the multiplicity of associated jets, the boson pTp_T and the LHC centre-of-mass energy. We present the flavour decomposition of the initial-state partons and an evaluation of the theoretical uncertainties. We show that the W+/WW^+/W^- ratio is sensitive to the up-quark to down-quark ratio of parton distribution functions (PDFs), while other theoretical uncertainties are negligible, meaning that a precise measurement of the W+/WW^+/W^- ratio at large boson pTp_T values could constrain the PDFs at larger momentum fractions xx than the usual inclusive WW charge asymmetry. The W±/ZW^\pm/Z ratio is insensitive to PDFs and most other theoretical uncertainties, other than possibly electroweak corrections, and a precise measurement will therefore be useful in validating theoretical predictions needed in data-driven methods, such as using W(ν)W(\to\ell\nu)+jets events to estimate the Z(ννˉ)Z(\to\nu\bar{\nu})+jets background in searches for new physics at the LHC. The differential WW and ZZ cross sections themselves, dσ/dpT{\rm d}\sigma/{\rm d}p_T, have the potential to constrain the gluon distribution, provided that theoretical uncertainties from higher-order QCD and electroweak corrections are brought under control, such as by inclusion of anticipated next-to-next-to-leading order QCD corrections.Comment: 33 pages, 13 figures. v2: expanded version published in JHE

    Top-Quark Physics at the LHC

    Full text link
    The top quark is the heaviest of all known elementary particles. It was discovered in 1995 by the CDF and D0 experiments at the Tevatron. With the start of the LHC in 2009, an unprecedented wealth of measurements of the top quark's production mechanisms and properties have been performed by the ATLAS and CMS collaborations, most of these resulting in smaller uncertainties than those achieved previously. At the same time, huge progress was made on the theoretical side yielding significantly improved predictions up to next-to-next-to-leading order in perturbative QCD. Due to the vast amount of events containing top quarks, a variety of new measurements became feasible and opened a new window to precisions tests of the Standard Model and to contributions of new physics. In this review, originally written for a recent book on the results of LHC Run 1, top-quark measurements obtained so far from the LHC Run 1 are summarised and put in context with the current understanding of the Standard Model.Comment: 35 pages, 25 figures. To appear in "The Large Hadron Collider -- Harvest of Run 1", Thomas Sch\"orner-Sadenius (ed.), Springer, 2015 (532 pages, 253 figures; ISBN 978-3-319-15000-0; eBook ISBN 978-3-319-15001-7, for more details, see http://www.springer.com/de/book/9783319150000

    Parallel Construction of Irreducible Polynomials

    Get PDF
    Let arithmetic pseudo-NC^k denote the problems that can be solved by log space uniform arithmetic circuits over the finite prime field GF(p) of depth O(log^k (n + p)) and size polynomial in (n + p). We show that the problem of constructing an irreducible polynomial of specified degree over GF(p) belongs to pseudo-NC^2.5. We prove that the problem of constructing an irreducible polynomial of specified degree over GF(p) whose roots are guaranteed to form a normal basis for the corresponding field extension pseudo-NC^2 -reduces to the problem of factor refinement. We show that factor refinement of polynomials is in arithmetic NC^3. Our algorithm works over any field and compared to other known algorithms it does not assume the ability to take p'th roots when the field has characteristic p

    Quantum machine learning: a classical perspective

    Get PDF
    Recently, increased computational power and data availability, as well as algorithmic advances, have led machine learning techniques to impressive results in regression, classification, data-generation and reinforcement learning tasks. Despite these successes, the proximity to the physical limits of chip fabrication alongside the increasing size of datasets are motivating a growing number of researchers to explore the possibility of harnessing the power of quantum computation to speed-up classical machine learning algorithms. Here we review the literature in quantum machine learning and discuss perspectives for a mixed readership of classical machine learning and quantum computation experts. Particular emphasis will be placed on clarifying the limitations of quantum algorithms, how they compare with their best classical counterparts and why quantum resources are expected to provide advantages for learning problems. Learning in the presence of noise and certain computationally hard problems in machine learning are identified as promising directions for the field. Practical questions, like how to upload classical data into quantum form, will also be addressed.Comment: v3 33 pages; typos corrected and references adde

    A measurement study of peer-to-peer bootstrapping and implementations of delay-based cryptography

    Get PDF
    This thesis researches two distinct areas of study in both peer-to-peer networking formodern cryptocurrencies and implementations of delay-based cryptography.The first part of the thesis researches elements of peer-to-peer network mechanisms,with a specific focus on the dependencies on centralised infrastructure required for theinitial participation in such networks.Cryptocurrencies rely on decentralised peer-to-peer networks, yet the method bywhich new peers initially join these networks, known as bootstrapping, presents a significantchallenge. Our original research consists of a measurement study of 74 cryptocurrencies.Our study reveals a prevalent reliance on centralised infrastructure which leadsto censorship-prone bootstrapping techniques leaving networks vulnerable to censorshipand manipulation.In response, we explore alternative bootstrapping methods seeking solutions lesssusceptible to censorship. However, our research demonstrates operational challengesand limitations which hinder their effectiveness, highlighting the complexity of achievingcensorship-resistance in practice.Furthermore, our global measurement study uncovers the details of cryptocurrencypeer-to-peer networks, revealing instances outages and intentional protocol manipulationimpacting bootstrapping operations. Through a volunteer network of probes deployedacross 42 countries, we analyse network topology, exposing centralisation tendencies andunintentional peer exposure.Our research also highlights the pervasive inheritance of legacy bootstrapping methods,perpetuating security vulnerabilities and censorship risks within cryptocurrencysystems. These findings illuminate broader concerns surrounding decentralisation andcensorship-resistance in distributed systems.In conclusion, our study offers valuable insights into cryptocurrency bootstrappingtechniques and their susceptibility to censorship, paving the way for future research andinterventions to enhance the resilience and autonomy of peer-to-peer networks.In the second part of the thesis, attention shifts towards delay-based cryptography,where the focus lies on the creation and practical implementations of timed-release encryptionschemes. Drawing from the historical delay-based cryptographic protocols, thisthesis presents two original research contributions.The first is the creation of a new timed-release encryption scheme with a propertytermed implicit authentication. The second contribution is the development of a practicalconstruction called TIDE (TIme Delayed Encryption) tailored for use in sealed-bidauctions.Timed-Release Encryption with Implicit Authentication (TRE-IA) is a cryptographicprimitive which presents a new property named implicit authentication (IA). This propertyensures that only authorised parties, such as whistleblowers, can generate meaningfulciphertexts. By incorporating IA techniques into the encryption process, TRE-IAaugments a new feature in standard timed-release encryption schemes by ensuring thatonly the party with the encryption key can create meaningful ciphertexts. This propertyensures the authenticity of the party behind the sensitive data disclosure. Specifically, IAenables the encryption process to authenticate the identity of the whistleblower throughthe ciphertext. This property prevents malicious parties from generating ciphertextsthat do not originate from legitimate sources. This ensures the integrity and authenticityof the encrypted data, safeguarding against potential leaks of information not vettedby the party performing the encryption.TIDE introduces a new method for timed-release encryption in the context of sealedbidauctions by creatively using classic number-theoretic techniques. By integratingRSA-OEAP public-key encryption and the Rivest Shamir Wagner time-lock assumptionwith classic number theory principles, TIDE offers a solution that is both conceptuallystraightforward and efficient to implement.Our contributions in TIDE address the complexities and performance challengesinherent in current instantiations of timed-release encryption schemes. Our researchoutput creates a practical timed-release encryption implementation on consumer-gradehardware which can facilitate real-world applications such as sealed-bid auctions withclear steps for implementation.Finally, our thesis concludes with a review of the prospects of delay-based cryptographywhere we consider potential applications such as leveraging TIDE for a publicrandomness beacon.<br/
    corecore