1,749 research outputs found

    The representation of integers by positive ternary quadratic polynomials

    Full text link
    An integral quadratic polynomial is called regular if it represents every integer that is represented by the polynomial itself over the reals and over the pp-adic integers for every prime pp. It is called complete if it is of the form Q(x+v)Q({\mathbf x} + {\mathbf v}), where QQ is an integral quadratic form in the variables x=(x1,…,xn){\mathbf x} = (x_1, \ldots, x_n) and v{\mathbf v} is a vector in Qn{\mathbb Q}^n. Its conductor is defined to be the smallest positive integer cc such that cv∈Znc{\mathbf v} \in {\mathbb Z}^n. We prove that for a fixed positive integer cc, there are only finitely many equivalence classes of positive primitive ternary regular complete quadratic polynomials with conductor cc. This generalizes the analogous finiteness results for positive definite regular ternary quadratic forms by Watson and for ternary triangular forms by Chan and Oh

    Axial Vector Z′Z' and Anomaly Cancellation

    Full text link
    Whilst the prospect of new Z′Z' gauge bosons with only axial couplings to the Standard Model (SM) fermions is widely discussed, examples of anomaly-free renormalisable models are lacking in the literature. We look to remedy this by constructing several motivated examples. Specifically, we consider axial vectors which couple universally to all SM fermions, as well as those which are generation-specific, leptophilic, and leptophobic. Anomaly cancellation typically requires the presence of new coloured and charged chiral fermions, and we argue that in a large class of models masses of these new states are expected to be comparable to that of the axial vector. Finally, an axial vector mediator could provide a portal between SM and hidden sector states, and we also consider the possibility that the axial vector couples to dark matter. If the dark matter relic density is set due to freeze-out via the axial vector, this strongly constrains the parameter space.Comment: 28 pages, 8 figures. v2. published versio

    Looking for a light Higgs boson in the overlooked channel

    Full text link
    The final state obtained when a Higgs boson decays to a photon and a Z boson has been mostly overlooked in current searches for a light Higgs boson. However, when the Z boson decays leptonically, all final state particles in this channel can be measured, allowing for accurate reconstructions of the Higgs mass and angular correlations. We determine the sensitivity of the Large Hadron Collider (LHC) running at center of masses energies of 8 and 14 TeV to Standard Model (SM) Higgs bosons with masses in the 120 - 130 GeV range. For the 8 TeV LHC, sensitivity to several times the the SM cross section times branching ratio may be obtained with 20 inverse femtobarns of integrated luminosity, while for the 14 TeV LHC, the SM rate is probed with about 100 inverse femtobarns of integrated luminosity.Comment: 4 pages, 4 figures. Improves on version 1 in that 8 and 14 TeV LHC running is considered, the case of a 125 GeV Higgs is treated specifically, and the effect of an additional jet in the final state has been taken into account in studying experimental sensitivit

    Integrative genomic mining for enzyme function to enable engineering of a non-natural biosynthetic pathway.

    Get PDF
    The ability to biosynthetically produce chemicals beyond what is commonly found in Nature requires the discovery of novel enzyme function. Here we utilize two approaches to discover enzymes that enable specific production of longer-chain (C5-C8) alcohols from sugar. The first approach combines bioinformatics and molecular modelling to mine sequence databases, resulting in a diverse panel of enzymes capable of catalysing the targeted reaction. The median catalytic efficiency of the computationally selected enzymes is 75-fold greater than a panel of naively selected homologues. This integrative genomic mining approach establishes a unique avenue for enzyme function discovery in the rapidly expanding sequence databases. The second approach uses computational enzyme design to reprogramme specificity. Both approaches result in enzymes with >100-fold increase in specificity for the targeted reaction. When enzymes from either approach are integrated in vivo, longer-chain alcohol production increases over 10-fold and represents >95% of the total alcohol products

    Snatch trajectory of elite level girevoy (Kettlebell) sport athletes and its implications to strength and conditioning coaching

    Get PDF
    Girevoy sport (GS) has developed only recently in the West, resulting in a paucity of English scientific literature available. The aim was to document kettlebell trajectory of GS athletes performing the kettlebell snatch. Four elite GS athletes (age = 29-47 years, body mass = 68.3-108.1 kg, height 1.72-1.89 m) completed one set of 16 repetitions with a 32.1 kg kettlebell. Trajectory was captured with the VICON motion analysis system (250 Hz) and analysed with VICON Nexus (1.7.1). The kettlebell followed a ‘C’ shape trajectory in the sagittal plane. Mean peak velocity in the upwards phase was 4.03 ± 0.20 m s –1, compared to 3.70 ± 0.30 m s–1 during the downwards phase, and mean radial error across the sagittal and frontal planes was 0.022 ± 0.006 m. Low error in the movement suggests consistent trajectory is important to reduce extraneous movement and improve efficiency. While the kettlebell snatch and swing both require large anterior-posterior motion, the snatch requires the kettlebell to be held stationary overhead. Therefore, a different coaching application is required to that of a barbell snatch

    Synthetic boundary conditions for image deblurring

    Get PDF
    AbstractIn this paper we introduce a new boundary condition that can be used when reconstructing an image from observed blurred and noisy data. Our approach uses information from the observed image to enforce boundary conditions that continue image features such as edges and texture across the boundary. Because of its similarity to methods used in texture synthesis, we call our approach synthetic boundary conditions. We provide an efficient algorithm for implementing the new boundary condition, and provide a linear algebraic framework for the approach that puts it in the context of more classical and well known image boundary conditions, including zero, periodic, reflective, and anti-reflective. Extensive numerical experiments show that our new synthetic boundary conditions provide a more accurate approximation of the true image scene outside the image boundary, and thus allow for better reconstructions of the unknown, true image scene

    Secondary frequencies in the wake of a circular cylinder with vortex shedding

    Get PDF
    A detailed numerical study of two-dimensional flow past a circular cylinder at moderately low Reynolds numbers was conducted using three different numerical algorithms for solving the time-dependent compressible Navier-Stokes equations. It was found that if the algorithm and associated boundary conditions were consistent and stable, then the major features of the unsteady wake were well-predicted. However, it was also found that even stable and consistent boundary conditions could introduce additional periodic phenomena reminiscent of the type seen in previous wind-tunnel experiments. However, these additional frequencies were eliminated by formulating the boundary conditions in terms of the characteristic variables. An analysis based on a simplified model provides an explanation for this behavior

    Open Standard, Open Source and Peer to Peer Methods for Collaborative Product Development and Knowledge Management

    Get PDF
    Tools such as product data management (PDM) and its offspring product lifecycle management (PLM) enable collaboration within and between enterprises. Large enterprises have invariably been the target of software vendors for development of such tools, resulting in large entralized applications. These are beyond the means of small to medium enterprises (SME). Even after these efforts had been made, large enterprises face numerous difficulties with PLM. Firstly, enterprises evolve, and an evolving enterprise needs an evolving data management system. With large applications, such configuration changes have to be made at the server level by dedicated staff. The second problem arises when enterprises wish to collaborate with a large number of suppliers and original equipment manufacturer (OEM) customers. Current applications enable collaboration using business-to-business (B2B) protocols. However, these do not take into account that disparate enterprises do not have unitary data models or workflows. This is a strong factor in reducing the abilities of large enterprises to participate in collaborative project
    • …
    corecore