30,920 research outputs found

    Normalizers of Irreducible Subfactors

    Full text link
    We consider normalizers of an irreducible inclusion NMN\subseteq M of II1\mathrm{II}_1 factors. In the infinite index setting an inclusion uNuNuNu^*\subseteq N can be strict, forcing us to also investigate the semigroup of one-sided normalizers. We relate these normalizers of NN in MM to projections in the basic construction and show that every trace one projection in the relative commutant NN'\cap is of the form ueNuu^*e_Nu for some unitary uMu\in M with uNuNuNu^*\subseteq N. This enables us to identify the normalizers and the algebras they generate in several situations. In particular each normalizer of a tensor product of irreducible subfactors is a tensor product of normalizers modulo a unitary. We also examine normalizers of irreducible subfactors arising from subgroup--group inclusions HGH\subseteq G. Here the normalizers are the normalizing group elements modulo a unitary from L(H)L(H). We are also able to identify the finite trace L(H)L(H)-bimodules in 2(G)\ell^2(G) as double cosets which are also finite unions of left cosets.Comment: 33 Page

    Eliminating the Hadronic Uncertainty

    Get PDF
    The Standard Model Lagrangian requires the values of the fermion masses, the Higgs mass and three other experimentally well-measured quantities as input in order to become predictive. These are typically taken to be α\alpha, GμG_\mu and MZM_Z. Using the first of these, however, introduces a hadronic contribution that leads to a significant error. If a quantity could be found that was measured at high energy with sufficient precision then it could be used to replace α\alpha as input. The level of precision required for this to happen is given for a number of precisely-measured observables. The WW boson mass must be measured with an error of ±13\pm13\,MeV, ΓZ\Gamma_Z to 0.70.7\,MeV and polarization asymmetry, ALRA_{LR}, to ±0.002\pm0.002 that would seem to be the most promising candidate. The r\^ole of renormalized parameters in perturbative calculations is reviewed and the value for the electromagnetic coupling constant in the MS\overline{\rm MS} renormalization scheme that is consistent with all experimental data is obtained to be αMS1(MZ2)=128.17\alpha^{-1}_{\overline{\rm MS}}(M^2_Z)=128.17.Comment: 8 pages LaTeX2

    Kadison-Kastler stable factors

    Get PDF
    A conjecture of Kadison and Kastler from 1972 asks whether sufficiently close operator algebras in a natural uniform sense must be small unitary perturbations of one another. For n≥3 and a free, ergodic, probability measure-preserving action of SL<sub>n</sub>(Z) on a standard nonatomic probability space (X,μ), write M=(L<sup>∞</sup>(X,μ)⋊SL<sub>n</sub>(Z))⊗¯¯¯R, where R is the hyperfinite II1-factor. We show that whenever M is represented as a von Neumann algebra on some Hilbert space H and N⊆B(H) is sufficiently close to M, then there is a unitary u on H close to the identity operator with uMu∗=N. This provides the first nonamenable class of von Neumann algebras satisfying Kadison and Kastler’s conjecture. We also obtain stability results for crossed products L<sup>∞</sup>(X,μ)⋊Γ whenever the comparison map from the bounded to usual group cohomology vanishes in degree 2 for the module L<sup>2</sup>(X,μ). In this case, any von Neumann algebra sufficiently close to such a crossed product is necessarily isomorphic to it. In particular, this result applies when Γ is a free group

    Maximum likelihood estimates of pairwise rearrangement distances

    Get PDF
    Accurate estimation of evolutionary distances between taxa is important for many phylogenetic reconstruction methods. In the case of bacteria, distances can be estimated using a range of different evolutionary models, from single nucleotide polymorphisms to large-scale genome rearrangements. In the case of sequence evolution models (such as the Jukes-Cantor model and associated metric) have been used to correct pairwise distances. Similar correction methods for genome rearrangement processes are required to improve inference. Current attempts at correction fall into 3 categories: Empirical computational studies, Bayesian/MCMC approaches, and combinatorial approaches. Here we introduce a maximum likelihood estimator for the inversion distance between a pair of genomes, using the group-theoretic approach to modelling inversions introduced recently. This MLE functions as a corrected distance: in particular, we show that because of the way sequences of inversions interact with each other, it is quite possible for minimal distance and MLE distance to differently order the distances of two genomes from a third. This has obvious implications for the use of minimal distance in phylogeny reconstruction. The work also tackles the above problem allowing free rotation of the genome. Generally a frame of reference is locked, and all computation made accordingly. This work incorporates the action of the dihedral group so that distance estimates are free from any a priori frame of reference.Comment: 21 pages, 7 figures. To appear in the Journal of Theoretical Biolog

    Jointly selecting for fibre diameter and fleece weight: A market-level assessment of the QPLU$ Merino breeding project

    Get PDF
    The QPLU$ Merino breeding project began in the early 1990s. The aim of the project was to demonstrate the efficiency of using a selection index to achieve breeding objectives. A number of selection lines were created from three strains of Merino sheep. During the ten-year course of the project, selection of each line was undertaken using an index based on measurements of fleece weight and fibre diameter. Different emphases were placed on each trait in each selected line. This paper estimates the potential aggregate returns of the project to the Australian sheep and wool industries using an equilibrium displacement model.Australian sheep and wool industries, equilibrium displacement model, cross-commodity relationships, R&D evaluation, Livestock Production/Industries, Research and Development/Tech Change/Emerging Technologies, Research Methods/ Statistical Methods,

    Development of a low-maintenance measurement approach to continuously estimate methane emissions: a case study

    Get PDF
    The chemical breakdown of organic matter in landfills represents a significant source of methane gas (CH4). Current estimates suggest that landfills are responsible for between 3% and 19% of global anthropogenic emissions. The net CH4 emissions resulting from biogeochemical processes and their modulation by microbes in landfills are poorly constrained by imprecise knowledge of environmental constraints. The uncertainty in absolute CH4 emissions from landfills is therefore considerable. This study investigates a new method to estimate the temporal variability of CH4 emissions using meteorological and CH4 concentration measurements downwind of a landfill site in Suffolk, UK from July to September 2014, taking advantage of the statistics that such a measurement approach offers versus shorter-term, but more complex and instantaneously accurate, flux snapshots. Methane emissions were calculated from CH4 concentrations measured 700 m from the perimeter of the landfill with observed concentrations ranging from background to 46.4 ppm. Using an atmospheric dispersion model, we estimate a mean emission flux of 709 μg m−2 s−1 over this period, with a maximum value of 6.21 mg m−2 s−1, reflecting the wide natural variability in biogeochemical and other environmental controls on net site emission. The emissions calculated suggest that meteorological conditions have an influence on the magnitude of CH4 emissions. We also investigate the factors responsible for the large variability observed in the estimated CH4 emissions, and suggest that the largest component arises from uncertainty in the spatial distribution of CH4 emissions within the landfill area. The results determined using the low-maintenance approach discussed in this paper suggest that a network of cheaper, less precise CH4 sensors could be used to measure a continuous CH4 emission time series from a landfill site, something that is not practical using far-field approaches such as tracer release methods. Even though there are limitations to the approach described here, this easy, low-maintenance, low-cost method could be used by landfill operators to estimate time-averaged CH4 emissions and their impact downwind by simultaneously monitoring plume advection and CH4 concentrations
    corecore