1,418 research outputs found

    Maturation of molybdoenzymes and its influence on the pathogenesis of non-typeable Haemophilus influenzae

    Get PDF
    © 2015 Dhouib, Pg Othman, Essilfie, Hansbro, Hanson, McEwan and Kappler. Mononuclear molybdenum enzymes of the dimethylsulfoxide (DMSO) reductase family occur exclusively in prokaryotes, and a loss of some these enzymes has been linked to a loss of bacterial virulence in several cases. The MobA protein catalyzes the final step in the synthesis of the molybdenum guanine dinucleotide (MGD) cofactor that is exclusive to enzymes of the DMSO reductase family. MobA has been proposed as a potential target for control of virulence since its inhibition would affect the activities of all molybdoenzymes dependent upon MGD. Here, we have studied the phenotype of a mobA mutant of the host-adapted human pathogen Haemophilus influenzae. H. influenzae causes and contributes to a variety of acute and chronic diseases of the respiratory tract, and several enzymes of the DMSO reductase family are conserved and highly expressed in this bacterium. The mobA mutation caused a significant decrease in the activities of all Mo-enzymes present, and also resulted in a small defect in anaerobic growth. However, we did not detect a defect in in vitro biofilm formation nor in invasion and adherence to human epithelial cells in tissue culture compared to the wild-type. In a murine in vivo model, the mobA mutant showed only a mild attenuation compared to the wild-type. In summary, our data show that MobA is essential for the activities of molybdenum enzymes, but does not appear to affect the fitness of H. influenzae. These results suggest that MobA is unlikely to be a useful target for antimicrobials, at least for the purpose of treating H. influenzae infections

    How many crowdsourced workers should a requester hire?

    Get PDF
    Recent years have seen an increased interest in crowdsourcing as a way of obtaining information from a potentially large group of workers at a reduced cost. The crowdsourcing process, as we consider in this paper, is as follows: a requester hires a number of workers to work on a set of similar tasks. After completing the tasks, each worker reports back outputs. The requester then aggregates the reported outputs to obtain aggregate outputs. A crucial question that arises during this process is: how many crowd workers should a requester hire? In this paper, we investigate from an empirical perspective the optimal number of workers a requester should hire when crowdsourcing tasks, with a particular focus on the crowdsourcing platform Amazon Mechanical Turk. Specifically, we report the results of three studies involving different tasks and payment schemes. We find that both the expected error in the aggregate outputs as well as the risk of a poor combination of workers decrease as the number of workers increases. Surprisingly, we find that the optimal number of workers a requester should hire for each task is around 10 to 11, no matter the underlying task and payment scheme. To derive such a result, we employ a principled analysis based on bootstrapping and segmented linear regression. Besides the above result, we also find that overall top-performing workers are more consistent across multiple tasks than other workers. Our results thus contribute to a better understanding of, and provide new insights into, how to design more effective crowdsourcing processes

    Metabolic flexibility as a major predictor of spatial distribution in microbial communities

    Get PDF
    A better understand the ecology of microbes and their role in the global ecosystem could be achieved if traditional ecological theories can be applied to microbes. In ecology organisms are defined as specialists or generalists according to the breadth of their niche. Spatial distribution is often used as a proxy measure of niche breadth; generalists have broad niches and a wide spatial distribution and specialists a narrow niche and spatial distribution. Previous studies suggest that microbial distribution patterns are contrary to this idea; a microbial generalist genus (Desulfobulbus) has a limited spatial distribution while a specialist genus (Methanosaeta) has a cosmopolitan distribution. Therefore, we hypothesise that this counter-intuitive distribution within generalist and specialist microbial genera is a common microbial characteristic. Using molecular fingerprinting the distribution of four microbial genera, two generalists, Desulfobulbus and the methanogenic archaea Methanosarcina, and two specialists, Methanosaeta and the sulfate-reducing bacteria Desulfobacter were analysed in sediment samples from along a UK estuary. Detected genotypes of both generalist genera showed a distinct spatial distribution, significantly correlated with geographic distance between sites. Genotypes of both specialist genera showed no significant differential spatial distribution. These data support the hypothesis that the spatial distribution of specialist and generalist microbes does not match that seen with specialist and generalist large organisms. It may be that generalist microbes, while having a wider potential niche, are constrained, possibly by intrageneric competition, to exploit only a small part of that potential niche while specialists, with far fewer constraints to their niche, are more capable of filling their potential niche more effectively, perhaps by avoiding intrageneric competition. We suggest that these counter-intuitive distribution patterns may be a common feature of microbes in general and represent a distinct microbial principle in ecology, which is a real challenge if we are to develop a truly inclusive ecology

    Gauge symmetry and W-algebra in higher derivative systems

    Full text link
    The problem of gauge symmetry in higher derivative Lagrangian systems is discussed from a Hamiltonian point of view. The number of independent gauge parameters is shown to be in general {\it{less}} than the number of independent primary first class constraints, thereby distinguishing it from conventional first order systems. Different models have been considered as illustrative examples. In particular we show a direct connection between the gauge symmetry and the W-algebra for the rigid relativistic particle.Comment: 1+22 pages, 1 figure, LaTeX, v2; title changed, considerably expanded version with new results, to appear in JHE

    Neural Network Parameterizations of Electromagnetic Nucleon Form Factors

    Full text link
    The electromagnetic nucleon form-factors data are studied with artificial feed forward neural networks. As a result the unbiased model-independent form-factor parametrizations are evaluated together with uncertainties. The Bayesian approach for the neural networks is adapted for chi2 error-like function and applied to the data analysis. The sequence of the feed forward neural networks with one hidden layer of units is considered. The given neural network represents a particular form-factor parametrization. The so-called evidence (the measure of how much the data favor given statistical model) is computed with the Bayesian framework and it is used to determine the best form factor parametrization.Comment: The revised version is divided into 4 sections. The discussion of the prior assumptions is added. The manuscript contains 4 new figures and 2 new tables (32 pages, 15 figures, 2 tables

    Systematic design for trait introgression projects

    Get PDF
    We demonstrate an innovative approach for designing Trait Introgression (TI) projects based on optimization principles from Operations Research. If the designs of TI projects are based on clear and measurable objectives, they can be translated into mathematical models with decision variables and constraints that can be translated into Pareto optimality plots associated with any arbitrary selection strategy. The Pareto plots can be used to make rational decisions concerning the trade-offs between maximizing the probability of success while minimizing costs and time. The systematic rigor associated with a cost, time and probability of success (CTP) framework is well suited to designing TI projects that require dynamic decision making. The CTP framework also revealed that previously identified ‘best’ strategies can be improved to be at least twice as effective without increasing time or expenses

    Measurement of triple gauge boson couplings from W⁺W⁻ production at LEP energies up to 189 GeV

    Get PDF
    A measurement of triple gauge boson couplings is presented, based on W-pair data recorded by the OPAL detector at LEP during 1998 at a centre-of-mass energy of 189 GeV with an integrated luminosity of 183 pb⁻¹. After combining with our previous measurements at centre-of-mass energies of 161–183 GeV we obtain κ = 0.97_{-0.16}^{+0.20}, g_{1}^{z} = 0.991_{-0.057}^{+0.060} and λ = -0.110_{-0.055}^{+0.058}, where the errors include both statistical and systematic uncertainties and each coupling is determined by setting the other two couplings to their Standard Model values. These results are consistent with the Standard Model expectations

    Photonic quantum technologies

    Full text link
    The first quantum technology, which harnesses uniquely quantum mechanical effects for its core operation, has arrived in the form of commercially available quantum key distribution systems that achieve enhanced security by encoding information in photons such that information gained by an eavesdropper can be detected. Anticipated future quantum technologies include large-scale secure networks, enhanced measurement and lithography, and quantum information processors, promising exponentially greater computation power for particular tasks. Photonics is destined for a central role in such technologies owing to the need for high-speed transmission and the outstanding low-noise properties of photons. These technologies may use single photons or quantum states of bright laser beams, or both, and will undoubtably apply and drive state-of-the-art developments in photonics

    A Measurement of Rb using a Double Tagging Method

    Get PDF
    The fraction of Z to bbbar events in hadronic Z decays has been measured by the OPAL experiment using the data collected at LEP between 1992 and 1995. The Z to bbbar decays were tagged using displaced secondary vertices, and high momentum electrons and muons. Systematic uncertainties were reduced by measuring the b-tagging efficiency using a double tagging technique. Efficiency correlations between opposite hemispheres of an event are small, and are well understood through comparisons between real and simulated data samples. A value of Rb = 0.2178 +- 0.0011 +- 0.0013 was obtained, where the first error is statistical and the second systematic. The uncertainty on Rc, the fraction of Z to ccbar events in hadronic Z decays, is not included in the errors. The dependence on Rc is Delta(Rb)/Rb = -0.056*Delta(Rc)/Rc where Delta(Rc) is the deviation of Rc from the value 0.172 predicted by the Standard Model. The result for Rb agrees with the value of 0.2155 +- 0.0003 predicted by the Standard Model.Comment: 42 pages, LaTeX, 14 eps figures included, submitted to European Physical Journal

    First observations of separated atmospheric nu_mu and bar{nu-mu} events in the MINOS detector

    Get PDF
    The complete 5.4 kton MINOS far detector has been taking data since the beginning of August 2003 at a depth of 2070 meters water-equivalent in the Soudan mine, Minnesota. This paper presents the first MINOS observations of nuµ and [overline nu ]µ charged-current atmospheric neutrino interactions based on an exposure of 418 days. The ratio of upward- to downward-going events in the data is compared to the Monte Carlo expectation in the absence of neutrino oscillations, giving Rup/downdata/Rup/downMC=0.62-0.14+0.19(stat.)±0.02(sys.). An extended maximum likelihood analysis of the observed L/E distributions excludes the null hypothesis of no neutrino oscillations at the 98% confidence level. Using the curvature of the observed muons in the 1.3 T MINOS magnetic field nuµ and [overline nu ]µ interactions are separated. The ratio of [overline nu ]µ to nuµ events in the data is compared to the Monte Carlo expectation assuming neutrinos and antineutrinos oscillate in the same manner, giving R[overline nu ][sub mu]/nu[sub mu]data/R[overline nu ][sub mu]/nu[sub mu]MC=0.96-0.27+0.38(stat.)±0.15(sys.), where the errors are the statistical and systematic uncertainties. Although the statistics are limited, this is the first direct observation of atmospheric neutrino interactions separately for nuµ and [overline nu ]µ
    corecore