5,881 research outputs found

    Can LHC observe an anomaly in ttZ production?

    Full text link
    The cross section for production at the 7 TeV LHC has been measured. For the first time it therefore becomes possible to measure Z couplings to top quarks. Interpreting the notorious LEP1 anomaly on Z couplings to b quarks in terms of an extra-dimension model, one expects an excess, as large as a factor 2, on the rate which may already become significant with the LHC at 8 TeV data. Other schemes are reviewed which also predict deviations of couplings. The size of the effect and a dramatic increase of the top quarks with the right chirality should confirm our interpretation of the underlying mechanism. Complementary signals observable at LHC are also briefly reviewed.Comment: 7 pages, 2 figure

    A Z-prime interpretation of Bd->K*mu+mu- data and consequences for high energy colliders

    Full text link
    In this note, I examine the possible consequences for high energy colliders of a Z-prime interpretation of the LHCb anomaly observed in the K*mu+mu- final state. Two examples are elaborated in the framework of the so-called 331 model. In the first one it is shown that LEP2 provides the tightest lower mass limit for the Z-prime boson, above 8 TeV, while in the second one the lower mass limit is set by ATLAS/CMS to about 3 TeV. It is then shown that precision measurements at ILC 500 GeV can fully explore the underlying structure of the model by measuring the fermion final states separately: leptons, charm, beauty and top final states. Z-prime-Z mixing can also be substantial, thus leading to possible effects almost observable at LEP1 and which can be precisely measured at GigaZ. Discovery prospects for heavy bosons and heavy fermions at LHC are also discussed

    Variations along the Fuchsian locus

    Get PDF
    The main result is an explicit expression for the Pressure Metric on the Hitchin component of surface group representations into PSL(n,R) along the Fuchsian locus. The expression is in terms of a parametrization of the tangent space by holomorphic differentials, and it gives a precise relationship with the Petersson pairing. Along the way, variational formulas are established that generalize results from classical Teichmueller theory, such as Gardiner's formula, the relationship between length functions and Fenchel-Nielsen deformations, and variations of cross ratios.Comment: 58 pages, 1 figur

    Matched subspace detection with hypothesis dependent noise power

    Get PDF
    We consider the problem of detecting a subspace signal in white Gaussian noise when the noise power may be different under the null hypothesis—where it is assumed to be known—and the alternative hypothesis. This situation occurs when the presence of the signal of interest (SOI) triggers an increase in the noise power. Accordingly, it may be relevant in the case of a mismatch between the actual SOI subspace and its presumed value, resulting in a modelling error. We derive the generalized likelihood ratio test (GLRT) for the problem at hand and contrast it with the GLRT which assumes known and equal noise power under the two hypotheses. A performance analysis is carried out and the distributions of the two test statistics are derived. From this analysis, we discuss the differences between the two detectors and provide explanations for the improved performance of the new detector. Numerical simulations attest to the validity of the analysis

    The Multinomial Multiperiod Probit Model: Identification and Efficient Estimation

    Get PDF
    In this paper we discuss parameter identification and likelihood evaluation for multinomial multiperiod Probit models. It is shown in particular that the standard autoregressive specification used in the literature can be interpreted as a latent common factor model. However, this specification is not invariant with respect to the selection of the baseline category. Hence, we propose an alternative specification which is invariant with respect to such a selection and identifies coefficients characterizing the stationary covariance matrix which are not identified in the standard approach. For likelihood evaluation requiring high-dimensional truncated integration we propose to use a generic procedure known as Efficient Importance Sampling (EIS). A special case of our proposed EIS algorithm is the standard GHK probability simulator. To illustrate the relative performance of both procedures we perform a set Monte-Carlo experiments. Our results indicate substantial numerical e?ciency gains of the ML estimates based on GHK-EIS relative to ML estimates obtained by using GHK. --Discrete choice,Importance sampling,Monte-Carlo integration,Panel data,Parameter identification,Simulated maximum likelihood

    Classical and Bayesian Analysis of Univariate and Multivariate Stochastic Volatility Models

    Get PDF
    In this paper Efficient Importance Sampling (EIS) is used to perform a classical and Bayesian analysis of univariate and multivariate Stochastic Volatility (SV) models for financial return series. EIS provides a highly generic and very accurate procedure for the Monte Carlo (MC) evaluation of high-dimensional interdependent integrals. It can be used to carry out ML-estimation of SV models as well as simulation smoothing where the latent volatilities are sampled at once. Based on this EIS simulation smoother a Bayesian Markov Chain Monte Carlo (MCMC) posterior analysis of the parameters of SV models can be performed. --Dynamic Latent Variables,Markov Chain Monte Carlo,Maximum likelihood,Simulation Smoother
    • 

    corecore