141 research outputs found

    A test for the absence of aliasing or white noise in two-dimensional locally stationary wavelet processes

    Get PDF
    Either intentionally or unintentionally, sub-sampling is a common occurrence in image processing and can lead to aliasing if the highest frequency in the underlying process is higher than the Nyquist frequency. Several techniques have already been suggest in order to prevent aliasing from occurring (for example applying anti-aliasing filters), however there is little work describing methods to detect for it. Recently, Eckley and Nason (Biometrika 105(4), 833–848, 2018) developed a test for the absence of aliasing and/or white noise in locally stationary wavelet processes. Following Eckley and Nason (Biometrika 105(4), 833–848, 2018), we derive the corresponding theoretical consequences of sub-sampling a two-dimensional locally stationary wavelet process and develop a procedure to test for the absence of aliasing and/or white noise confounding at a fixed point, demonstrating its effectiveness and use through appropriate simulation studies and an example. In addition, we outline some possibilities for extending these methods further, from images to videos

    Bayesian Wavelet Shrinkage of the Haar-Fisz Transformed Wavelet Periodogram.

    Get PDF
    It is increasingly being realised that many real world time series are not stationary and exhibit evolving second-order autocovariance or spectral structure. This article introduces a Bayesian approach for modelling the evolving wavelet spectrum of a locally stationary wavelet time series. Our new method works by combining the advantages of a Haar-Fisz transformed spectrum with a simple, but powerful, Bayesian wavelet shrinkage method. Our new method produces excellent and stable spectral estimates and this is demonstrated via simulated data and on differenced infant electrocardiogram data. A major additional benefit of the Bayesian paradigm is that we obtain rigorous and useful credible intervals of the evolving spectral structure. We show how the Bayesian credible intervals provide extra insight into the infant electrocardiogram data

    NLL+NNLO predictions for jet-veto efficiencies in Higgs-boson and Drell-Yan production

    Full text link
    Using the technology of the CAESAR approach to resummation, we examine the jet-veto efficiency in Higgs-boson and Drell-Yan production at hadron colliders and show that at next-to-leading logarithmic (NLL) accuracy the resummation reduces to just a Sudakov form factor. Matching with NNLO calculations results in stable predictions for the case of Drell-Yan production, but reveals substantial uncertainties in gluon-fusion Higgs production, connected in part with the poor behaviour of the perturbative series for the total cross section. We compare our results to those from POWHEG with and without reweighting by HqT, as used experimentally, and observe acceptable agreement. In an appendix we derive the part of the NNLL resummation corrections associated with the radius dependence of the jet algorithm.Comment: 30 pages, 8 figures; v2 as published in JHE

    Strong interface-induced spin-orbit coupling in graphene on WS2

    Get PDF
    Interfacial interactions allow the electronic properties of graphene to be modified, as recently demonstrated by the appearance of satellite Dirac cones in the band structure of graphene on hexagonal boron nitride (hBN) substrates. Ongoing research strives to explore interfacial interactions in a broader class of materials in order to engineer targeted electronic properties. Here we show that at an interface with a tungsten disulfide (WS2) substrate, the strength of the spin-orbit interaction (SOI) in graphene is very strongly enhanced. The induced SOI leads to a pronounced low-temperature weak anti-localization (WAL) effect, from which we determine the spin-relaxation time. We find that spin-relaxation time in graphene is two-to-three orders of magnitude smaller on WS2 than on SiO2 or hBN, and that it is comparable to the intervalley scattering time. To interpret our findings we have performed first-principle electronic structure calculations, which both confirm that carriers in graphene-on-WS2 experience a strong SOI and allow us to extract a spin-dependent low-energy effective Hamiltonian. Our analysis further shows that the use of WS2 substrates opens a possible new route to access topological states of matter in graphene-based systems.Comment: Originally submitted version in compliance with editorial guidelines. Final version with expanded discussion of the relation between theory and experiments to be published in Nature Communication

    Normalization in MALDI-TOF imaging datasets of proteins: practical considerations

    Get PDF
    Normalization is critically important for the proper interpretation of matrix-assisted laser desorption/ionization (MALDI) imaging datasets. The effects of the commonly used normalization techniques based on total ion count (TIC) or vector norm normalization are significant, and they are frequently beneficial. In certain cases, however, these normalization algorithms may produce misleading results and possibly lead to wrong conclusions, e.g. regarding to potential biomarker distributions. This is typical for tissues in which signals of prominent abundance are present in confined areas, such as insulin in the pancreas or ÎČ-amyloid peptides in the brain. In this work, we investigated whether normalization can be improved if dominant signals are excluded from the calculation. Because manual interaction with the data (e.g., defining the abundant signals) is not desired for routine analysis, we investigated two alternatives: normalization on the spectra noise level or on the median of signal intensities in the spectrum. Normalization on the median and the noise level was found to be significantly more robust against artifact generation compared to normalization on the TIC. Therefore, we propose to include these normalization methods in the standard “toolbox” of MALDI imaging for reliable results under conditions of automation

    Single bottom quark production in kT-factorisation

    Get PDF
    We present a study within the k T -factorisation scheme on single bottom quark production at the LHC. In particular, we calculate the rapidity and transverse momentum differential distributions for single bottom quark/anti-quark production. In our setup, the unintegrated gluon density is obtained from the NLx BFKL Green function whereas we included mass effects to the Lx heavy quark jet vertex. We compare our results to the corresponding distributions predicted by the usual collinear factorisation scheme. The latter were produced with Pythia 8.1
    • 

    corecore