2,389 research outputs found

    Developments in Rare Kaon Decay Physics

    Get PDF
    We review the current status of the field of rare kaon decays. The study of rare kaon decays has played a key role in the development of the standard model, and the field continues to have significant impact. The two areas of greatest import are the search for physics beyond the standard model and the determination of fundamental standard-model parameters. Due to the exquisite sensitivity of rare kaon decay experiments, searches for new physics can probe very high mass scales. Studies of the k->pnn modes in particular, where the first event has recently been seen, will permit tests of the standard-model picture of quark mixing and CP violation.Comment: One major revision to the text is the branching ratio of KL->ppg, based on a new result from KTeV. Several references were updated, with minor modifications to the text. A total of 48 pages, with 28 figures, in LaTeX; to be published in the Annual Review of Nuclear and Particle Science, Vol. 50, December 200

    Efficacy and safety of ablation for people with non-paroxysmal atrial fibrillation.

    Get PDF
    : The optimal rhythm management strategy for people with non-paroxysmal (persistent or long-standing persistent) atrial fibrilation is currently not well defined. Antiarrhythmic drugs have been the mainstay of therapy. But recently, in people who have not responded to antiarrhythmic drugs, the use of ablation (catheter and surgical) has emerged as an alternative to maintain sinus rhythm to avoid long-term atrial fibrillation complications. However, evidence from randomised trials about the efficacy and safety of ablation in non-paroxysmal atrial fibrillation is limited. : To determine the efficacy and safety of ablation (catheter and surgical) in people with non-paroxysmal (persistent or long-standing persistent) atrial fibrillation compared to antiarrhythmic drugs. : We searched the Cochrane Central Register of Controlled Trials (CENTRAL), MEDLINE Ovid, Embase Ovid, conference abstracts, clinical trial registries, and Health Technology Assessment Database. We searched these databases from their inception to 1 April 2016. We used no language restrictions. : We included randomised trials evaluating the effect of radiofrequency catheter ablation (RFCA) or surgical ablation compared with antiarrhythmic drugs in adults with non-paroxysmal atrial fibrillation, regardless of any concomitant underlying heart disease, with at least 12 months of follow-up. : Two review authors independently selected studies and extracted data. We evaluated risk of bias using the Cochrane 'Risk of bias' tool. We calculated risk ratios (RRs) for dichotomous data with 95% confidence intervals (CIs) a using fixed-effect model when heterogeneity was low (I² &lt;= 40%) and a random-effects model when heterogeneity was moderate or substantial (I² &gt; 40%). Using the GRADE approach, we evaluated the quality of the evidence and used the GRADE profiler (GRADEpro) to import data from Review Manager 5 to create 'Summary of findings' tables. : We included three randomised trials with 261 participants (mean age: 60 years) comparing RFCA (159 participants) to antiarrhythmic drugs (102) for non-paroxysmal atrial fibrillation. We generally assessed the included studies as having low or unclear risk of bias across multiple domains, with reported outcomes generally lacking precision due to low event rates. Evidence showed that RFCA was superior to antiarrhythmic drugs in achieving freedom from atrial arrhythmias (RR 1.84, 95% CI 1.17 to 2.88; 3 studies, 261 participants; low-quality evidence), reducing the need for cardioversion (RR 0.62, 95% CI 0.47 to 0.82; 3 studies, 261 participants; moderate-quality evidence), and reducing cardiac-related hospitalisation (RR 0.27, 95% CI 0.10 to 0.72; 2 studies, 216 participants; low-quality evidence) at 12 months follow-up. There was substantial uncertainty surrounding the effect of RFCA regarding significant bradycardia (or need for a pacemaker) (RR 0.20, 95% CI 0.02 to 1.63; 3 studies, 261 participants; low-quality evidence), periprocedural complications, and other safety outcomes (RR 0.94, 95% CI 0.16 to 5.68; 3 studies, 261 participants; very low-quality evidence). : In people with non-paroxysmal atrial fibrillation, evidence suggests a superiority of RFCA to antiarrhythmic drugs in achieving freedom from atrial arrhythmias, reducing the need for cardioversion, and reducing cardiac-related hospitalisations. There was uncertainty surrounding the effect of RFCA with significant bradycardia (or need for a pacemaker), periprocedural complications, and other safety outcomes. Evidence should be interpreted with caution, as event rates were low and quality of evidence ranged from moderate to very low.<br/

    The s ---> d gamma decay in and beyond the Standard Model

    Get PDF
    The New Physics sensitivity of the s ---> d gamma transition and its accessibility through hadronic processes are thoroughly investigated. Firstly, the Standard Model predictions for the direct CP-violating observables in radiative K decays are systematically improved. Besides, the magnetic contribution to epsilon prime is estimated and found subleading, even in the presence of New Physics, and a new strategy to resolve its electroweak versus QCD penguin fraction is identified. Secondly, the signatures of a series of New Physics scenarios, characterized as model-independently as possible in terms of their underlying dynamics, are investigated by combining the information from all the FCNC transitions in the s ---> d sector.Comment: 54 pages, 14 eps figure

    Spatiotemporal PET reconstruction using ML-EM with learned diffeomorphic deformation

    Full text link
    Patient movement in emission tomography deteriorates reconstruction quality because of motion blur. Gating the data improves the situation somewhat: each gate contains a movement phase which is approximately stationary. A standard method is to use only the data from a few gates, with little movement between them. However, the corresponding loss of data entails an increase of noise. Motion correction algorithms have been implemented to take into account all the gated data, but they do not scale well, especially not in 3D. We propose a novel motion correction algorithm which addresses the scalability issue. Our approach is to combine an enhanced ML-EM algorithm with deep learning based movement registration. The training is unsupervised, and with artificial data. We expect this approach to scale very well to higher resolutions and to 3D, as the overall cost of our algorithm is only marginally greater than that of a standard ML-EM algorithm. We show that we can significantly decrease the noise corresponding to a limited number of gates

    On single and double soft behaviors in NLSM

    Full text link
    In this paper, we study the single and double soft behaviors of tree level off-shell currents and on-shell amplitudes in nonlinear sigma model(NLSM). We first propose and prove the leading soft behavior of the tree level currents with a single soft particle. In the on-shell limit, this single soft emission becomes the Adler's zero. Then we establish the leading and sub-leading soft behaviors of tree level currents with two adjacent soft particles. With a careful analysis of the on-shell limit, we obtain the double soft behaviors of on-shell amplitudes where the two soft particles are adjacent to each other. By applying Kleiss-Kuijf (KK) relation, we further obtain the leading and sub-leading behaviors of amplitudes with two nonadjacent soft particles.Comment: 41 pages, 6 tables, 9 figures, minor revised, more content about nonadjacent double soft limit, update the reference

    Towards the development of an EIT-based stretchable sensor for multi-touch industrial human-computer interaction systems

    Get PDF
    In human-computer interaction studies, an interaction is often considered as a kind of information or discrete internal states of an individual that can be transmitted in a loss-free manner from people to computing interfaces (or robotic interfaces) and vice-versa. This project aims to investigate processes capable of communicating and cooperating by adjusting their schedules to match the evolving execution circumstances, in a way that maximise the quality of their joint activities. By enabling human-computer interactions, the process will emerge as a framework based on the concept of expectancy, demand, and need of the human and computer together, for understanding the interplay between people and computers. The idea of this work is to utilise touch feedback from humans as a channel for communication thanks to an artificial sensitive skin made of a thin, flexible, and stretchable material acting as transducer. As a proof of concept, we demonstrate that the first prototype of our artificial sensitive skin can detect surface contacts and show their locations with an image reconstructing the internal electrical conductivity of the sensor

    The persistence landscape and some of its properties

    Full text link
    Persistence landscapes map persistence diagrams into a function space, which may often be taken to be a Banach space or even a Hilbert space. In the latter case, it is a feature map and there is an associated kernel. The main advantage of this summary is that it allows one to apply tools from statistics and machine learning. Furthermore, the mapping from persistence diagrams to persistence landscapes is stable and invertible. We introduce a weighted version of the persistence landscape and define a one-parameter family of Poisson-weighted persistence landscape kernels that may be useful for learning. We also demonstrate some additional properties of the persistence landscape. First, the persistence landscape may be viewed as a tropical rational function. Second, in many cases it is possible to exactly reconstruct all of the component persistence diagrams from an average persistence landscape. It follows that the persistence landscape kernel is characteristic for certain generic empirical measures. Finally, the persistence landscape distance may be arbitrarily small compared to the interleaving distance.Comment: 18 pages, to appear in the Proceedings of the 2018 Abel Symposiu

    Determinants of a healthy lifestyle and use of preventive screening in Canada

    Get PDF
    BACKGROUND: This study explores the associations between individual characteristics such as income and education with health behaviours and utilization of preventive screening. METHODS: Data from the Canadian National Population Health Survey (NPHS) 1998–9 were used. Independent variables were income, education, age, sex, marital status, body mass index, urban/rural residence and access to a regular physician. Dependent variables included smoking, excessive alcohol use, physical activity, blood pressure checks, mammography in past year and Pap smear in past 3 years. Logistic regression models were developed for each dependent variable. RESULTS: 13,756 persons 20 years of age and older completed the health portion of the NPHS. In general, higher levels of income were associated with healthier behaviours, as were higher levels of education, although there were exceptions to both. The results for age and gender also varied depending on the outcome. The presence of a regular medical doctor was associated with increased rates of all preventive screening and reduced rates of smoking. CONCLUSION: These results expand upon previous data suggesting that socioeconomic disparities in healthy behaviours and health promotion continue to exist despite equal access to medical screening within the Canadian healthcare context. Knowledge, resources and the presence of a regular medical doctor are important factors associated with identified differences

    Testing the Nambu-Goldstone Hypothesis for Quarks and Leptons at the LHC

    Get PDF
    The hierarchy of the Yukawa couplings is an outstanding problem of the standard model. We present a class of models in which the first and second generation fermions are SUSY partners of pseudo-Nambu-Goldstone bosons that parameterize a non-compact Kahler manifold, explaining the small values of these fermion masses relative to those of the third generation. We also provide an example of such a model. We find that various regions of the parameter space in this scenario can give the correct dark matter abundance, and that nearly all of these regions evade other phenomenological constraints. We show that for gluino mass ~700 GeV, model points from these regions can be easily distinguished from other mSUGRA points at the LHC with only 7 fb^(-1) of integrated luminosity at 14 TeV. The most striking signatures are a dearth of b- and tau-jets, a great number of multi-lepton events, and either an "inverted" slepton mass hierarchy, narrowed slepton mass hierarchy, or characteristic small-mu spectrum.Comment: Corresponds to published versio
    • …
    corecore