1,485 research outputs found
Two-loop electroweak top corrections: are they under control?
The assumption that two-loop top corrections are well approximated by the
contribution is investigated. It is shown that in the case of
the ratio neutral-to-charged current amplitudes at zero momentum transfer the
terms are numerically comparable to the
contribution for realistic values of the top mass. An estimate of the
theoretical error due to unknown two-loop top effect is presented for a few
observables of LEP interest.Comment: 13 pages, LaTeX using equations, doublespace, cite macros. Hard
copies of the paper including one figure are available from
[email protected]
Towards precise predictions for Higgs-boson production in the MSSM
We study the production of scalar and pseudoscalar Higgs bosons via gluon
fusion and bottom-quark annihilation in the MSSM. Relying on the NNLO-QCD
calculation implemented in the public code SusHi, we provide precise
predictions for the Higgs-production cross section in six benchmark scenarios
compatible with the LHC searches. We also provide a detailed discussion of the
sources of theoretical uncertainty in our calculation. We examine the
dependence of the cross section on the renormalization and factorization
scales, on the precise definition of the Higgs-bottom coupling and on the
choice of PDFs, as well as the uncertainties associated to our incomplete
knowledge of the SUSY contributions through NNLO. In particular, a potentially
large uncertainty originates from uncomputed higher-order QCD corrections to
the bottom-quark contributions to gluon fusion.Comment: 62 pages, 24 pdf figures; v2: minor clarifications, improved plot
quality, matches published versio
Nonparametric Bayesian Mixed-effect Model: a Sparse Gaussian Process Approach
Multi-task learning models using Gaussian processes (GP) have been developed
and successfully applied in various applications. The main difficulty with this
approach is the computational cost of inference using the union of examples
from all tasks. Therefore sparse solutions, that avoid using the entire data
directly and instead use a set of informative "representatives" are desirable.
The paper investigates this problem for the grouped mixed-effect GP model where
each individual response is given by a fixed-effect, taken from one of a set of
unknown groups, plus a random individual effect function that captures
variations among individuals. Such models have been widely used in previous
work but no sparse solutions have been developed. The paper presents the first
sparse solution for such problems, showing how the sparse approximation can be
obtained by maximizing a variational lower bound on the marginal likelihood,
generalizing ideas from single-task Gaussian processes to handle the
mixed-effect model as well as grouping. Experiments using artificial and real
data validate the approach showing that it can recover the performance of
inference with the full sample, that it outperforms baseline methods, and that
it outperforms state of the art sparse solutions for other multi-task GP
formulations.Comment: Preliminary version appeared in ECML201
The natural evolution of endoscopic approaches in skull base surgery: robotic-assisted surgery?
The current surgical trend is to expand the variety of minimally invasive approaches and, in particular, the possible applications of robotic systems in head and neck surgery. This is particularly intriguing in skull base regions. In this paper, we review the current literature and propose personal considerations on the role of robotic techniques in this field. A brief description of our personal preclinical experience on skull base robotic dissection represents the basis for further considerations. We are convinced that the advantages of robotic surgery applied to the posterior cranial fossa are similar to those already clinically experienced in other areas (oropharynx, tongue base), in terms of tremor-free, bimanual, precise dissection: the implementation of instruments for bony work and resolving current drawbacks will definitely increase the applicability of such a system in forthcoming years
Theory Building as Integrated Reflection: Understanding Physician Reflection Through Human Communication Research, Medical Education, and Ethics
Grounded in a presupposition that a single explanatory framework cannot fully account for the expansive learning processes that occur during medical residency, the article examines developing physiciansâ reflective writing from three disciplinary lenses. The goal is to understand how the multi-dimensional nature of medical residency translates into assembling educational experiences and constructing meaning that cannot be fully explained through a single discipline. An interdisciplinary research team across medical education, communication, and ethics qualitatively analyzed reflective entries (N=756) completed by family medicine residents (N=33) across an academic year. Results provide evidence for moving toward an integrated thematic explanation across disciplines. The authors suggest that the integration of disciplinary explanations allows for comprehensive understanding of reflection as a cornerstone in the broader formation of the physician. Examples provide evidence for an integrated understanding of a fuller human experience by considering the three thematic explanations as co-occurring, reciprocal processes
The Multitude of Molecular Hydrogen Knots in the Helix Nebula
We present HST/NICMOS imaging of the H_2 2.12 \mu m emission in 5 fields in
the Helix Nebula ranging in radial distance from 250-450" from the central
star. The images reveal arcuate structures with their apexes pointing towards
the central star. Comparison of these images with comparable resolution ground
based images reveals that the molecular gas is more highly clumped than the
ionized gas line tracers. From our images, we determine an average number
density of knots in the molecular gas ranging from 162 knots/arcmin^2 in the
denser regions to 18 knots/arcmin^2 in the lower density outer regions. Using
this new number density, we estimate that the total number of knots in the
Helix to be ~23,000 which is a factor of 6.5 larger than previous estimates.
The total neutral gas mass in the Helix is 0.35 M_\odot assuming a mass of
\~1.5x10^{-5} M_\odot for the individual knots. The H_2 intensity, 5-9x10^{-5}
erg s^{-1} cm^{-2} sr^{-1}, remains relatively constant with projected distance
from the central star suggesting a heating mechanism for the molecular gas that
is distributed almost uniformly in the knots throughout the nebula. The
temperature and H_2 2.12 \mu m intensity of the knots can be approximately
explained by photodissociation regions (PDRs) in the individual knots; however,
theoretical PDR models of PN under-predict the intensities of some knots by a
factor of 10.Comment: 26 pages, 3 tables, 10 figures; AJ accepte
APENet: LQCD clusters a la APE
Developed by the APE group, APENet is a new high speed, low latency,
3-dimensional interconnect architecture optimized for PC clusters running
LQCD-like numerical applications. The hardware implementation is based on a
single PCI-X 133MHz network interface card hosting six indipendent
bi-directional channels with a peak bandwidth of 676 MB/s each direction. We
discuss preliminary benchmark results showing exciting performances similar or
better than those found in high-end commercial network systems.Comment: Lattice2004(machines), 3 pages, 4 figure
Optimization Under Uncertainty Using the Generalized Inverse Distribution Function
A framework for robust optimization under uncertainty based on the use of the
generalized inverse distribution function (GIDF), also called quantile
function, is here proposed. Compared to more classical approaches that rely on
the usage of statistical moments as deterministic attributes that define the
objectives of the optimization process, the inverse cumulative distribution
function allows for the use of all the possible information available in the
probabilistic domain. Furthermore, the use of a quantile based approach leads
naturally to a multi-objective methodology which allows an a-posteriori
selection of the candidate design based on risk/opportunity criteria defined by
the designer. Finally, the error on the estimation of the objectives due to the
resolution of the GIDF will be proven to be quantifiableComment: 20 pages, 25 figure
Ethical considerations of telehealth: Access, inequity, trust, and overuse
In the U.S. healthcare system, telehealth is increasingly present and demands ethical assessment. On the one hand, telehealth increases access to healthcare services for some at-risk populations (e.g., people suffering from mental illness and addictions) and in specific contexts (e.g., rural). On the other hand, telehealth widens the digital divide and can lead to overuse of services. Furthermore, because it is still unclear how telehealth influences trust between patients and primary care clinicians, connecting relationship science and human communication research can inform critical reasoning. Finally, healthcare policy is advancing toward the wide adoption of telehealth. Hence, it is urgent to address these ethical issues and invest in further research
- âŠ