2,560 research outputs found
Soft Tissue to Hard Tissue Advancement Ratios for Mandibular Elongation Using Distraction Osteogenesis in Children
Distraction osteogenesis is extensively used for the elongation of hypoplastic mandibles in children, yet the soft tissue profile response to this is not well understood. The pre- and posttreatment lateral cephalometric radiographs of 27 pediatric patients who underwent bilateral mandibular elongation using distraction osteogenesis were analyzed retrospectively to correlate horizontal soft tissue advancement with horizontal underlying bone advancement at B point and pogonion. Horizontal advancement (in millimeters) of bone and overlying soft tissue at these points was collected from the radiographs of each patient, and linear regression analysis was performed to determine the relationship of hard to soft tissue horizontal advancement at these points. A 1:0.90 mean ratio of bone to soft tissue advancement was observed at B point/labiomental sulcus and at pogonion/soft tissue pogonion (linear regression analysis demonstrated slopes [β1 values] of 0.94 and 0.92, respectively). These ratios were consistent throughout the sample population and are highly predictive of the soft tissue response that can be anticipated. Magnitude of advancement, age, and sex of the patient had no effect on these ratios in our population. This study assists with our understanding of the soft tissue response that accompanies bony elongation during distraction osteogenesis which will allow us to more effectively treatment plan the orthodontic and surgical intervention that will optimize the patients\u27 functional and esthetic outcome
Microscopic calculation of 6Li elastic and transition form factors
Variational Monte Carlo wave functions, obtained from a realistic Hamiltonian
consisting of the Argonne v18 two-nucleon and Urbana-IX three-nucleon
interactions, are used to calculate the 6Li ground-state longitudinal and
transverse form factors as well as transition form factors to the first four
excited states. The charge and current operators include one- and two-body
components, leading terms of which are constructed consistently with the
two-nucleon interaction. The calculated form factors and radiative widths are
in good agreement with available experimental data.Comment: 9 pages, 2 figures, REVTeX, submitted to Physical Review Letters,
with updated introduction and reference
The Plausibility of a String Quartet Performance in Virtual Reality
We describe an experiment that explores the contribution of auditory and other features to the illusion of plausibility in a
virtual environment that depicts the performance of a string quartet. ‘Plausibility’ refers to the component of presence that is the
illusion that the perceived events in the virtual environment are really happening. The features studied were: Gaze (the musicians
ignored the participant, the musicians sometimes looked towards and followed the participant’s movements), Sound Spatialization
(Mono, Stereo, Spatial), Auralization (no sound reflections, reflections corresponding to a room larger than the one perceived,
reflections that exactly matched the virtual room), and Environment (no sound from outside of the room, birdsong and wind
corresponding to the outside scene). We adopted the methodology based on color matching theory, where 20 participants were first
able to assess their feeling of plausibility in the environment with each of the four features at their highest setting. Then five times
participants started from a low setting on all features and were able to make transitions from one system configuration to another until
they matched their original feeling of plausibility. From these transitions a Markov transition matrix was constructed, and also
probabilities of a match conditional on feature configuration. The results show that Environment and Gaze were individually the most
important factors influencing the level of plausibility. The highest probability transitions were to improve Environment and Gaze, and
then Auralization and Spatialization. We present this work as both a contribution to the methodology of assessing presence without
questionnaires, and showing how various aspects of a musical performance can influence plausibility
Single-Brane Cosmological Solutions with a Stable Compact Extra Dimension
We consider 5-dimensional cosmological solutions of a single brane. The
correct cosmology on the brane, i.e., governed by the standard 4-dimensional
Friedmann equation, and stable compactification of the extra dimension is
guaranteed by the existence of a non-vanishing \hat{T}^5_5 which is
proportional to the 4-dimensional trace of the energy-momentum tensor. We show
that this component of the energy-momentum tensor arises from the backreaction
of the dilaton coupling to the brane. The same positive features are exhibited
in solutions found in the presence of non-vanishing cosmological constants both
on the brane (\Lambda_{br}) and in the bulk (\Lambda_B). Moreover, the
restoration of the Friedmann equation, with the correct sign, takes place for
both signs of so long as the sign of is opposite
in order to cancel the energy densities of the two cosmological
constants. We further extend our single-brane thin-wall solution to allow a
brane with finite thickness.Comment: 25 pages, Latex file, no figures, comments added, references updated,
final version to appear in Physical Review
Statistical coverage for supersymmetric parameter estimation: a case study with direct detection of dark matter
Models of weak-scale supersymmetry offer viable dark matter (DM) candidates.
Their parameter spaces are however rather large and complex, such that pinning
down the actual parameter values from experimental data can depend strongly on
the employed statistical framework and scanning algorithm. In frequentist
parameter estimation, a central requirement for properly constructed confidence
intervals is that they cover true parameter values, preferably at exactly the
stated confidence level when experiments are repeated infinitely many times.
Since most widely-used scanning techniques are optimised for Bayesian
statistics, one needs to assess their abilities in providing correct confidence
intervals in terms of the statistical coverage. Here we investigate this for
the Constrained Minimal Supersymmetric Standard Model (CMSSM) when only
constrained by data from direct searches for dark matter. We construct
confidence intervals from one-dimensional profile likelihoods and study the
coverage by generating several pseudo-experiments for a few benchmark sets of
pseudo-true parameters. We use nested sampling to scan the parameter space and
evaluate the coverage for the benchmarks when either flat or logarithmic priors
are imposed on gaugino and scalar mass parameters. The sampling algorithm has
been used in the configuration usually adopted for exploration of the Bayesian
posterior. We observe both under- and over-coverage, which in some cases vary
quite dramatically when benchmarks or priors are modified. We show how most of
the variation can be explained as the impact of explicit priors as well as
sampling effects, where the latter are indirectly imposed by physicality
conditions. For comparison, we also evaluate the coverage for Bayesian credible
intervals, and observe significant under-coverage in those cases.Comment: 30 pages, 5 figures; v2 includes major updates in response to
referee's comments; extra scans and tables added, discussion expanded, typos
corrected; matches published versio
Gamma Lines without a Continuum: Thermal Models for the Fermi-LAT 130 GeV Gamma Line
Recent claims of a line in the Fermi-LAT photon spectrum at 130 GeV are
suggestive of dark matter annihilation in the galactic center and other dark
matter-dominated regions. If the Fermi feature is indeed due to dark matter
annihilation, the best-fit line cross-section, together with the lack of any
corresponding excess in continuum photons, poses an interesting puzzle for
models of thermal dark matter: the line cross-section is too large to be
generated radiatively from open Standard Model annihilation modes, and too
small to provide efficient dark matter annihilation in the early universe. We
discuss two mechanisms to solve this puzzle and illustrate each with a simple
reference model in which the dominant dark matter annihilation channel is
photonic final states. The first mechanism we employ is resonant annihilation,
which enhances the annihilation cross-section during freezeout and allows for a
sufficiently large present-day annihilation cross section. Second, we consider
cascade annihilation, with a hierarchy between p-wave and s-wave processes.
Both mechanisms require mass near-degeneracies and predict states with masses
closely related to the dark matter mass; resonant freezeout in addition
requires new charged particles at the TeV scale.Comment: 17 pages, 8 figure
Algorithm for normal random numbers
We propose a simple algorithm for generating normally distributed pseudo
random numbers. The algorithm simulates N molecules that exchange energy among
themselves following a simple stochastic rule. We prove that the system is
ergodic, and that a Maxwell like distribution that may be used as a source of
normally distributed random deviates follows when N tends to infinity. The
algorithm passes various performance tests, including Monte Carlo simulation of
a finite 2D Ising model using Wolff's algorithm. It only requires four simple
lines of computer code, and is approximately ten times faster than the
Box-Muller algorithm.Comment: 5 pages, 3 encapsulated Postscript Figures. Submitted to
Phys.Rev.Letters. For related work, see http://pipe.unizar.es/~jf
The Higgs - photon - Z boson coupling revisited
We analyze the coupling of CP-even and CP-odd Higgs bosons to a photon and a
Z boson in extensions of the Standard Model. In particular, we study in detail
the effect of charged Higgs bosons in two-Higgs doublet models, and the
contribution of SUSY particle loops in the minimal supersymmetric extension of
the Standard Model. The Higgs- coupling can be measured in the decay
+Higgs at colliders running on the Z resonance, or in
the reverse process Higgs with the Higgs boson produced at LHC.
We show that a measurement of this coupling with a precision at the percent
level, which could be the case at future colliders, would allow to
distinguish between the lightest SUSY and standard Higgs bosons in large areas
of the parameter space.Comment: 18 pages LaTex + 7 figures (ps). Typo corrected in eq.(5
Viable tax constitutions
Taxation is only sustainable if the general public complies with it. This observation is uncontroversial with tax practitioners but has been ignored by the public finance tradition, which has interpreted tax constitutions as binding contracts by which the power to tax is irretrievably conferred by individuals to government, which can then levy any tax it chooses. However, in the absence of an outside party enforcing contracts between members of a group, no arrangement within groups can be considered to be a binding contract, and therefore the power of tax must be sanctioned by individuals on an ongoing basis. In this paper we offer, for the first time, a theoretical analysis of this fundamental compliance problem associated with taxation, obtaining predictions that in some cases point to a re-interptretation of the theoretical constructions of the public finance tradition while in others call them into question
Slightly Non-Minimal Dark Matter in PAMELA and ATIC
We present a simple model in which dark matter couples to the standard model
through a light scalar intermediary that is itself unstable. We find this model
has several notable features, and allows a natural explanation for a surplus of
positrons, but no surplus of anti-protons, as has been suggested by early data
from PAMELA and ATIC. Moreover, this model yields a very small nucleon
coupling, well below the direct detection limits. In this paper we explore the
effect of this model in both the early universe and in the galaxy.Comment: 7 pages, 6 figures, v3: updated for new data, added discussion of
Ferm
- …
