5,658 research outputs found
Randomised prospective study for the effect of therapy on residual beta cell function in type-1 diabetes mellitus [ISRCTN70703138]
BACKGROUND: Newly diagnosed insulin-dependent diabetes mellitus is characterised by a temporary recovery of endogeneous insulin ("remission") after the beginning of medical treatment with subcutaneous insulin injections. Although most diabetologists think, that insulin reserve is related to reduced occurrence of diabetic long-term complications, such as eye, nerve and kidney disease, there is only one prospective controlled clinical study (the DCCT) addressing this question, however as secondary hypothesis. METHODS/DESIGN: Therefore, we composed a trial consisting of two cohorts with two therapeutic options within each cohort (conventional versus intensive therapy) and a three-year follow-up. In one group the patients are randomly assigned to the treatment regimes to test the statistical alternative hypothesis if variable insulin dosage is superior to fixed insulin injection in preserving insulin reserve measured by C-peptide in serum. Another group includes patients who prefer one of the two therapies, decline randomisation, but consent to follow-up. Apart from the determination of insulin reserve as a biological parameter a second primary endpoint was defined as 'therapeutic failure' according to the criteria of the European Association for the Study of Diabetes. Patients pass a training program to help them self-manage diabetes. A standardised protocol is being set up to minimize centre effects and bias of health care providers. Potential patient dependent bias will be investigated by questionnaires measuring psychic coping processes of people with diabetes. Management of visit dates is directly navigated by the database. Automated visit-reminders are mailed to patients and caregivers to optimise the number of visits on schedule. Data quality is regularly monitored and centres are informed on the results of continuous data management
An explicit SU(12) family and flavor unification model with natural fermion masses and mixings
We present an SU(12) unification model with three light chiral families,
avoiding any external flavor symmetries. The hierarchy of quark and lepton
masses and mixings is explained by higher dimensional Yukawa interactions
involving Higgs bosons that contain SU(5) singlet fields with VEVs about 50
times smaller than the SU(12) unification scale. The presented model has been
analyzed in detail and found to be in very good agreement with the observed
quark and lepton masses and mixings.Comment: 11 pages, 4 table
Semiquantum Chaos in the Double-Well
The new phenomenon of semiquantum chaos is analyzed in a classically regular
double-well oscillator model. Here it arises from a doubling of the number of
effectively classical degrees of freedom, which are nonlinearly coupled in a
Gaussian variational approximation (TDHF) to full quantum mechanics. The
resulting first-order nondissipative autonomous flow system shows energy
dependent transitions between regular behavior and semiquantum chaos, which we
monitor by Poincar\'e sections and a suitable frequency correlation function
related to the density matrix. We discuss the general importance of this new
form of deterministic chaos and point out the necessity to study open
(dissipative) quantum systems, in order to observe it experimentally.Comment: LaTeX, 25 pages plus 7 postscript figures. Replaced figure 3 with a
non-bitmapped versio
A path integral approach to the dynamics of a random chain with rigid constraints
In this work the dynamics of a freely jointed random chain which fluctuates
at constant temperature in some viscous medium is studied. The chain is
regarded as a system of small particles which perform a brownian motion and are
subjected to rigid constraints which forbid the breaking of the chain. For
simplicity, all interactions among the particles have been switched off and the
number of dimensions has been limited to two. The problem of describing the
fluctuations of the chain in the limit in which it becomes a continuous system
is solved using a path integral approach, in which the constraints are imposed
with the insertion in the path integral of suitable Dirac delta functions. It
is shown that the probability distribution of the possible conformations in
which the fluctuating chain can be found during its evolution in time coincides
with the partition function of a field theory which is a generalization of the
nonlinear sigma model in two dimensions. Both the probability distribution and
the generating functional of the correlation functions of the positions of the
beads are computed explicitly in a semiclassical approximation for a
ring-shaped chain.Comment: 36 pages, 2 figures, LaTeX + REVTeX4 + graphicx, minor changes in the
text, reference adde
Triplet Leptogenesis in Left-Right Symmetric Seesaw Models
We discuss scalar triplet leptogenesis in a specific left-right symmetric
seesaw model. We show that the Majorana phases that are present in the model
can be effectively used to saturate the existing upper limit on the
CP-asymmetry of the triplets. We solve the relevant Boltzmann equations and
analyze the viability of triplet leptogenesis. It is known for this kind of
scenario that the efficiency of leptogenesis is maximal if there exists a
hierarchy between the branching ratios of the triplet decays into leptons and
Higgs particles. We show that triplet leptogenesis typically favors branching
ratios with not too strong hierarchies, since maximal efficiency can only be
obtained at the expense of suppressed CP-asymmetries.Comment: 16 pages, 5 figures, published versio
Lepton Mixing and Cancellation of the Dirac Mass Hierarchy in SO(10) GUTs with Flavor Symmetries T7 and Sigma(81)
In SO(10) grand unified theories (GUTs) the hierarchy which is present in the
Dirac mass term of the neutrinos is generically as strong as the one in the
up-type quark mass term. We propose a mechanism to partially or completely
cancel this hierarchy in the light neutrino mass matrix in the seesaw context.
The two main ingredients of the cancellation mechanism are the existence of
three fermionic gauge singlets and of a discrete flavor symmetry G_f which is
broken at a higher scale than SO(10). Two realizations of the cancellation
mechanism are presented. The realization based on the Frobenius group T7 = Z7 x
Z3 leads to a partial cancellation of the hierarchy and relates maximal 2-3
lepton mixing with the geometric hierarchy of the up-quark masses. In the
realization with the group Sigma(81) the cancellation is complete and
tri-bimaximal lepton mixing is reproduced at the lowest order. In both cases,
to fully accommodate the leptonic data we take into account additional effects
such as effects of higher-dimensional operators involving more than one flavon.
The heavy neutral fermion mass spectra are considered. For both realizations we
analyze the flavon potential at the renormalizable level as well as ways to
generate the Cabibbo angle.Comment: 31 page
Precise numerical results for limit cycles in the quantum three-body problem
The study of the three-body problem with short-range attractive two-body
forces has a rich history going back to the 1930's. Recent applications of
effective field theory methods to atomic and nuclear physics have produced a
much improved understanding of this problem, and we elucidate some of the
issues using renormalization group ideas applied to precise nonperturbative
calculations. These calculations provide 11-12 digits of precision for the
binding energies in the infinite cutoff limit. The method starts with this
limit as an approximation to an effective theory and allows cutoff dependence
to be systematically computed as an expansion in powers of inverse cutoffs and
logarithms of the cutoff. Renormalization of three-body bound states requires a
short range three-body interaction, with a coupling that is governed by a
precisely mapped limit cycle of the renormalization group. Additional
three-body irrelevant interactions must be determined to control subleading
dependence on the cutoff and this control is essential for an effective field
theory since the continuum limit is not likely to match physical systems ({\it
e.g.}, few-nucleon bound and scattering states at low energy). Leading order
calculations precise to 11-12 digits allow clear identification of subleading
corrections, but these corrections have not been computed.Comment: 37 pages, 8 figures, LaTeX, uses graphic
Stability and leptogenesis in the left-right symmetric seesaw mechanism
We analyze the left-right symmetric type I+II seesaw mechanism, where an
eight-fold degeneracy among the mass matrices of heavy right-handed neutrinos
M_R is known to exist. Using the stability property of the solutions and their
ability to lead to successful baryogenesis via leptogenesis as additional
criteria, we discriminate among these eight solutions and partially lift their
eight-fold degeneracy. In particular, we find that viable leptogenesis is
generically possible for four out of the eight solutions.Comment: 25 pages, 11 figures, latex; minor changes, published versio
The Profiling Potential of Computer Vision and the Challenge of Computational Empiricism
Computer vision and other biometrics data science applications have commenced
a new project of profiling people. Rather than using 'transaction generated
information', these systems measure the 'real world' and produce an assessment
of the 'world state' - in this case an assessment of some individual trait.
Instead of using proxies or scores to evaluate people, they increasingly deploy
a logic of revealing the truth about reality and the people within it. While
these profiling knowledge claims are sometimes tentative, they increasingly
suggest that only through computation can these excesses of reality be captured
and understood. This article explores the bases of those claims in the systems
of measurement, representation, and classification deployed in computer vision.
It asks if there is something new in this type of knowledge claim, sketches an
account of a new form of computational empiricism being operationalised, and
questions what kind of human subject is being constructed by these
technological systems and practices. Finally, the article explores legal
mechanisms for contesting the emergence of computational empiricism as the
dominant knowledge platform for understanding the world and the people within
it
- …