5,529 research outputs found
The role of dissipation in biasing the vacuum selection in quantum field theory at finite temperature
We study the symmetry breaking pattern of an O(4) symmetric model of scalar
fields, with both charged and neutral fields, interacting with a photon bath.
Nagasawa and Brandenberger argued that in favourable circumstances the vacuum
manifold would be reduced from S^3 to S^1. Here it is shown that a selective
condensation of the neutral fields, that are not directly coupled to photons,
can be achieved in the presence of a minimal ``external'' dissipation, i.e. not
related to interactions with a bath. This should be relevant in the early
universe or in heavy-ion collisions where dissipation occurs due to expansion.Comment: Final version to appear in Phys. Rev. D, 2 figures added, 2 new
sub-section
Predicting the critical density of topological defects in O(N) scalar field theories
O(N) symmetric field theories describe many critical
phenomena in the laboratory and in the early Universe. Given N and ,
the dimension of space, these models exhibit topological defect classical
solutions that in some cases fully determine their critical behavior. For N=2,
D=3 it has been observed that the defect density is seemingly a universal
quantity at T_c. We prove this conjecture and show how to predict its value
based on the universal critical exponents of the field theory. Analogously, for
general N and D we predict the universal critical densities of domain walls and
monopoles, for which no detailed thermodynamic study exists. This procedure can
also be inverted, producing an algorithm for generating typical defect networks
at criticality, in contrast to the canonical procedure, which applies only in
the unphysical limit of infinite temperature.Comment: 4 pages, 3 figures, uses RevTex, typos in Eq.(11) and (14) correcte
The social fabric of Jeans': Assessing the social: Coupling social simulation and assessment methods
International audienceThe culture and manufacturing of the cotton fabric used to make your Jeans’ may have implied the use of fertilizers or pesticides polluting a water basin, have led to relocating people and even of children labour at different stages of its fabrication. As a consumer you probably didn’t take all these consequences into account (for your sake most of the information is not available, or value-wise you feel unconcerned) and you surely preferred to buy the cheapest one or to follow the fashion trend. Basically, every economic or public activity has repercussions directly, or through a chain of consequences on the environment and the society. In order to try and measure those impacts, or to valuate one choice (Jeans’ L) compared to another (Jeans’ P&J), several assessment methods have been developed and are frequently used. As a self-evident truth, assessment methods are instruments used to evaluate something. These could include measuring a performance on a specific case. In terms of evaluating policies and strategies, their possible outcomes are intended to evaluate their potential impacts. This refers to impact assessment in which past (already implemented actions) or future (ex-ante analysis) performances are studied
A diabetes mellitus prĂ©via nĂŁo agrava o prognĂłstico no primeiro ano apĂłs a transplantação cardĂaca
BACKGROUND AND AIMS: Heart transplantation remains the gold standard treatment for selected patients with end-stage heart failure. However, transplantation in diabetic patients remains controversial. The hyperglycemic effect of immunosuppressant therapy further complicates posttransplantation management of diabetes and, although this is still unproven, could be responsible for a higher incidence of post-transplantation infection, rejection and mortality. In this study, we aimed to compare one-year outcomes of survival and morbidity after cardiac transplantation among recipients with and without diabetes mellitus.
METHODS: This was a prospective observational study of 114 patients who underwent first heart transplantation between November 2003 and January 2008, with 1-year follow-up. They were divided into two groups according to whether they had pre-transplantation diabetes (group 1) or not (group 2). Baseline variables and complications were recorded. Logistic regression analysis was used to identify independent predictors of 1-year mortality.
RESULTS: Of the 114 patients, 33% were diabetic before transplantation. Diabetic patients were older (57.0 +/- 7.4 vs. 51.2 +/- 12.9 years, p = 0.013), and had a higher prevalence of hypertension (63.6% vs. 16.7%, p = 0.002), lower creatinine clearance (53.5 +/- 16.2 vs. 63.0 +/- 21.8 ml/min, p = 0.020) and higher C-reactive protein levels (1.8 +/- 2.4 vs. 0.9 +/- 1.3 mg/l, p = 0.029) than non-diabetics. They tended to have more peripheral arterial disease (20.8 vs. 14.8%, p = NS) and carotid disease (25.8 vs. 14.3%, p = NS). In diabetic patients fasting glucose levels were significantly lower at one year than before heart transplantation (134.2 +/- 45.3 vs. 158.4 +/- 71.2 mg/dl, p = 0.039). There were no significant differences between diabetic and non-diabetic patients in rejection (16.2 vs. 23.4%, p = 0.467), infection (27.0 vs. 33.8%, p = 0.524) or mortality (16.2 vs. 6.5%, p = 0.171) at 1-year follow-up. On logistic regression analysis, the only predictor of 1-year mortality was baseline creatinine > 1.4 mg/dl (OR: 6.36, 95% CI: 1.12-36.04). Diabetes and impaired fasting glucose before heart transplantation were not independent predictors of 1-year mortality.
CONCLUSIONS: These data suggest that diabetes is not associated with worse 1-year survival or higher morbidity in heart transplant patients, as long as good blood glucose control is maintained
Numerical simulation of stochastic vortex tangles
We present the results of simulation of the chaotic dynamics of quantized
vortices in the bulk of superfluid He II.
Evolution of vortex lines is calculated on the base of the Biot-Savart law.
The dissipative effects appeared from the interaction with the normal
component, or/and from relaxation of the order parameter are taken into
account. Chaotic dynamics appears in the system via a random forcing, e.i. we
use the Langevin approach to the problem. In the present paper we require the
correlator of the random force to satisfy the fluctuation-disspation relation,
which implies that thermodynamic equilibrium should be reached. In the paper we
describe the numerical methods for integration of stochastic differential
equation (including a new algorithm for reconnection processes), and we present
the results of calculation of some characteristics of a vortex tangle such as
the total length, distribution of loops in the space of their length, and the
energy spectrum.Comment: 8 pages, 5 figure
Nucleation of vortices by rapid thermal quench
We show that vortex nucleation in superfluid He by rapid thermal quench
in the presence of superflow is dominated by a transverse instability of the
moving normal-superfluid interface. Exact expressions for the instability
threshold as a function of supercurrent density and the front velocity are
found. The results are verified by numerical solution of the Ginzburg-Landau
equation.Comment: 4 Pages, 4 Figure, submitted to Phys. Rev. Let
Algorithmic statistics revisited
The mission of statistics is to provide adequate statistical hypotheses
(models) for observed data. But what is an "adequate" model? To answer this
question, one needs to use the notions of algorithmic information theory. It
turns out that for every data string one can naturally define
"stochasticity profile", a curve that represents a trade-off between complexity
of a model and its adequacy. This curve has four different equivalent
definitions in terms of (1)~randomness deficiency, (2)~minimal description
length, (3)~position in the lists of simple strings and (4)~Kolmogorov
complexity with decompression time bounded by busy beaver function. We present
a survey of the corresponding definitions and results relating them to each
other
- …