40 research outputs found
Thermal photon production in heavy ion collisions
Using a three-dimensional hydrodynamic simulation of the collision and an
equation of state containing a first order phase transition to the quark-gluon
plasma, we study thermal photon production for collisions at
AGeV and for collisions at AGeV. We obtain
surprisingly high rates of thermal photons even at the lower energy, suggesting
that, contrary to what was expected so far, photon production may be an
interesting topic for experimental search also at the Alternating Gradient
Synchrotron. When applied to the reaction at AGeV, our model can
reproduce preliminary data obtained by the WA80 Collaboration without having to
postulate the existence of an extremely long-lived mixed phase as was recently
proposed.Comment: 9 pages, figures are uudecoded compressed and tare
Fire and the relative roles of weather, climate and landscape characteristics in the Great Lakes-St. Lawrence forest of Canada
Question: In deciduous-dominated forest landscapes, what are the relative roles of fire weather, climate, human and biophysical landscape characteristics for explaining variation in large fire occurrence and area burned? Location: The Great Lakes-St. Lawrence forest of Canada. Methods: We characterized the recent (1959-1999) regime of large (≥ 200 ha) fires in 26 deciduous-dominated landscapes and analysed these data in an information-theoretic framework to compare six hypotheses that related fire occurrence and area burned to fire weather severity, climate normals, population and road densities, and enduring landscape characteristics such as surficial deposits and large lakes. Results: 392 large fires burned 833 698 ha during the study period, annually burning on average 0.07% ± 0.42% of forested area in each landscape. Fire activity was strongly seasonal, with most fires and area burned occurring in May and June. A combination of antecedent-winter precipitation, fire season precipitation deficit/surplus and percent of landscape covered by well-drained surficial deposits best explained fire occurrence and area burned. Fire occurrence varied only as a function of fire weather and climate variables, whereas area burned was also explained by percent cover of aspen and pine stands, human population density and two enduring characteristics: percent cover of large water bodies and glaciofluvial deposits. Conclusion: Understanding the relative role of these variables may help design adaptation strategies for forecasted increases in fire weather severity by allowing (1) prioritization of landscapes according to enduring characteristics and (2) management of their composition so that substantially increased fire activity would be necessary to transform landscape structure and composition
Crises and collective socio-economic phenomena: simple models and challenges
Financial and economic history is strewn with bubbles and crashes, booms and
busts, crises and upheavals of all sorts. Understanding the origin of these
events is arguably one of the most important problems in economic theory. In
this paper, we review recent efforts to include heterogeneities and
interactions in models of decision. We argue that the Random Field Ising model
(RFIM) indeed provides a unifying framework to account for many collective
socio-economic phenomena that lead to sudden ruptures and crises. We discuss
different models that can capture potentially destabilising self-referential
feedback loops, induced either by herding, i.e. reference to peers, or
trending, i.e. reference to the past, and account for some of the phenomenology
missing in the standard models. We discuss some empirically testable
predictions of these models, for example robust signatures of RFIM-like herding
effects, or the logarithmic decay of spatial correlations of voting patterns.
One of the most striking result, inspired by statistical physics methods, is
that Adam Smith's invisible hand can badly fail at solving simple coordination
problems. We also insist on the issue of time-scales, that can be extremely
long in some cases, and prevent socially optimal equilibria to be reached. As a
theoretical challenge, the study of so-called "detailed-balance" violating
decision rules is needed to decide whether conclusions based on current models
(that all assume detailed-balance) are indeed robust and generic.Comment: Review paper accepted for a special issue of J Stat Phys; several
minor improvements along reviewers' comment
Recommended from our members
Enhanced Observations with Borehole Seismographic Networks. The Parkfield, California Experiment
The data acquired in the Parkfield, California experiment are unique and they are producing results that force a new look at some conventional concepts and models for earthquake occurrence and fault-zone dynamics. No fault-zone drilling project can afford to neglect installation of such a network early enough in advance of the fault-zone penetration to have a well-defined picture of the seismicity details (probably at least 1000 microearthquakes--an easy 2-3 year goal for the M<0 detection of a borehole network). Analyses of nine years of Parkfield monitoring data have revealed significant and unambiguous departures from stationarity both in the seismicity characteristics and in wave propagation details within the S-wave coda for paths within the presumed M6 nucleation zone where we also have found a high Vp/Vs anomaly at depth, and where the three recent M4.7-5.0 sequences have occurred. Synchronous changes well above noise levels have also been seen among several independent parameters, including seismicity rate, average focal depth, S-wave coda velocities, characteristic sequence recurrence intervals, fault creep and water levels in monitoring wells. The significance of these findings lies in their apparent coupling and inter-relationships, from which models for fault-zone process can be fabricated and tested with time. The more general significance of the project is its production of a truly unique continuous baseline, at very high resolution, of both the microearthquake pathology and the subtle changes in wave propagation
Muscle histology vs MRI in Duchenne muscular dystrophy
Objective: There are currently no effective treatments to halt the muscle breakdown in Duchenne muscular dystrophy (DMD), although genetic-based clinical trials are being piloted. Most of these trials have as an endpoint the restoration of dystrophin in muscle fibers, hence requiring sufficiently well-preserved muscle of recruited patients. The choice of the muscles to be studied and the role of noninvasive methods to assess muscle preservation therefore require further evaluation.Methods: We studied the degree of muscle involvement in the lower leg muscles of 34 patients with DMD >8 years, using muscle MRI. In a subgroup of 15 patients we correlated the muscle MRI findings with the histology of open extensor digitorum brevis (EDB) muscle biopsies. Muscle MRI involvement was assigned using a scale 0-4 (normal-severe).Results: In all patients we documented a gradient of involvement of the lower leg muscles: the posterior compartment (gastrocnemius > soleus) was most severely affected; the anterior compartment (tibialis anterior/posterior, popliteus, extensor digitorum longus) least affected. Muscle MRI showed EDB involvement that correlated with the patient's age (p = 0.055). We show a correlation between the MRI and EDB histopathologic changes, with MRI 3-4 grades associated with a more severe fibro-adipose tissue replacement. The EDB was sufficiently preserved for bulk and signal intensity in 18/22 wheelchair users aged 10-16.6 years.Conclusion: This study provides a detailed correlation between muscle histology and MRI changes in DMD and demonstrates the value of this imaging technique as a reliable tool for the selection of muscles in patients recruited into clinical trials. Neurology (R) 2011;76:346-35