667 research outputs found
Role of a parallel magnetic field in two dimensional disordered clusters containing a few correlated electrons
An ensemble of 2d disordered clusters with a few electrons is studied as a
function of the Coulomb energy to kinetic energy ratio r_s. Between the Fermi
system (small r_s) and the Wigner molecule (large r_s), an interaction induced
delocalization of the ground state takes place which is suppressed when the
spins are aligned by a parallel magnetic field. Our results confirm the
existence of an intermediate regime where the Wigner antiferromagnetism
defavors the Stoner ferromagnetism and where the enhancement of the Lande g
factor observed in dilute electron systems is reproduced.Comment: 4 pages, 3 figure
Bargmann-Michel-Telegdi equation and one-particle relativistic approach
A reexamination of the semiclassical approach of the relativistic electron
indicates a possible variation of its helicity for electric and magnetic static
fields applied along its global motion due to zitterbewegung effects,
proportional to the anomalous part of the magnetic moment.Comment: 10 pages, LATEX2E, uses amsb
From regional to local SPTHA: efficient computation of probabilistic tsunami inundation maps addressing near-field sources
Site-specific seismic probabilistic tsunami hazard analysis (SPTHA) is a
computationally demanding task, as it requires, in principle, a huge number of
high-resolution numerical simulations for producing probabilistic inundation
maps. We implemented an efficient and robust methodology using a filtering
procedure to reduce the number of numerical simulations needed while still
allowing for a full treatment of aleatory and epistemic uncertainty. Moreover, to
avoid biases in tsunami hazard assessment, we developed a strategy to
identify and separately treat tsunamis generated by near-field earthquakes.
Indeed, the coseismic deformation produced by local earthquakes necessarily
affects tsunami intensity, depending on the scenario size, mechanism and
position, as coastal uplift or subsidence tends to diminish or increase the
tsunami hazard, respectively. Therefore, we proposed two parallel filtering
schemes in the far- and the near-field, based on the similarity of offshore
tsunamis and hazard curves and on the similarity of the coseismic fields,
respectively. This becomes mandatory as offshore tsunami amplitudes can not
represent a proxy for the coastal inundation in the case of near-field sources.
We applied the method to an illustrative use case at the Milazzo oil refinery
(Sicily, Italy). We demonstrate that a blind filtering procedure can not
properly account for local sources and would lead to a nonrepresentative
selection of important scenarios. For the specific source–target
configuration, this results in an overestimation of the tsunami hazard,
which turns out to be correlated to dominant coastal uplift. Different
settings could produce either the opposite or a mixed behavior along the
coastline. However, we show that the effects of the coseismic deformation due
to local sources can not be neglected and a suitable correction has to be
employed when assessing local-scale SPTHA, irrespective of the specific
signs of coastal displacement.</p
Delocalizing effect of the Hubbard repulsion for electrons on a two-dimensional disordered lattice
We study numerically the ground-state properties of the repulsive Hubbard
model for spin-1/2 electrons on two-dimensional lattices with disordered
on-site energies. The projector quantum Monte Carlo method is used to obtain
very accurate values of the ground-state charge density distributions with
and particles. The difference in these charge densities allows us
to study the localization properties of an added particle. The results obtained
at quarter-filling on finite clusters show that the Hubbard repulsion has a
strong delocalizing effect on the electrons in disordered 2D lattices. However,
numerical restrictions do not allow us to reach a definite conclusion about the
existence of a metal-insulator transition in the thermodynamic limit in
two-dimensions.Comment: revtex, 7 pages, 7 figure
Effect of using reporting guidelines during peer review on quality of final manuscripts submitted to a biomedical journal: masked randomised trial
Objective To investigate the effect of an additional review based on reporting guidelines such as STROBE and CONSORT on quality of manuscripts
Polyfunctional antibodies: a path towards precision vaccines for vulnerable populations
Vaccine efficacy determined within the controlled environment of a clinical trial is usually substantially greater than real-world vaccine effectiveness. Typically, this results from reduced protection of immunologically vulnerable populations, such as children, elderly individuals and people with chronic comorbidities. Consequently, these high-risk groups are frequently recommended tailored immunisation schedules to boost responses. In addition, diverse groups of healthy adults may also be variably protected by the same vaccine regimen. Current population-based vaccination strategies that consider basic clinical parameters offer a glimpse into what may be achievable if more nuanced aspects of the immune response are considered in vaccine design. To date, vaccine development has been largely empirical. However, next-generation approaches require more rational strategies. We foresee a generation of precision vaccines that consider the mechanistic basis of vaccine response variations associated with both immunogenetic and baseline health differences. Recent efforts have highlighted the importance of balanced and diverse extra-neutralising antibody functions for vaccine-induced protection. However, in immunologically vulnerable populations, significant modulation of polyfunctional antibody responses that mediate both neutralisation and effector functions has been observed. Here, we review the current understanding of key genetic and inflammatory modulators of antibody polyfunctionality that affect vaccination outcomes and consider how this knowledge may be harnessed to tailor vaccine design for improved public health
Operations of and Future Plans for the Pierre Auger Observatory
Technical reports on operations and features of the Pierre Auger Observatory,
including ongoing and planned enhancements and the status of the future
northern hemisphere portion of the Observatory. Contributions to the 31st
International Cosmic Ray Conference, Lodz, Poland, July 2009.Comment: Contributions to the 31st ICRC, Lodz, Poland, July 200
Measurement of the Depth of Maximum of Extensive Air Showers above 10^18 eV
We describe the measurement of the depth of maximum, Xmax, of the
longitudinal development of air showers induced by cosmic rays. Almost four
thousand events above 10^18 eV observed by the fluorescence detector of the
Pierre Auger Observatory in coincidence with at least one surface detector
station are selected for the analysis. The average shower maximum was found to
evolve with energy at a rate of (106 +35/-21) g/cm^2/decade below 10^(18.24 +/-
0.05) eV and (24 +/- 3) g/cm^2/decade above this energy. The measured
shower-to-shower fluctuations decrease from about 55 to 26 g/cm^2. The
interpretation of these results in terms of the cosmic ray mass composition is
briefly discussed.Comment: Accepted for publication by PR
Assessing long-term tephra fallout hazard in southern Italy from Neapolitan volcanoes
Nowadays, modeling of tephra fallout hazard is coupled with probabilistic analysis that takes into account the natural variability of the volcanic phenomena in terms of eruption probability, eruption sizes, vent position, and meteorological conditions. In this framework, we present a prototypal methodology to carry out the long-term tephra fallout hazard assessment in southern Italy from the active Neapolitan volcanoes: Somma–Vesuvius, Campi Flegrei, and Ischia.
The FALL3D model (v.8.0) has been used to run thousands of numerical simulations (1500 per eruption size class), considering the ECMWF ERA5 meteorological dataset over the last 30 years. The output in terms of tephra ground load has been processed within a new workflow for large-scale, high-resolution volcanic hazard assessment, relying on a Bayesian procedure, in order to provide the mean annual frequency with which the tephra load at the ground exceeds given critical thresholds at a target site within a 50-year exposure time. Our results are expressed in terms of absolute mean hazard maps considering different levels of aggregation, from the impact of each volcanic source and eruption size class to the quantification of the total hazard. This work provides, for the first time, a multi-volcano probabilistic hazard assessment posed by tephra fallout, comparable with those used for seismic phenomena and other natural disasters. This methodology can be applied to any other volcanic areas or over different exposure times, allowing researchers to account for the eruptive history of the target volcanoes that, when available, could include the occurrence of less frequent large eruptions, representing critical elements for risk evaluations.</p
- …