7,452 research outputs found

    Equilibrium Statistical Mechanics of Fermion Lattice Systems

    Full text link
    We study equilibrium statistical mechanics of Fermion lattice systems which require a different treatment compared with spin lattice systems due to the non-commutativity of local algebras for disjoint regions. Our major result is the equivalence of the KMS condition and the variational principle with a minimal assumption for the dynamics and without any explicit assumption on the potential. It holds also for spin lattice systems as well, yielding a vast improvement over known results. All formulations are in terms of a C*-dynamical systems for the Fermion (CAR) algebra with all or a part of the following assumptions: (I) The interaction is even with respect to the Fermion number. (Automatically satisfied when (IV) below is assumed.) (II) All strictly local elements of the algebra have the first time derivative. (III) The time derivatives in (II) determine the dynamics. (IV) The interaction is lattice translation invariant. A major technical tool is the conditional expectation from the total algebra onto the local subalgebra for any finite subset of the lattice, which induces a system of commuting squares. This technique overcomes the lack of tensor product structures for Fermion systems and even simplifies many known arguments for spin lattice systems.Comment: 103 pages, no figure. The Section 13 has become simpler and a problem in 14.1 is settled thanks to a referee. The format has been revised according to the suggestion of this and the other referee

    Magnetic flares in the protoplanetary nebula and the origin of meteorite chondrules

    Get PDF
    This study proposes and analyzes a model for the chondrule forming heating events based on magnetohydrodynamic flares in the corona of the protoplanetary nebula which precipitate energy in the form of energetic plasma along magnetic field lines down toward the face of the nebula. It is found that flare energy release rates sufficient to melt the prechondrular matter, leading to the formation of the chondrules, can occur in the tenuous corona of a protostellar disk. Energy release rates sufficient to achieve melting require that the ambient magnetic field strength be in the range that has been inferred separately from independent meteorite remanent magnetization studies

    Multiresolution approximation of the vector fields on T^3

    Full text link
    Multiresolution approximation (MRA) of the vector fields on T^3 is studied. We introduced in the Fourier space a triad of vector fields called helical vectors which derived from the spherical coordinate system basis. Utilizing the helical vectors, we proved the orthogonal decomposition of L^2(T^3) which is a synthesis of the Hodge decomposition of the differential 1- or 2-form on T^3 and the Beltrami decomposition that decompose the space of solenoidal vector fields into the eigenspaces of curl operator. In the course of proof, a general construction procedure of the divergence-free orthonormal complete basis from the basis of scalar function space is presented. Applying this procedure to MRA of L^2(T^3), we discussed the MRA of vector fields on T^3 and the analyticity and regularity of vector wavelets. It is conjectured that the solenoidal wavelet basis must break r-regular condition, i.e. some wavelet functions cannot be rapidly decreasing function because of the inevitable singularities of helical vectors. The localization property and spatial structure of solenoidal wavelets derived from the Littlewood-Paley type MRA (Meyer's wavelet) are also investigated numerically.Comment: LaTeX, 33 Pages, 3 figures. submitted to J. Math. Phy

    Bayesian value-of-infomation analysis: an application to a policy model of Alzheimer's disease

    Get PDF
    A framework is presented that distinguishes the conceptually separate decisions of which treatment strategy is optimal from the question of whether more information is required to inform this choice in the future. The authors argue that the choice of treatment strategy should be based on expected utility, and the only valid reason to characterize the uncertainty surrounding outcomes of interest is to establish the value of acquiring additional information. A Bayesian decision theoretic approach is demonstrated through a probabilistic analysis of a published policy model of Alzheimer’s disease. The expected value of perfect information is estimated for the decision to adopt a new pharmaceutical for the population of patients with Alzheimer’s disease in the United States. This provides an upper bound on the value of additional research. The value of information is also estimated for each of the model inputs. This analysis can focus future research by identifying those parameters where more precise estimates would be most valuable and indicating whether an experimental design would be required. We also discuss how this type of analysis can also be used to design experimental research efficiently (identifying optimal sample size and optimal sample allocation) based on the marginal cost and marginal benefit of sample information. Value-of-information analysis can provide a measure of the expected payoff from proposed research, which can be used to set priorities in research and development. It can also inform an efficient regulatory framework for new healthcare technologies: an analysis of the value of information would define when a claim for a new technology should be deemed substantiated and when evidence should be considered competent and reliable when it is not cost-effective to gather any more information

    Edge modes in band topological insulators

    Full text link
    We characterize gapless edge modes in translation invariant topological insulators. We show that the edge mode spectrum is a continuous deformation of the spectrum of a certain gluing function defining the occupied state bundle over the Brillouin zone (BZ). Topologically non-trivial gluing functions, corresponding to non-trivial bundles, then yield edge modes exhibiting spectral flow. We illustrate our results for the case of chiral edge states in two dimensional Chern insulators, as well as helical edges in quantum spin Hall states.Comment: 4 pages, 2 figures; v4 minor change

    Estimating terrestrial uranium and thorium by antineutrino flux measurements

    Full text link
    Uranium and thorium within the Earth produce a major portion of terrestrial heat along with a measurable flux of electron antineutrinos. These elements are key components in geophysical and geochemical models. Their quantity and distribution drive the dynamics, define the thermal history, and are a consequence of the differentiation of the Earth. Knowledge of uranium and thorium concentrations in geological reservoirs relies largely on geochemical model calculations. This research report describes the methods and criteria to experimentally determine average concentrations of uranium and thorium in the continental crust and in the mantle using site-specific measurements of the terrestrial antineutrino flux. Optimal, model-independent determinations involve significant exposures of antineutrino detectors remote from nuclear reactors at both a mid-continental and a mid-oceanic site. This would require major, new antineutrino detection projects. The results of such projects could yield a greatly improved understanding of the deep interior of the Earth.Comment: 15 pages, 2 figure

    Evolution of Non-linear Fluctuations in Preheating after Inflation

    Full text link
    We investigate the evolution of the non-linear long wavelength fluctuations during preheating after inflation. By using the separate universe approach, the temporal evolution of the power spectrum of the scalar fields and the curvature variable is obtained numerically. We found that the amplitude of the large scale fluctuations is suppressed after non-linear evolution during preheating.Comment: To be published in Class. Quantum Gra

    Bayesian Value-of-Information Analysis: An Application to a Policy Model of Alzheimer's Disease

    Get PDF
    A framework is presented which distinguishes the conceptually separate decisions of which treatment strategy is optimal from the question of whether more information is required to inform this choice in the future. The authors argue that the choice of treatment strategy should be based on expected utility and the only valid reason to characterise the uncertainty surrounding outcomes of interest is to establish the value of acquiring additional information. A Bayesian decision theoretic approach is demonstrated though a probabilistic analysis of a published policy model of Alzheimer’s disease. The expected value of perfect information is estimated for the decision to adopt a new pharmaceutical for the population of US Alzheimer’s disease patients. This provides an upper bound on the value of additional research. The value of information is also estimated for each of the model inputs. This analysis can focus future research by identifying those parameters where more precise estimates would be most valuable, and indicating whether an experimental design would be required. We also discuss how this type of analysis can also be used to design experimental research efficiently (identifying optimal sample size and optimal sample allocation) based on the marginal cost and marginal benefit of sample information. Value-of-information analysis can provide a measure of the expected payoff from proposed research, which can be used to set priorities in research and development. It can also inform an efficient regulatory framework for new health care technologies: an analysis of the value of information would define when a claim for a new technology should be deemed “substantiated” and when evidence should be considered “competent and reliable” when it is not cost-effective to gather anymore information.stochastic CEA; Bayesian decision theory; value of information.

    Instantons in N=1/2 Super Yang-Mills Theory via Deformed Super ADHM Construction

    Full text link
    We study an extension of the ADHM construction to give deformed anti-self-dual (ASD) instantons in N=1/2 super Yang-Mills theory with U(n) gauge group. First we extend the exterior algebra on superspace to non(anti)commutative superspace and show that the N=1/2 super Yang-Mills theory can be reformulated in a geometrical way. By using this exterior algebra, we formulate a non(anti)commutative version of the super ADHM construction and show that the curvature two-form superfields obtained by our construction do satisfy the deformed ASD equations and thus we establish the deformed super ADHM construction. We also show that the known deformed U(2) one instanton solution is obtained by this construction.Comment: 32 pages, LaTeX, v2: typos corrected, references adde

    The Measurement Process in Local Quantum Theory and the EPR Paradox

    Full text link
    We describe in a qualitative way a possible picture of the Measurement Process in Quantum Mechanics, which takes into account: 1. the finite and non zero time duration T of the interaction between the observed system and the microscopic part of the measurement apparatus; 2. the finite space size R of that apparatus; 3. the fact that the macroscopic part of the measurement apparatus, having the role of amplifying the effect of that interaction to a macroscopic scale, is composed by a very large but finite number N of particles. The conventional picture of the measurement, as an instantaneous action turning a pure state into a mixture, arises only in the limit in which N and R tend to infinity, and T tends to 0. We sketch here a proposed scheme, which still ought to be made mathematically precise in order to analyse its implications and to test it in specific models, where we argue that in Quantum Field Theory this picture should apply to the unique time evolution expressing the dynamics of a given theory, and should comply with the Principle of Locality. We comment on the Einstein Podolski Rosen thought experiment (partly modifying the discussion on this point in an earlier version of this note), reformulated here only in terms of local observables (rather than global ones, as one particle or polarisation observables). The local picture of the measurement process helps to make it clear that there is no conflict with the Principle of Locality.Comment: 18 page
    corecore