949 research outputs found
A simple parameter-free one-center model potential for an effective one-electron description of molecular hydrogen
For the description of an H2 molecule an effective one-electron model
potential is proposed which is fully determined by the exact ionization
potential of the H2 molecule. In order to test the model potential and examine
its properties it is employed to determine excitation energies, transition
moments, and oscillator strengths in a range of the internuclear distances, 0.8
< R < 2.5 a.u. In addition, it is used as a description of an H2 target in
calculations of the cross sections for photoionization and for partial
excitation in collisions with singly-charged ions. The comparison of the
results obtained with the model potential with literature data for H2 molecules
yields a good agreement and encourages therefore an extended usage of the
potential in various other applications or in order to consider the importance
of two-electron and anisotropy effects.Comment: 8 pages, 6 figure
Equations of state of elements based on the generalized Fermi-Thomas theory
The Fermi-Thomas model has been used to derive the equation of state of matter at high pressures and at various temperatures. Calculations have been carried out both without and with the exchange terms. Discussion of similarity transformations lead to the virial theorem and to correlation of solutions for different Z values
Electric Current Perturbation Calculations for Half-Penny Cracks
The electric current perturbation (ECP) method1–4 consists of inducing or injecting an electric current flow in the material to be examined and then detecting localized perturbations of the magnetic flux associated with current flow around material defects such as cracks or inclusions. Empirically, ECP data has shown strong correlations among certain signal features and crack size characteristics, and thus promises to be a useful method for quantitative NDE. To aid in the further development of the method, the objectives of the work reported in this paper are (1) to develop a mathematical model of the ECP flux distribution for a half-penny crack, (2) to determine the degree of validity of the model through comparisons with experimental data, and (3) to develop a detailed theory of sizing relationships for half-penny cracks
Space-based geoengineering: challenges and requirements
The prospect of engineering the Earth's climate (geoengineering) raises a multitude of issues associated with climatology, engineering on macroscopic scales, and indeed the ethics of such ventures. Depending on personal views, such large-scale engineering is either an obvious necessity for the deep future, or yet another example of human conceit. In this article a simple climate model will be used to estimate requirements for engineering the Earth's climate, principally using space-based geoengineering. Active cooling of the climate to mitigate anthropogenic climate change due to a doubling of the carbon dioxide concentration in the Earth's atmosphere is considered. This representative scenario will allow the scale of the engineering challenge to be determined. It will be argued that simple occulting discs at the interior Lagrange point may represent a less complex solution than concepts for highly engineered refracting discs proposed recently. While engineering on macroscopic scales can appear formidable, emerging capabilities may allow such ventures to be seriously considered in the long term. This article is not an exhaustive review of geoengineering, but aims to provide a foretaste of the future opportunities, challenges, and requirements for space-based geoengineering ventures
Heat kernel of integrable billiards in a magnetic field
We present analytical methods to calculate the magnetic response of
non-interacting electrons constrained to a domain with boundaries and submitted
to a uniform magnetic field. Two different methods of calculation are
considered - one involving the large energy asymptotic expansion of the
resolvent (Stewartson-Waechter method) is applicable to the case of separable
systems, and another based on the small time asymptotic behaviour of the heat
kernel (Balian-Bloch method). Both methods are in agreement with each other but
differ from the result obtained previously by Robnik. Finally, the Balian-Bloch
multiple scattering expansion is studied and the extension of our results to
other geometries is discussed.Comment: 13 pages, Revte
Landscape quality and brownfield regeneration: a community investigation approach inspired by landscape preference studies
Peer reviewe
Renormalization: the observable-state model
The usual mathematical formalism of quantum field theory is non-rigorous
because it contains divergences that can only be renormalized by non-rigorous
mathematical methods. The purpose of this paper is to present a method of
subtraction of this divergences using the formalism of decoherence. This is
achieved by replacing the standard renormalization method by a projector on a
well defined Hilbert subspace. In this way a list of problems of the standard
formalism disappears while the physical results of QFT remains valid. From it
own nature, this formalism can be used in non-renormalizable theories.Comment: 23 page
Weyl’s gauge argument
The standard U(1) “gauge principle” or “gauge argument” produces an exact potential A=dλ and a vanishing field F=ddλ=0. Weyl has his own gauge argument, which is sketchy, archaic and hard to follow; but at least it produces an inexact potential A and a nonvanishing field F=dA≠0. I attempt a reconstruction
Recommended from our members
Developmental changes in the balance of disparity, blur and looming/proximity cues to drive ocular alignment and focus
Accurate co-ordination of accommodation and convergence is necessary to view near objects and develop fine motor co-ordination. We used a remote haploscopic videorefraction paradigm to measure longitudinal changes in simultaneous ocular accommodation and vergence to targets at different depths, and to all combinations of blur, binocular disparity, and change-in-size (“proximity”) cues. Infants were followed longitudinally and compared to older children and young adults, with the prediction that sensitivity to different cues would change during development. Mean infant responses to the most naturalistic condition were similar to those of adults from 6-7 weeks (accommodation) and 8-9 weeks (vergence). Proximity cues influenced responses most in infants less than 14 weeks of age, but sensitivity declined thereafter. Between 12-28 weeks of age infants were equally responsive to all three cues, while in older children and adults manipulation of disparity resulted in the greatest changes in response. Despite rapid development of visual acuity (thus increasing availability of blur cues), responses to blur were stable throughout development. Our results suggest that during much of infancy, vergence and accommodation responses are not dependent on the development of specific depth cues, but make use of any cues available to drive appropriate changes in response
- …