9,925 research outputs found
Orthogonality relations for triple modes at dielectric boundary surfaces
We work out the orthogonality relations for the set of Carniglia-Mandel
triple modes which provide a set of normal modes for the source-free
electromagnetic field in a background consisting of a passive dielectric
half-space and the vacuum, respectively. Due to the inherent computational
complexity of the problem, an efficient strategy to accomplish this task is
desirable, which is presented in the paper. Furthermore, we provide all main
steps for the various proofs pertaining to different combinations of triple
modes in the orthogonality integral.Comment: 15 page
Signatures of few-body resonances in finite volume
We study systems of bosons and fermions in finite periodic boxes and show how
the existence and properties of few-body resonances can be extracted from
studying the volume dependence of the calculated energy spectra. Using a
plane-wave-based discrete variable representation to conveniently implement
periodic boundary conditions, we establish that avoided level crossings occur
in the spectra of up to four particles and can be linked to the existence of
multi-body resonances. To benchmark our method we use two-body calculations,
where resonance properties can be determined with other methods, as well as a
three-boson model interaction known to generate a three-boson resonance state.
Finding good agreement for these cases, we then predict three-body and
four-body resonances for models using a shifted Gaussian potential. Our results
establish few-body finite-volume calculations as a new tool to study few-body
resonances. In particular, the approach can be used to study few-neutron
systems, where such states have been conjectured to exist.Comment: 13 pages, 10 figures, 2 tables, published versio
Is a Trineutron Resonance Lower in Energy than a Tetraneutron Resonance?
We present quantum Monte Carlo calculations of few-neutron systems confined
in external potentials based on local chiral interactions at
next-to-next-to-leading order in chiral effective field theory. The energy and
radial densities for these systems are calculated in different external
Woods-Saxon potentials. We assume that their extrapolation to zero
external-potential depth provides a quantitative estimate of three- and
four-neutron resonances. The validity of this assumption is demonstrated by
benchmarking with an exact diagonalization in the two-body case. We find that
the extrapolated trineutron resonance, as well as the energy for shallow well
depths, is lower than the tetraneutron resonance energy. This suggests that a
three-neutron resonance exists below a four-neutron resonance in nature and is
potentially measurable. To confirm that the relative ordering of three- and
four-neutron resonances is not an artifact of the external confinement, we test
that the odd-even staggering in the helium isotopic chain is reproduced within
this approach. Finally, we discuss similarities between our results and
ultracold Fermi gases.Comment: 6 pages, 5 figures, version compatible with published lette
Few-body physics in effective field theory
Effective Field Theory (EFT) provides a powerful framework that exploits a
separation of scales in physical systems to perform systematically improvable,
model-independent calculations. Particularly interesting are few-body systems
with short-range interactions and large two-body scattering length. Such
systems display remarkable universal features. In systems with more than two
particles, a three-body force with limit cycle behavior is required for
consistent renormalization already at leading order. We will review this EFT
and some of its applications in the physics of cold atoms and nuclear physics.
In particular, we will discuss the possibility of an infrared limit cycle in
QCD. Recent extensions of the EFT approach to the four-body system and N-boson
droplets in two spatial dimensions will also be addressed.Comment: 10 pages, 5 figures, Proceedings of the INT Workshop on "Nuclear
Forces and the Quantum Many-Body Problem", Oct. 200
DEVELOPMENT OF A STOCHASTIC MODEL TO EVALUATE PLANT GROWERS' ENTERPRISE BUDGETS
Increased domestic concentration and international competition in the floricultural industry are forcing growers to improve resource management efficiency. Cost management and cost accounting methods are becoming key tools as growers attempt to reduce costs. These tools allow growers to allocate costs for each crop, increasing their greenhouse planning abilities. Growers have a relative high degree of risk due to potential crop and market failure. Individual growers have different tolerance for risk and risk bearing capacity. Growers need a cost accounting system that incorporates production and market risk, a system that allows them to make informed business decisions. The research reported in this paper developed a greenhouse budgeting model that incorporated risk to allow growers to compare production costs for flowers with different genetics and production technologies. This enables greenhouse growers to make production management decisions that incorporate production and market risk. The model gives growers the option of imputing their own production data to evaluate how various yield and price assumptions influence income and expense projections, and ultimately, profit. The model allows growers to compare total production cost and revenue varying grower type, production time, geographical location, operation size, and cost structure. The model evaluates budgets for growers who market to mass-market retail operations or wholesale intermediaries who sell to merchandisers or flower shops distribution channels. The model was demonstrated with sample data to illustrate how incorporating risk analysis into a grower's greenhouse budget model effects resource allocation and production decisions as compare to a budget model that does not incorporate risk. Deterministic and stochastic models were used to demonstrate differences in production decisions under various assumptions. The stochastic model introduced prices and flowering characteristics variability. The @Risk software was used to generate the random number simulation of the stochastic model, and stochastic dominance analysis was used to rank the alternatives. The result for both the deterministic and stochastic models identified the same cultivar as most profitable. However, there were differences in crop profits levels and rankings for subsequent cultivars that could influence growers' production choice decisions. The grower's risk aversion level influenced his/her choice of the most profitable cultivars in the stochastic model. The model summarizes the sources of variability that affect cost and revenue. The model enables the grower to measure effects that change in productivity might have on profit. Growers can identify items in their budget that have a greater effect on profitability, and make adjustments. The model can be used to allocate cost across activities, so the grower would be able to measure the economic impact of an item on the budget.Crop Production/Industries,
Galaxy disks do not need to survive in the L-CDM paradigm: the galaxy merger rate out to z~1.5 from morpho-kinematic data
About two-thirds of present-day, large galaxies are spirals such as the Milky
Way or Andromeda, but the way their thin rotating disks formed remains
uncertain. Observations have revealed that half of their progenitors, six
billion years ago, had peculiar morphologies and/or kinematics, which exclude
them from the Hubble sequence. Major mergers, i.e., fusions between galaxies of
similar mass, are found to be the likeliest driver for such strong
peculiarities. However, thin disks are fragile and easily destroyed by such
violent collisions, which creates a critical tension between the observed
fraction of thin disks and their survival within the L-CDM paradigm. Here we
show that the observed high occurrence of mergers amongst their progenitors is
only apparent and is resolved when using morpho-kinematic observations which
are sensitive to all the phases of the merging process. This provides an
original way of narrowing down observational estimates of the galaxy merger
rate and leads to a perfect match with predictions by state-of-the-art L-CDM
semi-empirical models with no particular fine-tuning needed. These results
imply that half of local thin disks do not survive but are actually rebuilt
after a gas-rich major merger occurring in the past nine billion years, i.e.,
two-thirds of the lifetime of the Universe. This emphasizes the need to study
how thin disks can form in halos with a more active merger history than
previously considered, and to investigate what is the origin of the gas
reservoir from which local disks would reform.Comment: 19 pages, 7 figures, 2 tables. Accepted in ApJ. V2 to match proof
corrections and added reference
Generalized Swiss-cheese cosmologies: Mass scales
We generalize the Swiss-cheese cosmologies so as to include nonzero linear
momenta of the associated boundary surfaces. The evolution of mass scales in
these generalized cosmologies is studied for a variety of models for the
background without having to specify any details within the local
inhomogeneities. We find that the final effective gravitational mass and size
of the evolving inhomogeneities depends on their linear momenta but these
properties are essentially unaffected by the details of the background model.Comment: 10 pages, 14 figures, 1 table, revtex4, Published form (with minor
corrections
Symbolic Manipulators Affect Mathematical Mindsets
Symbolic calculators like Mathematica are becoming more commonplace among
upper level physics students. The presence of such a powerful calculator can
couple strongly to the type of mathematical reasoning students employ. It does
not merely offer a convenient way to perform the computations students would
have otherwise wanted to do by hand. This paper presents examples from the work
of upper level physics majors where Mathematica plays an active role in
focusing and sustaining their thought around calculation. These students still
engage in powerful mathematical reasoning while they calculate but struggle
because of the narrowed breadth of their thinking. Their reasoning is drawn
into local attractors where they look to calculation schemes to resolve
questions instead of, for example, mapping the mathematics to the physical
system at hand. We model the influence of Mathematica as an integral part of
the constant feedback that occurs in how students frame, and hence focus, their
work
Beyond deficit-based models of learners' cognition: Interpreting engineering students' difficulties with sense-making in terms of fine-grained epistemological and conceptual dynamics
Researchers have argued against deficit-based explanations of students'
troubles with mathematical sense-making, pointing instead to factors such as
epistemology: students' beliefs about knowledge and learning can hinder them
from activating and integrating productive knowledge they have. In this case
study of an engineering major solving problems (about content from his
introductory physics course) during a clinical interview, we show that "Jim"
has all the mathematical and conceptual knowledge he would need to solve a
hydrostatic pressure problem that we posed to him. But he reaches and sticks
with an incorrect answer that violates common sense. We argue that his lack of
mathematical sense-making-specifically, translating and reconciling between
mathematical and everyday/common-sense reasoning-stems in part from his
epistemological views, i.e., his views about the nature of knowledge and
learning. He regards mathematical equations as much more trustworthy than
everyday reasoning, and he does not view mathematical equations as expressing
meaning that tractably connects to common sense. For these reasons, he does not
view reconciling between common sense and mathematical formalism as either
necessary or plausible to accomplish. We, however, avoid a potential "deficit
trap"-substituting an epistemological deficit for a concepts/skills deficit-by
incorporating multiple, context-dependent epistemological stances into Jim's
cognitive dynamics. We argue that Jim's epistemological stance contains
productive seeds that instructors could build upon to support Jim's
mathematical sense-making: He does see common-sense as connected to formalism
(though not always tractably so) and in some circumstances this connection is
both salient and valued.Comment: Submitted to the Journal of Engineering Educatio
Adlayer core-level shifts of random metal overlayers on transition-metal substrates
We calculate the difference of the ionization energies of a core-electron of
a surface alloy, i.e., a B-atom in a A_(1-x) B_x overlayer on a
fcc-B(001)-substrate, and a core-electron of the clean fcc-B(001) surface using
density-functional-theory. We analyze the initial-state contributions and the
screening effects induced by the core hole, and study the influence of the
alloy composition for a number of noble metal-transition metal systems. Data
are presented for Cu_(1-x)Pd_x/Pd(001), Ag_(1-x) Pd_x/Pd(001), Pd_(1-x)
Cu_x/Cu(001), and Pd_(1-x) Ag_x/Ag(001), changing x from 0 to 100 %. Our
analysis clearly indicates the importance of final-state screening effects for
the interpretation of measured core-level shifts. Calculated deviations from
the initial-state trends are explained in terms of the change of inter- and
intra-atomic screening upon alloying. A possible role of alloying on the
chemical reactivity of metal surfaces is discussed.Comment: 4 pages, 2 figures, Phys. Rev. Letters, to appear in Feb. 199
- …