2,572 research outputs found
Why Don't Country Elevators Pay Less for Low Quality Wheat? Information, Producer Preferences and Prospect Theory
Previous research found that country elevators that are the first in their area to grade wheat and pay quality-adjusted prices would receive above-normal profits at the expense of their competitors. Because of spatial monopsony, these early-adopting elevators would pass on to producers only 70% of the quality-based price differentials received from next-in-line buyers. If competing elevators also adopted these practices, profits for all elevators would return to near normal, and elevators would pass on to producers nearly all price differentials received from next-in-line buyers. However, that research could not explain why more elevators were not becoming "early adopters" by paying quality-adjusted prices. More recent research found that producers' risk aversion and lack of information about the quality of their wheat could explain more of the failure of country elevators to pass on premiums and discounts. If producers are risk averse, an elevator that imposes discounts for lower quality wheat, even while paying a higher price for high quality wheat, risks losing business if producers believe that a competing elevator may be more likely to pay them a higher price net of discounts. However, even more important is the level of information producers have about the quality of their wheat before selling it to an elevator. Still, these explanations account for only part of elevators' apparent reluctance to pay quality-adjusted prices. Since inconsistencies have been observed between expected utility and individuals' behavior, this research considers the case where producers' preferences can be more appropriately modeled by prospect theory, and whether such preferences can explain more of elevators' reluctance to pay quality-adjusted prices. A simulation model is used to measure the effects of risk-averse producers (in both expected utility and prospect theory frameworks) and limited quality information on profits that can be earned by an elevator that pays quality-adjusted prices. Results indicate that prospect theory helps to explain part, but not all, of the reluctance to pay quality-adjusted prices.Crop Production/Industries, Demand and Price Analysis,
Cosmic Shear Results from the Deep Lens Survey - II: Full Cosmological Parameter Constraints from Tomography
We present a tomographic cosmic shear study from the Deep Lens Survey (DLS),
which, providing a limiting magnitude r_{lim}~27 (5 sigma), is designed as a
pre-cursor Large Synoptic Survey Telescope (LSST) survey with an emphasis on
depth. Using five tomographic redshift bins, we study their auto- and
cross-correlations to constrain cosmological parameters. We use a
luminosity-dependent nonlinear model to account for the astrophysical
systematics originating from intrinsic alignments of galaxy shapes. We find
that the cosmological leverage of the DLS is among the highest among existing
>10 sq. deg cosmic shear surveys. Combining the DLS tomography with the 9-year
results of the Wilkinson Microwave Anisotropy Probe (WMAP9) gives
Omega_m=0.293_{-0.014}^{+0.012}, sigma_8=0.833_{-0.018}^{+0.011},
H_0=68.6_{-1.2}^{+1.4} km/s/Mpc, and Omega_b=0.0475+-0.0012 for LCDM, reducing
the uncertainties of the WMAP9-only constraints by ~50%. When we do not assume
flatness for LCDM, we obtain the curvature constraint
Omega_k=-0.010_{-0.015}^{+0.013} from the DLS+WMAP9 combination, which however
is not well constrained when WMAP9 is used alone. The dark energy equation of
state parameter w is tightly constrained when Baryonic Acoustic Oscillation
(BAO) data are added, yielding w=-1.02_{-0.09}^{+0.10} with the DLS+WMAP9+BAO
joint probe. The addition of supernova constraints further tightens the
parameter to w=-1.03+-0.03. Our joint constraints are fully consistent with the
final Planck results and also the predictions of a LCDM universe.Comment: Accepted for publication in Ap
Scaling Relations and Overabundance of Massive Clusters at z>~1 from Weak-Lensing Studies with HST
We present weak gravitational lensing analysis of 22 high-redshift (z >~1)
clusters based on Hubble Space Telescope images. Most clusters in our sample
provide significant lensing signals and are well detected in their
reconstructed two-dimensional mass maps. Combining the current results and our
previous weak-lensing studies of five other high-z clusters, we compare
gravitational lensing masses of these clusters with other observables. We
revisit the question whether the presence of the most massive clusters in our
sample is in tension with the current LambdaCDM structure formation paradigm.
We find that the lensing masses are tightly correlated with the gas
temperatures and establish, for the first time, the lensing mass-temperature
relation at z >~ 1. For the power law slope of the M-TX relation (M propto
T^{\alpha}), we obtain \alpha=1.54 +/- 0.23. This is consistent with the
theoretical self-similar prediction \alpha=3/2 and with the results previously
reported in the literature for much lower redshift samples. However, our
normalization is lower than the previous results by 20-30%, indicating that the
normalization in the M-TX relation might evolve. After correcting for Eddington
bias and updating the discovery area with a more conservative choice, we find
that the existence of the most massive clusters in our sample still provides a
tension with the current Lambda CDM model. The combined probability of finding
the four most massive clusters in this sample after marginalization over
current cosmological parameters is less than 1%.Comment: ApJ in press. See http://www.supernova.lbl.gov for additional
information pertaining to the HST Cluster SN Surve
Rings of Dark Matter in Collisions Between Clusters of Galaxies
Several lines of evidence suggest that the galaxy cluster Cl0024+17, an
apparently relaxed system, is actually a collision of two clusters, the
interaction occurring along our line of sight. Recent lensing observations
suggest the presence of a ring-like dark matter structure, which has been
interpreted as the result of such a collision. In this paper we present
-body simulations of cluster collisions along the line of sight to
investigate the detectability of such features. We use realistic dark matter
density profiles as determined from cosmological simulations. Our simulations
show a "shoulder" in the dark matter distribution after the collision, but no
ring feature even when the initial particle velocity distribution is highly
tangentially anisotropic (). Only when the initial
particle velocity distribution is circular do our simulations show such a
feature. Even modestly anisotropic velocity distributions are inconsistent with
the halo velocity distributions seen in cosmological simulations, and would
require highly fine-tuned initial conditions. Our investigation leaves us
without an explanation for the dark matter ring-like feature in Cl 0024+17
suggested by lensing observations.Comment: 7 pages (emulateapj), 9 figures. Expanded figures and text to match
accepted versio
Faster Born probability estimation via gate merging and frame optimisation
Outcome probability estimation via classical methods is an important task for validating quantum computing devices. Outcome probabilities of any quantum circuit can be estimated using Monte Carlo sampling, where the amount of negativity present in the circuit frame representation quantifies the overhead on the number of samples required to achieve a certain precision. In this paper, we propose two classical sub-routines: circuit gate merging and frame optimisation, which optimise the circuit representation to reduce the sampling overhead. We show that the runtimes of both sub-routines scale polynomially in circuit size and gate depth. Our methods are applicable to general circuits, regardless of generating gate sets, qudit dimensions and the chosen frame representations for the circuit components. We numerically demonstrate that our methods provide improved scaling in the negativity overhead for all tested cases of random circuits with Clifford+T and Haar-random gates, and that the performance of our methods compares favourably with prior quasi-probability simulators as the number of non-Clifford gates increases
- …