5,007 research outputs found
Cosmological test using the high-redshift detection rate of FSRQs with the Square Kilometer Array
We present a phenomenological method for predicting the number of Flat
Spectrum Radio Quasars (FSRQs) that should be detected by upcoming Square
Kilometer Array (SKA) SKA1-MID Wide Band 1 and Medium-Deep band 2 surveys. We
use the Fermi Blazar Sequence and mass estimates of Fermi FSRQs, and gamma-ray
emitting Narrow Line Seyfert 1 galaxies, to model the radio emission of FSRQs
as a function of mass alone, assuming a near-Eddington accretion rate, which is
suggested by current quasar surveys at z > 6. This is used to determine the
smallest visible black hole mass as a function of redshift in two competing
cosmologies we compare in this paper: the standard LCDM model and the R_h=ct
universe. We then apply lockstep growth to the observed black-hole mass
function at in order to devolve that population to higher redshifts and
determine the number of FSRQs detectable by the SKA surveys as a function of z.
We find that at the redshifts for which this method is most valid, LCDM
predicts ~30 times more FSRQs than R_h=ct for the Wide survey, and ~100 times
more in the Medium-Deep survey. These stark differences will allow the SKA
surveys to strongly differentiate between these two models, possibly rejecting
one in comparison with the other at a high level of confidence.Comment: 8 pages, 5 figures, 3 tables. Accepted for publication in MNRA
Analyzing H(z) Data using Two-point Diagnostics
Measurements of the Hubble constant H(z) are increasingly being used to test
the expansion rate predicted by various cosmological models. But the recent
application of 2-point diagnostics, such as Om(z_i,z_j) and Omh^2(z_i,z_j), has
produced considerable tension between LCDM's predictions and several
observations, with other models faring even worse. Part of this problem is
attributable to the continued mixing of truly model-independent measurements
using the cosmic-chronomter approach, and model-dependent data extracted from
BAOs. In this paper, we advance the use of 2-point diagnostics beyond their
current status, and introduce new variations, which we call Delta h(z_i,z_j),
that are more useful for model comparisons. But we restrict our analysis
exclusively to cosmic-chronometer data, which are truly model independent. Even
for these measurements, however, we confirm the conclusions drawn by earlier
workers that the data have strongly non-Gaussian uncertainties, requiring the
use of both "median" and "mean" statistical approaches. Our results reveal that
previous analyses using 2-point diagnostics greatly underestimated the errors,
thereby misinterpreting the level of tension between theoretical predictions
and H(z) data. Instead, we demonstrate that as of today, only Einstein-de
Sitter is ruled out by the 2-point diagnostics at a level of significance
exceeding ~ 3 sigma. The R_h=ct universe is slightly favoured over the
remaining models, including LCDM and Chevalier-Polarski-Linder, though all of
them (other than Einstein-de Sitter) are consistent to within 1 sigma with the
measured mean of the Delta h(z_i,z_j) diagnostics.Comment: 17 pages, 6 figures. Accepted for publication in MNRA
A Two-point Diagnostic for the HII Galaxy Hubble Diagram
A previous analysis of starburst-dominated HII Galaxies and HII regions has
demonstrated a statistically significant preference for the
Friedmann-Robertson-Walker cosmology with zero active mass, known as the R_h=ct
universe, over LCDM and its related dark-matter parametrizations. In this
paper, we employ a 2-point diagnostic with these data to present a
complementary statistical comparison of R_h=ct with Planck LCDM. Our 2-point
diagnostic compares---in a pairwise fashion---the difference between the
distance modulus measured at two redshifts with that predicted by each
cosmology. Our results support the conclusion drawn by a previous comparative
analysis demonstrating that R_h=ct is statistically preferred over Planck LCDM.
But we also find that the reported errors in the HII measurements may not be
purely Gaussian, perhaps due to a partial contamination by non-Gaussian
systematic effects. The use of HII Galaxies and HII regions as standard candles
may be improved even further with a better handling of the systematics in these
sources.Comment: 7 pages, 6 figures, 2 tables. Accepted for publication in MNRA
Applications of satellite snow cover in computerized short-term streamflow forecasting
A procedure is described whereby the correlation between: (1) satellite derived snow-cover depletion and (2) residual snowpack water equivalent, can be used to update computerized residual flow forecasts for the Conejos River in southern Colorado
Energetics of ion competition in the DEKA selectivity filter of neuronal sodium channels
The energetics of ionic selectivity in the neuronal sodium channels is
studied. A simple model constructed for the selectivity filter of the channel
is used. The selectivity filter of this channel type contains aspartate (D),
glutamate (E), lysine (K), and alanine (A) residues (the DEKA locus). We use
Grand Canonical Monte Carlo simulations to compute equilibrium binding
selectivity in the selectivity filter and to obtain various terms of the excess
chemical potential from a particle insertion procedure based on Widom's method.
We show that K ions in competition with Na are efficiently excluded
from the selectivity filter due to entropic hard sphere exclusion. The
dielectric constant of protein has no effect on this selectivity. Ca
ions, on the other hand, are excluded from the filter due to a free energetic
penalty which is enhanced by the low dielectric constant of protein.Comment: 14 pages, 7 figure
- …