3,129 research outputs found
Biases in velocity reconstruction: investigating the effects on growth rate and expansion measurements in the local universe
The local galaxy peculiar velocity field can be reconstructed from the
surrounding distribution of large-scale structure and plays an important role
in calibrating cosmic growth and expansion measurements. In this paper, we
investigate the effect of the stochasticity of these velocity reconstructions
on the statistical and systematic errors in cosmological inferences. By
introducing a simple statistical model between the measured and theoretical
velocities, whose terms we calibrate from linear theory, we derive the bias in
the model velocity. We then use lognormal realisations to explore the potential
impact of this bias when using a cosmic flow model to measure the growth rate
of structure, and to sharpen expansion rate measurements from host galaxies for
gravitational wave standard sirens with electromagnetic counterparts. Although
our illustrative study does not contain fully realistic observational effects,
we demonstrate that in some scenarios these corrections are significant and
result in a measurable improvement in determinations of the Hubble constant
compared to standard forecasts.Comment: 10 pages, 10 figures, 1 table, 1 appendix. Submitted to MNRAS.
Comments welcom
On the correlations of galaxy peculiar velocities and their covariance
Measurements of the peculiar velocities of large samples of galaxies enable
new tests of the standard cosmological model, including determination of the
growth rate of cosmic structure that encodes gravitational physics. With the
size of such samples now approaching hundreds of thousands of galaxies, complex
statistical analysis techniques and models are required to extract cosmological
information. In this paper we summarise how correlation functions between
galaxy velocities, and with the surrounding large-scale structure, may be
utilised to test cosmological models. We present new determinations of the
analytical covariance between such correlation functions, which may be useful
for cosmological likelihood analyses. The statistical model we use to determine
these covariances includes the sample selection functions, observational noise,
curved-sky effects and redshift-space distortions. By comparing these
covariance determinations with corresponding estimates from large suites of
cosmological simulations, we demonstrate that these analytical models recover
the key features of the covariance between different statistics and
separations, and produce similar measurements of the growth rate of structure.Comment: 20 pages, 9 figures, version accepted for publication by MNRA
Predicting College Student Gambling Frequency Using the Theory of Planned Behvior: Does the Theory Work Differently for Disordered and Non-Disordered Gamblers?
We examined whether disordered gambling moderates the prediction of gambling behavior via the theory of planned behavior (TPB; i.e., intentions, subjective norms, perceived behavioral control, and attitudes) among college students. A convenience sample of undergraduate students (N=377) at a large, Southeastern university who gambled in the past year completed a classroom-based survey. Approximately half of participants were male (n = 205; 54.4%), and the majority were Caucasian (n = 310; 83.8%). Gambling frequency, gambling problems and gambling-specific TPB constructs were assessed via a cross-sectional survey. A series of regression analyses were conducted to test the utility of the TPB model to predict gambling behavior (i.e., frequency) among (1) non-disordered gamblers (N=342) and (2) disordered gamblers (N=35). Moderation analyses indicated that disordered gamblers might not proceed through the thought processes that guide gambling in non-disordered gamblers. However, findings should be interpreted cautiously, as our study was limited by a small number of lifetime disordered gamblers
The pPSU Plasmids for Generating DNA Molecular Weight Markers.
Visualizing nucleic acids by gel electrophoresis is one of the most common techniques in molecular biology, and reference molecular weight markers or ladders are commonly used for size estimation. We have created the pPSU1 & pPSU2 pair of molecular weight marker plasmids which produce both 100 bp and 1 kb DNA ladders when digested with two common restriction enzymes. The 100 bp ladder fragments have been optimized to migrate appropriately on both agarose and native polyacrylamide, unlike many currently available DNA ladders. Sufficient plasmid DNA can be isolated from 100 ml E. coli cultures for the two plasmids to produce 100 bp or 1 kb ladders for 1000 gels. As such, the pPSU1 and pPSU2 plasmids provide reference fragments from 50 to 10000 bp at a fraction of the cost of commercial DNA ladders. The pPSU1 and pPSU2 plasmids are available without licensing restrictions to nonprofit academic users, affording freely available high-quality, low-cost molecular weight standards for molecular biology applications
Optimization and assessment of phytoplankton size class algorithms for ocean color data on the Northeast U.S. continental shelf
The size structure of phytoplankton communities influences important ecological and biogeochemical processes, including the transfer of energy through marine food webs. A variety of algorithms have been developed to estimate phytoplankton size classes (PSCs) from satellite ocean color data. However, many of these algorithms were developed for application to the global ocean, and their performance in more productive, optically complex coastal and continental shelf regions warrants evaluation. In this study, several existing PSC models were applied in the Northeast U.S. continental shelf (NES) region and compared with in situ PSC estimates derived from a local HPLC pigment data set. The effect of regional re-parameterization and incorporation of sea surface temperature (SST) into existing abundance-based model frameworks was investigated and model performance was assessed using an independent data set. Abundance-based model re-parameterization alone did not result in significant improvement in model performance compared with other models. However, the inclusion of SST led to a consistent reduction in model error for all size classes. Of two absorption-based algorithms tested, the best performing approach displayed similar performance metrics to the regional SST-dependent abundance-based model. The SST-dependent model and the absorption-based method were applied to monthly composites of the NES region for April and September 2019 and qualitatively compared. The results highlight the benefit of considering SST in abundance-based models and the applicability of absorption-based PSC methods in optically complex regions
The effects of an experimental programme to support students’ autonomy on the overt behaviours of physical education teachers
Although the benefits of autonomy supportive behaviours are now well established in the literature, very few studies have attempted to train teachers to offer a greater autonomy support to their students. In fact, none of these studies has been carried out in physical education (PE). The purpose of this study is to test the effects of an autonomy-supportive training on overt behaviours of teaching among PE teachers. The experimental group included two PE teachers who were first educated on the benefits of an autonomy supportive style and then followed an individualised guidance programme during the 8 lessons of a teaching cycle. Their behaviours were observed and rated along 3 categories (i.e., autonomy supportive, neutral and controlling) and were subsequently compared to those of three teachers who formed the control condition. The results showed that teachers in the experimental group used more autonomy supportive and neutral behaviours than those in the control group, but no difference emerged in relation to controlling behaviours. We discuss the implications for schools of our findings
A framework for automated anomaly detection in high frequency water-quality data from in situ sensors
River water-quality monitoring is increasingly conducted using automated in
situ sensors, enabling timelier identification of unexpected values. However,
anomalies caused by technical issues confound these data, while the volume and
velocity of data prevent manual detection. We present a framework for automated
anomaly detection in high-frequency water-quality data from in situ sensors,
using turbidity, conductivity and river level data. After identifying end-user
needs and defining anomalies, we ranked their importance and selected suitable
detection methods. High priority anomalies included sudden isolated spikes and
level shifts, most of which were classified correctly by regression-based
methods such as autoregressive integrated moving average models. However, using
other water-quality variables as covariates reduced performance due to complex
relationships among variables. Classification of drift and periods of
anomalously low or high variability improved when we applied replaced anomalous
measurements with forecasts, but this inflated false positive rates.
Feature-based methods also performed well on high priority anomalies, but were
also less proficient at detecting lower priority anomalies, resulting in high
false negative rates. Unlike regression-based methods, all feature-based
methods produced low false positive rates, but did not and require training or
optimization. Rule-based methods successfully detected impossible values and
missing observations. Thus, we recommend using a combination of methods to
improve anomaly detection performance, whilst minimizing false detection rates.
Furthermore, our framework emphasizes the importance of communication between
end-users and analysts for optimal outcomes with respect to both detection
performance and end-user needs. Our framework is applicable to other types of
high frequency time-series data and anomaly detection applications
- …