7,983 research outputs found
Hierarchical confirmatory factor analysis of the flow state scale in exercise
In this study, we examined the factor structure and internal consistency of the Flow State Scale using responses of exercise participants.This self-report questionnaire consists of nine subscales designed to assess flow in sport and physical activity. It was administered to 1231 aerobic dance exercise participants. Confirmatory factor analyses were used to test three competing measurement models of the flow construct: a single-factor model, a nine-factor model and a hierarchical model positing a higher-order flow factor to explain the intercorrelations between the nine first-order factors. The single-factor model showed a poor fit to the data. The nine-factor model and the hierarchical model did not show an adequate fit to the data. All subscales of the Flow State Scale displayed acceptable internal consistency (alpha > 0.70), with the exception of transformation of time (alpha = 0.65). Collectively, the present results do not provide support for the tenability of the single-factor, nine-factor or hierarchical measurement models in an exercise setting
A default prior for regression coefficients
When the sample size is not too small, M-estimators of regression
coefficients are approximately normal and unbiased. This leads to the familiar
frequentist inference in terms of normality-based confidence intervals and
p-values. From a Bayesian perspective, use of the (improper) uniform prior
yields matching results in the sense that posterior quantiles agree with
one-sided confidence bounds. For this, and various other reasons, the uniform
prior is often considered objective or non-informative. In spite of this, we
argue that the uniform prior is not suitable as a default prior for inference
about a regression coefficient in the context of the bio-medical and social
sciences. We propose that a more suitable default choice is the normal
distribution with mean zero and standard deviation equal to the standard error
of the M-estimator. We base this recommendation on two arguments. First, we
show that this prior is non-informative for inference about the sign of the
regression coefficient. Secondly, we show that this prior agrees well with a
meta-analysis of 50 articles from the MEDLINE database
A Permutation-based Combination of Sign Tests for Assessing Habitat Selection
The analysis of habitat use in radio-tagged animals is approached by comparing the portions of use vs the portions of availability observed for each habitat type. Since data are linearly dependent with singular variance-covariance matrices, standard multivariate statistical test cannot be applied. To overcome the problem, compositional data analysis is customary performed via log-ratio transform of sample observations. The procedure is criticized in this paper, emphasizing the many drawbacks which may arise from the use of compositional analysis. An alternative nonparametric solution is proposed in the framework of multiple testing. The habitat use is assessed separately for each habitat type by means of the sign test performed on the original observations. The resulting p-values are combined in an overall test statistic whose significance is determined permuting sample observations. The theoretical findings of the paper are checked by simulation studies. Applications to some case studies are considered.compositional data analysis, Johnson’s second order selection, Johnson’s third order selection, Monte Carlo studies, multiple testing, random habitat use.
Financial Risk Measurement for Financial Risk Management
Current practice largely follows restrictive approaches to market risk measurement, such as historical simulation or RiskMetrics. In contrast, we propose flexible methods that exploit recent developments in financial econometrics and are likely to produce more accurate risk assessments, treating both portfolio-level and asset-level analysis. Asset-level analysis is particularly challenging because the demands of real-world risk management in financial institutions - in particular, real-time risk tracking in very high-dimensional situations - impose strict limits on model complexity. Hence we stress powerful yet parsimonious models that are easily estimated. In addition, we emphasize the need for deeper understanding of the links between market risk and macroeconomic fundamentals, focusing primarily on links among equity return volatilities, real growth, and real growth volatilities. Throughout, we strive not only to deepen our scientific understanding of market risk, but also cross-fertilize the academic and practitioner communities, promoting improved market risk measurement technologies that draw on the best of both.Market risk, volatility, GARCH
Assessing the Magnitude of the Concentration Parameter in a Simultaneous Equations Model
Poskitt and Skeels (2003) provide a new approximation to the sampling distribution of the IV estimator in a simultaneous equations model. This approximation is appropriate when the concentration parameter associated with the reduced form model is small and a basic purpose of this paper is to provide the practitioner with a method of ascertaining when the concentration parameter is small, and hence when the use of the Poskitt and Skeels (2003) approximation is appropriate. Existing procedures tend to focus on the notion of correlation and hypothesis testing. Approaching the problem from a different perspective leads us to advocate a different statistic for use in this problem. We provide exact and approximate distribution theory for the proposed statistic and show that it satisfies various optimality criteria not satisfied by some of its competitors. Rather than adopting a testing approach we suggest the use of p-values as a calibration device.Concentration parameter, simultaneous equations model, alienation coefficient, Wilks-lambda distribution, admissible invariant test.
- …
