253,478 research outputs found
Untenable nonstationarity: An assessment of the fitness for purpose of trend tests in hydrology
The detection and attribution of long-term patterns in hydrological time series have been important research topics for decades. A significant portion of the literature regards such patterns as ‘deterministic components’ or ‘trends’ even though the complexity of hydrological systems does not allow easy deterministic explanations and attributions. Consequently, trend estimation techniques have been developed to make and justify statements about tendencies in the historical data, which are often used to predict future events. Testing trend hypothesis on observed time series is widespread in the hydro-meteorological literature mainly due to the interest in detecting consequences of human activities on the hydrological cycle. This analysis usually relies on the application of some null hypothesis significance tests (NHSTs) for slowly-varying and/or abrupt changes, such as Mann-Kendall, Pettitt, or similar, to summary statistics of hydrological time series (e.g., annual averages, maxima, minima, etc.). However, the reliability of this application has seldom been explored in detail. This paper discusses misuse, misinterpretation, and logical flaws of NHST for trends in the analysis of hydrological data from three different points of view: historic-logical, semantic-epistemological, and practical. Based on a review of NHST rationale, and basic statistical definitions of stationarity, nonstationarity, and ergodicity, we show that even if the empirical estimation of trends in hydrological time series is always feasible from a numerical point of view, it is uninformative and does not allow the inference of nonstationarity without assuming a priori additional information on the underlying stochastic process, according to deductive reasoning. This prevents the use of trend NHST outcomes to support nonstationary frequency analysis and modeling. We also show that the correlation structures characterizing hydrological time series might easily be underestimated, further compromising the attempt to draw conclusions about trends spanning the period of records. Moreover, even though adjusting procedures accounting for correlation have been developed, some of them are insufficient or are applied only to some tests, while some others are theoretically flawed but still widely applied. In particular, using 250 unimpacted stream flow time series across the conterminous United States (CONUS), we show that the test results can dramatically change if the sequences of annual values are reproduced starting from daily stream flow records, whose larger sizes enable a more reliable assessment of the correlation structures
Testing the isotropy of high energy cosmic rays using spherical needlets
For many decades, ultrahigh energy charged particles of unknown origin that
can be observed from the ground have been a puzzle for particle physicists and
astrophysicists. As an attempt to discriminate among several possible
production scenarios, astrophysicists try to test the statistical isotropy of
the directions of arrival of these cosmic rays. At the highest energies, they
are supposed to point toward their sources with good accuracy. However, the
observations are so rare that testing the distribution of such samples of
directional data on the sphere is nontrivial. In this paper, we choose a
nonparametric framework that makes weak hypotheses on the alternative
distributions and allows in turn to detect various and possibly unexpected
forms of anisotropy. We explore two particular procedures. Both are derived
from fitting the empirical distribution with wavelet expansions of densities.
We use the wavelet frame introduced by [SIAM J. Math. Anal. 38 (2006b) 574-594
(electronic)], the so-called needlets. The expansions are truncated at scale
indices no larger than some , and the distances between
those estimates and the null density are computed. One family of tests (called
Multiple) is based on the idea of testing the distance from the null for each
choice of , whereas the so-called PlugIn approach is
based on the single full expansion, but with thresholded wavelet
coefficients. We describe the practical implementation of these two procedures
and compare them to other methods in the literature. As alternatives to
isotropy, we consider both very simple toy models and more realistic
nonisotropic models based on Physics-inspired simulations. The Monte Carlo
study shows good performance of the Multiple test, even at moderate sample
size, for a wide sample of alternative hypotheses and for different choices of
the parameter . On the 69 most energetic events published by the
Pierre Auger Collaboration, the needlet-based procedures suggest statistical
evidence for anisotropy. Using several values for the parameters of the
methods, our procedures yield -values below 1%, but with uncontrolled
multiplicity issues. The flexibility of this method and the possibility to
modify it to take into account a large variety of extensions of the problem
make it an interesting option for future investigation of the origin of
ultrahigh energy cosmic rays.Comment: Published in at http://dx.doi.org/10.1214/12-AOAS619 the Annals of
Applied Statistics (http://www.imstat.org/aoas/) by the Institute of
Mathematical Statistics (http://www.imstat.org
Key issues on partial least squares (PLS) in operations management research: A guide to submissions
Purpose: This work aims to systematise the use of PLS as an analysis tool via a usage guide or
recommendation for researchers to help them eliminate errors when using this tool.
Design/methodology/approach: A recent literature review about PLS and discussion with experts in
the methodology.
Findings: This article considers the current situation of PLS after intense academic debate in recent years,
and summarises recommendations to properly conduct and report a research work that uses this
methodology in its analyses. We particularly focus on how to: choose the construct type; choose the
estimation technique (PLS or CB-SEM); evaluate and report the measurement model; evaluate and report
the structural model; analyse statistical power.
Research limitations: It was impossible to cover some relevant aspects in considerable detail herein:
presenting a guided example that respects all the report recommendations presented herein to act as a
practical guide for authors; does the specification or evaluation of the measurement model differ when it
deals with first-order or second-order constructs?; how are the outcomes of the constructs interpreted
with the indicators being measured with nominal measurement levels?; is the Confirmatory Composite
Analysis approach compatible with recent proposals about the Confirmatory Tetrad Analysis (CTA)?
These themes will the object of later publications.
Originality/value: We provide a check list of the information elements that must contain any article
using PLS. Our intention is for the article to act as a guide for the researchers and possible authors who
send works to the JIEM (Journal of Industrial and Engineering Management). This guide could be used by
both editors and reviewers of JIEM, or other journals in this area, to evaluate and reduce the risk of bias
(Losilla, Oliveras, Marin-Garcia & Vives, 2018) in works using PLS as an analysis procedure
False discovery rate analysis of brain diffusion direction maps
Diffusion tensor imaging (DTI) is a novel modality of magnetic resonance
imaging that allows noninvasive mapping of the brain's white matter. A
particular map derived from DTI measurements is a map of water principal
diffusion directions, which are proxies for neural fiber directions. We
consider a study in which diffusion direction maps were acquired for two groups
of subjects. The objective of the analysis is to find regions of the brain in
which the corresponding diffusion directions differ between the groups. This is
attained by first computing a test statistic for the difference in direction at
every brain location using a Watson model for directional data. Interesting
locations are subsequently selected with control of the false discovery rate.
More accurate modeling of the null distribution is obtained using an empirical
null density based on the empirical distribution of the test statistics across
the brain. Further, substantial improvements in power are achieved by local
spatial averaging of the test statistic map. Although the focus is on one
particular study and imaging technology, the proposed inference methods can be
applied to other large scale simultaneous hypothesis testing problems with a
continuous underlying spatial structure.Comment: Published in at http://dx.doi.org/10.1214/07-AOAS133 the Annals of
Applied Statistics (http://www.imstat.org/aoas/) by the Institute of
Mathematical Statistics (http://www.imstat.org
Mimicry and automatic imitation are not correlated
It is widely known that individuals have a tendency to imitate each other. However, different psychological disciplines assess imitation in different manners. While social psychologists assess mimicry by means of action observation, cognitive psychologists assess automatic imitation with reaction time based measures on a trial-by-trial basis. Although these methods differ in crucial methodological aspects, both phenomena are assumed to rely on similar underlying mechanisms. This raises the fundamental question whether mimicry and automatic imitation are actually correlated. In the present research we assessed both phenomena and did not find a meaningful correlation. Moreover, personality traits such as empathy, autism traits, and traits related to self- versus other-focus did not correlate with mimicry or automatic imitation either. Theoretical implications are discussed
Multipath Multiplexing for Capacity Enhancement in SIMO Wireless Systems
This paper proposes a novel and simple orthogonal faster than Nyquist (OFTN)
data transmission and detection approach for a single input multiple output
(SIMO) system. It is assumed that the signal having a bandwidth is
transmitted through a wireless channel with multipath components. Under
this assumption, the current paper provides a novel and simple OFTN
transmission and symbol-by-symbol detection approach that exploits the
multiplexing gain obtained by the multipath characteristic of wideband wireless
channels. It is shown that the proposed design can achieve a higher
transmission rate than the existing one (i.e., orthogonal frequency division
multiplexing (OFDM)). Furthermore, the achievable rate gap between the proposed
approach and that of the OFDM increases as the number of receiver antennas
increases for a fixed value of . This implies that the performance gain of
the proposed approach can be very significant for a large-scale multi-antenna
wireless system. The superiority of the proposed approach is shown
theoretically and confirmed via numerical simulations. {Specifically, we have
found {upper-bound average} rates of 15 bps/Hz and 28 bps/Hz with the OFDM and
proposed approaches, respectively, in a Rayleigh fading channel with 32 receive
antennas and signal to noise ratio (SNR) of 15.3 dB. The extension of the
proposed approach for different system setups and associated research problems
is also discussed.Comment: IEEE Transactions on Wireless Communication
- …