25,383 research outputs found
Optimizing Lossy Compression Rate-Distortion from Automatic Online Selection between SZ and ZFP
With ever-increasing volumes of scientific data produced by HPC applications,
significantly reducing data size is critical because of limited capacity of
storage space and potential bottlenecks on I/O or networks in writing/reading
or transferring data. SZ and ZFP are the two leading lossy compressors
available to compress scientific data sets. However, their performance is not
consistent across different data sets and across different fields of some data
sets: for some fields SZ provides better compression performance, while other
fields are better compressed with ZFP. This situation raises the need for an
automatic online (during compression) selection between SZ and ZFP, with a
minimal overhead. In this paper, the automatic selection optimizes the
rate-distortion, an important statistical quality metric based on the
signal-to-noise ratio. To optimize for rate-distortion, we investigate the
principles of SZ and ZFP. We then propose an efficient online, low-overhead
selection algorithm that predicts the compression quality accurately for two
compressors in early processing stages and selects the best-fit compressor for
each data field. We implement the selection algorithm into an open-source
library, and we evaluate the effectiveness of our proposed solution against
plain SZ and ZFP in a parallel environment with 1,024 cores. Evaluation results
on three data sets representing about 100 fields show that our selection
algorithm improves the compression ratio up to 70% with the same level of data
distortion because of very accurate selection (around 99%) of the best-fit
compressor, with little overhead (less than 7% in the experiments).Comment: 14 pages, 9 figures, first revisio
Characterizing and Modeling the Dynamics of Activity and Popularity
Social media, regarded as two-layer networks consisting of users and items,
turn out to be the most important channels for access to massive information in
the era of Web 2.0. The dynamics of human activity and item popularity is a
crucial issue in social media networks. In this paper, by analyzing the growth
of user activity and item popularity in four empirical social media networks,
i.e., Amazon, Flickr, Delicious and Wikipedia, it is found that cross links
between users and items are more likely to be created by active users and to be
acquired by popular items, where user activity and item popularity are measured
by the number of cross links associated with users and items. This indicates
that users generally trace popular items, overall. However, it is found that
the inactive users more severely trace popular items than the active users.
Inspired by empirical analysis, we propose an evolving model for such networks,
in which the evolution is driven only by two-step random walk. Numerical
experiments verified that the model can qualitatively reproduce the
distributions of user activity and item popularity observed in empirical
networks. These results might shed light on the understandings of micro
dynamics of activity and popularity in social media networks.Comment: 13 pages, 6 figures, 2 table
Nociceptive-Evoked Potentials Are Sensitive to Behaviorally Relevant Stimulus Displacements in Egocentric Coordinates.
Feature selection has been extensively studied in the context of goal-directed behavior, where it is heavily driven by top-down factors. A more primitive version of this function is the detection of bottom-up changes in stimulus features in the environment. Indeed, the nervous system is tuned to detect fast-rising, intense stimuli that are likely to reflect threats, such as nociceptive somatosensory stimuli. These stimuli elicit large brain potentials maximal at the scalp vertex. When elicited by nociceptive laser stimuli, these responses are labeled laser-evoked potentials (LEPs). Although it has been shown that changes in stimulus modality and increases in stimulus intensity evoke large LEPs, it has yet to be determined whether stimulus displacements affect the amplitude of the main LEP waves (N1, N2, and P2). Here, in three experiments, we identified a set of rules that the human nervous system obeys to identify changes in the spatial location of a nociceptive stimulus. We showed that the N2 wave is sensitive to: (1) large displacements between consecutive stimuli in egocentric, but not somatotopic coordinates; and (2) displacements that entail a behaviorally relevant change in the stimulus location. These findings indicate that nociceptive-evoked vertex potentials are sensitive to behaviorally relevant changes in the location of a nociceptive stimulus with respect to the body, and that the hand is a particularly behaviorally important site
Study on space-time structure of Higgs boson decay using HBT correlation Method in ee collision at =250 GeV
The space-time structure of the Higgs boson decay are carefully studied with
the HBT correlation method using ee collision events produced through
Monte Carlo generator PYTHIA 8.2 at =250GeV. The Higgs boson jets
(Higgs-jets) are identified by H-tag tracing. The measurement of the Higgs
boson radius and decay lifetime are derived from HBT correlation of its decay
final state pions inside Higgs-jets in the ee collisions events with an
upper bound of fm and fs. This result is consistent with CMS data.Comment: 7 pages,3 figure
Likelihood Ratio Testing for Admixture Models with Application to Genetic Linkage Analysis
We consider likelihood ratio tests (LRT) and their modifications for homogeneity in admixture models. The admixture model is a special case of two component mixture model, where one component is indexed by an unknown parameter while the parameter value for the other component is known. It has been widely used in genetic linkage analysis under heterogeneity, in which the kernel distribution is binomial. For such models, it is long recognized that testing for homogeneity is nonstandard and the LRT statistic does not converge to a conventional 2 distribution. In this paper, we investigate the asymptotic behavior of the LRT for general admixture models and show that its limiting distribution is equivalent to the supremum of a squared Gaussian process. We also provide insights on the connection and comparison between LRT and alternative approaches in the literature, mostly modifications of LRT and score tests, including the modified or penalized LRT (Fu et al., 2006). The LRT is an omnibus test that is powerful against general alternative hypothesis. In contrast, alternative approaches may be slightly more powerful against certain type of alternatives, but much less powerful for other types. Our results are illustrated by simulation studies and an application to a genetic linkage study of schizophrenia
- …
