4,737 research outputs found
Biased Information Search in Homogeneous Groups: Confidence as a Moderator for the Effect of Anticipated Task Requirements
When searching for information, groups that are homogeneous regarding their members’ prediscussion decision preferences show a strong bias for information that supports rather than conflicts with the prevailing opinion (confirmation bias). The present research examined whether homogeneous groups blindly search for information confirming their beliefs irrespective of the anticipated task or whether they are sensitive to the usefulness of new information for this forthcoming task. Results of three experiments show that task sensitivity depends on the groups’ confidence in the correctness of their decision: Moderately confident groups displayed a strong confirmation bias when they anticipated having to give reasons for their decision but showed a balanced information search or even a disconfirmation bias (i.e., predominately seeking conflicting information) when they anticipated having to refute unterarguments. In contrast, highly confident groups demonstrated a strong confirmation bias independent of the anticipated task requirements
Language Barriers in Health Care Settings: An Annotated Bibliography of Research Literature
Provides an overview of resources related to the prevalence, role, and effects of language barriers and access in health care
Learning Mixtures of Gaussians in High Dimensions
Efficiently learning mixture of Gaussians is a fundamental problem in
statistics and learning theory. Given samples coming from a random one out of k
Gaussian distributions in Rn, the learning problem asks to estimate the means
and the covariance matrices of these Gaussians. This learning problem arises in
many areas ranging from the natural sciences to the social sciences, and has
also found many machine learning applications. Unfortunately, learning mixture
of Gaussians is an information theoretically hard problem: in order to learn
the parameters up to a reasonable accuracy, the number of samples required is
exponential in the number of Gaussian components in the worst case. In this
work, we show that provided we are in high enough dimensions, the class of
Gaussian mixtures is learnable in its most general form under a smoothed
analysis framework, where the parameters are randomly perturbed from an
adversarial starting point. In particular, given samples from a mixture of
Gaussians with randomly perturbed parameters, when n > {\Omega}(k^2), we give
an algorithm that learns the parameters with polynomial running time and using
polynomial number of samples. The central algorithmic ideas consist of new ways
to decompose the moment tensor of the Gaussian mixture by exploiting its
structural properties. The symmetries of this tensor are derived from the
combinatorial structure of higher order moments of Gaussian distributions
(sometimes referred to as Isserlis' theorem or Wick's theorem). We also develop
new tools for bounding smallest singular values of structured random matrices,
which could be useful in other smoothed analysis settings
Momentum and Mass Fluxes in a Gas Confined between Periodically Structured Surfaces at Different Temperatures
It is well known that in a gas-filled duct or channel along which a
temperature gradient is applied, a thermal creep flow is created. Here we show
that a mass and momentum flux can also be induced in a gas confined between two
parallel structured surfaces at different temperatures, i.e.
\textit{orthogonal} to the temperature gradient. We use both analytical and
numerical methods to compute the resulting fluxes. The momentum flux assumes
its maximum value in the free-molecular flow regime, the (normalized) mass flux
in the transition flow regime. The discovered phenomena could find applications
in novel methods for energy-conversion and thermal pumping of gases.Comment: 6 pages, 5 figures, updated fig.5, updated text for the numerical
  metho
Private Multiplicative Weights Beyond Linear Queries
A wide variety of fundamental data analyses in machine learning, such as
linear and logistic regression, require minimizing a convex function defined by
the data. Since the data may contain sensitive information about individuals,
and these analyses can leak that sensitive information, it is important to be
able to solve convex minimization in a privacy-preserving way.
  A series of recent results show how to accurately solve a single convex
minimization problem in a differentially private manner. However, the same data
is often analyzed repeatedly, and little is known about solving multiple convex
minimization problems with differential privacy. For simpler data analyses,
such as linear queries, there are remarkable differentially private algorithms
such as the private multiplicative weights mechanism (Hardt and Rothblum, FOCS
2010) that accurately answer exponentially many distinct queries. In this work,
we extend these results to the case of convex minimization and show how to give
accurate and differentially private solutions to *exponentially many* convex
minimization problems on a sensitive dataset
Embedding Principal Component Analysis for Data Reductionin Structural Health Monitoring on Low-Cost IoT Gateways
Principal component analysis (PCA) is a powerful data reductionmethod for
Structural Health Monitoring. However, its computa-tional cost and data memory
footprint pose a significant challengewhen PCA has to run on limited capability
embedded platformsin low-cost IoT gateways. This paper presents a
memory-efficientparallel implementation of the streaming History PCA
algorithm.On our dataset, it achieves 10x compression factor and 59x
memoryreduction with less than 0.15 dB degradation in the
reconstructedsignal-to-noise ratio (RSNR) compared to standard PCA. More-over,
the algorithm benefits from parallelization on multiple cores,achieving a
maximum speedup of 4.8x on Samsung ARTIK 710
Foundation and empire : a critique of Hardt and Negri
In this article, Thompson complements recent critiques of Hardt and Negri's Empire (see Finn Bowring in Capital and Class, no. 83) using the tools of labour process theory to critique the political economy of Empire, and to note its unfortunate similarities to conventional theories of the knowledge economy
- …
