3,067 research outputs found
Local entropic effects of polymers grafted to soft interfaces
In this paper, we study the equilibrium properties of polymer chains
end-tethered to a fluid membrane. The loss of conformational entropy of the
polymer results in an inhomogeneous pressure field that we calculate for
gaussian chains. We estimate the effects of excluded volume through a relation
between pressure and concentration. Under the polymer pressure, a soft surface
will deform. We calculate the deformation profile for a fluid membrane and show
that close to the grafting point, this profile assumes a cone-like shape,
independently of the boundary conditions. Interactions between different
polymers are also mediated by the membrane deformation. This pair-additive
potential is attractive for chains grafted on the same side of the membrane and
repulsive otherwise.Comment: 10 pages, 9 figure
Depletion forces near a soft surface
We investigate excluded-volume effects in a bidisperse colloidal suspension
near a flexible interface. Inspired by a recent experiment by Dinsmore et al.
(Phys. Rev, Lett. 80, 409 (1998)), we study the adsorption of a mesoscopic bead
on the surface and show that depletion forces could in principle lead to
particle encapsulation. We then consider the effect of surface fluctuations on
the depletion potential itself and construct the density profile of a polymer
solution near a soft interface. Surprisingly we find that the chains accumulate
at the wall, whereas the density displays a deficit of particles at distances
larger than the surface roughness. This non-monotonic behavior demonstrates
that surface fluctuations can have major repercusions on the properties of a
colloidal solution. On average, the additional contribution to the Gibbs
adsorbance is negative. The amplitude of the depletion potential between a
mesoscopic bead and the surface increases accordingly.Comment: 10 pages, 5 figure
The Bayesian Analysis of Complex, High-Dimensional Models: Can It Be CODA?
We consider the Bayesian analysis of a few complex, high-dimensional models
and show that intuitive priors, which are not tailored to the fine details of
the model and the estimated parameters, produce estimators which perform poorly
in situations in which good, simple frequentist estimators exist. The models we
consider are: stratified sampling, the partial linear model, linear and
quadratic functionals of white noise and estimation with stopping times. We
present a strong version of Doob's consistency theorem which demonstrates that
the existence of a uniformly -consistent estimator ensures that the
Bayes posterior is -consistent for values of the parameter in subsets
of prior probability 1. We also demonstrate that it is, at least, in principle,
possible to construct Bayes priors giving both global and local minimax rates,
using a suitable combination of loss functions. We argue that there is no
contradiction in these apparently conflicting findings.Comment: Published in at http://dx.doi.org/10.1214/14-STS483 the Statistical
Science (http://www.imstat.org/sts/) by the Institute of Mathematical
Statistics (http://www.imstat.org
Event Weighted Tests for Detecting Periodicity in Photon Arrival Times
This paper treats the problem of detecting periodicity in a sequence of
photon arrival times, which occurs, for example, in attempting to detect
gamma-ray pulsars. A particular focus is on how auxiliary information,
typically source intensity, background intensity, and incidence angles and
energies associated with each photon arrival should be used to maximize the
detection power. We construct a class of likelihood-based tests, score tests,
which give rise to event weighting in a principled and natural way, and derive
expressions quantifying the power of the tests. These results can be used to
compare the efficacies of different weight functions, including cuts in energy
and incidence angle. The test is targeted toward a template for the periodic
lightcurve, and we quantify how deviation from that template affects the power
of detection
MS 013 Guide to Laura C. Bickel, MD Papers (1938-1967)
The Laura C. Bickel, MD papers contain article reprints, correspondance, case studies, photographs, and xrays related to her research into the Rubella virus and congenital defects and her career in pediatrics. See more at MS 013
Plausibility functions and exact frequentist inference
In the frequentist program, inferential methods with exact control on error
rates are a primary focus. The standard approach, however, is to rely on
asymptotic approximations, which may not be suitable. This paper presents a
general framework for the construction of exact frequentist procedures based on
plausibility functions. It is shown that the plausibility function-based tests
and confidence regions have the desired frequentist properties in finite
samples---no large-sample justification needed. An extension of the proposed
method is also given for problems involving nuisance parameters. Examples
demonstrate that the plausibility function-based method is both exact and
efficient in a wide variety of problems.Comment: 21 pages, 5 figures, 3 table
Recommended from our members
GeneFishing to reconstruct context specific portraits of biological processes.
Rapid advances in genomic technologies have led to a wealth of diverse data, from which novel discoveries can be gleaned through the application of robust statistical and computational methods. Here, we describe GeneFishing, a semisupervised computational approach to reconstruct context-specific portraits of biological processes by leveraging gene-gene coexpression information. GeneFishing incorporates multiple high-dimensional statistical ideas, including dimensionality reduction, clustering, subsampling, and results aggregation, to produce robust results. To illustrate the power of our method, we applied it using 21 genes involved in cholesterol metabolism as "bait" to "fish out" (or identify) genes not previously identified as being connected to cholesterol metabolism. Using simulation and real datasets, we found that the results obtained through GeneFishing were more interesting for our study than those provided by related gene prioritization methods. In particular, application of GeneFishing to the GTEx liver RNA sequencing (RNAseq) data not only reidentified many known cholesterol-related genes, but also pointed to glyoxalase I (GLO1) as a gene implicated in cholesterol metabolism. In a follow-up experiment, we found that GLO1 knockdown in human hepatoma cell lines increased levels of cellular cholesterol ester, validating a role for GLO1 in cholesterol metabolism. In addition, we performed pantissue analysis by applying GeneFishing on various tissues and identified many potential tissue-specific cholesterol metabolism-related genes. GeneFishing appears to be a powerful tool for identifying related components of complex biological systems and may be used across a wide range of applications
Surface-mediated attraction between colloids
We investigate the equilibrium properties of a colloidal solution in contact
with a soft interface. As a result of symmetry breaking, surface effects are
generally prevailing in confined colloidal systems. In this Letter, particular
emphasis is given to surface fluctuations and their consequences on the local
(re)organization of the suspension. It is shown that particles experience a
significant effective interaction in the vicinity of the interface. This
potential of mean force is always attractive, with range controlled by the
surface correlation length. We suggest that, under some circumstances,
surface-induced attraction may have a strong influence on the local particle
distribution
Triangulating Abuse Liability Assessment for Flavoured Cigar Products Using Physiological, Behavioural Economic and Subjective Assessments: A Within-subjects Clinical Laboratory Protocol
Introduction In the USA, Food and Drug Administration regulations prohibit the sale of flavoured cigarettes, with menthol being the exception. However, the manufacture, advertisement and sale of flavoured cigar products are permitted. Such flavourings influence positive perceptions of tobacco products and are linked to increased use. Flavourings may mask the taste of tobacco and enhance smoke inhalation, influencing toxicant exposure and abuse liability among novice tobacco users. Using clinical laboratory methods, this study investigates how flavour availability affects measures of abuse liability in young adult cigarette smokers. The specific aims are to evaluate the effect of cigar flavours on nicotine exposure, and behavioural and subjective measures of abuse liability.
Methods and analyses Participants (projected n=25) are healthy smokers of five or more cigarettes per day over the past 3 months, 18–25 years old, naive to cigar use (lifetime use of 50 or fewer cigar products and no more than 10 cigars smoked in the past 30 days) and without a desire to quit cigarette smoking in the next 30 days. Participants complete five laboratory sessions in a Latin square design with either their own brand cigarette or a session-specific Black & Mild cigar differing in flavour (apple, cream, original and wine). Participants are single-blinded to cigar flavours. Each session consists of two 10-puff smoking bouts (30 s interpuff interval) separated by 1 hour. Primary outcomes include saliva nicotine concentration, behavioural economic task performance and response to various questionnaire items assessing subjective effects predictive of abuse liability. Differences in outcomes across own brand cigarette and flavoured cigar conditions will be tested using linear mixed models
Data-driven efficient score tests for deconvolution problems
We consider testing statistical hypotheses about densities of signals in
deconvolution models. A new approach to this problem is proposed. We
constructed score tests for the deconvolution with the known noise density and
efficient score tests for the case of unknown density. The tests are
incorporated with model selection rules to choose reasonable model dimensions
automatically by the data. Consistency of the tests is proved
- …