283 research outputs found
An Empirical Process Central Limit Theorem for Multidimensional Dependent Data
Let be the empirical process associated to an
-valued stationary process . We give general conditions,
which only involve processes for a restricted class of
functions , under which weak convergence of can be
proved. This is particularly useful when dealing with data arising from
dynamical systems or functional of Markov chains. This result improves those of
[DDV09] and [DD11], where the technique was first introduced, and provides new
applications.Comment: to appear in Journal of Theoretical Probabilit
An extended quantitative model for super-resolution optical fluctuation imaging (SOFI)
Super-resolution optical fluctuation imaging (SOFI) provides super-resolution (SR) fluorescence imaging by analyzing fluctuations in the fluorophore emission. The technique has been used both to acquire quantitative SR images and to provide SR biosensing by monitoring changes in fluorophore blinking dynamics. Proper analysis of such data relies on a fully quantitative model of the imaging. However, previous SOFI imaging models made several assumptions that can not be realized in practice. In this work we address these limitations by developing and verifying a fully quantitative model that better approximates real-world imaging conditions. Our model shows that (i) SOFI images are free of bias, or can be made so, if the signal is stationary and fluorophores blink independently, (ii) allows a fully quantitative description of the link between SOFI imaging and probe dynamics, and (iii) paves the way for more advanced SOFI image reconstruction by offering a computationally fast way to calculate SOFI images for arbitrary probe, sample and instrumental properties
Modifiers of notch transcriptional activity identified by genome-wide RNAi
<p>Abstract</p> <p>Background</p> <p>The Notch signaling pathway regulates a diverse array of developmental processes, and aberrant Notch signaling can lead to diseases, including cancer. To obtain a more comprehensive understanding of the genetic network that integrates into Notch signaling, we performed a genome-wide RNAi screen in <it>Drosophila </it>cell culture to identify genes that modify Notch-dependent transcription.</p> <p>Results</p> <p>Employing complementary data analyses, we found 399 putative modifiers: 189 promoting and 210 antagonizing Notch activated transcription. These modifiers included several known Notch interactors, validating the robustness of the assay. Many novel modifiers were also identified, covering a range of cellular localizations from the extracellular matrix to the nucleus, as well as a large number of proteins with unknown function. Chromatin-modifying proteins represent a major class of genes identified, including histone deacetylase and demethylase complex components and other chromatin modifying, remodeling and replacement factors. A protein-protein interaction map of the Notch-dependent transcription modifiers revealed that a large number of the identified proteins interact physically with these core chromatin components.</p> <p>Conclusions</p> <p>The genome-wide RNAi screen identified many genes that can modulate Notch transcriptional output. A protein interaction map of the identified genes highlighted a network of chromatin-modifying enzymes and remodelers that regulate Notch transcription. Our results open new avenues to explore the mechanisms of Notch signal regulation and the integration of this pathway into diverse cellular processes.</p
Transport collapse in dynamically evolving networks
Transport in complex networks can describe a variety of natural and
human-engineered processes including biological, societal and technological
ones. However, how the properties of the source and drain nodes can affect
transport subject to random failures, attacks or maintenance optimization in
the network remain unknown. In this paper, the effects of both the distance
between the source and drain nodes and of the degree of the source node on the
time of transport collapse are studied in scale-free and lattice-based
transport networks. These effects are numerically evaluated for two strategies,
which employ either transport-based or random link removal. Scale-free networks
with small distances are found to result in larger times of collapse. In
lattice-based networks, both the dimension and boundary conditions are shown to
have a major effect on the time of collapse. We also show that adding a direct
link between the source and the drain increases the robustness of scale-free
networks when subject to random link removals. Interestingly, the distribution
of the times of collapse is then similar to the one of lattice-based networks
A Study of Concurrency Bugs and Advanced Development Support for Actor-based Programs
The actor model is an attractive foundation for developing concurrent
applications because actors are isolated concurrent entities that communicate
through asynchronous messages and do not share state. Thereby, they avoid
concurrency bugs such as data races, but are not immune to concurrency bugs in
general. This study taxonomizes concurrency bugs in actor-based programs
reported in literature. Furthermore, it analyzes the bugs to identify the
patterns causing them as well as their observable behavior. Based on this
taxonomy, we further analyze the literature and find that current approaches to
static analysis and testing focus on communication deadlocks and message
protocol violations. However, they do not provide solutions to identify
livelocks and behavioral deadlocks. The insights obtained in this study can be
used to improve debugging support for actor-based programs with new debugging
techniques to identify the root cause of complex concurrency bugs.Comment: - Submitted for review - Removed section 6 "Research Roadmap for
Debuggers", its content was summarized in the Future Work section - Added
references for section 1, section 3, section 4.3 and section 5.1 - Updated
citation
Time series prediction via aggregation : an oracle bound including numerical cost
We address the problem of forecasting a time series meeting the Causal
Bernoulli Shift model, using a parametric set of predictors. The aggregation
technique provides a predictor with well established and quite satisfying
theoretical properties expressed by an oracle inequality for the prediction
risk. The numerical computation of the aggregated predictor usually relies on a
Markov chain Monte Carlo method whose convergence should be evaluated. In
particular, it is crucial to bound the number of simulations needed to achieve
a numerical precision of the same order as the prediction risk. In this
direction we present a fairly general result which can be seen as an oracle
inequality including the numerical cost of the predictor computation. The
numerical cost appears by letting the oracle inequality depend on the number of
simulations required in the Monte Carlo approximation. Some numerical
experiments are then carried out to support our findings
Super-Resolution Imaging Strategies for Cell Biologists Using a Spinning Disk Microscope
In this study we use a spinning disk confocal microscope (SD) to generate super-resolution images of multiple cellular features from any plane in the cell. We obtain super-resolution images by using stochastic intensity fluctuations of biological probes, combining Photoactivation Light-Microscopy (PALM)/Stochastic Optical Reconstruction Microscopy (STORM) methodologies. We compared different image analysis algorithms for processing super-resolution data to identify the most suitable for analysis of particular cell structures. SOFI was chosen for X and Y and was able to achieve a resolution of ca. 80 nm; however higher resolution was possible >30 nm, dependant on the super-resolution image analysis algorithm used. Our method uses low laser power and fluorescent probes which are available either commercially or through the scientific community, and therefore it is gentle enough for biological imaging. Through comparative studies with structured illumination microscopy (SIM) and widefield epifluorescence imaging we identified that our methodology was advantageous for imaging cellular structures which are not immediately at the cell-substrate interface, which include the nuclear architecture and mitochondria. We have shown that it was possible to obtain two coloured images, which highlights the potential this technique has for high-content screening, imaging of multiple epitopes and live cell imaging
Black Holes as Effective Geometries
Gravitational entropy arises in string theory via coarse graining over an
underlying space of microstates. In this review we would like to address the
question of how the classical black hole geometry itself arises as an effective
or approximate description of a pure state, in a closed string theory, which
semiclassical observers are unable to distinguish from the "naive" geometry. In
cases with enough supersymmetry it has been possible to explicitly construct
these microstates in spacetime, and understand how coarse-graining of
non-singular, horizon-free objects can lead to an effective description as an
extremal black hole. We discuss how these results arise for examples in Type II
string theory on AdS_5 x S^5 and on AdS_3 x S^3 x T^4 that preserve 16 and 8
supercharges respectively. For such a picture of black holes as effective
geometries to extend to cases with finite horizon area the scale of quantum
effects in gravity would have to extend well beyond the vicinity of the
singularities in the effective theory. By studying examples in M-theory on
AdS_3 x S^2 x CY that preserve 4 supersymmetries we show how this can happen.Comment: Review based on lectures of JdB at CERN RTN Winter School and of VB
at PIMS Summer School. 68 pages. Added reference
- …