5,198 research outputs found
Testing of random matrices
Let be a positive integer and be an
\linebreak \noindent sized matrix of independent random variables
having joint uniform distribution \hbox{Pr} {x_{ij} = k \hbox{for} 1 \leq k
\leq n} = \frac{1}{n} \quad (1 \leq i, j \leq n) \koz. A realization
of is called \textit{good}, if its each row and
each column contains a permutation of the numbers . We present and
analyse four typical algorithms which decide whether a given realization is
good
COORDINATION THROUGH DE BRUIJN SEQUENCES
Let ” be a rational distribution over a finite alphabet, and ( ) be a n-periodic sequences which first n elements are drawn i.i.d. according to ”. We consider automata of bounded size that input and output at stage t. We prove the existence of a constant C such that, whenever , with probability close to 1 there exists an automaton of size m such that the empirical frequency of stages such that is close to 1. In particular, one can take , where and .Coordination, complexity, De Bruijn sequences, automata
Extreme Scale De Novo Metagenome Assembly
Metagenome assembly is the process of transforming a set of short,
overlapping, and potentially erroneous DNA segments from environmental samples
into the accurate representation of the underlying microbiomes's genomes.
State-of-the-art tools require big shared memory machines and cannot handle
contemporary metagenome datasets that exceed Terabytes in size. In this paper,
we introduce the MetaHipMer pipeline, a high-quality and high-performance
metagenome assembler that employs an iterative de Bruijn graph approach.
MetaHipMer leverages a specialized scaffolding algorithm that produces long
scaffolds and accommodates the idiosyncrasies of metagenomes. MetaHipMer is
end-to-end parallelized using the Unified Parallel C language and therefore can
run seamlessly on shared and distributed-memory systems. Experimental results
show that MetaHipMer matches or outperforms the state-of-the-art tools in terms
of accuracy. Moreover, MetaHipMer scales efficiently to large concurrencies and
is able to assemble previously intractable grand challenge metagenomes. We
demonstrate the unprecedented capability of MetaHipMer by computing the first
full assembly of the Twitchell Wetlands dataset, consisting of 7.5 billion
reads - size 2.6 TBytes.Comment: Accepted to SC1
The asymptotical error of broadcast gossip averaging algorithms
In problems of estimation and control which involve a network, efficient
distributed computation of averages is a key issue. This paper presents
theoretical and simulation results about the accumulation of errors during the
computation of averages by means of iterative "broadcast gossip" algorithms.
Using martingale theory, we prove that the expectation of the accumulated error
can be bounded from above by a quantity which only depends on the mixing
parameter of the algorithm and on few properties of the network: its size, its
maximum degree and its spectral gap. Both analytical results and computer
simulations show that in several network topologies of applicative interest the
accumulated error goes to zero as the size of the network grows large.Comment: 10 pages, 3 figures. Based on a draft submitted to IFACWC201
Algorithmic complexity theory detects decreases in the relative efficiency of stock markets in the aftermath of the 2008 financial crisis
The relative efficiency of financial markets can be evaluated using algorithmic complexity theory. Using this approach we detect decreases in efficiency rates of the major stocks listed on the Sao Paulo Stock Exchange in the aftermath of the 2008 financial crisis.market efficiency, stock markets, econophysics
Decreasing Diagrams for Confluence and Commutation
Like termination, confluence is a central property of rewrite systems. Unlike
for termination, however, there exists no known complexity hierarchy for
confluence. In this paper we investigate whether the decreasing diagrams
technique can be used to obtain such a hierarchy. The decreasing diagrams
technique is one of the strongest and most versatile methods for proving
confluence of abstract rewrite systems. It is complete for countable systems,
and it has many well-known confluence criteria as corollaries.
So what makes decreasing diagrams so powerful? In contrast to other
confluence techniques, decreasing diagrams employ a labelling of the steps with
labels from a well-founded order in order to conclude confluence of the
underlying unlabelled relation. Hence it is natural to ask how the size of the
label set influences the strength of the technique. In particular, what class
of abstract rewrite systems can be proven confluent using decreasing diagrams
restricted to 1 label, 2 labels, 3 labels, and so on? Surprisingly, we find
that two labels suffice for proving confluence for every abstract rewrite
system having the cofinality property, thus in particular for every confluent,
countable system.
Secondly, we show that this result stands in sharp contrast to the situation
for commutation of rewrite relations, where the hierarchy does not collapse.
Thirdly, investigating the possibility of a confluence hierarchy, we
determine the first-order (non-)definability of the notion of confluence and
related properties, using techniques from finite model theory. We find that in
particular Hanf's theorem is fruitful for elegant proofs of undefinability of
properties of abstract rewrite systems
On a Problem of Steinhaus
Let be a positive integer. A sequence of points
in the unit interval is piercing if holds for every
and every . In 1958 Steinhaus asked whether
piercing sequences can be arbitrarily long. A negative answer was provided by
Schinzel, who proved that any such sequence may have at most elements.
This was later improved to the best possible value of by Warmus, and
independently by Berlekamp and Graham.
In this paper we study a more general variant of piercing sequences. Let
be an infinite nondecreasing sequence of positive integers. A
sequence is -piercing if
holds for every and every .
A special case of , with a fixed nonnegative integer, was studied
by Berlekamp and Graham. They noticed that for each , the maximum
length of any -piercing sequence is finite. Expressing this maximum
length as , they obtained an exponential upper bound on the function
, which was later improved to by Graham and Levy. Recently,
Konyagin proved that holds for all sufficiently big
.
Using a different technique based on the Farey fractions and stick-breaking
games, we prove here that the function satisfies
, where
and
. We also prove that there exists an
infinite -piercing sequence with if and only if
.Comment: 16 page
New developments and applications in quantitative electron spectroscopic imaging of iron in human liver biopsies
Reliable iron concentration data can be obtained by quantitative analyses of image sequences, acquired by electron spectroscopic imaging. A number of requirements are formulated for the successful application of this recently developed in situ quantitative type of analysis. A demonstration of the procedures is given. By application of the technique it is established that there are no significant differences in the average iron loading of structures analysed in liver parenchymal cells of a patient with an iron storage disease, before and after phlebotomy. This supports the hypothesis that the process of iron unloading is an organelle specific process. Measurement of the binary morphology, represented by the area and contour ratio of the iron containing objects revealed no information about differences between the objects. This finding contradicts the visual suggestion that ferritin clusters are more irregularly shaped than the other iron objects. Also, no differences could be found in this sense between the situations before and after phlebotomy. With respect to the density appearance, objects that have an inhomogeneous iron loading averagely contain more iron. This observation does correspond well with the visual impression of the increasingly irregular appearance of more well-loaded structures
- âŠ