763,612 research outputs found
I/O-optimal algorithms on grid graphs
Given a graph of which the n vertices form a regular two-dimensional grid,
and in which each (possibly weighted and/or directed) edge connects a vertex to
one of its eight neighbours, the following can be done in O(scan(n)) I/Os,
provided M = Omega(B^2): computation of shortest paths with non-negative edge
weights from a single source, breadth-first traversal, computation of a minimum
spanning tree, topological sorting, time-forward processing (if the input is a
plane graph), and an Euler tour (if the input graph is a tree). The
minimum-spanning tree algorithm is cache-oblivious. The best previously
published algorithms for these problems need Theta(sort(n)) I/Os. Estimates of
the actual I/O volume show that the new algorithms may often be very efficient
in practice.Comment: 12 pages' extended abstract plus 12 pages' appendix with details,
proofs and calculations. Has not been published in and is currently not under
review of any conference or journa
Biologically informed ecological niche models for an example pelagic, highly mobile species
Background: Although pelagic seabirds are broadly recognised as indicators of the health of marine systems, numerous gaps exist in knowledge of their at-sea distributions at the species level. These gaps have profound negative impacts on the robustness of marine conservation policies. Correlative modelling techniques have provided some information, but few studies have explored model development for non-breeding pelagic seabirds. Here, I present a first phase in developing robust niche models for highly mobile species as a baseline for further development.Methodology: Using observational data from a 12-year time period, 217 unique model parameterisations across three correlative modelling algorithms (boosted regression trees, Maxent and minimum volume ellipsoids) were tested in a time-averaged approach for their ability to recreate the at-sea distribution of non-breeding Wandering Albatrosses (Diomedea exulans) to provide a baseline for further development.Principle Findings/Results: Overall, minimum volume ellipsoids outperformed both boosted regression trees and Maxent. However, whilst the latter two algorithms generally overfit the data, minimum volume ellipsoids tended to underfit the data. Conclusions: The results of this exercise suggest a necessary evolution in how correlative modelling for highly mobile species such as pelagic seabirds should be approached. These insights are crucial for understanding seabird–environment interactions at macroscales, which can facilitate the ability to address population declines and inform effective marine conservation policy in the wake of rapid global change
Impact of large cutoff-effects on algorithms for improved Wilson fermions
As a feasibility study for a scaling test we investigate the behavior of
algorithms for dynamical fermions in the N_f=2 Schroedinger functional at an
intermediate volume of 1 fm^4. Simulations were performed using HMC with two
pseudo-fermions and PHMC at lattice spacings of approximately 0.1 and 0.07 fm.
We show that some algorithmic problems are due to large cutoff-effects in the
spectrum of the improved Wilson-Dirac operator and disappear at the smaller
lattice spacing. The problems discussed here are not expected to be specific to
the Schroedinger functional.Comment: 19 pages, 12 figures, Sec. 2 extended, few references added. Accepted
for publication in Comp. Phys. Com
Multi-Grid Monte Carlo via Embedding. II. Two-Dimensional Principal Chiral Model
We carry out a high-precision simulation of the two-dimensional
principal chiral model at correlation lengths up to ,
using a multi-grid Monte Carlo (MGMC) algorithm and approximately one year of
Cray C-90 CPU time. We extrapolate the finite-volume Monte Carlo data to
infinite volume using finite-size-scaling theory, and we discuss carefully the
systematic and statistical errors in this extrapolation. We then compare the
extrapolated data to the renormalization-group predictions. The deviation from
asymptotic scaling, which is at , decreases to
at . We also analyze the dynamic critical
behavior of the MGMC algorithm using lattices up to , finding
the dynamic critical exponent
(subjective 68% confidence interval). Thus, for this asymptotically free model,
critical slowing-down is greatly reduced compared to local algorithms, but not
completely eliminated.Comment: self-unpacking archive including .tex, .sty and .ps files; 126 pages
including all figure
An Empirical Study of Stochastic Variational Algorithms for the Beta Bernoulli Process
Stochastic variational inference (SVI) is emerging as the most promising
candidate for scaling inference in Bayesian probabilistic models to large
datasets. However, the performance of these methods has been assessed primarily
in the context of Bayesian topic models, particularly latent Dirichlet
allocation (LDA). Deriving several new algorithms, and using synthetic, image
and genomic datasets, we investigate whether the understanding gleaned from LDA
applies in the setting of sparse latent factor models, specifically beta
process factor analysis (BPFA). We demonstrate that the big picture is
consistent: using Gibbs sampling within SVI to maintain certain posterior
dependencies is extremely effective. However, we find that different posterior
dependencies are important in BPFA relative to LDA. Particularly,
approximations able to model intra-local variable dependence perform best.Comment: ICML, 12 pages. Volume 37: Proceedings of The 32nd International
Conference on Machine Learning, 201
Validation of infarct size and location from the ECG by inverse body surface mapping
This paper describes the incorporation of body surface mapping algorithms to detect the position and size of acute myocardial infarctions using standard 12 lead ECG recording. The results are compared with the results from cardiac MRI scan analysis. In case patient specific volume conductor models are used, the position of the infarction could be accurately determined. When generalized patient volume conductor models were examined, the estimation of the infarct position became significantly less accurate. The calculations of the size of the infarctions need further improvement
- …