311 research outputs found
An Algorithm to Recover Shredded Random Matrices
Given some binary matrix , suppose we are presented with the collection of
its rows and columns in independent arbitrary orderings. From this information,
are we able to recover the unique original orderings and matrix? We present an
algorithm that identifies whether there is a unique ordering associated with a
set of rows and columns, and outputs either the unique correct orderings for
the rows and columns or the full collection of all valid orderings and valid
matrices. We show that there is a constant such that the algorithm
terminates in time with high probability and in expectation for random
binary matrices with i.i.d.\ Bernoulli entries
such that
Recommended from our members
Statistical Machine Learning Methods for the Large Scale Analysis of Neural Data
Modern neurotechnologies enable the recording of neural activity at the scale of entire brains and with single-cell resolution. However, the lack of principled approaches to extract structure from these massive data streams prevent us from fully exploiting the potential of these technologies. This thesis, divided in three parts, introduces new statistical machine learning methods to enable the large-scale analysis of some of these complex neural datasets. In the first part, I present a method that leverages Gaussian quadrature to accelerate inference of neural encoding models from a certain type of observed neural point processes --- spike trains --- resulting in substantial improvements over existing methods.
The second part focuses on the simultaneous electrical stimulation and recording of neurons using large electrode arrays. There, identification of neural activity is hindered by stimulation artifacts that are much larger than spikes, and overlap temporally with spikes. To surmount this challenge, I develop an algorithm to infer and cancel this artifact, enabling inference of the neural signal of interest. This algorithm is based on a a bayesian generative model for recordings, where a structured gaussian process is used to represent prior knowledge of the artifact. The algorithm achieves near perfect accuracy and enables the analysis of data hundreds of time faster than previous approaches.
The third part is motivated by the problem of inference of neural dynamics in the worm C.elegans: when taking a data-driven approach to this question, e.g., when using whole-brain calcium imaging data, one is faced with the need to match neural recordings to canonical neural identities, in practice resolved by tedious human labor. Alternatively, on a bayesian setup this problem may be cast as posterior inference of a latent permutation. I introduce methods that enable gradient-based approximate posterior inference of permutations, overcoming the difficulties imposed by the combinatorial and discrete nature of this object. Results suggest the feasibility of automating neural identification, and demonstrate variational inference in permutations is a sensible alternative to MCMC
Experimental and theoretical models of cultural evolution
This thesis contributes to the field of cultural evolution by presenting two
experimental and two theoretical models of cultural evolution. Prior to presenting these I survey existing experimental and theoretical models of cultural evolution. In the first experiment, I test the hypothesis that increasing group size speeds up cultural accumulation, using a novel puzzle-solving task and within a transmission chain design. I find support for this hypothesis, in contrast with previous experiments. In the second experiment, also using a transmission chain design, I examine perceptual errors in recreating Acheulean handaxes and ask whether such errors can account for the variability of Acheulean technology over time. Using the accumulated copying error model to compare the experimental data to archaeological records, I conclude that perceptual errors alone were likely not the driving force behind Acheulean evolution. In the first theoretical chapter, I present models of cultural differences between populations and of cumulative culture, which build on existing models and accord with empirical data. I then show that the models, when combined, have two qualitative regimes which may correspond to human and nonhuman culture. In the second theoretical chapter, I present a ‘fundamental theorem of cultural selection’, an equivalent of Fisher’s Fundamental Theorem of Natural Selection for cultural evolution. I discuss how this theorem formalizes and sheds light on cultural evolutionary theory. Finally I conclude and discuss future research directions
Shotgun assembly of random graphs
In the graph shotgun assembly problem, we are given the balls of radius
around each vertex of a graph and asked to reconstruct the graph. We study the
shotgun assembly of the Erd\H{o}s-R\'enyi random graph from a
wide range of values of . We determine the threshold for reconstructibility
for each , extending and improving substantially on results of Mossel
and Ross for . For , we give upper and lower bounds that improve on
results of Gaudio and Mossel by polynomial factors. We also give a sharpening
of a result of Huang and Tikhomirov for .Comment: 36 pages, 3 figure
An optimizational approach for an algorithmic reassembly of fragmented objects
In Cambodia close to the Thai border, lies the Angkor-style temple of Banteay Chhmar. Like all nearly forgotten temples in remote places, it crumbles under the ages. By today most of it is only a heap of stones. Manually reconstructing these temples is both complex and challenging: The conservation team is confronted with a pile of stones, the
original position of which is generally unkown. This reassembly task resembles a large-scale 3D puzzle. Usually, it is resolved by a team of specialists who analyze each stone, using their experience and knowledge of Khmer culture. Possible solutions are tried and retried and the stones are placed in different locations until the correct one is found. The major drawbacks of this technique are: First, since the stones are moved continuously they are further damaged, second, there is a threat to the safety of the workers due to handling very heavy weights, and third because of the high complexity and labour-intensity of the
work it takes several months up to several years to solve even a small part of the puzzle.
These risks and conditions motivated the development of a virtual approach to reassemble the stones, as computer algorithms are theoretically capable of enumerating all
potential solutions in less time, thereby drastically reducing the amount of work required for handling the stones. Furthermore the virtual approach has the potential to reduce the on-site costs of in-situ analysis. The basis for this virtual puzzle algorithm are high-resolution 3D models of more than one hundred stones. The stones can be viewed as polytopes with approximately cuboidal form although some of them contain additional indentations. Exploiting these and related geometric features and using a priori knowledge of the orientation of each stone speeds up the process of matching the stones.
The aim of the current thesis is to solve this complex large-scale virtual 3D puzzle. In order to achieve this, a general workflow is developed which involves 1) to simplify
the high-resolution models to their most characteristic features, 2) apply an advanced similarity analysis and 3) to match best combinations as well as 4) validate the results.
The simplification step is necessary to be able to quickly match potential side-surfaces. It introduces the new concept of a minimal volume box (MVB) designed to closely and storage efficiently resemble Khmer stones.Additionally, this reduced edge-based model is used to segment the high-resolution data according to each side-surface. The second step presents a novel technique allowing to conduct a similarity analysis of virtual temple stones. It is based on several geometric distance functions which determine the relatedness of a potential match and is capable of sorting out unlikely ones. The third step employs graph theoretical methods to combine the similarity values into a correct solution of this large-scale 3D puzzle. The validation demonstrates the high quality and robustness of this newly constructed puzzle workflow.
The workflow this thesis presents virtually puzzles digitized stones of fallen straight Khmer temple walls. It is able to virtually and correctly reasemble up to 42 digitized stones requiring a minimum of user-interaction
- …