373 research outputs found
A computational model of the integration of landmarks and motion in the insect central complex.
The insect central complex (CX) is an enigmatic structure whose computational function has evaded inquiry, but has been implicated in a wide range of behaviours. Recent experimental evidence from the fruit fly (Drosophila melanogaster) and the cockroach (Blaberus discoidalis) has demonstrated the existence of neural activity corresponding to the animal's orientation within a virtual arena (a neural 'compass'), and this provides an insight into one component of the CX structure. There are two key features of the compass activity: an offset between the angle represented by the compass and the true angular position of visual features in the arena, and the remapping of the 270° visual arena onto an entire circle of neurons in the compass. Here we present a computational model which can reproduce this experimental evidence in detail, and predicts the computational mechanisms that underlie the data. We predict that both the offset and remapping of the fly's orientation onto the neural compass can be explained by plasticity in the synaptic weights between segments of the visual field and the neurons representing orientation. Furthermore, we predict that this learning is reliant on the existence of neural pathways that detect rotational motion across the whole visual field and uses this rotation signal to drive the rotation of activity in a neural ring attractor. Our model also reproduces the 'transitioning' between visual landmarks seen when rotationally symmetric landmarks are presented. This model can provide the basis for further investigation into the role of the central complex, which promises to be a key structure for understanding insect behaviour, as well as suggesting approaches towards creating fully autonomous robotic agents
Spiral Galaxies as Chiral Objects?
Spiral galaxies show axial symmetry and an intrinsic 2D-chirality.
Environmental effects can influence the chirality of originally isolated
stellar systems and a progressive loss of chirality can be recognised in the
Hubble sequence. We point out a preferential modality for genetic galaxies as
in microscopic systems like aminoacids, sugars or neutrinos. This feature could
be the remnant of a primordial symmetry breaking characterizing systems at all
scales.Comment: 10 pages, 3 figure
They are young, and they are many: dating freshwater lineages in unicellular dinophytes
Dinophytes are one of few protist groups that have an extensive fossil record and are therefore appropriate for time estimations. However, insufficient sequence data and strong rate heterogeneity have been hindering to put dinophyte evolution into a time frame until now. Marine‐to‐freshwater transitions within this group are considered geologically old and evolutionarily exceptional due to strong physiological constraints that prevent such processes. Phylogenies based on concatenated rRNA sequences (including 19 new GenBank entries) of two major dinophyte lineages, Gymnodiniaceae and Peridiniales, were carried out using an uncorrelated molecular clock and five calibration points based on fossils. Contrarily to previous assumptions, marine‐to‐freshwater transitions are more frequent in dinophytes (i.e. five marine‐freshwater transitions in Gymnodiniaceae, up to ten but seven strongly supported transitions in Peridiniales), and none of them occurred as early as 140 MYA. Furthermore, most marine‐to‐freshwater transitions, and the followed diversification, took place after the Cretaceous–Paleogene boundary. Not older than 40 MYA, the youngest transitions within Gymnodiniaceae and Peridiniales occurred under the influence of the Eocene climate shift. Our evolutionary scenario indicates a gradual diversification of dinophytes without noticeable impact of catastrophic events, and their freshwater lineages have originated several times independently at different points in time
Low Complexity Regularization of Linear Inverse Problems
Inverse problems and regularization theory is a central theme in contemporary
signal processing, where the goal is to reconstruct an unknown signal from
partial indirect, and possibly noisy, measurements of it. A now standard method
for recovering the unknown signal is to solve a convex optimization problem
that enforces some prior knowledge about its structure. This has proved
efficient in many problems routinely encountered in imaging sciences,
statistics and machine learning. This chapter delivers a review of recent
advances in the field where the regularization prior promotes solutions
conforming to some notion of simplicity/low-complexity. These priors encompass
as popular examples sparsity and group sparsity (to capture the compressibility
of natural signals and images), total variation and analysis sparsity (to
promote piecewise regularity), and low-rank (as natural extension of sparsity
to matrix-valued data). Our aim is to provide a unified treatment of all these
regularizations under a single umbrella, namely the theory of partial
smoothness. This framework is very general and accommodates all low-complexity
regularizers just mentioned, as well as many others. Partial smoothness turns
out to be the canonical way to encode low-dimensional models that can be linear
spaces or more general smooth manifolds. This review is intended to serve as a
one stop shop toward the understanding of the theoretical properties of the
so-regularized solutions. It covers a large spectrum including: (i) recovery
guarantees and stability to noise, both in terms of -stability and
model (manifold) identification; (ii) sensitivity analysis to perturbations of
the parameters involved (in particular the observations), with applications to
unbiased risk estimation ; (iii) convergence properties of the forward-backward
proximal splitting scheme, that is particularly well suited to solve the
corresponding large-scale regularized optimization problem
Multi-year interlaboratory exercises for the analysis of illicit drugs and metabolites in wastewater:development of a quality control system
Thirty-seven laboratories from 25 countries present the development of an inter-laboratory testing scheme for the analysis of seven illicit drug residues in standard solutions, tap- and wastewater. Almost 10 000 concentration values were evaluated: triplicates of up to five samples and 26 laboratories per year. The setup was substantially improved with experiences gained across the six repetitions (e.g. matrix type, sample conditions, spiking levels). From this, (pre-)analytical issues (e.g. pH adjustment, filtration) were revealed for specific analytes which resulted in formulation of best-practice protocols for inter-laboratory setup and analytical procedures. The results illustrate the effectiveness of the inter-laboratory setup to assess laboratory performance in the framework of wastewater-based epidemiology. The exercise proved that measurements of laboratories were of high quality (>80% satisfactory results for six out of seven analytes) and that analytical follow-up is important to assist laboratories in improving robustness of wastewater-based epidemiology results
Measurement of atmospheric tau neutrino appearance with IceCube DeepCore
We present a measurement of atmospheric tau neutrino appearance from oscillations with three years of data from the DeepCore subarray of the IceCube Neutrino Observatory. This analysis uses atmospheric neutrinos from the full sky with reconstructed energies between 5.6 and 56 GeV to search for a statistical excess of cascadelike neutrino events which are the signature of ντ interactions. For CC+NC (CC-only) interactions, we measure the tau neutrino normalization to be 0.73⁺·³⁰-.₂₄ (0.57⁺·³⁶-.₃₀) and exclude the absence of tau neutrino oscillations at a significance of 3.2σ (2.0σ) These results are consistent with, and of similar precision to, a confirmatory IceCube analysis also presented, as well as measurements performed by other experiments.M.G. Aartsen … G.C. Hill … A. Kyriacou … A. Wallace … B.J. Whelan … et al. (The IceCube Collaboration
- …