20,811 research outputs found
Efficient Constellation-Based Map-Merging for Semantic SLAM
Data association in SLAM is fundamentally challenging, and handling ambiguity
well is crucial to achieve robust operation in real-world environments. When
ambiguous measurements arise, conservatism often mandates that the measurement
is discarded or a new landmark is initialized rather than risking an incorrect
association. To address the inevitable `duplicate' landmarks that arise, we
present an efficient map-merging framework to detect duplicate constellations
of landmarks, providing a high-confidence loop-closure mechanism well-suited
for object-level SLAM. This approach uses an incrementally-computable
approximation of landmark uncertainty that only depends on local information in
the SLAM graph, avoiding expensive recovery of the full system covariance
matrix. This enables a search based on geometric consistency (GC) (rather than
full joint compatibility (JC)) that inexpensively reduces the search space to a
handful of `best' hypotheses. Furthermore, we reformulate the commonly-used
interpretation tree to allow for more efficient integration of clique-based
pairwise compatibility, accelerating the branch-and-bound max-cardinality
search. Our method is demonstrated to match the performance of full JC methods
at significantly-reduced computational cost, facilitating robust object-based
loop-closure over large SLAM problems.Comment: Accepted to IEEE International Conference on Robotics and Automation
(ICRA) 201
Complexity Analysis and Efficient Measurement Selection Primitives for High-Rate Graph SLAM
Sparsity has been widely recognized as crucial for efficient optimization in
graph-based SLAM. Because the sparsity and structure of the SLAM graph reflect
the set of incorporated measurements, many methods for sparsification have been
proposed in hopes of reducing computation. These methods often focus narrowly
on reducing edge count without regard for structure at a global level. Such
structurally-naive techniques can fail to produce significant computational
savings, even after aggressive pruning. In contrast, simple heuristics such as
measurement decimation and keyframing are known empirically to produce
significant computation reductions. To demonstrate why, we propose a
quantitative metric called elimination complexity (EC) that bridges the
existing analytic gap between graph structure and computation. EC quantifies
the complexity of the primary computational bottleneck: the factorization step
of a Gauss-Newton iteration. Using this metric, we show rigorously that
decimation and keyframing impose favorable global structures and therefore
achieve computation reductions on the order of and , respectively,
where is the pruning rate. We additionally present numerical results
showing EC provides a good approximation of computation in both batch and
incremental (iSAM2) optimization and demonstrate that pruning methods promoting
globally-efficient structure outperform those that do not.Comment: Pre-print accepted to ICRA 201
Nonaffine rubber elasticity for stiff polymer networks
We present a theory for the elasticity of cross-linked stiff polymer
networks. Stiff polymers, unlike their flexible counterparts, are highly
anisotropic elastic objects. Similar to mechanical beams stiff polymers easily
deform in bending, while they are much stiffer with respect to tensile forces
(``stretching''). Unlike in previous approaches, where network elasticity is
derived from the stretching mode, our theory properly accounts for the soft
bending response. A self-consistent effective medium approach is used to
calculate the macroscopic elastic moduli starting from a microscopic
characterization of the deformation field in terms of ``floppy modes'' --
low-energy bending excitations that retain a high degree of non-affinity. The
length-scale characterizing the emergent non-affinity is given by the ``fiber
length'' , defined as the scale over which the polymers remain straight.
The calculated scaling properties for the shear modulus are in excellent
agreement with the results of recent simulations obtained in two-dimensional
model networks. Furthermore, our theory can be applied to rationalize bulk
rheological data in reconstituted actin networks.Comment: 12 pages, 10 figures, revised Section II
The Carrington event not observed in most ice core nitrate records
The Carrington Event of 1859 is considered to be among the largest space weather events of the last 150 years. We show that only one out of 14 well-resolved ice core records from Greenland and Antarctica has a nitrate spike dated to 1859. No sharp spikes are observed in the Antarctic cores studied here. In Greenland numerous spikes are observed in the 40 years surrounding 1859, but where other chemistry was measured, all large spikes have the unequivocal signal, including co-located spikes in ammonium, formate, black carbon and vanillic acid, of biomass burning plumes. It seems certain that most spikes in an earlier core, including that claimed for 1859, are also due to biomass burning plumes, and not to solar energetic particle (SEP) events. We conclude that an event as large as the Carrington Event did not leave an observable, widespread imprint in nitrate in polar ice. Nitrate spikes cannot be used to derive the statistics of SEPs
Comment on “Low time resolution analysis of ice cores cannot detect impulsive nitrate events” by D. F. Smart et al.
Smart et al. (2014) suggested that the detection of nitrate spikes in polar ice cores from solar energetic particle (SEP) events could be achieved if an analytical system with sufficiently high resolution was used. Here we show that the spikes they associate with SEP events are not reliably recorded in cores from the same location, even when the resolution is clearly adequate. We explain the processes that limit the effective resolution of ice cores. Liquid conductivity data suggest that the observed spikes are associated with sodium or another nonacidic cation, making it likely that they result from deposition of sea salt or similar aerosol that has scavenged nitrate, rather than from a primary input of nitrate in the troposphere. We consider that there is no evidence at present to support the identification of any spikes in nitrate as representing SEP events. Although such events undoubtedly create nitrate in the atmosphere, we see no plausible route to using nitrate spikes to document the statistics of such events
Stiff Polymers, Foams and Fiber Networks
We study the elasticity of fibrous materials composed of generalized stiff
polymers. It is shown that in contrast to cellular foam-like structures affine
strain fields are generically unstable. Instead, a subtle interplay between the
architecture of the network and the elastic properties of its building blocks
leads to intriguing mechanical properties with intermediate asymptotic scaling
regimes. We present exhaustive numerical studies based on a finite element
method complemented by scaling arguments.Comment: 4 pages, 5 figure
Atmospheric nitrogen oxides (NO and NO2) at Dome C, East Antarctica, during the OPALE campaign
Mixing ratios of the atmospheric nitrogen oxides NO and NO2 were measured as part of the OPALE (Oxidant Production in Antarctic Lands & Export) campaign at Dome C, East Antarctica (75.1 degrees S, 123.3 degrees E, 3233 m), during December 2011 to January 2012. Profiles of NOx mixing ratios of the lower 100m of the atmosphere confirm that, in contrast to the South Pole, air chemistry at Dome C is strongly influenced by large diurnal cycles in solar irradiance and a sudden collapse of the atmospheric boundary layer in the early evening. Depth profiles of mixing ratios in firn air suggest that the upper snowpack at Dome C holds a significant reservoir of photolytically produced NO2 and is a sink of gas-phase ozone (O-3). First-time observations of bromine oxide (BrO) at Dome C show that mixing ratios of BrO near the ground are low, certainly less than 5 pptv, with higher levels in the free troposphere. Assuming steady state, observed mixing ratios of BrO and RO2 radicals are too low to explain the large NO2 : NO ratios found in ambient air, possibly indicating the existence of an unknown process contributing to the atmospheric chemistry of reactive nitrogen above the Antarctic Plateau. During 2011-2012, NOx mixing ratios and flux were larger than in 2009-2010, consistent with also larger surface O-3 mixing ratios resulting from increased net O-3 production. Large NOx mixing ratios at Dome C arise from a combination of continuous sunlight, shallow mixing height and significant NOx emissions by surface snow (F-NOx). During 23 December 2011-12 January 2012, median F-NOx was twice that during the same period in 20092010 due to significantly larger atmospheric turbulence and a slightly stronger snowpack source. A tripling of F-NOx in December 2011 was largely due to changes in snowpack source strength caused primarily by changes in NO3- concentrations in the snow skin layer, and only to a secondary order by decrease of total column O-3 and associated increase in NO3- photolysis rates. A source of uncertainty in model estimates of F-NOx is the quantum yield of NO3- photolysis in natural snow, which may change over time as the snow ages
Current reversal and exclusion processes with history-dependent random walks
A class of exclusion processes in which particles perform history-dependent
random walks is introduced, stimulated by dynamic phenomena in some biological
and artificial systems. The particles locally interact with the underlying
substrate by breaking and reforming lattice bonds. We determine the
steady-state current on a ring, and find current-reversal as a function of
particle density. This phenomenon is attributed to the non-local interaction
between the walkers through their trails, which originates from strong
correlations between the dynamics of the particles and the lattice. We
rationalize our findings within an effective description in terms of
quasi-particles which we call front barriers. Our analytical results are
complemented by stochastic simulations.Comment: 5 pages, 6 figure
Evolution of a beam dynamics model for the transport lines in a proton therapy facility
Despite the fact that the first-order beam dynamics models allow an
approximated evaluation of the beam properties, their contribution is essential
during the conceptual design of an accelerator or beamline. However, during the
commissioning some of their limitations appear in the comparison against
measurements. The extension of the linear model to higher order effects is,
therefore, demanded. In this paper, the effects of particle-matter interaction
have been included in the model of the transport lines in the proton therapy
facility at the Paul Scherrer Institut (PSI) in Switzerland. To improve the
performance of the facility, a more precise model was required and has been
developed with the multi-particle open source beam dynamics code called OPAL
(Object oriented Particle Accelerator Library). In OPAL, the Monte Carlo
simulations of Coulomb scattering and energy loss are performed seamless with
the particle tracking. Beside the linear optics, the influence of the passive
elements (e.g. degrader, collimators, scattering foils and air gaps) on the
beam emittance and energy spread can be analysed in the new model. This allows
for a significantly improved precision in the prediction of beam transmission
and beam properties. The accuracy of the OPAL model has been confirmed by
numerous measurements.Comment: 17 pages, 19 figure
- …