735 research outputs found

    Efficient Constellation-Based Map-Merging for Semantic SLAM

    Full text link
    Data association in SLAM is fundamentally challenging, and handling ambiguity well is crucial to achieve robust operation in real-world environments. When ambiguous measurements arise, conservatism often mandates that the measurement is discarded or a new landmark is initialized rather than risking an incorrect association. To address the inevitable `duplicate' landmarks that arise, we present an efficient map-merging framework to detect duplicate constellations of landmarks, providing a high-confidence loop-closure mechanism well-suited for object-level SLAM. This approach uses an incrementally-computable approximation of landmark uncertainty that only depends on local information in the SLAM graph, avoiding expensive recovery of the full system covariance matrix. This enables a search based on geometric consistency (GC) (rather than full joint compatibility (JC)) that inexpensively reduces the search space to a handful of `best' hypotheses. Furthermore, we reformulate the commonly-used interpretation tree to allow for more efficient integration of clique-based pairwise compatibility, accelerating the branch-and-bound max-cardinality search. Our method is demonstrated to match the performance of full JC methods at significantly-reduced computational cost, facilitating robust object-based loop-closure over large SLAM problems.Comment: Accepted to IEEE International Conference on Robotics and Automation (ICRA) 201

    Complexity Analysis and Efficient Measurement Selection Primitives for High-Rate Graph SLAM

    Get PDF
    Sparsity has been widely recognized as crucial for efficient optimization in graph-based SLAM. Because the sparsity and structure of the SLAM graph reflect the set of incorporated measurements, many methods for sparsification have been proposed in hopes of reducing computation. These methods often focus narrowly on reducing edge count without regard for structure at a global level. Such structurally-naive techniques can fail to produce significant computational savings, even after aggressive pruning. In contrast, simple heuristics such as measurement decimation and keyframing are known empirically to produce significant computation reductions. To demonstrate why, we propose a quantitative metric called elimination complexity (EC) that bridges the existing analytic gap between graph structure and computation. EC quantifies the complexity of the primary computational bottleneck: the factorization step of a Gauss-Newton iteration. Using this metric, we show rigorously that decimation and keyframing impose favorable global structures and therefore achieve computation reductions on the order of r2/9r^2/9 and r3r^3, respectively, where rr is the pruning rate. We additionally present numerical results showing EC provides a good approximation of computation in both batch and incremental (iSAM2) optimization and demonstrate that pruning methods promoting globally-efficient structure outperform those that do not.Comment: Pre-print accepted to ICRA 201

    Interventional closure of persistent foetal cardiac shunts including PDA and PFO : study of outcome, complications and novel methods

    Get PDF
    Background: Persistent foramen ovale (PFO) and persistent ductus arteriosus (PDA) are two of the most common congenital heart defects (CHD). The incidence of PFO is reported to be between 10% and 35% (1, 2). In term infants, a PDA is seen in around one in 2000 births, accounting for 5% to 10% of all congenital heart disease (3, 4). Transcatheter closure of these lesions has become standard procedure in children and adults and has largely replaced surgery for these congenital cardiac defects. Cardiac catheterisation techniques require ionising radiation and are generally classified as high-radiation dose procedures according to the European Directive 2013/59/Euratom (5). New methods and materials have improved outcomes and safety, and have shortened periprocedural hospital stays for patients undergoing catheterisation. Aim: This thesis aimed to study outcome and complications of new methods in patients undergoing heart catheterisation. The included studies aimed to evaluate new techniques, starting with vascular access and preclosure devices in the accessed vessel, to improve bleeding control and facilitate same-day discharge (SDD). A new method for surveillance of cancer risk during catheterisation procedures was introduced and can be used to alert the operator when the radiation has exceeded a certain cancer risk level. Lastly, several next-generation PDA devices were studied. Methods: Four retrospective studies were conducted. In Study I, data from 238 paediatric patients were collected to estimate the risk of radiation-induced cancer death. Study II reviewed all patients undergoing PDA closure with an Amplatzer device over a fourteen-year period in a large centre in Tel Aviv. Study III investigated same-day discharge (SDD) of adult patients undergoing PFO closure. All patients who underwent transcatheter closure of a PFO at the Karolinska University Hospital in Stockholm between March 2017 and June 2020 were included. Study IV included all patients who underwent transcatheter PDA closure with a 5/7 Occlutech® duct occluder in three European centres in the UK, France and Sweden. Results: More than 90% of the retrospective study cohort in Study I was within the range of very low (1–10 in 100,000) or low cancer risk level (1–10 in 10,000). No patient exceeded the high cancer risk level (> 1 in 100). In addition, a new concept of age- and gender-specific risk reference values (RRVs) related to population cancer risk was introduced. The results showed that the RRV for males was a factor 2–3 higher than that for females. In Study II, all Amplatzer devices demonstrated very good closure rates (> 99.5%) with a low rate of complications, such as device embolisation or left pulmonary artery (LPA) stenosis. A tendency toward less LPA stenosis with the Piccolo™ device was noted, and no aortic flow disturbance occurred in this study. The majority of complications (device embolisation and LPA stenosis) occurred in patients with a bodyweight < 15 kg. Study III focused on SDD of patients undergoing percutaneous closure of PFO. A total of 246 of 262 patients (94%) had SDD. In 166 (63%) patients, a Perclose ProGlide™ system was used for femoral vein access closure. Post-interventional arrhythmias were noted in 17 (6%) of the patients, and vascular complications in nine patients (3%). There was no difference in SDD between patients who received ProGlide (n=159, 96%) and patients who did not receive ProGlide™ (n=87, 91%, p=0.10). Eighteen paediatric patients with heart failure were included retrospectively in three study sites in Study IV. Eleven of them had a bodyweight below 12 kg, and pulmonary hypertension was noted in seven of the 18 patients. All patients underwent successful PDA closure with no complications with a 5/7 Occlutech® duct occluder. Conclusions: Various aspects of cardiac catheterisation, from radiation risks, device choice, SDD and access site closure, have been studied in this thesis. Transcatheter closure of persistent foetal cardiac shunts is the standard treatment in full-term children and adults with low morbidity and mortality rates. New device development has improved PDA closure outcomes, even in small children with large PDAs. PFO closure has increased in the last five years due to several randomised controlled trials that catheterisations to treat CDH can be carried out with reasonably low radiation have reported a lower risk of recurrent ischemic stroke after PFO closure compared with medical therapy. Cardiac levels. Radiation-reducing tactics and the risk of radiation-induced cancer death must be taken into consideration, especially when treating younger patients

    Categorification of persistent homology

    Full text link
    We redevelop persistent homology (topological persistence) from a categorical point of view. The main objects of study are diagrams, indexed by the poset of real numbers, in some target category. The set of such diagrams has an interleaving distance, which we show generalizes the previously-studied bottleneck distance. To illustrate the utility of this approach, we greatly generalize previous stability results for persistence, extended persistence, and kernel, image and cokernel persistence. We give a natural construction of a category of interleavings of these diagrams, and show that if the target category is abelian, so is this category of interleavings.Comment: 27 pages, v3: minor changes, to appear in Discrete & Computational Geometr

    Collision Probabilities for Continuous-Time Systems Without Sampling [with Appendices]

    Full text link
    Demand for high-performance, robust, and safe autonomous systems has grown substantially in recent years. Fulfillment of these objectives requires accurate and efficient risk estimation that can be embedded in core decision-making tasks such as motion planning. On one hand, Monte-Carlo (MC) and other sampling-based techniques can provide accurate solutions for a wide variety of motion models but are cumbersome to apply in the context of continuous optimization. On the other hand, "direct" approximations aim to compute (or upper-bound) the failure probability as a smooth function of the decision variables, and thus are widely applicable. However, existing approaches fundamentally assume discrete-time dynamics and can perform unpredictably when applied to continuous-time systems operating in the real world, often manifesting as severe conservatism. State-of-the-art attempts to address this within a conventional discrete-time framework require additional Gaussianity approximations that ultimately produce inconsistency of their own. In this paper we take a fundamentally different approach, deriving a risk approximation framework directly in continuous time and producing a lightweight estimate that actually improves as the discretization is refined. Our approximation is shown to significantly outperform state-of-the-art techniques in replicating the MC estimate while maintaining the functional and computational benefits of a direct method. This enables robust, risk-aware, continuous motion-planning for a broad class of nonlinear, partially-observable systems.Comment: To appear at RSS 202

    The evolution of GX 339-4 in the low-hard state as seen by NuSTAR and Swift

    Get PDF
    We analyze eleven NuSTAR and Swift observations of the black hole X-ray binary GX 339-4 in the hard state, six of which were taken during the end of the 2015 outburst, five during a failed outburst in 2013. These observations cover luminosities from 0.5%-5% of the Eddington luminosity. Implementing the most recent version of the reflection model relxillCp, we perform simultaneous spectral fits on both datasets to track the evolution of the properties in the accretion disk including the inner edge radius, the ionization, and temperature of the thermal emission. We also constrain the photon index and electron temperature of the primary source (the "corona"). We find the disk becomes more truncated when the luminosity decreases, and observe a maximum truncation radius of 37Rg37R_g. We also explore a self-consistent model under the framework of coronal Comptonization, and find consistent results regarding the disk truncation in the 2015 data, providing a more physical preferred fit for the 2013 observations.Comment: 15 pages, 8 figures, 6 tables, accepted for publication in The Astrophysical Journa

    Basin-scale estimates of pelagic and coral reef calcification in the Red Sea and Western Indian Ocean.

    Get PDF
    Basin-scale calcification rates are highly important in assessments of the global oceanic carbon cycle. Traditionally, such estimates were based on rates of sedimentation measured with sediment traps or in deep sea cores. Here we estimated CaCO3 precipitation rates in the surface water of the Red Sea from total alkalinity depletion along their axial flow using the water flux in the straits of Bab el Mandeb. The relative contribution of coral reefs and open sea plankton were calculated by fitting a Rayleigh distillation model to the increase in the strontium to calcium ratio. We estimate the net amount of CaCO3 precipitated in the Red Sea to be 7.3 ± 0.4·10(10) kg·y(-1) of which 80 ± 5% is by pelagic calcareous plankton and 20 ± 5% is by the flourishing coastal coral reefs. This estimate for pelagic calcification rate is up to 40% higher than published sedimentary CaCO3 accumulation rates for the region. The calcification rate of the Gulf of Aden was estimated by the Rayleigh model to be ∼1/2 of the Red Sea, and in the northwestern Indian Ocean, it was smaller than our detection limit. The results of this study suggest that variations of major ions on a basin scale may potentially help in assessing long-term effects of ocean acidification on carbonate deposition by marine organisms
    corecore