202 research outputs found

    Accurate 3D maps from depth images and motion sensors via nonlinear Kalman filtering

    Full text link
    This paper investigates the use of depth images as localisation sensors for 3D map building. The localisation information is derived from the 3D data thanks to the ICP (Iterative Closest Point) algorithm. The covariance of the ICP, and thus of the localization error, is analysed, and described by a Fisher Information Matrix. It is advocated this error can be much reduced if the data is fused with measurements from other motion sensors, or even with prior knowledge on the motion. The data fusion is performed by a recently introduced specific extended Kalman filter, the so-called Invariant EKF, and is directly based on the estimated covariance of the ICP. The resulting filter is very natural, and is proved to possess strong properties. Experiments with a Kinect sensor and a three-axis gyroscope prove clear improvement in the accuracy of the localization, and thus in the accuracy of the built 3D map.Comment: Submitted to IROS 2012. 8 page

    Enhancing reuse of data and biological material in medical research : from FAIR to FAIR-Health

    Get PDF
    The known challenge of underutilization of data and biological material from biorepositories as potential resources formedical research has been the focus of discussion for over a decade. Recently developed guidelines for improved data availability and reusability—entitled FAIR Principles (Findability, Accessibility, Interoperability, and Reusability)—are likely to address only parts of the problem. In this article,we argue that biologicalmaterial and data should be viewed as a unified resource. This approach would facilitate access to complete provenance information, which is a prerequisite for reproducibility and meaningful integration of the data. A unified view also allows for optimization of long-term storage strategies, as demonstrated in the case of biobanks.Wepropose an extension of the FAIR Principles to include the following additional components: (1) quality aspects related to research reproducibility and meaningful reuse of the data, (2) incentives to stimulate effective enrichment of data sets and biological material collections and its reuse on all levels, and (3) privacy-respecting approaches for working with the human material and data. These FAIR-Health principles should then be applied to both the biological material and data. We also propose the development of common guidelines for cloud architectures, due to the unprecedented growth of volume and breadth of medical data generation, as well as the associated need to process the data efficiently.peer-reviewe

    Systematizing Decentralization and Privacy: Lessons from 15 Years of Research and Deployments

    Get PDF
    Decentralized systems are a subset of distributed systems where multiple authorities control different components and no authority is fully trusted by all. This implies that any component in a decentralized system is potentially adversarial. We revise fifteen years of research on decentralization and privacy, and provide an overview of key systems, as well as key insights for designers of future systems. We show that decentralized designs can enhance privacy, integrity, and availability but also require careful trade-offs in terms of system complexity, properties provided, and degree of decentralization. These trade-offs need to be understood and navigated by designers. We argue that a combination of insights from cryptography, distributed systems, and mechanism design, aligned with the development of adequate incentives, are necessary to build scalable and successful privacy-preserving decentralized systems

    Consequences of hot gas in the broad line region of active galactic nuclei

    Get PDF
    Models for hot gas in the broad line region of active galactic nuclei are discussed. The results of the two phase equilibrium models for confinement of broad line clouds by Compton heated gas are used to show that high luminosity quasars are expected to show Fe XXVI L alpha line absorption which will be observed with spectrometers such as those planned for the future X-ray spectroscopy experiments. Two phase equilibrium models also predict that the gas in the broad line clouds and the confining medium may be Compton thick. It is shown that the combined effects of Comptonization and photoabsorption can suppress both the broad emission lines and X-rays in the Einstein and HEAO-1 energy bands. The observed properties of such Compton thick active galaxies are expected to be similar to those of Seyfert 2 nuclei. The implications for polarization and variability are also discussed

    Supernova Remnants and Plerions in the Compton Gamma-Ray Observatory Era

    Get PDF
    Due to observations made by the Compton Gamma-Ray Observatory over the last six years, it appears that a number of galactic supernova remnants may be candidates for sources of cosmic gamma-rays. These include shell-type remnants such as IC443 and Îł\gamma Cygni, which have no known parent pulsars, but have significant associations with unidentified EGRET sources, and others that appear to be composite, where a pulsar is embedded in a shell (e.g. W44 and Vela), or are purely pulsar-driven, such as the Crab Nebula. This review discusses our present understanding of gamma-ray production in plerionic and non-plerionic supernova remnants, and explores the relationship between such emission and that in other wavebands. Focuses include models of the Crab and Vela nebulae, the composite nature of W44, the relationship of shell-type remnants to cosmic ray production, the relative importance of shock-accelerated protons and electrons, constraints on models placed by TeV, X-ray and radio observations, and the role of electrons injected directly into the remnants by parent pulsars.Comment: 21 pages, including 4 eps figures, invited review, to appear in Proc. 4th Compton Symposium, (1997) ed. Dermer, C. D. & Kurfess, J. D. (AIP, New York

    Observability, Identifiability and Sensitivity of Vision-Aided Navigation

    Full text link
    We analyze the observability of motion estimates from the fusion of visual and inertial sensors. Because the model contains unknown parameters, such as sensor biases, the problem is usually cast as a mixed identification/filtering, and the resulting observability analysis provides a necessary condition for any algorithm to converge to a unique point estimate. Unfortunately, most models treat sensor bias rates as noise, independent of other states including biases themselves, an assumption that is patently violated in practice. When this assumption is lifted, the resulting model is not observable, and therefore past analyses cannot be used to conclude that the set of states that are indistinguishable from the measurements is a singleton. In other words, the resulting model is not observable. We therefore re-cast the analysis as one of sensitivity: Rather than attempting to prove that the indistinguishable set is a singleton, which is not the case, we derive bounds on its volume, as a function of characteristics of the input and its sufficient excitation. This provides an explicit characterization of the indistinguishable set that can be used for analysis and validation purposes

    Information Leakage as a Model for Quality of Anonymity Networks

    Get PDF
    Measures for anonymity in systems must be on one hand simple and concise, and on the other hand reflect the realities of real systems. Such systems are heterogeneous, as are the ways they are used, the deployed anonymity measures, and finally the possible attack methods. Implementation quality and topologies of the anonymity measures must be considered as well. We therefore propose a new measure for the anonymity degree, that takes into account these various. We model the effectiveness of single mixes or of mix networks in terms of information leakage, and we measure it in terms of covert channel capacity. The relationship between the anonymity degree and information leakage is described, and an example is shown

    Information Leakage as a Model for Quality of Anonymity Networks

    Get PDF
    Measures for anonymity in systems must be on one hand simple and concise, and on the other hand reflect the realities of real systems. Such systems are heterogeneous, as are the ways they are used, the deployed anonymity measures, and finally the possible attack methods. Implementation quality and topologies of the anonymity measures must be considered as well. We therefore propose a new measure for the anonymity degree, that takes into account these various. We model the effectiveness of single mixes or of mix networks in terms of information leakage, and we measure it in terms of covert channel capacity. The relationship between the anonymity degree and information leakage is described, and an example is shown

    Controlled Cloud Based SaaS Service Scheme to Denial the Hotspot Locating Attacks in Wireless Sensor Networks

    Get PDF
    The wireless sensor networks opponent can make use of the traffic informations to locate the monitored objects in Software as a service(Saas) , e.g., to identify the opponent soldiers. In this paper, we first define a hotspot phenomenon through SaaS and it causes an obvious inconsistency in the network traffic pattern due to the large volume of packets originating from a small area in Partial controlled cloud based scheme . Second, we develop a realistic opponent model, assuming that the opponent can monitor the network traffic in multiple areas, rather than the entire network or only one area. Using this model, Hotspot - Locating where the opponent uses traffic analysis techniques to locate hotspots. Finally, we propose a controlled cloud - based SaaS scheme for efficiently protecting against Hotspot - Locating attack by creating a controlled cloud with an irregular shape of fake traffic, inconsistency in the traffic pattern and camouflage the source node in the nodes for ming the controlled clou d . To reduce the energy cost, controlled cloud s are active only during data transmission and the intersection of controlled cloud s creates a larger merged controlled cloud , to reduce the number of fake packets and also boost preservation. Simulation and a nalytical results demonstrate that our scheme can provide stronger protection than routing - based schemes and requires much less energy than global - adversary schemes

    Statistical relational learning of semantic models and grammar rules for 3D building reconstruction from 3D point clouds

    Get PDF
    Formal grammars are well suited for the estimation of models with an a-priori unknown number of parameters such as buildings and have proven their worth for 3D modeling and reconstruction of cities. However, the generation and design of corresponding grammar rules is a laborious task and relies on expert knowledge. This thesis presents novel approaches for the reduction of this effort using advanced machine learning methods resulting in automatically learned sophisticated grammar rules. Indeed, the learning of a wide range of sophisticated rules, that reflect the variety and complexity, is a challenging task. This is especially the case if a simultaneous machine learning of building structures and the underlying aggregation hierarchies as well as the building parameters and the constraints among them for a semantic interpretation is expected. Thus, in this thesis, an incremental approach is followed. It separates the structure learning from the parameter distribution learning of building parts. Moreover, the so far procedural approaches with formal grammars are mostly rather convenient for the generation of virtual city models than for the reconstruction of existing buildings. To this end, Inductive Logic Programming (ILP) techniques are transferred and applied for the first time in the field of 3D building modeling. This enables the automatic learning of declarative logic programs, which are equivalent to attribute grammars and separate the representation of buildings and their parts from the reconstruction task. A stepwise bottom-up learning, starting from the smallest atomic features of a building part together with the semantic, topological and geometric constraints, is a key to a successful learning of a whole building part. Only few examples are sufficient to learn from precise as well as noisy observations. The learning from uncertain data is realized using probability density functions, decision trees and uncertain projective geometry. This enables the handling and modeling of uncertain topology and geometric reasoning taking noise into consideration. The uncertainty of models itself is also considered. Therefore, a novel method is developed for the learning of Weighted Attribute Context-Free Grammar (WACFG). On the one hand, the structure learning of façades – context-free part of the Grammar – is performed based on annotated derivation trees using specific Support Vector Machines (SVMs). The latter are able to derive probabilistic models from structured data and to predict a most likely tree regarding to given observations. On the other hand, to the best of my knowledge, Statistical Relational Learning (SRL), especially Markov Logic Networks (MLNs), are applied for the first time in order to learn building part (shape and location) parameters as well as the constraints among these parts. The use of SRL enables to take profit from the elegant logical relational description and to benefit from the efficiency of statistical inference methods. In order to model latent prior knowledge and exploit the architectural regularities of buildings, a novel method is developed for the automatic identification of translational as well as axial symmetries. For symmetry identification a supervised machine learning approach is followed based on an SVM classifier. Building upon the classification results, algorithms are designed for the representation of symmetries using context-free grammars from authoritative building footprints. In all steps the machine learning is performed based on real- world data such as 3D point clouds and building footprints. The handling with uncertainty and occlusions is assured. The presented methods have been successfully applied on real data. The belonging classification and reconstruction results are shown.Statistisches relationales Lernen von semantischen Modellen und Grammatikregeln fĂŒr 3D GebĂ€uderekonstruktion aus 3D Punktwolken Formale Grammatiken eignen sich sehr gut zur SchĂ€tzung von Modellen mit a-priori unbekannter Anzahl von Parametern und haben sich daher als guter Ansatz zur Rekonstruktion von StĂ€dten mittels 3D Stadtmodellen bewĂ€hrt. Der Entwurf und die Erstellung der dazugehörigen Grammatikregeln benötigt jedoch Expertenwissen und ist mit großem Aufwand verbunden. Im Rahmen dieser Arbeit wurden Verfahren entwickelt, die diesen Aufwand unter Zuhilfenahme von leistungsfĂ€higen Techniken des maschinellen Lernens reduzieren und automatisches Lernen von Regeln ermöglichen. Das Lernen umfangreicher Grammatiken, die die Vielfalt und KomplexitĂ€t der GebĂ€ude und ihrer Bestandteile widerspiegeln, stellt eine herausfordernde Aufgabe dar. Dies ist insbesondere der Fall, wenn zur semantischen Interpretation sowohl das Lernen der Strukturen und Aggregationshierarchien als auch von Parametern der zu lernenden Objekte gleichzeitig statt finden soll. Aus diesem Grund wird hier ein inkrementeller Ansatz verfolgt, der das Lernen der Strukturen vom Lernen der Parameterverteilungen und Constraints zielfĂŒhrend voneinander trennt. Existierende prozedurale AnsĂ€tze mit formalen Grammatiken sind eher zur Generierung von synthetischen Stadtmodellen geeignet, aber nur bedingt zur Rekonstruktion existierender GebĂ€ude nutzbar. HierfĂŒr werden in dieser Schrift Techniken der Induktiven Logischen Programmierung (ILP) zum ersten Mal auf den Bereich der 3D GebĂ€udemodellierung ĂŒbertragen. Dies fĂŒhrt zum Lernen deklarativer logischer Programme, die hinsichtlich ihrer AusdrucksstĂ€rke mit attributierten Grammatiken gleichzusetzen sind und die ReprĂ€sentation der GebĂ€ude von der Rekonstruktionsaufgabe trennen. Das Lernen von zuerst disaggregierten atomaren Bestandteilen sowie der semantischen, topologischen und geometrischen Beziehungen erwies sich als SchlĂŒssel zum Lernen der Gesamtheit eines GebĂ€udeteils. Das Lernen erfolgte auf Basis einiger weniger sowohl prĂ€ziser als auch verrauschter Beispielmodelle. Um das Letztere zu ermöglichen, wurde auf Wahrscheinlichkeitsdichteverteilungen, EntscheidungsbĂ€umen und unsichere projektive Geometrie zurĂŒckgegriffen. Dies erlaubte den Umgang mit und die Modellierung von unsicheren topologischen Relationen sowie unscharfer Geometrie. Um die Unsicherheit der Modelle selbst abbilden zu können, wurde ein Verfahren zum Lernen Gewichteter Attributierter Kontextfreier Grammatiken (Weighted Attributed Context-Free Grammars, WACFG) entwickelt. Zum einen erfolgte das Lernen der Struktur von Fassaden –kontextfreier Anteil der Grammatik – aus annotierten HerleitungsbĂ€umen mittels spezifischer Support Vektor Maschinen (SVMs), die in der Lage sind, probabilistische Modelle aus strukturierten Daten abzuleiten und zu prĂ€dizieren. Zum anderen wurden nach meinem besten Wissen Methoden des statistischen relationalen Lernens (SRL), insbesondere Markov Logic Networks (MLNs), erstmalig zum Lernen von Parametern von GebĂ€uden sowie von bestehenden Relationen und Constraints zwischen ihren Bestandteilen eingesetzt. Das Nutzen von SRL erlaubt es, die eleganten relationalen Beschreibungen der Logik mit effizienten Methoden der statistischen Inferenz zu verbinden. Um latentes Vorwissen zu modellieren und architekturelle RegelmĂ€ĂŸigkeiten auszunutzen, ist ein Verfahren zur automatischen Erkennung von Translations- und Spiegelsymmetrien und deren ReprĂ€sentation mittels kontextfreier Grammatiken entwickelt worden. HierfĂŒr wurde mittels ĂŒberwachtem Lernen ein SVM-Klassifikator entwickelt und implementiert. Basierend darauf wurden Algorithmen zur Induktion von Grammatikregeln aus Grundrissdaten entworfen
    • 

    corecore