6,674 research outputs found

    Affine Registration of label maps in Label Space

    Get PDF
    Two key aspects of coupled multi-object shape\ud analysis and atlas generation are the choice of representation\ud and subsequent registration methods used to align the sample\ud set. For example, a typical brain image can be labeled into\ud three structures: grey matter, white matter and cerebrospinal\ud fluid. Many manipulations such as interpolation, transformation,\ud smoothing, or registration need to be performed on these images\ud before they can be used in further analysis. Current techniques\ud for such analysis tend to trade off performance between the two\ud tasks, performing well for one task but developing problems when\ud used for the other.\ud This article proposes to use a representation that is both\ud flexible and well suited for both tasks. We propose to map object\ud labels to vertices of a regular simplex, e.g. the unit interval for\ud two labels, a triangle for three labels, a tetrahedron for four\ud labels, etc. This representation, which is routinely used in fuzzy\ud classification, is ideally suited for representing and registering\ud multiple shapes. On closer examination, this representation\ud reveals several desirable properties: algebraic operations may\ud be done directly, label uncertainty is expressed as a weighted\ud mixture of labels (probabilistic interpretation), interpolation is\ud unbiased toward any label or the background, and registration\ud may be performed directly.\ud We demonstrate these properties by using label space in a gradient\ud descent based registration scheme to obtain a probabilistic\ud atlas. While straightforward, this iterative method is very slow,\ud could get stuck in local minima, and depends heavily on the initial\ud conditions. To address these issues, two fast methods are proposed\ud which serve as coarse registration schemes following which the\ud iterative descent method can be used to refine the results. Further,\ud we derive an analytical formulation for direct computation of the\ud "group mean" from the parameters of pairwise registration of all\ud the images in the sample set. We show results on richly labeled\ud 2D and 3D data sets

    Laplacian Mixture Modeling for Network Analysis and Unsupervised Learning on Graphs

    Full text link
    Laplacian mixture models identify overlapping regions of influence in unlabeled graph and network data in a scalable and computationally efficient way, yielding useful low-dimensional representations. By combining Laplacian eigenspace and finite mixture modeling methods, they provide probabilistic or fuzzy dimensionality reductions or domain decompositions for a variety of input data types, including mixture distributions, feature vectors, and graphs or networks. Provable optimal recovery using the algorithm is analytically shown for a nontrivial class of cluster graphs. Heuristic approximations for scalable high-performance implementations are described and empirically tested. Connections to PageRank and community detection in network analysis demonstrate the wide applicability of this approach. The origins of fuzzy spectral methods, beginning with generalized heat or diffusion equations in physics, are reviewed and summarized. Comparisons to other dimensionality reduction and clustering methods for challenging unsupervised machine learning problems are also discussed.Comment: 13 figures, 35 reference

    A Philosophical Foundation of Non-Additive Measure and Probability

    Get PDF
    In this paper, non-additivity of a set function is interpreted as a method to express relations between sets which are not modeled in a set theoretic way. Drawing upon a concept called "quasi-analysis” of the philosopher Rudolf Carnap, we introduce a transform for sets, functions, and set functions to formalize this idea. Any image-set under this transform can be interpreted as a class of (quasi-)components or (quasi-)properties representing the original set. We show that non-additive set functions can be represented as signed σ-additive measures defined on sets of quasi-components. We then use this interpretation to justify the use of non-additive set functions in various applications like for instance multi criteria decision making and cooperative game theory. Additionally, we show exemplarily by means of independence, conditioning, and products how concepts from classical measure and probability theory can be transfered to the non-additive theory via the transfor

    Interpretation of Natural-language Robot Instructions: Probabilistic Knowledge Representation, Learning, and Reasoning

    Get PDF
    A robot that can be simply told in natural language what to do -- this has been one of the ultimate long-standing goals in both Artificial Intelligence and Robotics research. In near-future applications, robotic assistants and companions will have to understand and perform commands such as set the table for dinner'', make pancakes for breakfast'', or cut the pizza into 8 pieces.'' Although such instructions are only vaguely formulated, complex sequences of sophisticated and accurate manipulation activities need to be carried out in order to accomplish the respective tasks. The acquisition of knowledge about how to perform these activities from huge collections of natural-language instructions from the Internet has garnered a lot of attention within the last decade. However, natural language is typically massively unspecific, incomplete, ambiguous and vague and thus requires powerful means for interpretation. This work presents PRAC -- Probabilistic Action Cores -- an interpreter for natural-language instructions which is able to resolve vagueness and ambiguity in natural language and infer missing information pieces that are required to render an instruction executable by a robot. To this end, PRAC formulates the problem of instruction interpretation as a reasoning problem in first-order probabilistic knowledge bases. In particular, the system uses Markov logic networks as a carrier formalism for encoding uncertain knowledge. A novel framework for reasoning about unmodeled symbolic concepts is introduced, which incorporates ontological knowledge from taxonomies and exploits semantically similar relational structures in a domain of discourse. The resulting reasoning framework thus enables more compact representations of knowledge and exhibits strong generalization performance when being learnt from very sparse data. Furthermore, a novel approach for completing directives is presented, which applies semantic analogical reasoning to transfer knowledge collected from thousands of natural-language instruction sheets to new situations. In addition, a cohesive processing pipeline is described that transforms vague and incomplete task formulations into sequences of formally specified robot plans. The system is connected to a plan executive that is able to execute the computed plans in a simulator. Experiments conducted in a publicly accessible, browser-based web interface showcase that PRAC is capable of closing the loop from natural-language instructions to their execution by a robot
    corecore