32,112 research outputs found

    Hierarchical Object Parsing from Structured Noisy Point Clouds

    Full text link
    Object parsing and segmentation from point clouds are challenging tasks because the relevant data is available only as thin structures along object boundaries or other features, and is corrupted by large amounts of noise. To handle this kind of data, flexible shape models are desired that can accurately follow the object boundaries. Popular models such as Active Shape and Active Appearance models lack the necessary flexibility for this task, while recent approaches such as the Recursive Compositional Models make model simplifications in order to obtain computational guarantees. This paper investigates a hierarchical Bayesian model of shape and appearance in a generative setting. The input data is explained by an object parsing layer, which is a deformation of a hidden PCA shape model with Gaussian prior. The paper also introduces a novel efficient inference algorithm that uses informed data-driven proposals to initialize local searches for the hidden variables. Applied to the problem of object parsing from structured point clouds such as edge detection images, the proposed approach obtains state of the art parsing errors on two standard datasets without using any intensity information.Comment: 13 pages, 16 figure

    Towards Intelligent Databases

    Get PDF
    This article is a presentation of the objectives and techniques of deductive databases. The deductive approach to databases aims at extending with intensional definitions other database paradigms that describe applications extensionaUy. We first show how constructive specifications can be expressed with deduction rules, and how normative conditions can be defined using integrity constraints. We outline the principles of bottom-up and top-down query answering procedures and present the techniques used for integrity checking. We then argue that it is often desirable to manage with a database system not only database applications, but also specifications of system components. We present such meta-level specifications and discuss their advantages over conventional approaches

    MENU: multicast emulation using netlets and unicast

    Get PDF
    High-end networking applications such as Internet TV and software distribution have generated a demand for multicast protocols as an integral part of the network. This will allow such applications to support data dissemination to large groups of users in a scalable and reliable manner. Existing IP multicast protocols lack these features and also require state storage in the core of the network which is costly to implement. In this paper, we present a new multicast protocol referred to as MENU. It realises a scalable and a reliable multicast protocol model by pushing the tree building complexity to the edges of the network, thereby eliminating processing and state storage in the core of the network. The MENU protocol builds multicast support in the network using mobile agent based active network services, Netlets, and unicast addresses. The multicast delivery tree in MENU is a two level hierarchical structure where users are partitioned into client communities based on geographical proximity. Each client community in the network is treated as a single virtual destination for traffic from the server. Netlet based services referred to as hot spot delegates (HSDs) are deployed by servers at "hot spots" close to each client community. They function as virtual traffic destinations for the traffic from the server and also act as virtual source nodes for all users in the community. The source node feeds data to these distributed HSDs which in turn forward data to all downstream users through a locally constructed traffic delivery tree. It is shown through simulations that the resulting system provides an efficient means to incrementally build a source customisable secured multicast protocol which is both scalable and reliable. Furthermore, results show that MENU employs minimal processing and reduced state information in networks when compared to existing IP multicast protocols

    Diversity, Stability, Recursivity, and Rule Generation in Biological System: Intra-inter Dynamics Approach

    Full text link
    Basic problems for the construction of a scenario for the Life are discussed. To study the problems in terms of dynamical systems theory, a scheme of intra-inter dynamics is presented. It consists of internal dynamics of a unit, interaction among the units, and the dynamics to change the dynamics itself, for example by replication (and death) of units according to their internal states. Applying the dynamics to cell differentiation, isologous diversification theory is proposed. According to it, orbital instability leads to diversified cell behaviors first. At the next stage, several cell types are formed, first triggered by clustering of oscillations, and then as attracting states of internal dynamics stabilized by the cell-to-cell interaction. At the third stage, the differentiation is determined as a recursive state by cell division. At the last stage, hierarchical differentiation proceeds, with the emergence of stochastic rule for the differentiation to sub-groups, where regulation of the probability for the differentiation provides the diversity and stability of cell society. Relevance of the theory to cell biology is discussed.Comment: 19 pages, Int.J. Mod. Phes. B (in press

    Automated detection of brain abnormalities in neonatal hypoxia ischemic injury from MR images.

    Get PDF
    We compared the efficacy of three automated brain injury detection methods, namely symmetry-integrated region growing (SIRG), hierarchical region splitting (HRS) and modified watershed segmentation (MWS) in human and animal magnetic resonance imaging (MRI) datasets for the detection of hypoxic ischemic injuries (HIIs). Diffusion weighted imaging (DWI, 1.5T) data from neonatal arterial ischemic stroke (AIS) patients, as well as T2-weighted imaging (T2WI, 11.7T, 4.7T) at seven different time-points (1, 4, 7, 10, 17, 24 and 31 days post HII) in rat-pup model of hypoxic ischemic injury were used to assess the temporal efficacy of our computational approaches. Sensitivity, specificity, and similarity were used as performance metrics based on manual ('gold standard') injury detection to quantify comparisons. When compared to the manual gold standard, automated injury location results from SIRG performed the best in 62% of the data, while 29% for HRS and 9% for MWS. Injury severity detection revealed that SIRG performed the best in 67% cases while 33% for HRS. Prior information is required by HRS and MWS, but not by SIRG. However, SIRG is sensitive to parameter-tuning, while HRS and MWS are not. Among these methods, SIRG performs the best in detecting lesion volumes; HRS is the most robust, while MWS lags behind in both respects

    The Ouroboros Model

    Get PDF
    At the core of the Ouroboros Model lies a self-referential recursive process with alternating phases of data acquisition and evaluation. Memory entries are organized in schemata. Activation at a time of part of a schema biases the whole structure and, in particular, missing features, thus triggering expectations. An iterative recursive monitor process termed ‘consumption analysis’ is then checking how well such expectations fit with successive activations. A measure for the goodness of fit, “emotion”, provides feedback as (self-) monitoring signal. Contradictions between anticipations based on previous experience and actual current data are highlighted as well as minor gaps and deficits. The basic algorithm can be applied to goal directed movements as well as to abstract rational reasoning when weighing evidence for and against some remote theories. A sketch is provided how the Ouroboros Model can shed light on rather different characteristics of human behavior including learning and meta-learning. Partial implementations proved effective in dedicated safety systems
    • 

    corecore