62 research outputs found

    Terminological reasoning with constraint handling rules

    Get PDF
    Constraint handling rules (CHRs) are a flexible means to implement \u27user-defined\u27 constraints on top of existing host languages (like Prolog and Lisp). Recently, M. Schmidt-SchauĂź and G. Smolka proposed a new methodology for constructing sound and complete inference algorithms for terminological knowledge representation formalisms in the tradition of KLONE. We propose CHRs as a flexible implementation language for the consistency test of assertions, which is the basis for all terminological reasoning services. The implementation results in a natural combination of three layers: (i) a constraint layer that reasons in well- understood domains such as rationals or finite domains, (ii) a terminological layer providing a tailored, validated vocabulary on which (iii) the application layer can rely. The flexibility of the approach will be illustrated by extending the formalism, its implementation and an application example (solving configuration problems) with attributes, a new quantifier and concrete domains

    CHR Grammars

    Full text link
    A grammar formalism based upon CHR is proposed analogously to the way Definite Clause Grammars are defined and implemented on top of Prolog. These grammars execute as robust bottom-up parsers with an inherent treatment of ambiguity and a high flexibility to model various linguistic phenomena. The formalism extends previous logic programming based grammars with a form of context-sensitive rules and the possibility to include extra-grammatical hypotheses in both head and body of grammar rules. Among the applications are straightforward implementations of Assumption Grammars and abduction under integrity constraints for language analysis. CHR grammars appear as a powerful tool for specification and implementation of language processors and may be proposed as a new standard for bottom-up grammars in logic programming. To appear in Theory and Practice of Logic Programming (TPLP), 2005Comment: 36 pp. To appear in TPLP, 200

    Non Equilibrium Physics of Single-Cell Genomics

    Get PDF
    The self-organisation of cells into complex tissues relies on the tight regulation of molecular processes governing their behaviour. Understanding these processes is a central questions in cell biology. In recent years, technological breakthroughs in single-cell sequencing experiments have enabled us to probe these processes with unprecedented molecular detail. However, biological function relies on collective processes on the mesoscopic and macroscopic scale, which do not necessarily obey the rules that govern it on the microscopic scale. Insights from these experiments on how collective processes determine cellular behaviour consequently remain severely limited. Methods from nonequilibrium statistical physics provide a rigorous framework to connect microscopic measurements to their mesoscopic or macroscopic consequences. In this thesis, by combining for the first time the possibilities of single-cell technologies and tools from nonequilbrium statistical physics, we develop theoretical frameworks that overcome these conceptual limitations. In particular, we derive a theory that maps measurements along the linear sequence of the DNA to mesoscopic processes in space and time in the cell nucleus. We demonstrate this approach in the context of the establishment of chemical modifications of the DNA (DNA methylation) during early embryonic development. Drawing on sequencing experiments both in vitro and in vivo, we find that the embryonic DNA methylome is established through the interplay between DNA methylation and 30-40 nm dynamic chromatin condensates. This interplay gives rise to hallmark scaling behaviour with an exponent of 5/2 in the time evolution of embryonic DNA methylation and time dependent, scale-free connected correlation functions, both of which are predicted by our theory. Using this theory, we successfully identify regions of the DNA that carry DNA methylation patterns anticipating cellular symmetry breaking in vivo. The primary layer determining cell identity is gene expression. However, read-outs of gene-expression profiling experiments are dominated by systematic technical noise and they do not provide “stochiometric” measurements that allow experimental data to be predicted by theories. Here, by developing effective spin glass methods, we show that the macroscopic propagation of fluctuations in the concentration of mRNA molecules gives direct information on the physical mechanisms governing cell states, independent of technical bias. We find that gene expression fluctuations may exhibit glassy behaviour such that they are long-lived and carry biological information. We demonstrate the biological relevance of glassy fluctuations by analysing single-cell RNA sequencing experiments of mouse neurogenesis. Taken together, we overcome important conceptual limitations of emerging technologies in biology and pioneer the application of methods from stochastic processes, spin glasses, field and renormalization group theories to single-cell genomics

    Frequency modulated pulse for ultrasonic b-scan imaging in attenuating medium

    Get PDF
    A rigorous study of a new technique for Ultrasonic B-Scan imaging was performed. This technique made use of a Frequency Modulated (FM) pulse as opposed to the conventional Short pulse for imaging. The simulation studies offered sufficient support for this method, which was then implemented in the laboratory. Experiments were performed on phantoms which mimicked the attenuating medium. Due to the more flexible nature of this FM pulse, changes in the point spread function were studied as a function of bandwidth and depth of the scatterer. The backscattered signal was digitized post processed, and then displayed as a gray-scale B-scan image. The beam profile and the propagation of the pulse in the attenuating medium was carefully studied. Applications for tissue characterizations were explored through simulation studies
    • …
    corecore