62 research outputs found
Terminological reasoning with constraint handling rules
Constraint handling rules (CHRs) are a flexible means to implement \u27user-defined\u27 constraints on top of existing host languages (like Prolog and Lisp). Recently, M. Schmidt-SchauĂź and G. Smolka proposed a new methodology for constructing sound and complete inference algorithms for terminological knowledge representation formalisms in the tradition of KLONE. We propose CHRs as a flexible implementation language for the consistency test of assertions, which is the basis for all terminological reasoning services.
The implementation results in a natural combination of three layers: (i) a constraint layer that reasons in well- understood domains such as rationals or finite domains, (ii) a terminological layer providing a tailored, validated vocabulary on which (iii) the application layer can rely. The flexibility of the approach will be illustrated by extending the formalism, its implementation and an application example (solving configuration problems) with attributes, a new quantifier and concrete domains
CHR Grammars
A grammar formalism based upon CHR is proposed analogously to the way
Definite Clause Grammars are defined and implemented on top of Prolog. These
grammars execute as robust bottom-up parsers with an inherent treatment of
ambiguity and a high flexibility to model various linguistic phenomena. The
formalism extends previous logic programming based grammars with a form of
context-sensitive rules and the possibility to include extra-grammatical
hypotheses in both head and body of grammar rules. Among the applications are
straightforward implementations of Assumption Grammars and abduction under
integrity constraints for language analysis. CHR grammars appear as a powerful
tool for specification and implementation of language processors and may be
proposed as a new standard for bottom-up grammars in logic programming.
To appear in Theory and Practice of Logic Programming (TPLP), 2005Comment: 36 pp. To appear in TPLP, 200
Recommended from our members
Intercomparison of PERSIANN-CDR and TRMM-3B42V7 precipitation estimates at monthly and daily time scales
In the first part of this paper, monthly precipitation data from Precipitation Estimation from Remotely Sensed Information using Artificial Neural Networks–Climate Data Record (PERSIANN-CDR) and Tropical Rainfall Measuring Mission 3B42 algorithm Version 7 (TRMM-3B42V7) are evaluated over Iran using the Generalized Three-Cornered Hat (GTCH) method which is self-sufficient of reference data as input. Climate Data Unit (CRU) is added to the GTCH evaluations as an independent gauge-based dataset thus, the minimum requirement of three datasets for the model is satisfied. To ensure consistency of all datasets, the two satellite products were aggregated to 0.5° spatial resolution, which is the minimum resolution of CRU. The results show that the PERSIANN-CDR has higher Signal to Noise Ratio (SNR) than TRMM-3B42V7 for the monthly rainfall estimation, especially in the northern half of the country. All datasets showed low SNR in the mountainous area of southwestern Iran, as well as the arid parts in the southeast region of the country. Additionally, in order to evaluate the efficacy of PERSIANN-CDR and TRMM-3B42V7 in capturing extreme daily-precipitation amounts, an in-situ rain-gauge dataset collected by the Islamic Republic of the Iran Meteorological Organization (IRIMO) was employed. Given the sparsity of the rain gauges, only 0.25° pixels containing three or more gauges were used for this evaluation. There were 228 such pixels where daily and extreme rainfall from PERSIANN-CDR and TRMM-3B42V7 could be compared. However, TRMM-3B42V7 overestimates most of the intensity indices (correlation coefficients; R between 0.7648–0.8311, Root Mean Square Error; RMSE between 3.29mm/day-21.2mm/5day); PERSIANN-CDR underestimates these extremes (R between 0.6349–0.7791 and RMSE between 3.59mm/day-30.56mm/5day). Both satellite products show higher correlation coefficients and lower RMSEs for the annual mean of consecutive dry spells than wet spells. The results show that TRMM-3B42V7 can capture the annual mean of the absolute indices (the number of wet days in which daily precipitation >10 mm, 20 mm) better than PERSIANN-CDR. The results of daily evaluations show that the similarity of Empirical Cumulative Density Function (ECDF) of satellite products and IRIMO gauges daily precipitation, as well as dry spells with different thresholds in some selected pixels (include at least five gauges), are significant. The results also indicate that ECDFs become more significant when threshold increases. In terms of regional analyses, the higher SNR of the products on monthly (based on the GTCH method) and daily evaluations (significant ECDFs) is mostly consistent
Non Equilibrium Physics of Single-Cell Genomics
The self-organisation of cells into complex tissues relies on the tight regulation of molecular processes governing their behaviour. Understanding these processes is a central questions in cell biology. In recent years, technological breakthroughs in single-cell sequencing experiments have enabled us to probe these processes with unprecedented molecular detail. However, biological function relies on collective processes on the mesoscopic and macroscopic scale, which do not necessarily obey the rules that govern it on the microscopic scale. Insights from these experiments on how collective processes determine cellular behaviour consequently remain severely limited. Methods from nonequilibrium statistical physics provide a rigorous framework to connect microscopic measurements to their mesoscopic or macroscopic consequences.
In this thesis, by combining for the first time the possibilities of single-cell technologies and tools from nonequilbrium statistical physics, we develop theoretical frameworks that overcome these conceptual limitations. In particular, we derive a theory that maps measurements along the linear sequence of the DNA to mesoscopic processes in space and time in the cell nucleus. We demonstrate this approach in the context of the establishment of chemical modifications of the DNA (DNA methylation) during early embryonic development. Drawing on sequencing experiments both in vitro and in vivo, we find that the embryonic DNA methylome is established through the interplay between DNA methylation and 30-40 nm dynamic chromatin condensates. This interplay gives rise to hallmark scaling behaviour with an exponent of 5/2 in the time evolution of embryonic DNA methylation and time dependent, scale-free connected correlation functions, both of which are predicted by our theory. Using this theory, we successfully identify regions of the DNA that carry DNA methylation patterns anticipating cellular symmetry breaking in vivo.
The primary layer determining cell identity is gene expression. However, read-outs of gene-expression profiling experiments are dominated by systematic technical noise and they do not provide “stochiometric” measurements that allow experimental data to be predicted by theories. Here, by developing effective spin glass methods, we show that the macroscopic propagation of fluctuations in the concentration of mRNA molecules gives direct information on the physical mechanisms governing cell states, independent of technical bias. We find that gene expression fluctuations may exhibit glassy behaviour such that they are long-lived and carry biological information. We demonstrate the biological relevance of glassy fluctuations by analysing single-cell RNA sequencing experiments of mouse neurogenesis.
Taken together, we overcome important conceptual limitations of emerging technologies in biology and pioneer the application of methods from stochastic processes, spin glasses, field and renormalization group theories to single-cell genomics
Frequency modulated pulse for ultrasonic b-scan imaging in attenuating medium
A rigorous study of a new technique for Ultrasonic B-Scan imaging was performed. This technique made use of a Frequency Modulated (FM) pulse as opposed to the conventional Short pulse for imaging. The simulation studies offered sufficient support for this method, which was then implemented in the laboratory. Experiments were performed on phantoms which mimicked the attenuating medium. Due to the more flexible nature of this FM pulse, changes in the point spread function were studied as a function of bandwidth and depth of the scatterer. The backscattered signal was digitized post processed, and then displayed as a gray-scale B-scan image. The beam profile and the propagation of the pulse in the attenuating medium was carefully studied. Applications for tissue characterizations were explored through simulation studies
- …