2,447 research outputs found

    A Data-Oriented Approach to Semantic Interpretation

    Full text link
    In Data-Oriented Parsing (DOP), an annotated language corpus is used as a stochastic grammar. The most probable analysis of a new input sentence is constructed by combining sub-analyses from the corpus in the most probable way. This approach has been succesfully used for syntactic analysis, using corpora with syntactic annotations such as the Penn Treebank. If a corpus with semantically annotated sentences is used, the same approach can also generate the most probable semantic interpretation of an input sentence. The present paper explains this semantic interpretation method, and summarizes the results of a preliminary experiment. Semantic annotations were added to the syntactic annotations of most of the sentences of the ATIS corpus. A data-oriented semantic interpretation algorithm was succesfully tested on this semantically enriched corpus.Comment: 10 pages, Postscript; to appear in Proceedings Workshop on Corpus-Oriented Semantic Analysis, ECAI-96, Budapes

    Membrane formation by immersion precipitation : the role of a polymeric additive

    Get PDF
    In this thesis the immersion precipitation process is studied for systems in which two polymers are present.\ud In its basic form, immersion precipitation is carried out by immersing a thin film of a concentrated polymer solution into a bath of nonsolvent. By exchange of solvent from the polymer solution, and nonsolvent from the coagulation bath, the polymer solution becomes instable. Liquid-liquid phase separation results in a polymer lean phase and a polymer rich phase. The polymer lean phase forms pores inside a matrix created by the polymer rich phase, which forms the membrane.\ud The objective of this thesis is to investigate the effects of the addition of a second polymer into the polymer solution. The use of a second polymer (polymeric additive) that is miscible with the nonsolvent can result in more open porous (co-continuous) structures and a better defined porosity

    Data-Oriented Language Processing. An Overview

    Full text link
    During the last few years, a new approach to language processing has started to emerge, which has become known under various labels such as "data-oriented parsing", "corpus-based interpretation", and "tree-bank grammar" (cf. van den Berg et al. 1994; Bod 1992-96; Bod et al. 1996a/b; Bonnema 1996; Charniak 1996a/b; Goodman 1996; Kaplan 1996; Rajman 1995a/b; Scha 1990-92; Sekine & Grishman 1995; Sima'an et al. 1994; Sima'an 1995-96; Tugwell 1995). This approach, which we will call "data-oriented processing" or "DOP", embodies the assumption that human language perception and production works with representations of concrete past language experiences, rather than with abstract linguistic rules. The models that instantiate this approach therefore maintain large corpora of linguistic representations of previously occurring utterances. When processing a new input utterance, analyses of this utterance are constructed by combining fragments from the corpus; the occurrence-frequencies of the fragments are used to estimate which analysis is the most probable one. In this paper we give an in-depth discussion of a data-oriented processing model which employs a corpus of labelled phrase-structure trees. Then we review some other models that instantiate the DOP approach. Many of these models also employ labelled phrase-structure trees, but use different criteria for extracting fragments from the corpus or employ different disambiguation strategies (Bod 1996b; Charniak 1996a/b; Goodman 1996; Rajman 1995a/b; Sekine & Grishman 1995; Sima'an 1995-96); other models use richer formalisms for their corpus annotations (van den Berg et al. 1994; Bod et al., 1996a/b; Bonnema 1996; Kaplan 1996; Tugwell 1995).Comment: 34 pages, Postscrip

    A Delta Debugger for ILP Query Execution

    Full text link
    Because query execution is the most crucial part of Inductive Logic Programming (ILP) algorithms, a lot of effort is invested in developing faster execution mechanisms. These execution mechanisms typically have a low-level implementation, making them hard to debug. Moreover, other factors such as the complexity of the problems handled by ILP algorithms and size of the code base of ILP data mining systems make debugging at this level a very difficult job. In this work, we present the trace-based debugging approach currently used in the development of new execution mechanisms in hipP, the engine underlying the ACE Data Mining system. This debugger uses the delta debugging algorithm to automatically reduce the total time needed to expose bugs in ILP execution, thus making manual debugging step much lighter.Comment: Paper presented at the 16th Workshop on Logic-based Methods in Programming Environments (WLPE2006

    Thermoforming of foam sheet

    Get PDF
    Thermoforming is a widely used process for the manufacture of foam sheet products. Polystyrene foam food trays for instance can be produced by first heating the thermoplastic foam sheet, causing the gas contained to build up pressure and expand, after which a vacuum pressure can be applied to draw the sheet in the required form on the mould. This production method appears to be a very sensitive process with respect to e.g. the sheet temperature, the pressures applied and the cooling time. More problems can be foreseen when for environmental reasons the blowing agent will be adapted (for instance replaced by a gas with a lower molecular weight). To gain more insight in the occuring phenomena the large deformations of a foam structure have been analysed using finite element modelling. To this end a constitutive model has to be defined. Starting from the basic theory given by Gibson & Ashby [1], the behaviour of a closed cubic cell has been elaborated for large strains. The total stiffness is then the sum of the contributions of the edges and faces of the cell and the gas contained in it. The large deformations cause anisotropy of the cells [2], which influences their tangential stiffness. The constitutive model developed here includes the effects of internal gas pressure and the evolving anisotropy

    Galileons as the Scalar Analogue of General Relativity

    Get PDF
    We establish a correspondence between general relativity with diffeomorphism invariance and scalar field theories with Galilean invariance: notions such as the Levi-Civita connection and the Riemann tensor have a Galilean counterpart. This suggests Galilean theories as the unique nontrivial alternative to gauge theories (including general relativity). Moreover, it is shown that the requirement of first-order Palatini formalism uniquely determines the Galileon models with second-order field equations, similar to the Lovelock gravity theories. Possible extensions are discussed.Comment: 6 pages, v2: Version appeared in Phys. Rev.

    Higher Derivative Field Theories: Degeneracy Conditions and Classes

    Get PDF
    We provide a full analysis of ghost free higher derivative field theories with coupled degrees of freedom. Assuming the absence of gauge symmetries, we derive the degeneracy conditions in order to evade the Ostrogradsky ghosts, and analyze which (non)trivial classes of solutions this allows for. It is shown explicitly how Lorentz invariance avoids the propagation of "half" degrees of freedom. Moreover, for a large class of theories, we construct the field redefinitions and/or (extended) contact transformations that put the theory in a manifestly first order form. Finally, we identify which class of theories cannot be brought to first order form by such transformations.Comment: 26 pages, 1 figure. v2: minor changes, references added, matches version published in JHE

    Sketching is more than making correct drawings

    Get PDF
    Sketching in the context of a design process is not a goal in itself, but can be considered as a tool to\ud make better designs. Sketching as a design tool has several useful effects as: ordering your thoughts,\ud better understanding of difficult shapes, functioning as a communication tool, and providing an\ud iterative way of developing shapes. In our bachelor-curriculum Industrial Design Engineering we\ud developed a series of courses that addresses these effects in particular.\ud The courses are Sketching and concept drawing (SCT), Product Presentation Drawing (PPT) and\ud Applied sketching skills (TTV). This line of courses is built on three pillars:\ud - Learning to sketch; Theory, speed and control of the materials.\ud - Learning from sketching; Develop a better insight in complex 3D shapes (Figure 1).\ud - Sketching as a design tool; Communication, ordering your thoughts, iterative working.\ud As a result we see that students who have finished the courses instinctively start sketching in an\ud iterative manner, use sketching as a source of inspiration and learn that the whole process of iterative\ud sketching helps in structuring, developing and communicating the design process. In this way the\ud students become better sketchers and better designer
    corecore