35,052 research outputs found

    Acquiring Correct Knowledge for Natural Language Generation

    Full text link
    Natural language generation (NLG) systems are computer software systems that produce texts in English and other human languages, often from non-linguistic input data. NLG systems, like most AI systems, need substantial amounts of knowledge. However, our experience in two NLG projects suggests that it is difficult to acquire correct knowledge for NLG systems; indeed, every knowledge acquisition (KA) technique we tried had significant problems. In general terms, these problems were due to the complexity, novelty, and poorly understood nature of the tasks our systems attempted, and were worsened by the fact that people write so differently. This meant in particular that corpus-based KA approaches suffered because it was impossible to assemble a sizable corpus of high-quality consistent manually written texts in our domains; and structured expert-oriented KA techniques suffered because experts disagreed and because we could not get enough information about special and unusual cases to build robust systems. We believe that such problems are likely to affect many other NLG systems as well. In the long term, we hope that new KA techniques may emerge to help NLG system builders. In the shorter term, we believe that understanding how individual KA techniques can fail, and using a mixture of different KA techniques with different strengths and weaknesses, can help developers acquire NLG knowledge that is mostly correct

    Soil sustainability in organic agricultural production

    Get PDF
    Traditionally, the assessment of soil sustainability and the potential impact of cultivation are based upon the application of chemical procedures. In the absence of a biological context, these measurements offer little in understanding longterm changes in soil husbandry. Detailed microcosm investigations were applied as a predictive tool for management change. The microcosms were designed with homogenised soils treated with organic amendments. Key soil functional relationships were quantified using stable isotope techniques, biochemical measurements and traditional approaches

    The Mandelstam-Leibbrandt Prescription in Light-Cone Quantized Gauge Theories

    Get PDF
    Quantization of gauge theories on characteristic surfaces and in the light-cone gauge is discussed. Implementation of the Mandelstam-Leibbrandt prescription for the spurious singularity is shown to require two distinct null planes, with independent degrees of freedom initialized on each. The relation of this theory to the usual light-cone formulation of gauge field theory, using a single null plane, is described. A connection is established between this formalism and a recently given operator solution to the Schwinger model in the light-cone gauge.Comment: Revtex, 14 pages. One postscript figure (requires psfig). A brief discussion of necessary restrictions on the light-cone current operators has been added, and two references. Final version to appear in Z. Phys.

    On the wake of a Darrieus turbine

    Get PDF
    The theory and experimental measurements on the aerodynamic decay of a wake from high performance vertical axis wind turbine are discussed. In the initial experimental study, the wake downstream of a model Darrieus rotor, 28 cm diameter and a height of 45.5 cm, was measured in a Boundary Layer Wind Tunnel. The wind turbine was run at the design tip speed ratio of 5.5. It was found that the wake decayed at a slower rate with distance downstream of the turbine, than a wake from a screen with similar troposkein shape and drag force characteristics as the Darrieus rotor. The initial wind tunnel results indicated that the vertical axis wind turbines should be spaced at least forty diameters apart to avoid mutual power depreciation greater than ten per cent

    Squeezing out the last 1 nanometer of water: A detailed nanomechanical study

    Full text link
    In this study, we present a detailed analysis of the squeeze-out dynamics of nanoconfined water confined between two hydrophilic surfaces measured by small-amplitude dynamic atomic force microscopy (AFM). Explicitly considering the instantaneous tip-surface separation during squeezeout, we confirm the existence of an adsorbed molecular water layer on mica and at least two hydration layers. We also confirm the previous observation of a sharp transition in the viscoelastic response of the nanoconfined water as the compression rate is increased beyond a critical value (previously determined to be about 0.8 nm/s). We find that below the critical value, the tip passes smoothly through the molecular layers of the film, while above the critical speed, the tip encounters "pinning" at separations where the film is able to temporarily order. Pre-ordering of the film is accompanied by increased force fluctuations, which lead to increased damping preceding a peak in the film stiffness once ordering is completed. We analyze the data using both Kelvin-Voigt and Maxwell viscoelastic models. This provides a complementary picture of the viscoelastic response of the confined water film

    Non-recursive max* operator with reduced implementation complexity for turbo decoding

    Get PDF
    In this study, the authors deal with the problem of how to effectively approximate the max?? operator when having n > 2 input values, with the aim of reducing implementation complexity of conventional Log-MAP turbo decoders. They show that, contrary to previous approaches, it is not necessary to apply the max?? operator recursively over pairs of values. Instead, a simple, yet effective, solution for the max?? operator is revealed having the advantage of being in non-recursive form and thus, requiring less computational effort. Hardware synthesis results for practical turbo decoders have shown implementation savings for the proposed method against the most recent published efficient turbo decoding algorithms by providing near optimal bit error rate (BER) performance

    Probabilistic models of information retrieval based on measuring the divergence from randomness

    Get PDF
    We introduce and create a framework for deriving probabilistic models of Information Retrieval. The models are nonparametric models of IR obtained in the language model approach. We derive term-weighting models by measuring the divergence of the actual term distribution from that obtained under a random process. Among the random processes we study the binomial distribution and Bose--Einstein statistics. We define two types of term frequency normalization for tuning term weights in the document--query matching process. The first normalization assumes that documents have the same length and measures the information gain with the observed term once it has been accepted as a good descriptor of the observed document. The second normalization is related to the document length and to other statistics. These two normalization methods are applied to the basic models in succession to obtain weighting formulae. Results show that our framework produces different nonparametric models forming baseline alternatives to the standard tf-idf model

    On the Impact of Entity Linking in Microblog Real-Time Filtering

    Full text link
    Microblogging is a model of content sharing in which the temporal locality of posts with respect to important events, either of foreseeable or unforeseeable nature, makes applica- tions of real-time filtering of great practical interest. We propose the use of Entity Linking (EL) in order to improve the retrieval effectiveness, by enriching the representation of microblog posts and filtering queries. EL is the process of recognizing in an unstructured text the mention of relevant entities described in a knowledge base. EL of short pieces of text is a difficult task, but it is also a scenario in which the information EL adds to the text can have a substantial impact on the retrieval process. We implement a start-of-the-art filtering method, based on the best systems from the TREC Microblog track realtime adhoc retrieval and filtering tasks , and extend it with a Wikipedia-based EL method. Results show that the use of EL significantly improves over non-EL based versions of the filtering methods.Comment: 6 pages, 1 figure, 1 table. SAC 2015, Salamanca, Spain - April 13 - 17, 201

    Pupil remapping for high contrast astronomy: results from an optical testbed

    Full text link
    The direct imaging and characterization of Earth-like planets is among the most sought-after prizes in contemporary astrophysics, however current optical instrumentation delivers insufficient dynamic range to overcome the vast contrast differential between the planet and its host star. New opportunities are offered by coherent single mode fibers, whose technological development has been motivated by the needs of the telecom industry in the near infrared. This paper presents a new vision for an instrument using coherent waveguides to remap the pupil geometry of the telescope. It would (i) inject the full pupil of the telescope into an array of single mode fibers, (ii) rearrange the pupil so fringes can be accurately measured, and (iii) permit image reconstruction so that atmospheric blurring can be totally removed. Here we present a laboratory experiment whose goal was to validate the theoretical concepts underpinning our proposed method. We successfully confirmed that we can retrieve the image of a simulated astrophysical object (in this case a binary star) though a pupil remapping instrument using single mode fibers.Comment: Accepted in Optics Expres
    • 

    corecore