1,348 research outputs found

    A Single Nucleotide Polymorphism in the Bovine Kit Oncogene (Hardy−Zuckerman 4 Feline Sarcoma Viral (v-kit) Oncogene Homolog)

    Get PDF
    Source/description PCR conditions and SNP analysis Polymorphism Inheritance Chromosomal locatio

    Prosody Modelling in Concept-to-Speech Generation: Methodological Issues

    Get PDF
    We explore three issues for the development of concept-to-speech (CTS) systems. We identify information available in a language-generation system that has the potential to impact prosody; investigate the role played by different corpora in CTS prosody modelling; and explore different methodologies for learning how linguistic features impact prosody. Our major focus is on the comparison of two machine learning methodologies: generalized rule induction and memory-based learning. We describe this work in the context of multimedia abstract generation of intensive care (MAGIC) data, a system that produces multimedia brings of the status of patients who have just undergone a bypass operation

    Validating soil denitrification models based on laboratory N2 and N2O fluxes and underlying processes derived by stable isotope approaches: concept, methods and regulation of measured fluxes

    Get PDF
    Robust denitrification data suitable to validate soil N2 fluxes in denitrification models are scarce due to methodical limitations and the extreme spatio-temporal heterogeneity of denitrification in soils. Numerical models have become essential tools to predict denitrification at different scales. Model performance could either be tested for total gaseous flux (NO + N2O + N2), individual denitrification products (e.g. N2O and/or NO) or for the effect of denitrification factors (e.g. C-availability, respiration, diffusivity, anaerobic volume, etc.). While there are numerous examples for validating N2O fluxes, there are neither robust field data of N2 fluxes nor sufficiently resolved measurements of control factors used as state variables in the models. Here we present the concept, methods and first results of collecting model validation data. This is part of the coordinated research unit “Denitrification in Agricultural Soils: Integrated Control and Modelling at Various Scales” (DASIM). Novel approaches are used including analysis of stable isotopes, microbial communities, pore structure and organic matter fractions to provide denitrification data sets comprising as much detail on activity and regulation as possible. This will be the basis to validate existing and calibrate new denitrification models that are applied and/or developed by DASIM subprojects. To allow model testing in a wide range of conditions, denitrification control factors are varied in the initial settings (pore volume, plant residues, mineral N, pH) but also over time, where moisture, temperature, and mineral N are manipulated according to typical time patterns in the field. This is realized by including precipitation events, fertilization (via irrigation), drainage (via water potential) and temperature in the course of incubations. Moreover, oxygen concentration is varied to simulate anaerobic events. The 15N gas flux method is employed to quantify N2 and N2O emissions from various pools and processes

    Probabilistic approaches for modeling text structure and their application to text-to-text generation

    Get PDF
    Since the early days of generation research, it has been acknowledged that modeling the global structure of a document is crucial for producing coherent, readable output. However, traditional knowledge-intensive approaches have been of limited utility in addressing this problem since they cannot be effectively scaled to operate in domain-independent, large-scale applications. Due to this difficulty, existing text-to-text generation systems rarely rely on such structural information when producing an output text. Consequently, texts generated by these methods do not match the quality of those written by humans – they are often fraught with severe coherence violations and disfluencies. In this chapter, I will present probabilistic models of document structure that can be effectively learned from raw document collections. This feature distinguishes these new models from traditional knowledge intensive approaches used in symbolic concept-to-text generation. Our results demonstrate that these probabilistic models can be directly applied to content organization, and suggest that these models can prove useful in an even broader range of text-to-text applications than we have considered here.National Science Foundation (U.S.) (CAREER grant IIS- 0448168)Microsoft Research. New Faculty Fellowshi

    A Procedure for Assessing Heavy Mineral Resources Potential

    Get PDF
    Supplies of placer heavy minerals, such as ilmenite, rutile, zircon, and monazite, are anticipated to be in short supply by early in the next century. The depletion of conventional onshore deposits coupled with the declaration of the Exclusive Economic Zone in 1983 have provided the impetus to assess the resource potential of heavy-mineral concentrations in U.S. Continental Shelf sediments as future sources for these mineral commodities. Mineralogically imprecise assessments of placer resources result from analyses of concentrates derived from small volume samples because of the particle-sparsity effect. The overall low grade of heavy minerals in Atlantic Continental Shelf sediments require the analysis of mineral concentrates from large volumes of bulk sample. A set of procedures to extract and analyze heavy minerals from large-volume samples is presented.https://scholarworks.wm.edu/vimsbooks/1103/thumbnail.jp

    Validating soil denitrification models based on laboratory N2 and N2O fluxes and underlying processes: evaluation of DailyDayCent and COUP models

    Get PDF
    Denitrification is an anaerobic key process by microbes where the NO3- is step-by-step reduced and emitted as NO, N2O and finally N2 gas from the soil. Accurate knowledge on denitrification dynamics is important because the N2O is further reduced to N2 and constitutes the main emission source of this greenhouse gas from agricultural soils. Hence, our understanding and ability to quantify soil denitrification is crucial for mitigating nitrogen fertilizer loss as well as for reducing N2O emissions. Models can be an important tool to predict mitigation effects and help to develop climate smart mitigation strategies. Ideally, commonly used biogeochemical models could provide adequate predictions of denitrification processes of agricultural soils but often simplified process descriptions and inadequate model parameters prevent models from simulating adequate fluxes of N2 and N2O on field scale. Model development and parametrization often suffers from limited availability of empirical data describing denitrification processes in agricultural soils. While in many studies N2O emissions are used to develop and train models, detailed measurements on NO, N2O, N2 fluxes and concentrations and related soil conditions are necessary to develop and test adequate model algorithms. To address this issue the coordinated research unit „Denitrification in Agricultural Soils: Integrated Control and Modelling at Various Scales (DASIM)” was initiated to more closely investigate N-fluxes caused by denitrification in response to environmental effects, soil properties and microbial communities. Here, we present how we will use these data to evaluate common biogeochemical process models (DailyDayCent, Coup) with respect to modeled NO, N2O and N2 fluxes from denitrification. The models are used with different settings. The first approximation is the basic “factory” setting of the models. The next step would show the precision in the results of the modeling after adjusting the appropriate parameters from the result of the measurement values and the “factory” results. The better adjustment and the well-controlled input and output measured parameters could provide a better understanding of the probable scantiness of the tested models which will be a basis for future model improvement

    A computational approach to implicit entities and events in text and discourse

    Get PDF
    In this paper we will focus on the notion of “implicit” or lexically unexpressed linguistic elements that are nonetheless necessary for a complete semantic interpretation of a text. We refer to “entities” and “events” because the recovery of the implicit material may affect all the modules of a system for semantic processing, from the grammatically guided components to the inferential and reasoning ones. Reference to the system GETARUNS offers one possible implementation of the algorithms and procedures needed to cope with the problem and enables us to deal with all the spectrum of phenomena. The paper will address at first the following three types of “implicit” entities and events: – the grammatical ones, as suggested by a linguistic theories like LFG or similar generative theories; – the semantic ones suggested in the FrameNet project, i.e. CNI, DNI, INI; – the pragmatic ones: here we will present a theory and an implementation for the recovery of implicit entities and events of (non-) standard implicatures. In particular we will show how the use of commonsense knowledge may fruitfully contribute to find relevant implied meanings. Last Implicit Entity only touched on, though for lack of space, is the Subject of Point of View, which is computed by Semantic Informational Structure and contributes the intended entity from whose point of view a given subjective statement is expressed

    Temporal drag: transdisciplinarity and the 'case' of psychosocial studies

    Get PDF
    Psychosocial studies is a putatively ‘new’ or emerging field concerned with the irreducible relation between psychic and social life. Genealogically, it attempts to re-suture a tentative relation between mind and social world, individual and mass, internality and externality, norm and subject, and the human and non-human, through gathering up and re-animating largely forgotten debates that have played out across a range of other disciplinary spaces. If, as I argue, the central tenets, concepts and questions for psychosocial studies emerge out of a re-appropriation of what have become anachronistic or ‘useless’ concepts in other fields – ‘the unconscious’, for instance, in the discipline of psychology – then we need to think about transdisciplinarity not just in spatial terms (that is, in terms of the movement across disciplinary borders) but also in temporal terms. This may involve engaging with theoretical ‘embarrassments’, one of which – the notion of ‘psychic reality’ – I explore here

    Relating imperatives to action

    Get PDF
    The aim of this chapter is to provide an analysis of the use of logically complex imperatives, in particular, imperatives of the form Do A1 or A2 and Do A, if B. We argue for an analysis of imperatives in terms of classical logic which takes into account the influence of background information on imperatives. We show that by doing so one can avoid some counter-intuitive results which have been associated with analyses of imperatives in terms of classical logic. In particular, I address Hamblin's observations concerning rule-like imperatives and Ross' Paradox. The analysis is carried out within an agent-based logical framework. This analysis explicates what it means for an agent to have a successful policy for action with respect to satisfying his or her commitments, where some of these commitments have been introduced as a result of imperative language use
    • 

    corecore