40,025 research outputs found

    Acquiring Correct Knowledge for Natural Language Generation

    Full text link
    Natural language generation (NLG) systems are computer software systems that produce texts in English and other human languages, often from non-linguistic input data. NLG systems, like most AI systems, need substantial amounts of knowledge. However, our experience in two NLG projects suggests that it is difficult to acquire correct knowledge for NLG systems; indeed, every knowledge acquisition (KA) technique we tried had significant problems. In general terms, these problems were due to the complexity, novelty, and poorly understood nature of the tasks our systems attempted, and were worsened by the fact that people write so differently. This meant in particular that corpus-based KA approaches suffered because it was impossible to assemble a sizable corpus of high-quality consistent manually written texts in our domains; and structured expert-oriented KA techniques suffered because experts disagreed and because we could not get enough information about special and unusual cases to build robust systems. We believe that such problems are likely to affect many other NLG systems as well. In the long term, we hope that new KA techniques may emerge to help NLG system builders. In the shorter term, we believe that understanding how individual KA techniques can fail, and using a mixture of different KA techniques with different strengths and weaknesses, can help developers acquire NLG knowledge that is mostly correct

    Research study on materials processing in space experiment M512

    Get PDF
    A study program was conducted to clarify the role of gravity in the fluid mechanics of certain molten metal processes of potential significance to manufacturing in space. In particular, analyses were conducted of the M551 Metals Melting Experiment and the M553 Sphere Forming Experiment to be conducted in the M512 Facility onboard Skylab. The M551 experiment consisted of a study of electron beam welding of various metals, and the M553 experiment studied the formation of molten metal spheres by free-floating in a near zero-gravity environment. The analyses of these experiments and a comparison with ground-based and KC135 experimental results are presented

    Research study on materials processing in space experiment M512

    Get PDF
    A study program was conducted to clarify the role of gravity in the fluid mechanics of certain molten metal processes of potential significance to manufacturing in space. In particular, analyses were conducted of the M551 Metals Melting Experiment and the M553 Sphere Forming Experiment to be conducted in the M512 Facility onboard Skylab. The M551 experiment consisted of a study of electron beam welding of various metals, and the M553 experiment studied the formation of molten metal spheres by free-floating in a near zero-gravity environment. The analyses of these experiments and a comparison with ground-based and KC135 experimental results are presented

    Probabilistic models of information retrieval based on measuring the divergence from randomness

    Get PDF
    We introduce and create a framework for deriving probabilistic models of Information Retrieval. The models are nonparametric models of IR obtained in the language model approach. We derive term-weighting models by measuring the divergence of the actual term distribution from that obtained under a random process. Among the random processes we study the binomial distribution and Bose--Einstein statistics. We define two types of term frequency normalization for tuning term weights in the document--query matching process. The first normalization assumes that documents have the same length and measures the information gain with the observed term once it has been accepted as a good descriptor of the observed document. The second normalization is related to the document length and to other statistics. These two normalization methods are applied to the basic models in succession to obtain weighting formulae. Results show that our framework produces different nonparametric models forming baseline alternatives to the standard tf-idf model

    Squeezing out the last 1 nanometer of water: A detailed nanomechanical study

    Full text link
    In this study, we present a detailed analysis of the squeeze-out dynamics of nanoconfined water confined between two hydrophilic surfaces measured by small-amplitude dynamic atomic force microscopy (AFM). Explicitly considering the instantaneous tip-surface separation during squeezeout, we confirm the existence of an adsorbed molecular water layer on mica and at least two hydration layers. We also confirm the previous observation of a sharp transition in the viscoelastic response of the nanoconfined water as the compression rate is increased beyond a critical value (previously determined to be about 0.8 nm/s). We find that below the critical value, the tip passes smoothly through the molecular layers of the film, while above the critical speed, the tip encounters "pinning" at separations where the film is able to temporarily order. Pre-ordering of the film is accompanied by increased force fluctuations, which lead to increased damping preceding a peak in the film stiffness once ordering is completed. We analyze the data using both Kelvin-Voigt and Maxwell viscoelastic models. This provides a complementary picture of the viscoelastic response of the confined water film

    Quantum Information Paradox: Real or Fictitious?

    Full text link
    One of the outstanding puzzles of theoretical physics is whether quantum information indeed gets lost in the case of Black Hole (BH) evaporation or accretion. Let us recall that Quantum Mechanics (QM) demands an upper limit on the acceleration of a test particle. On the other hand, it is pointed out here that, if a Schwarzschild BH would exist, the acceleration of the test particle would blow up at the event horizon in violation of QM. Thus the concept of an exact BH is in contradiction of QM and quantum gravity (QG). It is also reminded that the mass of a BH actually appears as an INTEGRATION CONSTANT of Einstein equations. And it has been shown that the value of this integration constant is actually zero. Thus even classically, there cannot be finite mass BHs though zero mass BH is allowed. It has been further shown that during continued gravitational collapse, radiation emanating from the contracting object gets trapped within it by the runaway gravitational field. As a consequence, the contracting body attains a quasi-static state where outward trapped radiation pressure gets balanced by inward gravitational pull and the ideal classical BH state is never formed in a finite proper time. In other words, continued gravitational collapse results in an "Eternally Collapsing Object" which is a ball of hot plasma and which is asymptotically approaching the true BH state with M=0 after radiating away its entire mass energy. And if we include QM, this contraction must halt at a radius suggested by highest QM acceleration. In any case no EH is ever formed and in reality, there is no quantum information paradox.Comment: 8 pages in Pramana Style, 6 in Revtex styl

    Are there any good digraph width measures?

    Full text link
    Several different measures for digraph width have appeared in the last few years. However, none of them shares all the "nice" properties of treewidth: First, being \emph{algorithmically useful} i.e. admitting polynomial-time algorithms for all \MS1-definable problems on digraphs of bounded width. And, second, having nice \emph{structural properties} i.e. being monotone under taking subdigraphs and some form of arc contractions. As for the former, (undirected) \MS1 seems to be the least common denominator of all reasonably expressive logical languages on digraphs that can speak about the edge/arc relation on the vertex set.The latter property is a necessary condition for a width measure to be characterizable by some version of the cops-and-robber game characterizing the ordinary treewidth. Our main result is that \emph{any reasonable} algorithmically useful and structurally nice digraph measure cannot be substantially different from the treewidth of the underlying undirected graph. Moreover, we introduce \emph{directed topological minors} and argue that they are the weakest useful notion of minors for digraphs

    Inferring effective interactions from the local density of states: application to STM data from Bi2_2Sr2_2CaCu2_2O8+δ_{8+\delta}

    Full text link
    While the influence of impurities on the local density of states (LDOS) in a metal is notoriously non-local due to interference effects, low order moments of the LDOS in general can be shown to depend only on the local structure of the Hamiltonian. Specifically, we show that an analysis of the spatial variations of these moments permits one to ``work backwards'' from scanning tunneling microscopy (STM) data to infer the local structure of the underlying effective Hamiltonian. Applying this analysis to STM data from the high temperature superconductor, Bi2_2Sr2_2CaCu2_2O8+δ_{8+\delta}, we find that the variations of the electro-chemical potential are remarkably small (i.e., the disorder is, in a sense, weak) but that there are large variations in the local magnitude of the d-wave gap parameter.Comment: 7 pages, 7 figure
    corecore