147 research outputs found

    Entropy and Graph Based Modelling of Document Coherence using Discourse Entities: An Application

    Full text link
    We present two novel models of document coherence and their application to information retrieval (IR). Both models approximate document coherence using discourse entities, e.g. the subject or object of a sentence. Our first model views text as a Markov process generating sequences of discourse entities (entity n-grams); we use the entropy of these entity n-grams to approximate the rate at which new information appears in text, reasoning that as more new words appear, the topic increasingly drifts and text coherence decreases. Our second model extends the work of Guinaudeau & Strube [28] that represents text as a graph of discourse entities, linked by different relations, such as their distance or adjacency in text. We use several graph topology metrics to approximate different aspects of the discourse flow that can indicate coherence, such as the average clustering or betweenness of discourse entities in text. Experiments with several instantiations of these models show that: (i) our models perform on a par with two other well-known models of text coherence even without any parameter tuning, and (ii) reranking retrieval results according to their coherence scores gives notable performance gains, confirming a relation between document coherence and relevance. This work contributes two novel models of document coherence, the application of which to IR complements recent work in the integration of document cohesiveness or comprehensibility to ranking [5, 56]

    The genealogy of extremal particles of Branching Brownian Motion

    Full text link
    Branching Brownian Motion describes a system of particles which diffuse in space and split into offsprings according to a certain random mechanism. In virtue of the groundbreaking work by M. Bramson on the convergence of solutions of the Fisher-KPP equation to traveling waves, the law of the rightmost particle in the limit of large times is rather well understood. In this work, we address the full statistics of the extremal particles (first-, second-, third- etc. largest). In particular, we prove that in the large t−t-limit, such particles descend with overwhelming probability from ancestors having split either within a distance of order one from time 0, or within a distance of order one from time tt. The approach relies on characterizing, up to a certain level of precision, the paths of the extremal particles. As a byproduct, a heuristic picture of Branching Brownian Motion "at the edge" emerges, which sheds light on the still unknown limiting extremal process.Comment: 27 pages, 5 figures, final version accepted for publication in CPA

    Stein's method, logarithmic Sobolev and transport inequalities

    Get PDF
    52 pages. To appear in GAFAInternational audienceWe develop connections between Stein's approximation method, logarithmic Sobolev and transport inequalities by introducing a new class of functional inequalities involving the relative entropy, the Stein kernel, the relative Fisher information and the Wasserstein distance with respect to a given reference distribution on Rd\mathbb{R}^d. For the Gaussian model, the results improve upon the classical logarithmic Sobolev inequality and the Talagrand quadratic transportation cost inequality. Further examples of illustrations include multidimensional gamma distributions, beta distributions, as well as families of log-concave densities. As a by-product, the new inequalities are shown to be relevant towards convergence to equilibrium, concentration inequalities and entropic convergence expressed in terms of the Stein kernel. The tools rely on semigroup interpolation and bounds, in particular by means of the iterated gradients of the Markov generator with invariant measure the distribution under consideration. In a second part, motivated by the recent investigation by Nourdin, Peccati and Swan on Wiener chaoses, we address the issue of entropic bounds on multidimensional functionals FF with the Stein kernel via a set of data on FF and its gradients rather than on the Fisher information of the density. A natural framework for this investigation is given by the Markov Triple structure (E,μ,Γ)(E, \mu, \Gamma) in which abstract Malliavin-type arguments may be developed and extend the Wiener chaos setting

    A semi-implicit compressible model for atmospheric flows with seamless access to soundproof and hydrostatic dynamics

    Get PDF
    We introduce a second-order numerical scheme for compressible atmospheric motions at small to planetary scales. The collocated finite volume method treats the advection of mass, momentum, and mass-weighted potential temperature in conservation form while relying on Exner pressure for the pressure gradient term. It discretises the rotating compressible equations by evolving full variables rather than perturbations around a background state, and operates with time steps constrained by the advection speed only. Perturbation variables are only used as auxiliary quantities in the formulation of the elliptic problem. Borrowing ideas on forward-in-time differencing, the algorithm reframes the authors' previously proposed schemes into a sequence of implicit midpoint, advection, and implicit trapezoidal steps that allows for a time integration unconstrained by the internal gravity wave speed. Compared with existing approaches, results on a range of benchmarks of nonhydrostatic- and hydrostatic-scale dynamics are competitive. The test suite includes a new planetary-scale inertia-gravity wave test highlighting the properties of the scheme and its large time step capabilities. In the hydrostatic-scale cases the model is run in pseudo-incompressible and hydrostatic mode with simple switching within a uniform discretization framework. The differences with the compressible runs return expected relative magnitudes. By providing seamless access to soundproof and hydrostatic dynamics, the developments represent a necessary step towards an all-scale blended multimodel solver

    Choice-Driven Counterfactuals

    Get PDF

    High accuracy binary black hole simulations with an extended wave zone

    Get PDF
    We present results from a new code for binary black hole evolutions using the moving-puncture approach, implementing finite differences in generalised coordinates, and allowing the spacetime to be covered with multiple communicating non-singular coordinate patches. Here we consider a regular Cartesian near zone, with adapted spherical grids covering the wave zone. The efficiencies resulting from the use of adapted coordinates allow us to maintain sufficient grid resolution to an artificial outer boundary location which is causally disconnected from the measurement. For the well-studied test-case of the inspiral of an equal-mass non-spinning binary (evolved for more than 8 orbits before merger), we determine the phase and amplitude to numerical accuracies better than 0.010% and 0.090% during inspiral, respectively, and 0.003% and 0.153% during merger. The waveforms, including the resolved higher harmonics, are convergent and can be consistently extrapolated to r→∞r\to\infty throughout the simulation, including the merger and ringdown. Ringdown frequencies for these modes (to (ℓ,m)=(6,6)(\ell,m)=(6,6)) match perturbative calculations to within 0.01%, providing a strong confirmation that the remnant settles to a Kerr black hole with irreducible mass Mirr=0.884355±20×10−6M_{\rm irr} = 0.884355\pm20\times10^{-6} and spin $S_f/M_f^2 = 0.686923 \pm 10\times10^{-6}

    Fragmentation Methods: A Route to Accurate Calculations on Large Systems

    Get PDF
    Theoretical chemists have always strived to perform quantum mechanics (QM) calculations on larger and larger molecules and molecular systems, as well as condensed phase species, that are frequently much larger than the current state-of-the-art would suggest is possible. The desire to study species (with acceptable accuracy) that are larger than appears to be feasible has naturally led to the development of novel methods, including semiempirical approaches, reduced scaling methods, and fragmentation methods. The focus of the present review is on fragmentation methods, in which a large molecule or molecular system is made more computationally tractable by explicitly considering only one part (fragment) of the whole in any particular calculation. If one can divide a species of interest into fragments, employ some level of ab initio QM to calculate the wave function, energy, and properties of each fragment, and then combine the results from the fragment calculations to predict the same properties for the whole, the possibility exists that the accuracy of the outcome can approach that which would be obtained from a full (nonfragmented) calculation. It is this goal that drives the development of fragmentation methods

    The relation of Christian education to pastoral theology: with reference to the function-centered theology of Seward Hiltner

    Full text link
    Thesis (Ph.D)--Boston University1. Problem.--The problem of this dissertation is to examine the relation of Christian education to pastoral theology, utilizing the function-centered theology of Seward Hiltner as the frame of reference and theories of three Christian educators. There will be an attempt to clarify the nature and structure of pastoral theology and to arrive at some implications for its reformulation as a coordinate theology of the functions of ministry. 2. Methodology.--The study is set in a historical and contemporary perspective by a brief historical survey of practical and pastoral theology, including the rise of specialization within the functional ministry and a survey of recent attempts to reformulate pastoral theology. A descriptive analysis of Seward Hiltner's function-centered theology provides the frame of reference for describing and analyzing the general theory of pastoral theology and/or ministry and the relation of Christian education in the writings of Reuel L. Howe, Lewis J. Sherrill, and Ross Snyder. Inferences as to the relation of Christian education to pastoral theology and implications for a reformulation of pastoral theology are made [TRUNCATED

    The Tensor Networks Anthology: Simulation techniques for many-body quantum lattice systems

    Full text link
    We present a compendium of numerical simulation techniques, based on tensor network methods, aiming to address problems of many-body quantum mechanics on a classical computer. The core setting of this anthology are lattice problems in low spatial dimension at finite size, a physical scenario where tensor network methods, both Density Matrix Renormalization Group and beyond, have long proven to be winning strategies. Here we explore in detail the numerical frameworks and methods employed to deal with low-dimension physical setups, from a computational physics perspective. We focus on symmetries and closed-system simulations in arbitrary boundary conditions, while discussing the numerical data structures and linear algebra manipulation routines involved, which form the core libraries of any tensor network code. At a higher level, we put the spotlight on loop-free network geometries, discussing their advantages, and presenting in detail algorithms to simulate low-energy equilibrium states. Accompanied by discussions of data structures, numerical techniques and performance, this anthology serves as a programmer's companion, as well as a self-contained introduction and review of the basic and selected advanced concepts in tensor networks, including examples of their applications.Comment: 115 pages, 56 figure
    • …
    corecore