253 research outputs found

    Optimizing Talbot's Contours for the Inversion of the Laplace Transform

    Get PDF
    Talbot's method for the numerical inversion of the Laplace Transform consists of numerically integrating the Bromwich integral on a special contour by means of the trapezoidal or midpoint rules. In this paper we address the issue of how to choose the parameters that define the contour, for the particular situation when parabolic PDEs are solved. In the process the well known subgeometric convergence rate O(e -c \sqrt N) of this method is improved to the geometric rate O(e -cN) with N the number of nodes in the integration rule. The value of the maximum decay rate c is explicitly determined. Numerical results involving two versions of the heat equation are presented. With the choice of parameters derived here, the rule-of-thumb is that to achieve an accuracy of 10 -l at any given time t, the associated elliptic problem has to be solved no more that l times.\ud \ud Supported by the National Research Foundation in South Africa under grant NRF528

    A Numerical Methodology for the Painlevé Equations

    Get PDF
    The six PainlevĂ© transcendents PI – PVI have both applications and analytic properties that make them stand out from most other classes of special functions. Although they have been the subject of extensive theoretical investigations for about a century, they still have a reputation for being numerically challenging. In particular, their extensive pole fields in the complex plane have often been perceived as ‘numerical mine fields’. In the present work, we note that the PainlevĂ© property in fact provides the opportunity for very fast and accurate numerical solutions throughout such fields. When combining a Taylor/PadĂ©-based ODE initial value solver for the pole fields with a boundary value solver for smooth regions, numerical solutions become available across the full complex plane. We focus here on the numerical methodology, and illustrate it for the PI equation. In later studies, we will concentrate on mathematical aspects of both the PI and the higher PainlevĂ© transcendents

    The exponentially convergent trapezoidal rule

    Get PDF
    It is well known that the trapezoidal rule converges geometrically when applied to analytic functions on periodic intervals or the real line. The mathematics and history of this phenomenon are reviewed and it is shown that far from being a curiosity, it is linked with computational methods all across scientific computing, including algorithms related to inverse Laplace transforms, special functions, complex analysis, rational approximation, integral equations, and the computation of functions and eigenvalues of matrices and operators

    Parabolic and Hyperbolic Contours for Computing the Bromwich Integral

    Get PDF
    Some of the most effective methods for the numerical inversion of the Laplace transform are based on the approximation of the Bromwich contour integral. The accuracy of these methods often hinges on a good choice of contour, and several such contours have been proposed in the literature. Here we analyze two recently proposed contours, namely a parabola and a hyperbola. Using a representative model problem, we determine estimates for the optimal parameters that define these contours. An application to a fractional diffusion equation is presented.\ud \ud JACW was supported by the National Research Foundation in South Africa under grant FA200503230001

    The idea of lingual economy

    Get PDF
    A number of philosophical concepts in linguistics may be conceptualised as primitives or founding concepts. Many of these are historically significant; cf. the concepts lingual system; lingual position and sequence, lingual constancy. Less obvious primitives are ideas of spheres of discourse, text type and acceptability. Generally, such foundational notions may be characterised either as constitutive concepts or as regulative ideas. This article will discuss one such regulative linguistic idea, viz. lingual economy, especially as this was articulated in the work of the ethnomethodologists on turn-taking. Like many other linguistic primitives, this idea constitutes a significant advance in our understanding of things lingual. The analyses referred to below give insight into the normative dimensions of our communicative ability to function as lingual subjects within the material lingual sphere of conversation. These analyses constitute an advance on earlier analyses of conversation, where the overall impression is that it is “random”, forever edging towards indeterminacy and chaos. We may currently build upon the remarkable explanations, first given by ethnomethodology, for lingual distribution, equality, lingually scarce resource, and so forth. The article will argue that these relate to significant regulative ideas that disclose the structure of the lingual dimension of reality

    The Kink Phenomenon in Fejér and Clenshaw-Curtis Quadrature

    Get PDF
    The FejĂ©r and Clenshaw-Curtis rules for numerical integration exhibit a curious phenomenon when applied to certain analytic functions. When N, (the number of points in the integration rule) increases, the error does not decay to zero evenly but does so in two distinct stages. For N less than a critical value, the error behaves like O(ϱ−2N)O(\varrho^{-2N}), where ϱ\varrho is a constant greater than 1. For these values of N the accuracy of both the FejĂ©r and Clenshaw-Curtis rules is almost indistinguishable from that of the more celebrated Gauss-Legendre quadrature rule. For larger N, however, the error decreases at the rate O(ϱ−N)O(\varrho^{-N}), i.e., only half as fast as before. Convergence curves typically display a kink where the convergence rate cuts in half. In this paper we derive explicit as well as asymptotic error formulas that provide a complete description of this phenomenon.\ud \ud This work was supported by the Royal Society of the UK and the National Research Foundation of South Africa under the South Africa-UK Science Network Scheme. The first author also acknowledges grant FA2005032300018 of the NRF

    Academic literacy interventions: What are we not yet doing, or not yet doing right?

    Get PDF
    We now much more readily accept a skills-neutral rather than a skillsbaseddefinition of academic literacy, changing our conceptualisations of what academic literacy is. Yet two issues have evaded scrutiny: first, there is theuncritical acceptance that academic writing is what should be taught, andinstitutionalised. Second, there is a tendency to accept discipline specific academic literacy courses as necessarily superior to generic ones. There is a third, foundational level omission in our work. That is that there is little reciprocity in what we learn from applied linguistic artefacts in the realms of language testing, language course design, and language policy making. Why do we not check whether the design of a course should be done as responsibly as that of a test? What can test designers learn from course developers about specificity? There are many useful questions thatare right before us, but that we never seem to ask.Key words: academic literacy, academic writing, discipline specific, academicliteracy test, curriculum, course, desig

    Ensuring coherence: two solutions to organising poetic language

    Get PDF
    The organisation of poetic language as discourse type and as text is worth considering in its own right. What do poets bring to expression through their organisation of language, and how do they do it, if they employ language skilfully in order to support the main discursive threads of their work? This contribution demonstrates that poets may choose to organise language around discursive threads in order to ensure the integrity or wholeness of their texts. We might also label this wholeness the aesthetic coherence of the poetry that is produced. The article discusses two examples of how poets ensure coherence by organising their language in highly specific and inventive ways

    Academic literacy tests: design, development, piloting and refinement

    Get PDF
    Though there are many conditions for drafting language tests responsibly, this contribution focuses first on how to operationalise a set of three critically important design principles for such tests. For the last thirty years or so, developers of language tests have agreed that the most important design principle emanates from our ability to give a theoretical justification for what it is that we are measuring. Without this, we eventually have very little ground for a responsible interpretation of test results, which is a second, though not secondary, principle for language test design. There is a third principle involved, which is that the measuring instrument must be consistent and stable. The paper investigates how a blueprint for an academic literacy test may be conceptualised, how that could be operationalised, and demonstrates how pilot tests are analysed with a view to refining them. Finally, that leads to a consideration of how to arrive at a final draft test, and how valid and appropriate interpretations of its results may be made. Since the three conditions for language tests focussed on here are not the only design principles for such applied linguistic instruments, the discussion is placed in a broader philosophical framework for designing language tests that also includes a consideration of some of the remaining design principles for language testing
    • 

    corecore