203 research outputs found

    Image processing applied to gravity and topography data covering the continental United States

    Get PDF
    The applicability of fairly standard image processing techniques to processing and analyzing large geologic data sets in addressed. Image filtering techniques were used to interpolate between gravity station locations to produce a regularly spaced data array that preserves detail in areas with good coverage, and that produces a continuous tone image rather than a contour map. Standard image processing techniques were used to digitally register and overlay topographic and gravity data, and the data were displayed in ways that emphasize subtle but pervasive structural features. The potential of the methods is illustrated through a discussion of linear structures that appear in the processed data between the midcontinent gravity high and the Appalachians

    PowerAqua: fishing the semantic web

    Get PDF
    The Semantic Web (SW) offers an opportunity to develop novel, sophisticated forms of question answering (QA). Specifically, the availability of distributed semantic markup on a large scale opens the way to QA systems which can make use of such semantic information to provide precise, formally derived answers to questions. At the same time the distributed, heterogeneous, large-scale nature of the semantic information introduces significant challenges. In this paper we describe the design of a QA system, PowerAqua, designed to exploit semantic markup on the web to provide answers to questions posed in natural language. PowerAqua does not assume that the user has any prior information about the semantic resources. The system takes as input a natural language query, translates it into a set of logical queries, which are then answered by consulting and aggregating information derived from multiple heterogeneous semantic sources

    Structure of the midcontinent basement. Topography, gravity, seismic, and remote sensing

    Get PDF
    Some 600,000 discrete Bouguer gravity estimates of the continental United States were spatially filtered to produce a continuous tone image. The filtered data were also digitally painted in color coded form onto a shaded relief map. The resultant image is a colored shaded relief map where the hue and saturation of a given image element is controlled by the value of the Bouguer anomaly. Major structural features (e.g., midcontinent gravity high) are readily discernible in these data, as are a number of subtle and previously unrecognized features. A linear gravity low that is approximately 120 to 150 km wide extends from southeastern Nebraska, at a break in the midcontinent gravity high, through the Ozark Plateau, and across the Mississippi embayment. The low is also aligned with the Lewis and Clark lineament (Montana to Washington), forming a linear feature of approximately 2800 km in length. In southeastern Missouri the gravity low has an amplitude of 30 milligals, a value that is too high to be explained by simple valley fill by sedimentary rocks

    The Historical Context of the Gender Gap in Mathematics

    Get PDF
    This chapter is based on the talk that I gave in August 2018 at the ICM in Rio de Janeiro at the panel on "The Gender Gap in Mathematical and Natural Sciences from a Historical Perspective". It provides some examples of the challenges and prejudices faced by women mathematicians during last two hundred and fifty years. I make no claim for completeness but hope that the examples will help to shed light on some of the problems many women mathematicians still face today

    The whole and its parts : why and how to disentangle plant communities and synusiae in vegetation classification

    Get PDF
    Most plant communities consist of different structural and ecological subsets, ranging from cryptogams to different tree layers. The completeness and approach with which these subsets are sampled have implications for vegetation classification. Non‐vascular plants are often omitted or sometimes treated separately, referring to their assemblages as “synusiae” (e.g. epiphytes on bark, saxicolous species on rocks). The distinction of complete plant communities (phytocoenoses or holocoenoses) from their parts (synusiae or merocoenoses) is crucial to avoid logical problems and inconsistencies of the resulting classification systems. We here describe theoretical differences between the phytocoenosis as a whole and its parts, and outline consequences of this distinction for practise and terminology in vegetation classification. To implement a clearer separation, we call for modifications of the International Code of Phytosociological Nomenclature and the EuroVegChecklist. We believe that these steps will make vegetation classification systems better applicable and raise the recognition of the importance of non‐vascular plants in the vegetation as well as their interplay with vascular plants

    How Ordinary Elimination Became Gaussian Elimination

    Get PDF
    Newton, in notes that he would rather not have seen published, described a process for solving simultaneous equations that later authors applied specifically to linear equations. This method that Euler did not recommend, that Legendre called "ordinary," and that Gauss called "common" - is now named after Gauss: "Gaussian" elimination. Gauss's name became associated with elimination through the adoption, by professional computers, of a specialized notation that Gauss devised for his own least squares calculations. The notation allowed elimination to be viewed as a sequence of arithmetic operations that were repeatedly optimized for hand computing and eventually were described by matrices.Comment: 56 pages, 21 figures, 1 tabl

    Leibniz's Infinitesimals: Their Fictionality, Their Modern Implementations, And Their Foes From Berkeley To Russell And Beyond

    Full text link
    Many historians of the calculus deny significant continuity between infinitesimal calculus of the 17th century and 20th century developments such as Robinson's theory. Robinson's hyperreals, while providing a consistent theory of infinitesimals, require the resources of modern logic; thus many commentators are comfortable denying a historical continuity. A notable exception is Robinson himself, whose identification with the Leibnizian tradition inspired Lakatos, Laugwitz, and others to consider the history of the infinitesimal in a more favorable light. Inspite of his Leibnizian sympathies, Robinson regards Berkeley's criticisms of the infinitesimal calculus as aptly demonstrating the inconsistency of reasoning with historical infinitesimal magnitudes. We argue that Robinson, among others, overestimates the force of Berkeley's criticisms, by underestimating the mathematical and philosophical resources available to Leibniz. Leibniz's infinitesimals are fictions, not logical fictions, as Ishiguro proposed, but rather pure fictions, like imaginaries, which are not eliminable by some syncategorematic paraphrase. We argue that Leibniz's defense of infinitesimals is more firmly grounded than Berkeley's criticism thereof. We show, moreover, that Leibniz's system for differential calculus was free of logical fallacies. Our argument strengthens the conception of modern infinitesimals as a development of Leibniz's strategy of relating inassignable to assignable quantities by means of his transcendental law of homogeneity.Comment: 69 pages, 3 figure
    corecore