15 research outputs found

    GTE. A new software for gravitational terrain effect computation: theory and performances

    Get PDF
    The computation of the vertical attraction due to the topographic masses, the so-called Terrain Correction, is a fundamental step in geodetic and geophysical applications: it is required in high-precision geoid estimation by means of the remove–restore technique and it is used to isolate the gravitational effect of anomalous masses in geophysical exploration. The increasing resolution of recently developed digital terrain models, the increasing number of observation points due to extensive use of airborne gravimetry in geophysical exploration and the increasing accuracy of gravity data represents nowadays major issues for the terrain correction computation. Classical methods such as prism or point masses approximations are indeed too slow while Fourier based techniques are usually too approximate for the required accuracy. In this work a new software, called Gravity Terrain Effects (GTE), developed to guarantee high accuracy and fast computation of terrain corrections is presented. GTE has been thought expressly for geophysical applications allowing the computation not only of the effect of topographic and bathymetric masses but also those due to sedimentary layers or to the Earth crust-mantle discontinuity (the so-called Moho). In the present contribution, after recalling the main classical algorithms for the computation of the terrain correction we summarize the basic theory of the software and its practical implementation. Some tests to prove its performances are also described showing GTE capability to compute high accurate terrain corrections in a very short time: results obtained for a real airborne survey with GTE ranges between few hours and few minutes, according to the GTE profile used, with differences with respect to both planar and spherical computations (performed by prism and tesseroid respectively) of the order of 0.02 mGal even when using fastest profiles

    European sessions and datum definitions

    No full text
    In this work we present some preliminary studies to assess a method to investigate the eect of the selection of dierent datums on the adjustment of a geodetic network on a continental scale. In particular we have considered European VLBI sessions. All the experiments since 1990 until the end of 2010 have been processed. The adjustments, session by session, have been performed two times, under the same analysis options but xing two slightly dierent datums. An analysis of estimated 3D coordinates making use of the maximum eigenvalue calculated for each station and each experiment, was carried out. The stability of the results and the in uence of dierent datum choices on the goodness of estimated coordinates have been investigated

    Gis Techniques For Digital Surface Models Outlier Detection

    No full text
    In the present work strategies to cope with outliers detection are defined both for datasets stored on 2D lattices and on 1D profiles. Assuming that (Hawkins D. M., 1980): "the outlier is an observation which deviates so much from other observations as to arouse suspicious that it was generated by different mechanism" we studied the problem of outlier detection in digital surface models, as first preprocessing and validation step. The methods proposed and the tools implemented have been applied to digital terrain models (DTMs), gridded geophysical data (gridded borehole depths, seismic velocities, amplitudes and phases, magnetic data, gravity data) but their use can be extended to data within different fields, as long as they represent surface models described by grid stored information. We decided to implement the software to blunders detection by adding apposite tools in GRASS (Geographical Resources Analysis Support System). The validation techniques are characterized by a common localization procedure: we examine the entire dataset by considering only a small subset at a time. Our basic hypothesis is that the values in the moving window (the mask) are observations affected by normal distributed white noise. An interpolating surface (a-priori model) is computed from the points surrounding the center of the moving mask (suspected blunder). The choice of the model determines the residual between the observation an the surface at the mask central point P0 and therefore the capability to detect the possible outlier. The surface model can be obtained in the following ways: polynomial interpolation, robust estimation by using the median, collocation (or kriging). For each of them we define an associated test in order to decide whether the point P0 is a blunder or not. The paper focus on the first approach

    Multimodal navigation within multilayer-modeled Gregorian Chant information

    No full text
    Information entities can often be described from many different points of view. In the digital domain, a multimedia environment is fit to the goal of describing, as well as a multimodal framework is fit to the goal of enjoying those information entities which are rich in heterogeneous facets. The proposed approach can be applied to all the fields where symbolic, textual, graphical, audio and video contents are involved. This paper aims at deepening the concept of multimodal navigation for multilayer-modeled information. Addressing the specific case study of music, our work first presents a suitable format to encode heterogeneous information within a single document. Then it proposes a computer-based approach for analyzing recordings whose musical scores are known, identifying their semantic relationship in automatic way. Finally, the method is tested on a set of Gregorian chants, in order to produce a multimodal Web application

    Clustering Spatio-Temporal Data Streams

    No full text
    A spatio-temporal data stream is a sequence of time-stamped geo-referenced data elements which arrive at consecutive time points. In addition to the spatial and temporal dimensions which are information bearing, stream poses further challenges to data mining, which are avoiding multiple scans of the entire data sets, optimizing memory usage, and mining only the most recent patterns. In this paper, we address the challenges of mining spatiotemporal data streams for a new class of space-time patterns, called trend-clusters. These patterns combine spatial clustering and trend discovery in stream environments. In particular, we propose a novel algorithm, called TRUST, which allows to retrieve groups of spatially continuous geo-referenced data which variate according to a close trend polyline in the recent window past. Experiments demonstrate the effectiveness of the proposed algorithm

    MONITORAGGIO DELLO STATO DI CONSERVAZIONE DEL “CORTILE DEL RICHINI” A MILANO: UN CASO STUDIO PER LA DEFINIZIONE DI LINEE GUIDA PER UN PROGETTO DI CONSERVAZIONE PROGRAMMATA

    No full text
    The famous “Cortile del Richini” (main building of the University of Milan) has been subjected to an extensive study for the knowledge and monitoring of the state of conservation of both the surfaces and the materials. A large number of researchers have been engaged in an interdisciplinary work aimed at collecting new data after in situ and laboratory investigations. Updated results have been correlated to those acquired during the last conservation campaign carried out in 1990-93. All data have been managed by the latest release of a web-based GIS software, called SICaR. The close observations of the sculpted surfaces decorating the courtyard revealed a very worrying situation, which includes severe damages such as new crusts, detachments, scaling, fissuring and cracks, as well as a huge deposition of particulate matter. Guidelines for the planning of the ordinary and extraordinary maintenance works will be defined and shared with the management board of the University of Milan
    corecore