61 research outputs found

    Teoria de confiabilidade generalizada para múltiplos outliers: apresentação, discussão e comparação com a teoria convencional

    Get PDF
    Após o ajustamento de observações pelo método dos mínimos quadrados (MMQ) ter sido realizado, é possível a detecção e a identificação de erros não aleatórios nas observações, por meio de testes estatísticos. A teoria da confiabilidade faz uso de medidas adequadas para quantificar o menor erro detectável em uma observação, e a sua influência sobre os parâmetros ajustados, quando não detectado. A teoria de confiabilidade convencional foi desenvolvida para os procedimentos de teste convencionais, como o data snooping, que pressupõem que apenas uma observação está contaminada por erros grosseiros por vez. Recentemente foram desenvolvidas medidas de confiabilidade generalizadas, relativas a testes estatísticos que pressupõem a existência, simultânea, de múltiplas observações com erros (outliers). O objetivo deste trabalho é apresentar, aplicar e discutir a teoria de confiabilidade generalizada para múltiplos outliers. Além da formulação teórica, este artigo também apresenta experimentos realizados em uma rede GPS (Global Positioning System), onde erros propositais foram inseridos em algumas observações e medidas de confiabilidade e testes estatísticos foram calculados utilizando a abordagem para múltiplos outliers. Comparações com a teoria de confiabilidade convencional também são realizadas. Por fim, apresentam-se as discussões e conclusões obtidas com estes experimentos

    Low-cost, high-precision, single-frequency GPS–BDS RTK positioning

    Get PDF
    The integration of the Chinese BDS with other systems, such as the American GPS, makes precise RTK positioning possible with low-cost receivers. We investigate the performance of low-cost ublox receivers, which cost a few hundred USDs, while making use of L1 GPS + B1 BDS data in Dunedin, New Zealand. Comparisons will be made to L1 + L2 GPS and survey-grade receivers which cost several thousand USDs. The least-squares variance component estimation procedure is used to determine the code and phase variances and covariances of the receivers and thus formulate a realistic stochastic model. Otherwise, the ambiguity resolution and hence positioning performance would deteriorate. For the same reasons, the existence of receiver-induced time correlation is also investigated. The low-cost RTK performance is then evaluated by formal and empirical ambiguity success rates and positioning precisions. It will be shown that the code and phase precision of the low-cost receivers can be significantly improved by using survey-grade antennas, since they have better signal reception and multipath suppression abilities in comparison with low-cost patch antennas. It will also be demonstrated that the low-cost receivers can achieve competitive ambiguity resolution and positioning performance to survey-grade dual-frequency GPS receivers

    Theory of carrier phase ambiguity resolution

    Get PDF
    Carrier phase ambiguity resolution is the key to high precision Global Navigation Satellite System (GNSS) positioning and navigation. It applies to a great variety of current and future models of GPS, modernized GPS and Galileo. A proper handling of carrier phase ambiguity resolution requires a proper understanding of the underlying theory of integer inference. In this contribution a brief review is given of the probabilistic theory of integer ambiguity estimation. We describe the concept of ambiguity pull-in regions, introduce the class of admissible integer estimators, determine their probability mass functions and show how their variability affect the uncertainty in the so-called ‘fixed’ baseline solution. The theory is worked out in more detail for integer least-squares and integer bootstrapping. It is shown that the integer least-squares principle maximizes the probability of correct integer estimation. Sharp and easy-to-compute bounds are given for both the ambiguity success rate and the baseline’s probability of concentration. Finally the probability density function of the ambiguity residuals is determined. This allows one for the first time to formulate rigorous tests for the integerness of the parameters

    DESIGN OF GEODETIC NETWORKS BASED ON OUTLIER IDENTIFICATION CRITERIA: AN EXAMPLE APPLIED TO THE LEVELING NETWORK

    Get PDF
    We present a numerical simulation method for designing geodetic networks. The quality criterion considered is based on the power of the test of data snooping testing procedure. This criterion expresses the probability of the data snooping to identify correctly an outlier. In general, the power of the test is defined theoretically. However, with the advent of the fast computers and large data storage systems, it can be estimated using numerical simulation. Here, the number of experiments in which the data snooping procedure identifies the outlier correctly is counted using Monte Carlos simulations. If the network configuration does not meet the reliability criterion at some part, then it can be improved by adding required observation to the surveying plan. The method does not use real observations. Thus, it depends on the geometrical configuration of the network; the uncertainty of the observations; and the size of outlier. The proposed method is demonstrated by practical application of one simulated leveling network. Results showed the needs of five additional observations between adjacent stations. The addition of these new observations improved the internal reliability of approximately 18%. Therefore, the final designed network must be able to identify and resist against the undetectable outliers – according to the probability levels

    Non-stationary covariance function modelling in 2D least-squares collocation

    Get PDF
    Standard least-squares collocation (LSC) assumes 2D stationarity and 3D isotropy, and relies on a covariance function to account for spatial dependence in the ob-served data. However, the assumption that the spatial dependence is constant through-out the region of interest may sometimes be violated. Assuming a stationary covariance structure can result in over-smoothing of, e.g., the gravity field in mountains and under-smoothing in great plains. We introduce the kernel convolution method from spatial statistics for non-stationary covariance structures, and demonstrate its advantage fordealing with non-stationarity in geodetic data. We then compared stationary and non-stationary covariance functions in 2D LSC to the empirical example of gravity anomaly interpolation near the Darling Fault, Western Australia, where the field is anisotropic and non-stationary. The results with non-stationary covariance functions are better than standard LSC in terms of formal errors and cross-validation against data not used in the interpolation, demonstrating that the use of non-stationary covariance functions can improve upon standard (stationary) LSC

    Long-range chemical sensitivity in the sulfur K-edge X-ray absorption spectra of substituted thiophenes

    Get PDF
    © 2014 American Chemical Society. Thiophenes are the simplest aromatic sulfur-containing compounds and are stable and widespread in fossil fuels. Regulation of sulfur levels in fuels and emissions has become and continues to be ever more stringent as part of governments' efforts to address negative environmental impacts of sulfur dioxide. In turn, more effective removal methods are continually being sought. In a chemical sense, thiophenes are somewhat obdurate and hence their removal from fossil fuels poses problems for the industrial chemist. Sulfur K-edge X-ray absorption spectroscopy provides key information on thiophenic components in fuels. Here we present a systematic study of the spectroscopic sensitivity to chemical modifications of the thiophene system. We conclude that while the utility of sulfur K-edge X-ray absorption spectra in understanding the chemical composition of sulfur-containing fossil fuels has already been demonstrated, care must be exercised in interpreting these spectra because the assumption of an invariant spectrum for thiophenic forms may not always be valid

    MODES - Modellierung von Energiesystemen

    No full text
    Das Software-System MODES ist ein Programmwerkzeug zur Planung und Optimierung von Energieversorgungsanlagen bezueglich Auslegung und Energiemanagement. Es ermoeglicht die integrierte Betrachtung von Waerme und Elektroenergie unter Einschluss von erneuerbaren Energietraegem. Da fuer die Eingabeparameter nicht nur feste Werte, sondern auch Unschaerfebereiche eingegeben werden koennen, ist eine stochastische Behandlung der Ergebnisse moeglich. Neben der technischen erfolgt ebenfalls die in der Praxis unumgaengliche wirtschaftliche Simulation. Damit ist mit Hilfe von MODES eine numerische, mehrkriterielle technische und energiewirtschaftliche Bewertung von Energieversorgungssystemen moeglich. (orig.)MODES is a tool for projecting and optimisation of design and energy management of power supply systems which enables integrated analysis of thermal and electrical energy in consideration of renewable energy sources. Stochastic treatment of results is possible. The technical simulation is supplemented by an economic simulation.SIGLEAvailable from TIB Hannover: F02B759 / FIZ - Fachinformationszzentrum Karlsruhe / TIB - Technische InformationsbibliothekBundesministerium fuer Bildung und Forschung, Berlin (Germany)DEGerman
    corecore