13 research outputs found

    Nonparametric Fixed-Interval Smoothing with Vector Splines

    Full text link
    A computationally efficient algorithm for nonparametric smoothing of vector signals with general measurement covariances is presented. This algorithm provides an alternative to the optimal smoothing algorithms that hinge on (possibly inaccurate) parametric state-space models. Automatic procedures that use the measurements to determine how much to smooth are developed and compared. This adaptation allows the data to speak for itself without imposing a Gauss-Markov model structure. A nonparametric approach to covariance estimation for the case of independently identically distributed (i.i.d.) measurement errors is presented. Monte Carlo simulations demonstrate the performance of the algorithm.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/85838/1/Fessler113.pd

    Spline Estimator in Multi-Response Nonparametric Regression Model

    Get PDF
    In many applications two or more dependent variables are observed at several values of the independent variables, such as at time points. The statistical problems are to estimate functions that model their dependences on the independent variables, and to investigate relationships between these functions. Nonparametric regression model, especially smoothing splines provides powerful tools to model the functions which draw association of these variables. Penalized weighted least-squares is used to jointly estimate nonparametric functions from contemporaneously correlated data. In this paper we formulate the multi-response nonparametric regression model and give a theoretical method for both obtaining distribution of the response and estimating the nonparametric function in the model. We also estimate the smoothing parameters, the weighting parameters and the correlation parameter simultaneously by applying three methods: generalized maximum likelihood (GML), generalized cross validation (GCV) and leaving-out-one-pair cross validation (CV).

    Nonparametric Fixed-Interval Smoothing of Nonlinear Vector-Valued Measurements

    Full text link
    The problem of estimating a smooth vector-valued function given noisy nonlinear vector-valued measurements of that function is addressed. A nonparametric optimality criterion for this estimation problem is presented, and a computationally efficient iterative algorithm for its solution is developed. The criterion is the natural generalization of previously published work on vector splines with linear measurement models. The algorithm provides an alternative to the extended Kalman filter, as it does not require a parametric state-space model. An automatic procedure that uses the measurements to determine how much to smooth is presented. The algorithm's subpixel estimation accuracy is demonstrated on the estimation of a curved edge in a noisy image and on a biomedical image-processing application.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/85839/1/Fessler112.pd

    Tomographic Reconstruction Using Information-Weighted Spline Smoothing

    Full text link
    The conventional method for tomographic image reconstruction, convolution backprojection (CBP), attempts to reduce the effects of measurement noise by radial smoothing with a spatially-invariant filter. Spatially-invariant smoothing is suboptimal when the measurement statistics are nonstationary, and often leads to a choice between oversmoothing or streak artifacts. In this paper, we describe a nonstationary sinogram smoothing method that accounts for the relative variances between different detector measurements and for the finite width of tomographic detectors. The method is based on an information-weighted smoothing spline, where the weights are determined from the calibration factors and from the measurements themselves. This weighting diminishes the influence of high variance measurements, such as detectors with relatively poor efficiency, which is shown to reduce streak artifacts. Simulations of emission and transmission tomography applications demonstrate qualitatively improved image noise structure and quantitative improvements in the tradeoffs between bias and variance.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/85946/1/Fessler109.pd

    Fast Filtering and Smoothing for Multivariate State Space Models

    Get PDF
    This paper gives a new approach to diffuse filtering and smoothing for multivariate state space models. The standard approach treats the observations as vectors while our approach treats each element of the observational vector individually. This strategy leads to computationally efficient methods for multivariate filtering and smoothing. Also, the treatment of the diffuse initial state vector in multivariate models is much simpler than existing methods. The paper presents details of relevant algorithms for filtering, prediction and smoothing. Proofs are provided. Three examples of multivariate models in statistics and economics are presented for which the new approach is particularly relevant.

    Evaluation and comparison of the processing methods of airborne gravimetry concerning the errors effects on downward continuation results: Case studies in Louisiana (USA) and the Tibetan Plateau (China)

    Get PDF
    Gravity data gaps in mountainous areas are nowadays often filled in with the data from airborne gravity surveys. Because of the errors caused by the airborne gravimeter sensors, and because of rough flight conditions, such errors cannot be completely eliminated. The precision of the gravity disturbances generated by the airborne gravimetry is around 3–5 mgal. A major obstacle in using airborne gravimetry are the errors caused by the downward continuation. In order to improve the results the external high-accuracy gravity information e.g., from the surface data can be used for high frequency correction, while satellite information can be applying for low frequency correction. Surface data may be used to reduce the systematic errors, while regularization methods can reduce the random errors in downward continuation. Airborne gravity surveys are sometimes conducted in mountainous areas and the most extreme area of the world for this type of survey is the Tibetan Plateau. Since there are no high-accuracy surface gravity data available for this area, the above error minimization method involving the external gravity data cannot be used. We propose a semi-parametric downward continuation method in combination with regularization to suppress the systematic error effect and the random error effect in the Tibetan Plateau; i.e., without the use of the external high-accuracy gravity data. We use a Louisiana airborne gravity dataset from the USA National Oceanic and Atmospheric Administration (NOAA) to demonstrate that the new method works effectively. Furthermore, and for the Tibetan Plateau we show that the numerical experiment is also successfully conducted using the synthetic Earth Gravitational Model 2008 (EGM08)-derived gravity data contaminated with the synthetic errors. The estimated systematic errors generated by the method are close to the simulated values. In addition, we study the relationship between the downward continuation altitudes and the error effect. The analysis results show that the proposed semi-parametric method combined with regularization is efficient to address such modelling problems

    Improvement of downward continuation values of airborne gravity data in Taiwan

    Get PDF
    An airborne gravity survey was carried out to fill gaps in the gravity data for the mountainous areas of Taiwan. However, the downward continuation error of airborne gravity data is a major issue, especially in regions with complex terrain, such as Taiwan. The root mean square (RMS) of the difference between the downward continuation values and land gravity was approximately 20 mGal. To improve the results of downward continuation we investigated the inverse Poisson’s integral, the semi-parametric method combined with regularization (SPR) and the least-squares collocation (LSC) in this paper. The numerically simulated experiments are conducted in the Tibetan Plateau, which is also a mountainous area. The results show that as a valuable supplement to the inverse Poisson’s integral, the SPR is a useful approach to estimate systematic errors and to suppress random errors. While the LSC approach generates the best results in the Tibetan Plateau in terms of the RMS of the downward continuation errors. Thus, the LSC approach with a terrain correction (TC) is applied to the downward continuation of real airborne gravity data in Taiwan. The statistical results show that the RMS of the differences between the downward continuation values and land gravity data reduced to 11.7 mGal, which shows that an improvement of 40% is obtained

    Modelling of future extreme storm surges at the NW Mediterranean coast (Spain)

    Get PDF
    Storm surges are one of the main drivers for extreme flooding at the coastal areas. Such events can be characterized with the maximum level in an extreme storm surge event (surge peak), as well as the duration of the event. Surge projections come from a barotropic model for the 1950–2100 period, under a severe climate change scenario (RCP 8.5) at the northeastern Spanish coast. The relationship of extreme storm surges to three large-scale climate patterns was assessed: North Atlantic Oscillation (NAO), East Atlantic Pattern (EAWR), and Scandinavian Pattern (SC). The statistical model was built using two different strategies. In Strategy #1, the joint probability density was characterized by a moving-average series of stationary Archimedean copula, whereas in Strategy #2, the joint probability density was characterized by a non-stationary probit copula. The parameters of the marginal distribution and the copula were defined with generalized additive models. The analysis showed that the mean values of surge peak and event duration were constant and were independent of the proposed climate patterns. However, the values of NAO and SC influenced the threshold and the storminess of extreme events. According to Strategy #1, the variance of the surge peak and event duration increased with a fast shift of negative SC and a positive NAO, respectively. Alternatively, Strategy #2 showed that the variance of the surge peak increased with a positive EAWR. Both strategies coincided in that the joint dependence of the maximum surge level and the duration of extreme surges ranged from low to medium degree. Its mean value was stationary, and its variability was linked to the geographical location. Finally, Strategy #2 helped determine that this dependence increased with negative NAO.Peer ReviewedPostprint (published version

    Multivariate hybrid modelling of future wave-storms at the northwestern Black Sea

    Get PDF
    The characterization of future wave-storms and their relationship to large-scale climate can provide useful information for environmental or urban planning at coastal areas. A hybrid methodology (process-based and statistical) was used to characterize the extreme wave-climate at the northwestern Black Sea. The Simulating WAve Nearshore spectral wave-model was employed to produce wave-climate projections, forced with wind-fields projections for two climate change scenarios: Representative Concentration Pathways (RCPs) 4.5 and 8.5. A non-stationary multivariate statistical model was built, considering significant wave-height and peak-wave-period at the peak of the wave-storm, as well as storm total energy and storm-duration. The climate indices of the North Atlantic Oscillation, East Atlantic Pattern, and Scandinavian Pattern have been used as covariates to link to storminess, wave-storm threshold, and wave-storm components in the statistical model. The results show that, first, under both RCP scenarios, the mean values of significant wave-height and peak-wave-period at the peak of the wave-storm remain fairly constant over the 21st century. Second, the mean value of storm total energy is more markedly increasing in the RCP4.5 scenario than in the RCP8.5 scenario. Third, the mean value of storm-duration is increasing in the RCP4.5 scenario, as opposed to the constant trend in the RCP8.5 scenario. The variance of each wave-storm component increases when the corresponding mean value increases under both RCP scenarios. During the 21st century, the East Atlantic Pattern and changes in its pattern have a special influence on wave-storm conditions. Apart from the individual characteristics of each wave-storm component, wave-storms with both extreme energy and duration can be expected in the 21st century. The dependence between all the wave-storm components is moderate, but grows with time and, in general, the severe emission scenario of RCP8.5 presents less dependence between storm total energy and storm-duration and among wave-storm components.Peer ReviewedPostprint (published version
    corecore