3,564,870 research outputs found
Accuracy of remotely sensed data: Sampling and analysis procedures
A review and update of the discrete multivariate analysis techniques used for accuracy assessment is given. A listing of the computer program written to implement these techniques is given. New work on evaluating accuracy assessment using Monte Carlo simulation with different sampling schemes is given. The results of matrices from the mapping effort of the San Juan National Forest is given. A method for estimating the sample size requirements for implementing the accuracy assessment procedures is given. A proposed method for determining the reliability of change detection between two maps of the same area produced at different times is given
Bayesian econometrics:conjugate analysis and rejection sampling using mathematica
Mathematica is a powerful "system for doing mathematics by computer" which runs on personal computers (Macs and MS-DOS machines), workstations and mainframes. Here we show how Bayesian methods can be implemented in Mathematica. One of the drawbacks of Bayesian techniques is that they are computation-intensive, and every computation is a little different. Since Mathematica is so flexible, it can easily be adapted to solving a number of different Bayesian estimation problems. We illustrate the use of Mathematica functions (i) in a traditional conjugate analysis of the linear regression model and (ii) in a completely nonstandard model -where rejection sampling is used to sample from the posterior
A soil sampling program for the Netherlands
Soil data users in The Netherlands were inventoried for current and future data needs. Prioritized data needs were used to design the Netherlands Soil Sampling Program (NSSP) as a framework containing 3 groups of related projects: map upgrading, map updating and upgrading of pedotransfer functions. In each one group, the sampling design, performance criteria and optimal sample size were defined. This paper focuses on the upgrading of the existing soil map of The Netherlands at scale 1:50,000, and extensively treats the user inventory and the sampling strategy. The sampling design, performance criteria of the sampling and associated optimal sample size were obtained by statistical analysis of soil data available before the sampling. The Phosphate Sorption Capacity (PSC) was chosen as target variable to optimize sampling, because it dominated total cost per sample. A prior analysis of a performance criterion related to the sampling error of PSC resulted in a cost saving of 13% relative to total cost determined earlier by expert judgment. A posterior analysis showed that the set quality criterion was reached or better in 6 out of 7 cases. The NSSP resulted in a data base with soil data from 2524 sample points selected by stratified random sampling, and a collection of 5764 aliquots taken at these points. The NSSP has been showing its usage potential for various kinds of environmental studies and could be a sound future basis for a national scale monitoring program
Segmented compressed sampling for analog-to-information conversion: Method and performance analysis
A new segmented compressed sampling method for analog-to-information
conversion (AIC) is proposed. An analog signal measured by a number of parallel
branches of mixers and integrators (BMIs), each characterized by a specific
random sampling waveform, is first segmented in time into segments. Then
the sub-samples collected on different segments and different BMIs are reused
so that a larger number of samples than the number of BMIs is collected. This
technique is shown to be equivalent to extending the measurement matrix, which
consists of the BMI sampling waveforms, by adding new rows without actually
increasing the number of BMIs. We prove that the extended measurement matrix
satisfies the restricted isometry property with overwhelming probability if the
original measurement matrix of BMI sampling waveforms satisfies it. We also
show that the signal recovery performance can be improved significantly if our
segmented AIC is used for sampling instead of the conventional AIC. Simulation
results verify the effectiveness of the proposed segmented compressed sampling
method and the validity of our theoretical studies.Comment: 32 pages, 5 figures, submitted to the IEEE Transactions on Signal
Processing in April 201
Tangent space estimation for smooth embeddings of Riemannian manifolds
Numerous dimensionality reduction problems in data analysis involve the
recovery of low-dimensional models or the learning of manifolds underlying sets
of data. Many manifold learning methods require the estimation of the tangent
space of the manifold at a point from locally available data samples. Local
sampling conditions such as (i) the size of the neighborhood (sampling width)
and (ii) the number of samples in the neighborhood (sampling density) affect
the performance of learning algorithms. In this work, we propose a theoretical
analysis of local sampling conditions for the estimation of the tangent space
at a point P lying on a m-dimensional Riemannian manifold S in R^n. Assuming a
smooth embedding of S in R^n, we estimate the tangent space T_P S by performing
a Principal Component Analysis (PCA) on points sampled from the neighborhood of
P on S. Our analysis explicitly takes into account the second order properties
of the manifold at P, namely the principal curvatures as well as the higher
order terms. We consider a random sampling framework and leverage recent
results from random matrix theory to derive conditions on the sampling width
and the local sampling density for an accurate estimation of tangent subspaces.
We measure the estimation accuracy by the angle between the estimated tangent
space and the true tangent space T_P S and we give conditions for this angle to
be bounded with high probability. In particular, we observe that the local
sampling conditions are highly dependent on the correlation between the
components in the second-order local approximation of the manifold. We finally
provide numerical simulations to validate our theoretical findings
Analysis and optimization of weighted ensemble sampling
We give a mathematical framework for weighted ensemble (WE) sampling, a
binning and resampling technique for efficiently computing probabilities in
molecular dynamics. We prove that WE sampling is unbiased in a very general
setting that includes adaptive binning. We show that when WE is used for
stationary calculations in tandem with a coarse model, the coarse model can be
used to optimize the allocation of replicas in the bins.Comment: 22 pages, 3 figure
- …
