22,955 research outputs found

    Wavelet-based estimation with multiple sampling rates

    Full text link
    We suggest an adaptive sampling rule for obtaining information from noisy signals using wavelet methods. The technique involves increasing the sampling rate when relatively high-frequency terms are incorporated into the wavelet estimator, and decreasing it when, again using thresholded terms as an empirical guide, signal complexity is judged to have decreased. Through sampling in this way the algorithm is able to accurately recover relatively complex signals without increasing the long-run average expense of sampling. It achieves this level of performance by exploiting the opportunities for near-real time sampling that are available if one uses a relatively high primary resolution level when constructing the basic wavelet estimator. In the practical problems that motivate the work, where signal to noise ratio is particularly high and the long-run average sampling rate may be several hundred thousand operations per second, high primary resolution levels are quite feasible.Comment: Published at http://dx.doi.org/10.1214/009053604000000751 in the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Effect of control sampling rates on model-based manipulator control schemes

    Get PDF
    The effect of changing the control sampling period on the performance of the computed-torque and independent joint control schemes is discussed. While the former utilizes the complete dynamics model of the manipulator, the latter assumes a decoupled and linear model of the manipulator dynamics. Researchers discuss the design of controller gains for both the computed-torque and the independent joint control schemes and establish a framework for comparing their trajectory tracking performance. Experiments show that within each scheme the trajectory tracking accuracy varies slightly with the change of the sampling rate. However, at low sampling rates the computed-torque scheme outperforms the independent joint control scheme. Based on experimental results, researchers also conclusively establish the importance of high sampling rates as they result in an increased stiffness of the system

    Approximation with Error Bounds in Spark

    Full text link
    We introduce a sampling framework to support approximate computing with estimated error bounds in Spark. Our framework allows sampling to be performed at the beginning of a sequence of multiple transformations ending in an aggregation operation. The framework constructs a data provenance tree as the computation proceeds, then combines the tree with multi-stage sampling and population estimation theories to compute error bounds for the aggregation. When information about output keys are available early, the framework can also use adaptive stratified reservoir sampling to avoid (or reduce) key losses in the final output and to achieve more consistent error bounds across popular and rare keys. Finally, the framework includes an algorithm to dynamically choose sampling rates to meet user specified constraints on the CDF of error bounds in the outputs. We have implemented a prototype of our framework called ApproxSpark, and used it to implement five approximate applications from different domains. Evaluation results show that ApproxSpark can (a) significantly reduce execution time if users can tolerate small amounts of uncertainties and, in many cases, loss of rare keys, and (b) automatically find sampling rates to meet user specified constraints on error bounds. We also explore and discuss extensively trade-offs between sampling rates, execution time, accuracy and key loss

    Error evaluation for difference approximations to ordinary differential equations

    Get PDF
    Method involves relationships between errors introduced by using finite sampling rates and parameters describing specific numerical method used. Procedurre is used in design and analysi of digital filters and simulators

    Calibration of polyurethane foam (PUF) disk passive air samplers for quantitative measurement of polychlorinated biphenyls (PCBs) and polybrominated diphenyl ethers (PBDEs): Factors influencing sampling rates.

    Get PDF
    PUF disk passive air samplers are increasingly employed for monitoring of POPs in ambient air. In order to utilize them as quantitative sampling devices, a calibration experiment was conducted. Time integrated indoor air concentrations of PCBs and PBDEs were obtained from a low volume air sampler operated over a 50d period alongside the PUF disk samplers in the same office microenvironment. Passive sampling rates for the fully-sheltered sampler design employed in our research were determined for the 51 PCB and 7 PBDE congeners detected in all calibration samples. These values varied from .57 to 1.55m(3)d(-1) for individual PCBs and from 1.1 to 1.9m(3)d(-1) for PBDEs. These values are appreciably lower than those reported elsewhere for different PUF disk sampler designs (e.g. partially sheltered) employed under different conditions (e.g. in outdoor air), and derived using different calibration experiment configurations. This suggests that sampling rates derived for a specific sampler configuration deployed under specific environmental conditions, should not be extrapolated to different sampler configurations. Furthermore, our observation of variable congener-specific sampling rates (consistent with other studies), implies that more research is required in order to understand fully the factors that influence sampling rates. Analysis of wipe samples taken from the inside of the sampler housing, revealed evidence that the housing surface scavenges particle bound PBDEs
    • …
    corecore