9,569 research outputs found

    Successive Coordinate Search and Component-by-Component Construction of Rank-1 Lattice Rules

    Full text link
    The (fast) component-by-component (CBC) algorithm is an efficient tool for the construction of generating vectors for quasi-Monte Carlo rank-1 lattice rules in weighted reproducing kernel Hilbert spaces. We consider product weights, which assigns a weight to each dimension. These weights encode the effect a certain variable (or a group of variables by the product of the individual weights) has. Smaller weights indicate less importance. Kuo (2003) proved that the CBC algorithm achieves the optimal rate of convergence in the respective function spaces, but this does not imply the algorithm will find the generating vector with the smallest worst-case error. In fact it does not. We investigate a generalization of the component-by-component construction that allows for a general successive coordinate search (SCS), based on an initial generating vector, and with the aim of getting closer to the smallest worst-case error. The proposed method admits the same type of worst-case error bounds as the CBC algorithm, independent of the choice of the initial vector. Under the same summability conditions on the weights as in [Kuo,2003] the error bound of the algorithm can be made independent of the dimension dd and we achieve the same optimal order of convergence for the function spaces from [Kuo,2003]. Moreover, a fast version of our method, based on the fast CBC algorithm by Nuyens and Cools, is available, reducing the computational cost of the algorithm to O(dā€‰nlogā”(n))O(d \, n \log(n)) operations, where nn denotes the number of function evaluations. Numerical experiments seeded by a Korobov-type generating vector show that the new SCS algorithm will find better choices than the CBC algorithm and the effect is better when the weights decay slower.Comment: 13 pages, 1 figure, MCQMC2016 conference (Stanford

    Fractal image compression: A resolution independent representation for imagery

    Get PDF
    A deterministic fractal is an image which has low information content and no inherent scale. Because of their low information content, deterministic fractals can be described with small data sets. They can be displayed at high resolution since they are not bound by an inherent scale. A remarkable consequence follows. Fractal images can be encoded at very high compression ratios. This fern, for example is encoded in less than 50 bytes and yet can be displayed at resolutions with increasing levels of detail appearing. The Fractal Transform was discovered in 1988 by Michael F. Barnsley. It is the basis for a new image compression scheme which was initially developed by myself and Michael Barnsley at Iterated Systems. The Fractal Transform effectively solves the problem of finding a fractal which approximates a digital 'real world image'

    IRS-TR 12001: Spectral Pointing-Induced Throughput Error and Spectral Shape in Short-Low Order 1

    Full text link
    We investigate how the shape of a spectrum in the Short-Low module on the IRS varies with its overall throughput, which depends on how well centered a source is in the spectroscopic slit. Using flux ratios to quantify the overall slope or color of the spectrum and plotting them vs. the overall throughput reveals a double-valued function, which arises from asymmetries in the point spread function. We use this plot as a means of determining which individual spectra are valid for calibrating the IRS.Comment: 9 pages, 3 figure

    Fragmentation in Mental Health Benefits and Services: A Preliminary Examination into Consumption and Outcomes

    Get PDF
    In this chapter, we examine consumption patterns and health outcomes within a health insurance system in which mental health benefits are administered under a carved-out insurance plan. Using a comprehensive dataset of health claims, including insurance claims for both mental and physical health services, we examine both heterogeneity of consumption and variation in outcomes. Consumption variation addresses the regularly overlooked question of how equal insurance and access does not translate into equitable consumption. Outcomes variation yields insights into the potential harms of disparate consumption and of uncoordinated care. We find that even when insurance and access are held constant, consumption of mental health services varies dramatically across race and class. We are unable, however, to find any evidence that higher levels of consumption correspond with improved health when health status is controlled. We also find some evidence of the costs of fragmentation, such as uncoordinated care, low adherence rates, and variation in sources of care. These findings have important implications for both the delivery of health services and the administration of health insurance benefits

    Fractal image compression

    Get PDF
    Fractals are geometric or data structures which do not simplify under magnification. Fractal Image Compression is a technique which associates a fractal to an image. On the one hand, the fractal can be described in terms of a few succinct rules, while on the other, the fractal contains much or all of the image information. Since the rules are described with less bits of data than the image, compression results. Data compression with fractals is an approach to reach high compression ratios for large data streams related to images. The high compression ratios are attained at a cost of large amounts of computation. Both lossless and lossy modes are supported by the technique. The technique is stable in that small errors in codes lead to small errors in image data. Applications to the NASA mission are discussed
    • ā€¦
    corecore