314 research outputs found

    Marginal Release Under Local Differential Privacy

    Full text link
    Many analysis and machine learning tasks require the availability of marginal statistics on multidimensional datasets while providing strong privacy guarantees for the data subjects. Applications for these statistics range from finding correlations in the data to fitting sophisticated prediction models. In this paper, we provide a set of algorithms for materializing marginal statistics under the strong model of local differential privacy. We prove the first tight theoretical bounds on the accuracy of marginals compiled under each approach, perform empirical evaluation to confirm these bounds, and evaluate them for tasks such as modeling and correlation testing. Our results show that releasing information based on (local) Fourier transformations of the input is preferable to alternatives based directly on (local) marginals

    Washington University Magazine and Alumni News, Summer 1992

    Get PDF
    https://digitalcommons.wustl.edu/ad_wumag/1119/thumbnail.jp

    CHARACTERIZATION OF IN-SITU STRESS AND PERMEABILITY IN FRACTURED RESERVOIRS

    Full text link

    Biometric Systems

    Get PDF
    Because of the accelerating progress in biometrics research and the latest nation-state threats to security, this book's publication is not only timely but also much needed. This volume contains seventeen peer-reviewed chapters reporting the state of the art in biometrics research: security issues, signature verification, fingerprint identification, wrist vascular biometrics, ear detection, face detection and identification (including a new survey of face recognition), person re-identification, electrocardiogram (ECT) recognition, and several multi-modal systems. This book will be a valuable resource for graduate students, engineers, and researchers interested in understanding and investigating this important field of study

    Topics in Massive Data Summarization.

    Full text link
    We consider three problems in this thesis. First, we want to construct a nearly workload-optimal histogram. Given B, we want to find the near optimal B bucket histogram under associated workload w within 1 + epsilon error tolerance. In the cash register model where data is streamed as a series of updates, we can build a histogram using polylogarithmic space, polylogarithmic time to process each item, and polylogarithmic post-processing time to build the histogram. All these results need the workload to be explicitly stored since we show that if the workload is summarized in small space lossily, algorithmic results such as above do not exist. Then, we consider the problem of private computation of approximate Heavy Hitters. Alice and Bob each hold a vector and, in the vector sum, they want to find the B largest values along with their indices. We show how to solve the problem privately with polylogarithmic communication, polynomial work and constantly many rounds in the sense that nothing is learned by Alice and Bob beyond what is implied by their input, the ideal top-B output, and goodness of approximation (equivalently,the Euclidean norm of the vector sum). We give lower bounds showing that the Euclidean norm must leak by any efficient algorithm. In the third problem, we want to build a near optimal histogram on probabilistic data streams. Given B, we want to find the near optimal B bucket histogram on probabilistic data streams under both L1 measurement and L2 measurement. We give deterministic algorithms without sampling. We can build histograms using poly-logarithmic space, polylogarithmic time to process each item, and polylogarithmic post-processing time to build the histogram. The result we give under L2 measurement is within 1 + epsilon error tolerance, and the result under L1 measurement is heuristic. We also give a direction to give guarantees to the heuristic.Ph.D.Computer Science & EngineeringUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttp://deepblue.lib.umich.edu/bitstream/2027.42/60841/1/xuanzh_1.pd

    The value of reprocessing legacy data : a case study of Bois d'Arc, a Mississippi play in northeastern Oklahoma

    Get PDF
    Exploration companies have been producing the Mississippi Lime and overlying Redfork for almost 100 years, such that legacy 3D surveys cover significant parts of northern Oklahoma. Early 3D seismic surveys were acquired and processed to image conventional structural and stratigraphic plays that would be drilled by vertical wells. Modern adoption of horizontal drilling, acidation, and hydraulic fracturing have resulted in the Mississippi Limestone of Northern Oklahoma/Southern Kansas moving from marginal production to becoming one of the newest "unconventional" plays. With advances in processing algorithms and workflows, improved computing power, the desire to not only map structure but also to map rock properties such as density, porosity , lithology, P-wave velocity, S-wave velocity and the need to accurately land and guide horizontal wells, seismic data once thought to be sufficiently processed need to be re-examined.I illustrate the value of reprocessing a legacy seismic data volume acquired in 1999 in Kay County, OK by applying a modern workflow including surface consistent gain recovery and balancing, advanced phase match filtering of merged datasets, 3D FKK for ground roll attenuation, wavelet processing of vibroseis data in order to minimum phase convert for Wiener-Levinson spiking deconvolution. Final steps include Kirchhoff Prestack Time Migration followed by modern spectral enhancement. Each step adds incremental improvements to vertical and lateral resolution. I use both geometric attributes and impedance inversion to quantify the interpretational impact of reprocessing and find an improvement on vertical resolution from 20 m to 15 m. Coherence and curvature techniques show more detailed faulting and folding while ties to blind impedance wells increase from R=0.6 to R=0.7. Prestack acoustic impedance inversion indicates lateral changes in density and impedance that are consistent with tripolitic chert sweet spots

    Rock physics changes due to CO2 injection : the CO2CRC Otway Project

    Get PDF
    The CO2CRC Otway Project aims to demonstrate that CO2 can be safely stored in a depleted gas field and that an appropriate monitoring strategy can be deployed to verify its containment. The project commenced in 2005, with the baseline 3D seismic collected early in January 2008. CO2 was injected into depleted gas reservoir known as Waarre-C at Naylor field in April 2008. The first monitor survey was recorded in January 2009, shortly after the injection of 35,000 tonnes of CO2. Early predictions in the program suggested that the resulting time-lapse seismic effect will be very subtle because of the reservoir depth, small area, complexity, small amount of CO2/CH4 in 80/20 ratio injected and most of all partial saturation of the reservoir sand. The key challenge than presented to this research was how subtle exactly is the effect going to be? To answer that question I had to develop a workflow that will produce very accurate prediction of the elastic property changes in the reservoir caused by CO2 injection. Then the sensitivity of time-lapse seismic methodology in detecting subtle changes in the reservoir is investigated.The rock physics model I propose uses the “effective” grain bulk modulus (Kgrain) to represent the average mineralogy of the grains. The validity of this approach is confirmed by good agreement achieved between Vpsat core with Vpsat computed from the log data using the “effective” modulus. . The use of “effective” Kgrain was further justified by petrographic analysis. This has increased the modelling precision and changed the predicted time-lapse effect due to CO2 injection from 3% as an average over the reservoir sequence as previously computed to nearly 6%. The significance is that 6% change could be detected with high precision monitoring methodologies. The in-situ saturation type is homogeneous, according to the analysis path assumed in this thesis. If some patchiness exists in the reservoir it will be away from the wells and it would further elevate CO2 related seismic effect.The time-lapse seismic methodology at Otway site utilised very high survey density in order to increase sensitivity. On the negative side, weak sources and the change of the source type between the surveys resulted in non-repeatability greater or of the similar order as the time-lapse signal were expected to be. Hence the interpretation of the time-lapse P-wave seismic data assumed somewhat different path. I used the model-based post-stack seismic acoustic inversion in a similar way that history matching is used in reservoir simulation studies. I performed successive fluid substitutions, followed by the well ties and inversions. The objective was to look into the inversion error. Then the modelled fluid saturation case that result in minimal inversion error reflects the most likely state of the reservoir. Modelling using 35,000 tonnes of CO2/CH4 mix with 35% water saturation and 65% CO2/CH4 mix produced the smallest error when reinstating logs to the 2009 reservoir state.The time-lapse anomaly observed in the data exceeds predictions derived through the rock physics model, seismic modelling and simulation models. This is likely to be the case in general as the effect of CO2 onto a reservoir is difficult to predict. A “conservative” approach may result in an under-prediction of time-lapse seismic effects. Consequently, the predicted and measured seismic effects can be used as the lower and the upper bound of the time-lapse effects at Naylor field, respectively. The method presented here for analysis of a subtle time-lapse signal could be applied to the cases with similar challenges elsewhere
    • …
    corecore