596 research outputs found
VerdictDB: Universalizing Approximate Query Processing
Despite 25 years of research in academia, approximate query processing (AQP)
has had little industrial adoption. One of the major causes of this slow
adoption is the reluctance of traditional vendors to make radical changes to
their legacy codebases, and the preoccupation of newer vendors (e.g.,
SQL-on-Hadoop products) with implementing standard features. Additionally, the
few AQP engines that are available are each tied to a specific platform and
require users to completely abandon their existing databases---an unrealistic
expectation given the infancy of the AQP technology. Therefore, we argue that a
universal solution is needed: a database-agnostic approximation engine that
will widen the reach of this emerging technology across various platforms.
Our proposal, called VerdictDB, uses a middleware architecture that requires
no changes to the backend database, and thus, can work with all off-the-shelf
engines. Operating at the driver-level, VerdictDB intercepts analytical queries
issued to the database and rewrites them into another query that, if executed
by any standard relational engine, will yield sufficient information for
computing an approximate answer. VerdictDB uses the returned result set to
compute an approximate answer and error estimates, which are then passed on to
the user or application. However, lack of access to the query execution layer
introduces significant challenges in terms of generality, correctness, and
efficiency. This paper shows how VerdictDB overcomes these challenges and
delivers up to 171 speedup (18.45 on average) for a variety of
existing engines, such as Impala, Spark SQL, and Amazon Redshift, while
incurring less than 2.6% relative error. VerdictDB is open-sourced under Apache
License.Comment: Extended technical report of the paper that appeared in Proceedings
of the 2018 International Conference on Management of Data, pp. 1461-1476.
ACM, 201
Radiative Muon Capture on Hydrogen and the Induced Pseudoscalar Coupling
The first measurement of the elementary process is reported. A photon pair spectrometer was used to measure
the partial branching ratio ( for photons of k >
60 MeV. The value of the weak pseudoscalar coupling constant determined from
the partial branching ratio is , where the first error is the quadrature sum of statistical
and systematic uncertainties and the second error is due to the uncertainty in
, the decay rate of the ortho to para molecule. This
value of g_p is 1.5 times the prediction of PCAC and pion-pole dominance.Comment: 13 pages, RevTeX type, 3 figures (encapsulated postscript), submitted
to Phys. Rev. Let
Jamming and Fluctuations in Granular Drag
We investigate the dynamic evolution of jamming in granular media through
fluctuations in the granular drag force. The successive collapse and formation
of jammed states give a stick-slip nature to the fluctuations which is
independent of the contact surface between the grains and the dragged object --
thus implying that the stress-induced collapse is nucleated in the bulk of the
granular sample. We also find that while the fluctuations are periodic at small
depths, they become "stepped" at large depths, a transition which we interpret
as a consequence of the long-range nature of the force chains.Comment: 7 pages, 4 figures, RevTe
Parity Violation in Elastic Electron-Proton Scattering and the Proton's Strange Magnetic Form Factor
We report a new measurement of the parity-violating asymmetry in elastic electron scattering from the proton at backward scattering angles. This asymmetry is sensitive to the strange magnetic form factor of the proton as well as electroweak axial radiative corrections. The new measurement of A = -4.92±0.61±0.73 ppm provides a significant constraint on these quantities. The implications for the strange magnetic form factor are discussed in the context of theoretical estimates for the axial corrections
Parity-violating Electron Deuteron Scattering and the Proton's Neutral Weak Axial Vector Form Factor
We report on a new measurement of the parity-violating asymmetry in
quasielastic electron scattering from the deuteron at backward angles at Q2=
0.038 (GeV/c)2. This quantity provides a determination of the neutral weak
axial vector form factor of the nucleon, which can potentially receive large
electroweak corrections. The measured asymmetry A=-3.51 +/- 0.57(stat) +/-
0.58(sys)ppm is consistent with theoretical predictions. We also report on
updated results of the previous experiment at Q2=0.091 (GeV/c)2, which are also
consistent with theoretical predictions.Comment: 4 pages, 2 figures, submitted to Phys. Rev. Let
A Comprehensive Case Study of Macrosegregation in a Steel Ingot
This is the author accepted manuscript. The final version is available from Springer via http://dx.doi.org/10.1007/s11663-015-0386-yA case study is presented that examines the macrosegregation and grain structure present in a 12-tonne steel ingot, which was cast for experimental purposes. Details of the casting procedure were well documented and the resulting ingot was characterized using a number of techniques that measured chemical segregation, shrinkage, and porosity. The formation of the porosity and segregation patterns is discussed in reference to the particular grain structure observed in the ingot. It is hoped that this case study can be used as a tool for the validation of future macromodels.This work was undertaken as part of a Project sponsored by Rolls-Royce Power Nuclear plc in collaboration with Sheffield Forgemasters International
Illness Mapping: A time and cost effective method to estimate healthcare data needed to establish community-based health insurance
Background: Most healthcare spending in developing countries is private out-of-pocket. One explanation for low penetration of health insurance is that poorer individuals doubt their ability to enforce insurance contracts. Community-based health insurance schemes (CBHI) are a solution, but launching CBHI requires obtaining accurate local data on morbidity, healthcare utilization and other details to inform package design and pricing. We developed the "Illness Mapping" method (IM) for data collection (faster and cheaper than household surveys). Methods. IM is a modification of two non-interactive consensus group methods (Delphi and Nominal Group Technique) to operate as interactive methods. We elicited estimates from "Experts" in the target community on morbidity and healthcare utilization. Interaction between facilitator and experts became essential to bridge literacy constraints and to reach consensus.The study was conducted in Gaya District, Bihar (India) during April-June 2010. The intervention included the IM and a household survey (HHS). IM included 18 women's and 17 men's groups. The HHS was conducted in 50 villages with1,000 randomly selected households (6,656 individuals). Results: We found good agreement between the two methods on overall prevalence of illness (IM: 25.9% ±3.6; HHS: 31.4%) and on prevalence of acute (IM: 76.9%; HHS: 69.2%) and chronic illnesses (IM: 20.1%; HHS: 16.6%). We also found good agreement on incidence of deliveries (IM: 3.9% ±0.4; HHS: 3.9%), and on hospital deliveries (IM: 61.0%. ± 5.4; HHS: 51.4%). For hospitalizations, we obtained a lower estimate from the IM (1.1%) than from the HHS (2.6%). The IM required less time and less person-power than a household survey, which translate into reduced costs. Conclusions: We have shown that our Illness Mapping method can be carried out at lower financial and human cost for sourcing essential local data, at acceptably accurate levels. In view of the good fit of results obtained, we assume that the method could work elsewhere as well
Some Consequences of Thermosolutal Convection: The Grain Structure of Castings
The essential principles of thermosolutal convection are outlined, and how convection provides a transport mechanism between the mushy region of a casting and the open bulk liquid is illustrated. The convective flow patterns which develop assist in heat exchange and macroscopic solute segregation during solidification; they also provide a mechanism for the transport of dendritic fragments from the mushy region into the bulk liquid. Surviving fragments become nuclei for equiaxed grains and so lead to blocking of the parental columnar, dendritic growth front from which they originated. The physical steps in such a sequence are considered and some experimental data are provided to support the argument
Synthetic biology: Building the language for a new science brick by metaphorical brick
Changes in the biosciences and their relations to society over the last decades provide a unique opportunity to examine whether or not such changes leave traces in the language we use to talk about them. In this article we examine metaphors used in English-speaking press coverage to conceptualize a new type of (interdisciplinary) bioscience: synthetic biology. Findings show that three central metaphors were used between 2008 and May 2010. They exploit social and cultural knowledge about books, computers and engines and are linked to knowledge of three revolutions in science and society (the printing, information and industrial revolutions). These three central metaphors are connected to each other through the concepts of reading/writing, designing and mass production and they focus on science as a revolutionary process rather than on the end results or products of science. Overall, we observed the use of a complex bricolage of mixed metaphors and chains of metaphors that root synthetic biology in historical events and achievements, while at the same time extolling its promises for the future. © 2011 Copyright Taylor and Francis Group, LLC
- …