472 research outputs found
Challenges for Efficient Query Evaluation on Structured Probabilistic Data
Query answering over probabilistic data is an important task but is generally
intractable. However, a new approach for this problem has recently been
proposed, based on structural decompositions of input databases, following,
e.g., tree decompositions. This paper presents a vision for a database
management system for probabilistic data built following this structural
approach. We review our existing and ongoing work on this topic and highlight
many theoretical and practical challenges that remain to be addressed.Comment: 9 pages, 1 figure, 23 references. Accepted for publication at SUM
201
Supporting User-Defined Functions on Uncertain Data
Uncertain data management has become crucial in many sensing and scientific applications. As user-defined functions (UDFs) become widely used in these applications, an important task is to capture result uncertainty for queries that evaluate UDFs on uncertain data. In this work, we provide a general framework for supporting UDFs on uncertain data. Specifically, we propose a learning approach based on Gaussian processes (GPs) to compute approximate output distributions of a UDF when evaluated on uncertain input, with guaranteed error bounds. We also devise an online algorithm to compute such output distributions, which employs a suite of optimizations to improve accuracy and performance. Our evaluation using both real-world and synthetic functions shows that our proposed GP approach can outperform the state-of-the-art sampling approach with up to two orders of magnitude improvement for a variety of UDFs. 1
Parasitism capacity of Trichogramma pretiosum on eggs of Trichoplusia ni at different temperatures.
ABSTRACT: Trichogramma spp. are egg parasitoids of various pest species of Lepidoptera including Trichoplusia ni, an important pest of plants in the genus Brassica. Of the climatic conditions that can impair Trichogramma spp. parasitism capacity, the temperature is critical. Thus, the objective of this research was to evaluate the parasitism capacity of Trichogramma pretiosum on eggs of T. ni at 18, 21, 24, 27, 30, and 33ºC; 70±10% RH; and 12/12 hours photophase (L/D). Fresh eggs of the host moth were offered to T. pretiosum daily. The parasitism rate varied between 8 and 11.4 eggs/female at the temperatures evaluated for the first 24 hours. The highest number of parasitized eggs per female occurred at 24ºC (53.0 parasitized eggs/female). The period of parasitism and the mean longevity of females were inversely related to the temperature. Temperature heavily influences the parasitism rate of T. pretiosum on eggs of T. ni, and the best overall performance of the parasitoid occurs from 24 to 27ºC. RESUMO. Trichogramma spp. são parasitoide de ovos de várias espécies pragas de Lepidoptera incluindo Trichoplusia ni, uma importante praga de plantas do gênero Brassica. Das condições climáticas que podem influenciar a capacidade de parasitismo de Trichogramma spp., a temperatura é uma das principais. Portanto, o objetivo deste trabalho foi avaliar a capacidade de parasitismo de Trichogramma pretiosum em ovos de T. ni nas temperaturas de 18, 21, 24, 27, 30 e 33ºC; 70±10% UR; e 14/12h fotofase. Ovos frescos de T. ni foram oferecidos diariamente para T. pretiosum. A taxa de parasitismo nas primeiras 24h variou de 8 a 11,4 ovos/fêmea do parasitoide entre as temperaturas avaliadas. O maior número de ovos parasitados por fêmea ocorreu a 24ºC (53,0 ovos parasitados/fêmea). O tempo de parasitismo e a longevidade média dos parasitoides adultos foram inversamente relacionados à temperatura. Temperatura influência enormemente no parasitismo de T. pretiosum em ovos de T. ni, e os melhores resultados do parasitoide foram obtidos de 24 a 27ºC
Randomisation and Derandomisation in Descriptive Complexity Theory
We study probabilistic complexity classes and questions of derandomisation
from a logical point of view. For each logic L we introduce a new logic BPL,
bounded error probabilistic L, which is defined from L in a similar way as the
complexity class BPP, bounded error probabilistic polynomial time, is defined
from PTIME. Our main focus lies on questions of derandomisation, and we prove
that there is a query which is definable in BPFO, the probabilistic version of
first-order logic, but not in Cinf, finite variable infinitary logic with
counting. This implies that many of the standard logics of finite model theory,
like transitive closure logic and fixed-point logic, both with and without
counting, cannot be derandomised. Similarly, we present a query on ordered
structures which is definable in BPFO but not in monadic second-order logic,
and a query on additive structures which is definable in BPFO but not in FO.
The latter of these queries shows that certain uniform variants of AC0
(bounded-depth polynomial sized circuits) cannot be derandomised. These results
are in contrast to the general belief that most standard complexity classes can
be derandomised. Finally, we note that BPIFP+C, the probabilistic version of
fixed-point logic with counting, captures the complexity class BPP, even on
unordered structures
Risk-Averse Matchings over Uncertain Graph Databases
A large number of applications such as querying sensor networks, and
analyzing protein-protein interaction (PPI) networks, rely on mining uncertain
graph and hypergraph databases. In this work we study the following problem:
given an uncertain, weighted (hyper)graph, how can we efficiently find a
(hyper)matching with high expected reward, and low risk?
This problem naturally arises in the context of several important
applications, such as online dating, kidney exchanges, and team formation. We
introduce a novel formulation for finding matchings with maximum expected
reward and bounded risk under a general model of uncertain weighted
(hyper)graphs that we introduce in this work. Our model generalizes
probabilistic models used in prior work, and captures both continuous and
discrete probability distributions, thus allowing to handle privacy related
applications that inject appropriately distributed noise to (hyper)edge
weights. Given that our optimization problem is NP-hard, we turn our attention
to designing efficient approximation algorithms. For the case of uncertain
weighted graphs, we provide a -approximation algorithm, and a
-approximation algorithm with near optimal run time. For the case
of uncertain weighted hypergraphs, we provide a
-approximation algorithm, where is the rank of the
hypergraph (i.e., any hyperedge includes at most nodes), that runs in
almost (modulo log factors) linear time.
We complement our theoretical results by testing our approximation algorithms
on a wide variety of synthetic experiments, where we observe in a controlled
setting interesting findings on the trade-off between reward, and risk. We also
provide an application of our formulation for providing recommendations of
teams that are likely to collaborate, and have high impact.Comment: 25 page
Recommended from our members
Modelled and observed changes in aerosols and surface solar radiation over Europe between 1960 and 2009
Substantial changes in anthropogenic aerosols and precursor gas emissions have occurred over recent decades due to the implementation of air pollution control legislation and economic growth. The response of atmospheric aerosols to these changes and the impact on climate are poorly constrained, particularly in studies using detailed aerosol chemistry–climate models. Here we compare the HadGEM3-UKCA (Hadley Centre Global Environment Model-United Kingdom Chemistry and Aerosols) coupled chemistry–climate model for the period 1960–2009 against extensive ground-based observations of sulfate aerosol mass (1978–2009), total suspended particle matter (SPM, 1978–1998), PM10 (1997–2009), aerosol optical depth (AOD, 2000–2009), aerosol size distributions (2008–2009) and surface solar radiation (SSR, 1960–2009) over Europe. The model underestimates observed sulfate aerosol mass (normalised mean bias factor (NMBF) = −0.4), SPM (NMBF = −0.9), PM10 (NMBF = −0.2), aerosol number concentrations (N30 NMBF = −0.85; N50 NMBF = −0.65; and N100 NMBF = −0.96) and AOD (NMBF = −0.01) but slightly overpredicts SSR (NMBF = 0.02). Trends in aerosol over the observational period are well simulated by the model, with observed (simulated) changes in sulfate of −68 % (−78 %), SPM of −42 % (−20 %), PM10 of −9 % (−8 %) and AOD of −11 % (−14 %). Discrepancies in the magnitude of simulated aerosol mass do not affect the ability of the model to reproduce the observed SSR trends. The positive change in observed European SSR (5 %) during 1990–2009 ("brightening") is better reproduced by the model when aerosol radiative effects (ARE) are included (3 %), compared to simulations where ARE are excluded (0.2 %). The simulated top-of-the-atmosphere aerosol radiative forcing over Europe under all-sky conditions increased by > 3.0 W m−2 during the period 1970–2009 in response to changes in anthropogenic emissions and aerosol concentrations
Earthquake Induced Liquefaction Using Shake Table Test
Loose saturated cohesionless soils may undergo liquefaction due to strong ground motions. Such liquefaction causes significant damage to the structure resting on such soil. The extent of damage primarily depends upon soil properties, intensity of earthquake and type of structure. Various analytical models have been developed to estimate the likelihood of liquefaction of particular site based on field performance. However, if it is possible to identify the sites which are likely to liquefy due to specific intensity of earthquake it will help implementing the reduction in the damage which it would otherwise cause. One such analytical model has been developed by one of the authors of this paper and has been found to satisfactorily demarcate ‘Yes’ and ‘No’ zones of liquefaction for number of earthquakes. However, earlier research shows that laboratory tests could also be conducted to study the liquefaction behavior of soil under specific condition. The present study mainly deals with an attempt made in conducting Shake Table Test in laboratory by simulating earthquake conditions on site. The results obtained from the trial tests have been compared with the actual field cases and also with laboratory tests conducted for such soil by other researchers. It is observed that the criterion of the occurrence of liquefaction in the laboratory model is in close agreement with actual field data. Shake table test is found to be more effective in simulating the strong ground motion during earthquake
Recommended from our members
The Met Office Unified Model Global Atmosphere 7.0/7.1 and JULES Global Land 7.0 configurations
We describe Global Atmosphere 7.0 and GlobalLand 7.0 (GA7.0/GL7.0), the latest science configurations of the Met Office Unified Model (UM) and the Joint UK Land Environment Simulator (JULES) land surface model developed for use across weather and climate timescales. GA7.0 and GL7.0 include incremental developments and targeted improvements that, between them, address four critical errors identified in previous configurations: excessive precipitation biases over India, warm and moist biases in the tropical tropopause layer (TTL), a source of energy non-conservation in the advection scheme and excessive surface radiation biases over the Southern Ocean. They also include two new parametrisations, namely the UK Chemistry and Aerosol(UKCA) GLOMAP-mode (Global Model of Aerosol Processes) aerosol scheme and the JULES multi-layer snow scheme, which improve the fidelity of the simulation and were required for inclusion in the Global Atmosphere/Global Land configurations ahead of the 6th Coupled Model Inter-comparison Project (CMIP6).In addition, we describe the GA7.1 branch configuration, which reduces an overly negative anthropogenic aerosol effective radiative forcing (ERF) in GA7.0 whilst maintaining the quality of simulations of the present-day climate. GA7.1/GL7.0 will form the physical atmosphere/land component in the HadGEM3–GC3.1 and UKESM1 climate model submissions to the CMIP6
A SIMPLE AND EFFICIENT METHOD FOR DNA EXTRACTION FROM RABI SORGHUM [Sorghum bicolor (L.) MOENCH]
Sorghum [Sorghum bicolor (L.) Moench] is an important source for bio-energy and having significant contributions in agriculture. Accordingly, plant breeders are working for different strategies for genetic improvement i.e. breeding for higher yield, improved grain quality, and biotic and abiotic stress tolerance in sorghum. For genetic improvement four major biotechnological tools are now emerging e.g. molecular markers, gene identification and cloning, genetic engineering and gene transfer technology to integrate desirable traits into the sorghum genome, and genomics and germplasm databases. Genomic DNA extraction is the basic prerequisite for all of these tools. We made several modifications to the available CTAB method to isolate genomic DNA from sweet sorghum leaf tissues. Higher concentration of NaCl (5 M) in CTAB extraction buffer improved the DNA yield and quality by preventing the sample from becoming viscous during the sample grinding. The yield of DNA ranged from 432-569 ng from 200 mg of leaf tissue. Proteinase k and RNase A treatment properly removes the protein and RNA from DNA. An absorbance value of 1.8 at A260/A280 indicates that the DNA is free from RNA and protein contamination. Three times chloroform:isoamyl alcohol (24:1 v/v) wash resulted in a good quality of DNA. Pre-chilled ethanol being useful for DNA precipitation due to its volatile nature and ideal for small and medium scale DNA extraction. PCR analysis using SSR primers shows a consistent and reliable amplificatio
Intercomparison and evaluation of global aerosol microphysical properties among AeroCom models of a range of complexity
- …
