1,164 research outputs found
Context for Ubiquitous Data Management
In response to the advance of ubiquitous computing technologies, we believe that for computer systems to be ubiquitous, they must be context-aware. In this paper, we address the impact of context-awareness on ubiquitous data management. To do this, we overview different characteristics of context in order to develop a clear understanding of context, as well as its implications and requirements for context-aware data management. References to recent research activities and applicable techniques are also provided
Balancing smartness and privacy for the Ambient Intelligence
Ambient Intelligence (AmI) will introduce large privacy risks. Stored context histories are vulnerable for unauthorized disclosure, thus unlimited storing of privacy-sensitive context data is not desirable from the privacy viewpoint. However, high quality and quantity of data enable smartness for the AmI, while less and coarse data benefit privacy. This raises a very important problem to the AmI, that is, how to balance the smartness and privacy requirements in an ambient world. In this article, we propose to give to donors the control over the life cycle of their context data, so that users themselves can balance their needs and wishes in terms of smartness and privacy
Implanting Life-Cycle Privacy Policies in a Context Database
Ambient intelligence (AmI) environments continuously monitor surrounding individuals' context (e.g., location, activity, etc.) to make existing applications smarter, i.e., make decision without requiring user interaction. Such AmI smartness ability is tightly coupled to quantity and quality of the available (past and present) context. However, context is often linked to an individual (e.g., location of a given person) and as such falls under privacy directives. The goal of this paper is to enable the difficult wedding of privacy (automatically fulfilling users' privacy whishes) and smartness in the AmI. interestingly, privacy requirements in the AmI are different from traditional environments, where systems usually manage durable data (e.g., medical or banking information), collected and updated trustfully either by the donor herself, her doctor, or an employee of her bank. Therefore, proper information disclosure to third parties constitutes a major privacy concern in the traditional studies
ï»żAn Answer Explanation Model for Probabilistic Database Queries
Following the availability of huge amounts of uncertain data, coming from diverse ranges of applications such as sensors, machine learning or mining approaches, information extraction and integration, etc. in recent years, we have seen a revival of interests in probabilistic databases. Queries over these databases result in probabilistic answers. As the process of arriving at these answers is based on the underlying stored uncertain data, we argue that from the standpoint of an end user, it is helpful for such a system to give an explanation on how it arrives at an answer and on which uncertainty assumptions the derived answer is based. In this way, the user with his/her own knowledge can decide how much confidence to place in this probabilistic answer. \ud
The aim of this paper is to design such an answer explanation model for probabilistic database queries. We report our design principles and show the methods to compute the answer explanations. One of the main contributions of our model is that it fills the gap between giving only the answer probability, and giving the full derivation. Furthermore, we show how to balance verifiability and influence of explanation components through the concept of verifiable views. The behavior of the model and its computational efficiency are demonstrated through an extensive performance study
An adaptive finite element method for the infinity Laplacian
We construct a finite element method (FEM) for the infinity Laplacian. Solutions of this problem are well known to be singular in nature so we have taken the opportunity to conduct an a posteriori analysis of the method deriving residual based estimators to drive an adaptive algorithm. It is numerically shown that optimal convergence rates are regained using the adaptive procedure
String Organization of Field Theories: Duality and Gauge Invariance
String theories should reduce to ordinary four-dimensional field theories at
low energies. Yet the formulation of the two are so different that such a
connection, if it exists, is not immediately obvious. With the Schwinger
proper-time representation, and the spinor helicity technique, it has been
shown that field theories can indeed be written in a string-like manner, thus
resulting in simplifications in practical calculations, and providing novel
insights into gauge and gravitational theories. This paper continues the study
of string organization of field theories by focusing on the question of local
duality. It is shown that a single expression for the sum of many diagrams can
indeed be written for QED, thereby simulating the duality property in strings.
The relation between a single diagram and the dual sum is somewhat analogous to
the relation between a old- fashioned perturbation diagram and a Feynman
diagram. Dual expressions are particularly significant for gauge theories
because they are gauge invariant while expressions for single diagrams are not.Comment: 20 pages in Latex, including seven figures in postscrip
Quark initiated coherent diffractive production of muon pair and W boson at hadron colliders
The large transverse momentum muon pair and W boson productions in the quark
initiated coherent diffractive processes at hadron colliders are discussed
under the framework of the two-gluon exchange parametrization of the Pomeron
model. In this approach, the production cross sections are related to the
small-x off-diagonal gluon distribution and the large-x quark distribution in
the proton (antiproton). By approximating the off-diagonal gluon distribution
by the usual gluon distribution function, we estimate the production rates of
these processes at the Fermilab Tevatron.Comment: 11pages, 6 PS figures, to appear in PR
Variational Multiscale Stabilization and the Exponential Decay of Fine-scale Correctors
This paper addresses the variational multiscale stabilization of standard
finite element methods for linear partial differential equations that exhibit
multiscale features. The stabilization is of Petrov-Galerkin type with a
standard finite element trial space and a problem-dependent test space based on
pre-computed fine-scale correctors. The exponential decay of these correctors
and their localisation to local cell problems is rigorously justified. The
stabilization eliminates scale-dependent pre-asymptotic effects as they appear
for standard finite element discretizations of highly oscillatory problems,
e.g., the poor approximation in homogenization problems or the pollution
effect in high-frequency acoustic scattering
Diffractive light quark jet production at hadron colliders in the two-gluon exchange model
Massless quark and antiquark jet production at large transverse momentum in
the coherent diffractive processes at hadron colliders is calculated in the
two-gluon exchange parametrization of the Pomeron model. We use the helicity
amplitude method to calculate the cross section formula. We find that for the
light quark jet production the diffractive process is related to the
differential off-diagonal gluon distribution function in the proton. We
estimate the production rate for this process at the Fermilab Tevatron by
approximating the off-diagonal gluon distribution function by the usual
diagonal gluon distribution in the proton. And we find that the cross sections
for the diffractive light quark jet production and the charm quark jet
production are in the same order of magnitude. We also use the helicity
amplitude method to calculate the diffractive charm jet production at hadron
colliders, by which we reproduce the leading logarithmic approximation result
of this process we previously calculated.Comment: 15 pages, 4 PS figures, Revte
Can black holes be torn up by phantom dark energy in cyclic cosmology?
Infinitely cyclic cosmology is often frustrated by the black hole problem. It
has been speculated that this obstacle in cyclic cosmology can be removed by
taking into account a peculiar cyclic model derived from loop quantum cosmology
or the braneworld scenario, in which phantom dark energy plays a crucial role.
In this peculiar cyclic model, the mechanism of solving the black hole problem
is through tearing up black holes by phantom. However, using the theory of
fluid accretion onto black holes, we show in this paper that there exists
another possibility: that black holes cannot be torn up by phantom in this
cyclic model. We discussed this possibility and showed that the masses of black
holes might first decrease and then increase, through phantom accretion onto
black holes in the expanding stage of the cyclic universe.Comment: 6 pages, 2 figures; discussions adde
- âŠ