9,276 research outputs found
Effect preservation in transaction processing in rule triggering systems
Rules provide an expressive means for implementing database behavior: They cope with changes and their ramifications. Rules are commonly used for integrity enforcement, i.e., for repairing database actions in a way that integrity constraints are kept. Yet, Rule Triggering Systems fall short in enforcing effect preservation, i.e., guaranteeing that repairing events do not undo each other, and in particular, do not undo the original triggering event. A method for enforcement of effect preservation on updates in general rule triggering systems is suggested. The method derives transactions from rules, and then splits the work between compile time and run time. At compile time, a data structure is constructed, that analyzes the execution sequences of a transaction and computes minimal conditions for effect preservation. The transaction code is augmented with instructions that navigate along the data structure and test the computed minimal conditions. This method produces minimal effect preserving transactions, and under certain conditions, provides meaningful improvement over the quadratic overhead of pure run time procedures. For transactions without loops, the run time overhead is linear in the size of the transaction, and for general transactions, the run time overhead depends linearly on the length of the execution sequence and the number of loop repetitions. The method is currently being implemented within a traditional database system
Error-Bounded and Feature Preserving Surface Remeshing with Minimal Angle Improvement
The typical goal of surface remeshing consists in finding a mesh that is (1)
geometrically faithful to the original geometry, (2) as coarse as possible to
obtain a low-complexity representation and (3) free of bad elements that would
hamper the desired application. In this paper, we design an algorithm to
address all three optimization goals simultaneously. The user specifies desired
bounds on approximation error {\delta}, minimal interior angle {\theta} and
maximum mesh complexity N (number of vertices). Since such a desired mesh might
not even exist, our optimization framework treats only the approximation error
bound {\delta} as a hard constraint and the other two criteria as optimization
goals. More specifically, we iteratively perform carefully prioritized local
operators, whenever they do not violate the approximation error bound and
improve the mesh otherwise. In this way our optimization framework greedily
searches for the coarsest mesh with minimal interior angle above {\theta} and
approximation error bounded by {\delta}. Fast runtime is enabled by a local
approximation error estimation, while implicit feature preservation is obtained
by specifically designed vertex relocation operators. Experiments show that our
approach delivers high-quality meshes with implicitly preserved features and
better balances between geometric fidelity, mesh complexity and element quality
than the state-of-the-art.Comment: 14 pages, 20 figures. Submitted to IEEE Transactions on Visualization
and Computer Graphic
BPGrad: Towards Global Optimality in Deep Learning via Branch and Pruning
Understanding the global optimality in deep learning (DL) has been attracting
more and more attention recently. Conventional DL solvers, however, have not
been developed intentionally to seek for such global optimality. In this paper
we propose a novel approximation algorithm, BPGrad, towards optimizing deep
models globally via branch and pruning. Our BPGrad algorithm is based on the
assumption of Lipschitz continuity in DL, and as a result it can adaptively
determine the step size for current gradient given the history of previous
updates, wherein theoretically no smaller steps can achieve the global
optimality. We prove that, by repeating such branch-and-pruning procedure, we
can locate the global optimality within finite iterations. Empirically an
efficient solver based on BPGrad for DL is proposed as well, and it outperforms
conventional DL solvers such as Adagrad, Adadelta, RMSProp, and Adam in the
tasks of object recognition, detection, and segmentation
A Systematic Approach to Constructing Incremental Topology Control Algorithms Using Graph Transformation
Communication networks form the backbone of our society. Topology control
algorithms optimize the topology of such communication networks. Due to the
importance of communication networks, a topology control algorithm should
guarantee certain required consistency properties (e.g., connectivity of the
topology), while achieving desired optimization properties (e.g., a bounded
number of neighbors). Real-world topologies are dynamic (e.g., because nodes
join, leave, or move within the network), which requires topology control
algorithms to operate in an incremental way, i.e., based on the recently
introduced modifications of a topology. Visual programming and specification
languages are a proven means for specifying the structure as well as
consistency and optimization properties of topologies. In this paper, we
present a novel methodology, based on a visual graph transformation and graph
constraint language, for developing incremental topology control algorithms
that are guaranteed to fulfill a set of specified consistency and optimization
constraints. More specifically, we model the possible modifications of a
topology control algorithm and the environment using graph transformation
rules, and we describe consistency and optimization properties using graph
constraints. On this basis, we apply and extend a well-known constructive
approach to derive refined graph transformation rules that preserve these graph
constraints. We apply our methodology to re-engineer an established topology
control algorithm, kTC, and evaluate it in a network simulation study to show
the practical applicability of our approachComment: This document corresponds to the accepted manuscript of the
referenced journal articl
Automatic generation of simplified weakest preconditions for integrity constraint verification
Given a constraint assumed to hold on a database and an update to
be performed on , we address the following question: will still hold
after is performed? When is a relational database, we define a
confluent terminating rewriting system which, starting from and ,
automatically derives a simplified weakest precondition such that,
whenever satisfies , then the updated database will satisfy
, and moreover is simplified in the sense that its computation
depends only upon the instances of that may be modified by the update. We
then extend the definition of a simplified to the case of deductive
databases; we prove it using fixpoint induction
Mechanized semantics
The goal of this lecture is to show how modern theorem provers---in this
case, the Coq proof assistant---can be used to mechanize the specification of
programming languages and their semantics, and to reason over individual
programs and over generic program transformations, as typically found in
compilers. The topics covered include: operational semantics (small-step,
big-step, definitional interpreters); a simple form of denotational semantics;
axiomatic semantics and Hoare logic; generation of verification conditions,
with application to program proof; compilation to virtual machine code and its
proof of correctness; an example of an optimizing program transformation (dead
code elimination) and its proof of correctness
- …