443 research outputs found
Electrical control of the linear optical properties of particulate composite materials
The Bruggeman formalism for the homogenization of particulate composite
materials is used to predict the effective permittivity dyadic of a
two-constituent composite material with one constituent having the ability to
display the Pockels effect. Scenarios wherein the constituent particles are
randomly oriented, oriented spheres, and oriented spheroids are numerically
explored. Thereby, homogenized composite materials (HCMs) are envisaged whose
constitutive parameters may be continuously varied through the application of a
low-frequency (dc) electric field. The greatest degree of control over the HCM
constitutive parameters is achievable when the constituents comprise oriented
and highly aspherical particles and have high electro-optic coefficients
Object Grammars: Compositional & Bidirectional Mapping Between Text and Graphs
Abstract: Object Grammars define mappings between text and object graphs. Parsing recognizes syntactic features and creates the corresponding object structure. In the reverse direction, formatting recognizes object graph features and generates an appropriate textual presentation. The key to Object Grammars is the expressive power of the mapping, which decouples the syntactic structure from the graph structure. To handle graphs, Object Grammars support declarative annotations for resolving textual names that refer to arbitrary objects in the graph structure. Predicates on the semantic structure provide additional control over the mapping. Furthermore, Object Grammars are compositional so that languages may be defined in a modular fashion. We have implemented our approach to Object Grammars as one of the foundations of the Ens\xc5\x8d system and illustrate the utility of our approach by showing how it enables definition and composition of domain-specific languages (DSLs)
Managed Data: Modular Strategies for Data Abstraction
Managed Data is a two-level approach to data abstraction in which programmers first define data description and manipulation mechanisms, and then use these mechanisms to define specific kinds of data. Managed Data allows programmers to take control of many important aspects of data, including persistence, access/change control, reactivity, logging, bidirectional relationships, resource management, invariants and validation. These features are implemented once as reusable strategies that can apply to many different data types. Managed Data is a general concept that can be implemented in several ways, including reflection, metaclasses, and macros. In this paper we argue for the importance of Managed Data and present a novel implementation of Managed Data based on interpretation of data models. We show how to inherit and compose interpreters to implement the features described above. Our approach allows Managed Data to be used in object-oriented languages that support reflection over field access (overriding the "dot" operator) or dynamic method creation. We also show how self-describing data models are useful for bootstrapping, allowing Managed Data to be used definition of Data Managers themselves. As a case study, we used Managed Data in a web development framework from the Ens\xc5\x8d project to reuse database management and access control mechanisms across different data definitions
Ranking Templates for Linear Loops
We present a new method for the constraint-based synthesis of termination
arguments for linear loop programs based on linear ranking templates. Linear
ranking templates are parametrized, well-founded relations such that an
assignment to the parameters gives rise to a ranking function. This approach
generalizes existing methods and enables us to use templates for many different
ranking functions with affine-linear components. We discuss templates for
multiphase, piecewise, and lexicographic ranking functions. Because these
ranking templates require both strict and non-strict inequalities, we use
Motzkin's Transposition Theorem instead of Farkas Lemma to transform the
generated -constraint into an -constraint.Comment: TACAS 201
SCC: A Service Centered Calculus
We seek for a small set of primitives that might serve as a basis for formalising and programming service oriented applications over global computers. As an outcome of this study we introduce here SCC, a process calculus that features explicit notions of service definition, service invocation and session handling. Our proposal has been influenced by Orc, a programming model for structured orchestration of services, but the SCCâs session handling mechanism allows for the definition of structured interaction protocols, more complex than the basic request-response provided by Orc. We present syntax and operational semantics of SCC and a number of simple but nontrivial programming examples that demonstrate flexibility of the chosen set of primitives. A few encodings are also provided to relate our proposal with existing ones
Unrestricted Termination and Non-Termination Arguments for Bit-Vector Programs
Proving program termination is typically done by finding a well-founded
ranking function for the program states. Existing termination provers typically
find ranking functions using either linear algebra or templates. As such they
are often restricted to finding linear ranking functions over mathematical
integers. This class of functions is insufficient for proving termination of
many terminating programs, and furthermore a termination argument for a program
operating on mathematical integers does not always lead to a termination
argument for the same program operating on fixed-width machine integers. We
propose a termination analysis able to generate nonlinear, lexicographic
ranking functions and nonlinear recurrence sets that are correct for
fixed-width machine arithmetic and floating-point arithmetic Our technique is
based on a reduction from program \emph{termination} to second-order
\emph{satisfaction}. We provide formulations for termination and
non-termination in a fragment of second-order logic with restricted
quantification which is decidable over finite domains. The resulted technique
is a sound and complete analysis for the termination of finite-state programs
with fixed-width integers and IEEE floating-point arithmetic
Alternating runtime and size complexity analysis of integer programs
We present a modular approach to automatic complexity analysis. Based on a novel alternation between finding symbolic time bounds for program parts and using these to infer size bounds on program variables, we can restrict each analysis step to a small part of the program while maintaining a high level of precision. Extensive experiments with the implementation of our method demonstrate its performance and power in comparison with other tools
- âŠ