66 research outputs found
Two dimensional representation of the Dirac equation in Non associative algebra
In this note a simple extension of the complex algebra to higher dimension is
proposed. Using the postulated algebra a two dimensional Dirac equation is
formulated and its solution is calculated. It is found that there is a
sub-algebra where the associative nature can be recovered
Boxy Types: Inference for Higher-Rank Types and Impredicativity
Languages with rich type systems are beginning to employ a blend of type inference and type checking, so that the type inference engine is guided by programmer-supplied type annotations. In this paper we show, for the first time, how to combine the virtues of two well-established ideas: unification-based inference, and bidirectional propagation of type annotations. The result is a type system that conservatively extends Hindley-Milner, and yet supports both higher-rank types and impredicativity
Family Genericity
Type abstraction in object-oriented languages embody two techniques, each with its own strenghts and weaknesses. The first technique is extension, yielding abstraction mechanisms with good support for gradual specification.The prime example is inheritance. The second technique is functional abstraction, yielding more precise knowledge about the outcome. The prime example is type parameterized classes. This paper argues that these techniques should beclearly separated to work optimally, and also that current languages fail to do this.We have applied this design philosophy to a language based on an extension mechanism, namely virtual classes. As a result, some elements based on functional abstraction have been introduced, but they are simple and only used for things where they excel; conversely, the virtual classes have become more flexible, because their role is now more well-defined.We designate the result as family genericity. The presented language design has been implemented
Recommended from our members
Universally coupled massive gravity, III: dRGT-Maheshwari pure spin-2, Ogievetsky-Polubarinov and arbitrary mass terms
Einstein's equations were derived for a free massless spin- field using
universal coupling in the 1950-70s by various authors; total stress-energy
including gravity's served as a source for linear free field equations. A
massive variant was likewise derived in the late 1960s by Freund, Maheshwari
and Schonberg, and thought to be unique. How broad is universal coupling? In
the last decade four -parameter families of massive spin- theories
(contravariant, covariant, tetrad, and cotetrad of almost any density weights)
have been derived using universal coupling. The (co)tetrad derivations included
2 of the 3 pure spin- theories due to de Rham, Gabadadze, and Tolley; those
two theories first appeared in the -parameter Ogievetsky-Polubarinov family
(1965), which developed the symmetric square root of the metric as a nonlinear
group realization. One of the two theories was identified as pure spin- by
Maheshwari in 1971-2, thus evading the Boulware-Deser-Tyutin-Fradkin ghost by
the time it was announced. Unlike the previous 4 families, this paper permits
_nonlinear_ field redefinitions to build the effective metric. By not insisting
in advance on knowing the observable significance of the graviton potential to
all orders, one finds that an _arbitrary_ graviton mass term can be derived
using universal coupling. The arbitrariness of a universally coupled
mass/self-interaction term contrasts sharply with the uniqueness of the
Einstein kinetic term. One might have hoped to use universal coupling as a
tie-breaking criterion for choosing among theories that are equally
satisfactory on more crucial grounds (such as lacking ghosts and having a
smooth massless limit). But the ubiquity of universal coupling implies that the
criterion doesn't favor any particular theories among those with the Einstein
kinetic term.I thank Stanley Deser, Andrew Tolley and Fawad Hassan for discussing Maheshwari’s work, Alex Blum for acquainting me with the Bronstein paper and the Seligman (DeWitt) dissertation, and Jeremy Butterfield for discussion. This work was funded by the John Templeton Foundation, grant #38761.This is the author accepted manuscript. The final version is available from Elsevier via http://dx.doi.org/10.1016/j.aop.2015.12.00
Subtyping with Generics: A Unified Approach
Reusable software increases programmers\u27 productivity and reduces repetitive code and software bugs. Variance is a key programming language mechanism for writing reusable software. Variance is concerned with the interplay of parametric polymorphism (i.e., templates, generics) and subtype (inclusion) polymorphism. Parametric polymorphism enables programmers to write abstract types and is known to enhance the readability, maintainability, and reliability of programs. Subtyping promotes software reuse by allowing code to be applied to a larger set of terms. Integrating parametric and subtype polymorphism while maintaining type safety is a difficult problem. Existing variance mechanisms enable greater subtyping between parametric types, but they suffer from severe deficiencies. They are unable to express several common type abstractions. They can cause a proliferation of types and redundant code. They are difficult for programmers to use due to its inherent complexity. This dissertation aims to improve variance mechanisms in programming languages supporting parametric polymorphism. To address the shortcomings of current mechanisms, I will combine two popular approaches, definition-site variance and use-site variance, in a single programming language. I have developed formal languages or calculi for reasoning about variance. The calculi are example languages supporting both notions of definition-site and use-site variance. They enable stating precise properties that can be proved rigorously. The VarLang calculus demonstrates fundamental issues in variance from a language neutral perspective. The VarJ calculus illustrates realistic complications by modeling a mainstream programming language, Java. VarJ not only supports both notions of use-site and definition-site variance but also language features with complex interactions with variance such as F-bounded polymorphism and wildcard capture. A mapping from Java to VarLang was implemented in software that infers definition-site variance for Java. Large, standard Java libraries (e.g. Oracle\u27s JDK 1.6) were analyzed using the software to compute metrics measuring the benefits of adding definition-site variance to Java, which only supports use-site variance. Applying this technique to six Java generic libraries shows that 21-47% (depending on the library) of generic definitions are inferred to have single-variance; 7-29% of method signatures can be relaxed through this inference, and up to 100% of existing wildcard annotations are unnecessary and can be elided. Although the VarJ calculus proposes how to extend Java with definition-site variance, no mainstream language currently supports both definition-site and use-site variance. To assist programmers with utilizing both notions with existing technology, I developed a refactoring tool that refactors Java code by inferring definition-site variance and adding wildcard annotations. This tool is practical and immediately applicable: It assumes no changes to the Java type system, while taking into account all its intricacies. This system allows users to select declarations (variables, method parameters, return types, etc.) to generalize and considers declarations not declared in available source code. I evaluated our technique on six Java generic libraries. I found that 34% of available declarations of variant type signatures can be generalized-i.e., relaxed with more general wildcard types. On average, 146 other declarations need to be updated when a declaration is generalized, showing that this refactoring would be too tedious and error-prone to perform manually. The result of applying this refactoring is a more general interface that supports greater software reuse
A framework for substructural type systems
Mechanisation of programming language research is of growing interest, and the act of mechanising type systems and their metatheory is generally becoming easier as new techniques are invented. However, state-of-the-art techniques mostly rely on structurality of the type system --- that weakening, contraction, and exchange are admissible and variables can be used unrestrictedly once assumed. Linear logic, and many related subsequent systems, provide motivations for breaking some of these assumptions. We present a framework for mechanising the metatheory of certain substructural type systems, in a style resembling mechanised metatheory of structural type systems. The framework covers a wide range of simply typed syntaxes with semiring usage annotations, via a metasyntax of typing rules. The metasyntax for the premises of a typing rule is related to bunched logic, featuring both sharing and separating conjunction, roughly corresponding to the additive and multiplicative features of linear logic. We use the uniformity of syntaxes to derive type system-generic renaming, substitution, and a form of linearity checking
Grothendieck groups and a categorification of additive invariants
A topologically-invariant and additive homology class is mostly not a natural
transformation as it is. In this paper we discuss turning such a homology class
into a natural transformation; i.e., a "categorification" of it. In a general
categorical set-up we introduce a generalized relative Grothendieck group from
a cospan of functors of categories and also consider a categorification of
additive invariants on objects. As an example, we obtain a general theory of
characteristic homology classes of singular varieties.Comment: 27 pages, to appear in International J. Mathematic
- …