18 research outputs found

    Applications and extensions of context-sensitive rewriting

    Full text link
    [EN] Context-sensitive rewriting is a restriction of term rewriting which is obtained by imposing replacement restrictions on the arguments of function symbols. It has proven useful to analyze computational properties of programs written in sophisticated rewriting-based programming languages such asCafeOBJ, Haskell, Maude, OBJ*, etc. Also, a number of extensions(e.g., to conditional rewritingor constrained equational systems) and generalizations(e.g., controlled rewritingor forbidden patterns) of context-sensitive rewriting have been proposed. In this paper, we provide an overview of these applications and related issues. (C) 2021 Elsevier Inc. All rights reserved.Partially supported by the EU (FEDER), and projects RTI2018-094403-B-C32 and PROMETEO/2019/098.Lucas Alba, S. (2021). Applications and extensions of context-sensitive rewriting. Journal of Logical and Algebraic Methods in Programming. 121:1-33. https://doi.org/10.1016/j.jlamp.2021.10068013312

    Analysing Parallel Complexity of Term Rewriting

    Full text link
    We revisit parallel-innermost term rewriting as a model of parallel computation on inductive data structures and provide a corresponding notion of runtime complexity parametric in the size of the start term. We propose automatic techniques to derive both upper and lower bounds on parallel complexity of rewriting that enable a direct reuse of existing techniques for sequential complexity. The applicability and the precision of the method are demonstrated by the relatively light effort in extending the program analysis tool AProVE and by experiments on numerous benchmarks from the literature.Comment: Extended authors' accepted manuscript for a paper accepted for publication in the Proceedings of the 32nd International Symposium on Logic-based Program Synthesis and Transformation (LOPSTR 2022). 27 page

    On Complexity Bounds and Confluence of Parallel Term Rewriting

    Full text link
    We revisit parallel-innermost term rewriting as a model of parallel computation on inductive data structures and provide a corresponding notion of runtime complexity parametric in the size of the start term. We propose automatic techniques to derive both upper and lower bounds on parallel complexity of rewriting that enable a direct reuse of existing techniques for sequential complexity. Our approach to find lower bounds requires confluence of the parallel-innermost rewrite relation, thus we also provide effective sufficient criteria for proving confluence. The applicability and the precision of the method are demonstrated by the relatively light effort in extending the program analysis tool AProVE and by experiments on numerous benchmarks from the literature.Comment: Under submission to Fundamenta Informaticae. arXiv admin note: substantial text overlap with arXiv:2208.0100

    Reasoning about correctness properties of a coordination programming language

    Get PDF
    Safety critical systems place additional requirements to the programming language used to implement them with respect to traditional environments. Examples of features that in uence the suitability of a programming language in such environments include complexity of de nitions, expressive power, bounded space and time and veri ability. Hume is a novel programming language with a design which targets the rst three of these, in some ways, contradictory features: fully expressive languages cannot guarantee bounds on time and space, and low-level languages which can guarantee space and time bounds are often complex and thus error-phrone. In Hume, this contradiction is solved by a two layered architecture: a high-level fully expressive language, is built on top of a low-level coordination language which can guarantee space and time bounds. This thesis explores the veri cation of Hume programs. It targets safety properties, which are the most important type of correctness properties, of the low-level coordination language, which is believed to be the most error-prone. Deductive veri cation in Lamport's temporal logic of actions (TLA) is utilised, in turn validated through algorithmic experiments. This deductive veri cation is mechanised by rst embedding TLA in the Isabelle theorem prover, and then embedding Hume on top of this. Veri cation of temporal invariants is explored in this setting. In Hume, program transformation is a key feature, often required to guarantee space and time bounds of high-level constructs. Veri cation of transformations is thus an integral part of this thesis. The work with both invariant veri cation, and in particular, transformation veri cation, has pinpointed several weaknesses of the Hume language. Motivated and in uenced by this, an extension to Hume, called Hierarchical Hume, is developed and embedded in TLA. Several case studies of transformation and invariant veri cation of Hierarchical Hume in Isabelle are conducted, and an approach towards a calculus for transformations is examined.James Watt ScholarshipEngineering and Physical Sciences Research Council (EPSRC) Platform grant GR/SO177

    Proof-theoretic Semantics for Intuitionistic Multiplicative Linear Logic

    Get PDF
    This work is the first exploration of proof-theoretic semantics for a substructural logic. It focuses on the base-extension semantics (B-eS) for intuitionistic multiplicative linear logic (IMLL). The starting point is a review of Sandqvist’s B-eS for intuitionistic propositional logic (IPL), for which we propose an alternative treatment of conjunction that takes the form of the generalized elimination rule for the connective. The resulting semantics is shown to be sound and complete. This motivates our main contribution, a B-eS for IMLL , in which the definitions of the logical constants all take the form of their elimination rule and for which soundness and completeness are established

    Laser-induced forward transfer (LIFT) of water soluble polyvinyl alcohol (PVA) polymers for use as support material for 3D-printed structures

    Get PDF
    The additive microfabrication method of laser-induced forward transfer (LIFT) permits the creation of functional microstructures with feature sizes down to below a micrometre [1]. Compared to other additive manufacturing techniques, LIFT can be used to deposit a broad range of materials in a contactless fashion. LIFT features the possibility of building out of plane features, but is currently limited to 2D or 2ÂœD structures [2–4]. That is because printing of 3D structures requires sophisticated printing strategies, such as mechanical support structures and post-processing, as the material to be printed is in the liquid phase. Therefore, we propose the use of water-soluble materials as a support (and sacrificial) material, which can be easily removed after printing, by submerging the printed structure in water, without exposing the sample to more aggressive solvents or sintering treatments. Here, we present studies on LIFT printing of polyvinyl alcohol (PVA) polymer thin films via a picosecond pulsed laser source. Glass carriers are coated with a solution of PVA (donor) and brought into proximity to a receiver substrate (glass, silicon) once dried. Focussing of a laser pulse with a beam radius of 2 ”m at the interface of carrier and donor leads to the ejection of a small volume of PVA that is being deposited on a receiver substrate. The effect of laser pulse fluence , donor film thickness and receiver material on the morphology (shape and size) of the deposits are studied. Adhesion of the deposits on the receiver is verified via deposition on various receiver materials and via a tape test. The solubility of PVA after laser irradiation is confirmed via dissolution in de-ionised water. In our study, the feasibility of the concept of printing PVA with the help of LIFT is demonstrated. The transfer process maintains the ability of water solubility of the deposits allowing the use as support material in LIFT printing of complex 3D structures. Future studies will investigate the compatibility (i.e. adhesion) of PVA with relevant donor materials, such as metals and functional polymers. References: [1] A. PiquĂ© and P. Serra (2018) Laser Printing of Functional Materials. Weinheim, Germany: Wiley-VCH Verlag GmbH & Co. KGaA. [2] R. C. Y. Auyeung, H. Kim, A. J. Birnbaum, M. Zalalutdinov, S. A. Mathews, and A. PiquĂ© (2009) Laser decal transfer of freestanding microcantilevers and microbridges, Appl. Phys. A, vol. 97, no. 3, pp. 513–519. [3] C. W. Visser, R. Pohl, C. Sun, G.-W. Römer, B. Huis in ‘t Veld, and D. Lohse (2015) Toward 3D Printing of Pure Metals by Laser-Induced Forward Transfer, Adv. Mater., vol. 27, no. 27, pp. 4087–4092. [4] J. Luo et al. (2017) Printing Functional 3D Microdevices by Laser-Induced Forward Transfer, Small, vol. 13, no. 9, p. 1602553

    Influential factors in the design and implementation of electronic assessment at a research-led university

    Get PDF
    Of the many challenges faced by those in higher education (HE), with the use of technology to support teaching, learning and assessment activities being particularly prominent in recent years. Adopting a critical realist perspective, and drawing upon institutional ethnography and social practice theory, this thesis set out to examine how academic and professional services staff at a pre-1992, research-led university engaged with electronic assessment (e-assessment), and the extent to which the structures and formal policies impacted on design and implementation. Data were collected through a review of institutional documentation and face-to-face interviews with 23 participants, which were then analysed using the Framework method. The findings suggest that institutional priorities not only have a direct impact on the culture of the institution, but also the visibility and importance of e-assessment. In turn, those who are engaged with e-assessment are actively engaged in workarounds in their practice, as they negotiate institutional structures and formal policies. Whilst external factors have contributed to the visibility and desire for adopting e-assessment, successful engagement by academics is largely dependent on the reliability of the institution’s technological infrastructure and leadership at the local level, to support the operational aspects of delivery and more crucially, the enactment of institutional policies in disciplinary contexts. The findings also indicate that whilst institutional communities of practice are a valuable resource for sharing best practice, there is still a disconnect between academics and professional services staff with regards to what e-assessment entails, as to whether it is a process, a method, or a tool. Future efforts should focus on developing a shared vocabulary and recognition that e-assessment practice exhibits characteristics of the “third space” between academics and professional services staff, encouraging a reflection of the key roles that each plays in the design and implementation of e-assessment

    Internet and Smartphone Use-Related Addiction Health Problems: Treatment, Education and Research

    Get PDF
    This Special Issue presents some of the main emerging research on technological topics of health and education approaches to Internet use-related problems, before and during the beginning of coronavirus disease 2019 (COVID-19). The objective is to provide an overview to facilitate a comprehensive and practical approach to these new trends to promote research, interventions, education, and prevention. It contains 40 papers, four reviews and thirty-five empirical papers and an editorial introducing everything in a rapid review format. Overall, the empirical ones are of a relational type, associating specific behavioral addictive problems with individual factors, and a few with contextual factors, generally in adult populations. Many have adapted scales to measure these problems, and a few cover experiments and mixed methods studies. The reviews tend to be about the concepts and measures of these problems, intervention options, and prevention. In summary, it seems that these are a global culture trend impacting health and educational domains. Internet use-related addiction problems have emerged in almost all societies, and strategies to cope with them are under development to offer solutions to these contemporary challenges, especially during the pandemic situation that has highlighted the global health problems that we have, and how to holistically tackle them

    The physics of the B Factories

    Get PDF
    “The Physics of the B Factories” describes a decade long effort of physicists in the quest for the precise determination of asymmetry — broken symmetry — between particles and anti-particles. We now recognize that the matter we see around us is the residue — one part in a billion — of the matter and antimatter that existed in the early universe, most of which annihilated into the cosmic background radiation that bathes us. But the question remains: how did the baryonic matter-antimatter asymmetry arise? This book describes the work done by some 1000 physicists and engineers from around the globe on two experimental facilities built to test our understanding of this phenomenon, one at the SLAC National Accelerator Laboratory in California, USA, and a second at the KEK Laboratory, Tsukuba, Japan, and what we have learned from them in broadening our understanding of nature. Why is our universe dominated by the matter of which we are made rather than equal parts of matter and antimatter? This question has puzzled physicists for decades. However, this was not the question we addressed when we wrote the paper on CP violation in 1972. Our question was whether we can explain the CP violation observed in the K meson decay within the framework of the renormalizable gauge theory. At that time, Sakharov’s seminal paper was already published, but it did not attract our attention. If we were aware of the paper, we would have been misled into seeking a model satisfying Sakharov’s conditions and our paper might not have appeared. In our paper, we discussed that we need new particles in order to accommodate CP violation into the renormalizable electroweak theory, and proposed the six-quark scheme as one of the possible ways introducing new particles. We thought that the six-quark scheme is very interesting, but it was just a possibility. The situation changed when the tau-lepton was found and it was followed by the discovery of the Upsilon particle. The existence of the third generation became reality. However, it was still uncertain whether the mixing of the six quarks is a real origin of the observed CP violation. Theoretical calculation of CP asymmetries in the neutral K meson system contains uncertainty from strong interaction effects. What settled this problem were the B Factories built at SLAC and KEK. These B Factories are extraordinary in many ways. In order to fulfill the requirements of special experiments, the beam energies of the colliding electron and positron are asymmetric, and the luminosity is unprecedentedly high. It is also remarkable that severe competition between the two laboratories boosted their performance. One of us (M. Kobayashi) has been watching the development at KEK very closely as the director of the Institute of Particle and Nuclear Studies of KEK for a period of time. As witnesses, we appreciate the amazing achievement of those who participated in these projects at both laboratories. The B Factories have contributed a great deal to our understanding of particle physics, as documented in this book. In particular, thanks to the high luminosity far exceeding the design value, experimental groups measured mixing angles precisely and verified that the dominant source of CP violation observed in the laboratory experiments is flavor mixing among the three generations of quarks. Obviously we owe our Nobel Prize to this result. Now we are awaiting the operation of the next generation Super B Factories. In spite of its great success, the Standard Model is not an ultimate theory. For example, it is not thought to be possible for the matter dominance of the universe to be explained by the Standard Model. This means that there will still be unknown particles and unknown interactions. We have a lot of theoretical speculations but experimental means are rather limited. There are great expectations for the Super B Factories to reveal a clue to the world beyond the Standard Model
    corecore