430 research outputs found

    Towards a metric for open source software quality

    Get PDF
    Software quality is more than just conformance to a set of requirements and represents many attributes related to each other that make up a piece of software. An important part of this measure depends on the underlying processes and methodologies used in the engineering of software. We present an early exposition towards a quality model for open source software (OSS). We describe some basic notions of quality for OSS and present a basic model, where quality notions consist of various factors that influence such quality. The purpose of this effort is ultimately to develop a quantitative metric for software quality

    Generalisation of a Waiting-Time Relation

    Get PDF
    AbstractA generalisation of a waiting-time relation is developed by the use of Laplace transform theory. The generalisation produces an infinite series and it is demonstrated how it may be summed by representation in closed form. Extensions and examples of the waiting-time relation are given

    Letter from the Editors

    Full text link
    We are proud to present this year’s twenty-second edition of The Gettysburg Historical Journal. Having finally overcome the Covid-19 pandemic, the editors of the journal have had the opportunity to work together and with professors in person that we did not experience in the past two years. Coming out of the pandemic invigorated and ready to work, The Gettysburg Historical Journal received a plethora of submissions from both Gettysburg College students and other students around the country. The works accepted this semester offer a wide range of research spanning topics from Revolutionary America to postcolonial efforts in Vietnam

    Data consistency in transactional storage systems: a centralised approach.

    Get PDF
    We introduce an interleaving operational semantics for describing the client-observable behaviour of atomic transactions on distributed key-value stores. Our semantics builds on abstract states comprising centralised, global key-value stores and partial client views. We provide operational definitions of consistency models for our key-value stores which are shown to be equivalent to the well-known declarative definitions of consistency model for execution graphs. We explore two immediate applications of our semantics: specific protocols of geo-replicated databases (e.g. COPS) and partitioned databases (e.g. Clock-SI) can be shown to be correct for a specific consistency model by embedding them in our centralised semantics; programs can be directly shown to have invariant properties such as robustness results against a weak consistency model

    Sparse learning with concave regularization: relaxation of the irrepresentable condition

    Get PDF
    Learning sparse models from data is an important task in all those frameworks where relevant information should be identified within a large dataset. This can be achieved by formulating and solving suitable sparsity promoting optimization problems. As to linear regression models, Lasso is the most popular convex approach, based on an L1-norm regularization. In contrast, in this paper, we analyse a concave regularized approach, and we prove that it relaxes the irrepresentable condition, which is sufficient and essentially necessary for Lasso to select the right significant parameters. In practice, this has the benefit of reducing the number of necessary measurements with respect to Lasso. Since the proposed problem is nonconvex, we also discuss different algorithms to solve it, and we illustrate the obtained enhancement via numerical experiments

    Specifying and Verifying Concurrent Algorithms with Histories and Subjectivity

    Full text link
    We present a lightweight approach to Hoare-style specifications for fine-grained concurrency, based on a notion of time-stamped histories that abstractly capture atomic changes in the program state. Our key observation is that histories form a partial commutative monoid, a structure fundamental for representation of concurrent resources. This insight provides us with a unifying mechanism that allows us to treat histories just like heaps in separation logic. For example, both are subject to the same assertion logic and inference rules (e.g., the frame rule). Moreover, the notion of ownership transfer, which usually applies to heaps, has an equivalent in histories. It can be used to formally represent helping---an important design pattern for concurrent algorithms whereby one thread can execute code on behalf of another. Specifications in terms of histories naturally abstract granularity, in the sense that sophisticated fine-grained algorithms can be given the same specifications as their simplified coarse-grained counterparts, making them equally convenient for client-side reasoning. We illustrate our approach on a number of examples and validate all of them in Coq.Comment: 17 page

    Design and Characterization of a Textile Electrode System for the Detection of High-Density sEMG

    Get PDF
    Muscle activity monitoring in dynamic conditions is a crucial need in different scenarios, ranging from sport to rehabilitation science and applied physiology. The acquisition of surface electromyographic (sEMG) signals by means of grids of electrodes (High-Density sEMG, HD-sEMG) allows obtaining relevant information on muscle function and recruitment strategies. During dynamic conditions, this possibility demands both a wearable and miniaturized acquisition system and a system of electrodes easy to wear, assuring a stable electrode-skin interface. While recent advancements have been made on the former issue, detection systems specifically designed for dynamic conditions are at best incipient. The aim of this work is to design, characterize, and test a wearable, HD-sEMG detection system based on textile technology. A 32-electrodes, 15 mm inter-electrode distance textile grid was designed and prototyped. The electrical properties of the material constituting the detection system and of the electrode-skin interface were characterized. The quality of sEMG signals was assessed in both static and dynamic contractions. The performance of the textile detection system was comparable to that of conventional systems in terms of stability of the traces, properties of the electrode-skin interface and quality of the collected sEMG signals during quasi-isometric and highly dynamic tasks

    On the Semantics of Snapshot Isolation

    Get PDF
    Snapshot isolation (SI) is a standard transactional consistency model used in databases, distributed systems and software transactional memory (STM). Its semantics is formally defined both declaratively as an acyclicity axiom, and operationally as a concurrent algorithm with memory bearing timestamps. We develop two simpler equivalent operational definitions of SI as lock-based reference implementations that do not use timestamps. Our first locking implementation is prescient in that requires a priori knowledge of the data accessed by a transaction and carries out transactional writes eagerly (in-place). Our second implementation is non-prescient and performs transactional writes lazily by recording them in a local log and propagating them to memory at commit time. Whilst our first implementation is simpler and may be better suited for developing a program logic for SI transactions, our second implementation is more practical due to its non-prescience. We show that both implementations are sound and complete against the declarative SI specification and thus yield equivalent operational definitions for SI. We further consider, for the first time formally, the use of SI in a context with racy non-transactional accesses, as can arise in STM implementations of SI. We introduce robust snapshot isolation (RSI), an adaptation of SI with similar semantics and guarantees in this mixed setting. We present a declarative specification of RSI as an acyclicity axiom and analogously develop two operational models as lock-based reference implementations (one eager, one lazy). We show that these operational models are both sound and complete against the declarative RSI model

    Dynamic surface electromyography using stretchable screen-printed textile electrodes

    Get PDF
    Objective. Wearable devices have created new opportunities in healthcare and sport sciences by unobtrusively monitoring physiological signals. Textile polymer-based electrodes proved to be effective in detecting electrophysiological potentials but suffer mechanical fragility and low stretch resistance. The goal of this research is to develop and validate in dynamic conditions cost-effective and easily manufacturable electrodes characterized by adequate robustness and signal quality. Methods. We here propose an optimized screen printing technique for the fabrication of PEDOT:PSS-based textile electrodes directly into finished stretchable garments for surface electromyography (sEMG) applications. A sensorised stretchable leg sleeve was developed, targeting five muscles of interest in rehabilitation and sport science. An experimental validation was performed to assess the accuracy of signal detection during dynamic exercises, including sit-to-stand, leg extension, calf raise, walking, and cycling. Results. The electrodes can resist up to 500 stretch cycles. Tests on five subjects revealed excellent contact impedance, and cross-correlation between sEMG envelopes simultaneously detected from the leg muscles by the textile and Ag/AgCl electrodes was generally greater than 0.9, which proves that it is possible to obtain good quality signals with performance comparable with disposable electrodes. Conclusions. An effective technique to embed polymer-based electrodes in stretchable smart garments was presented, revealing good performance for dynamic sEMG detections. Significance. The achieved results pave the way to the integration of unobtrusive electrodes, obtained by screen printing of conductive polymers, into technical fabrics for rehabilitation and sport monitoring, and in general where the detection of sEMG in dynamic conditions is necessary

    Certification of open-source software : a role for formal methods?

    Get PDF
    Despiteitshugesuccessandincreasingincorporationincom- plex, industrial-strength applications, open source software, by the very nature of its open, unconventional, distributed development model, is hard to assess and certify in an effective, sound and independent way. This makes its use and integration within safety or security-critical systems, a risk. And, simultaneously an opportunity and a challenge for rigourous, mathematically based, methods which aim at pushing software analysis and development to the level of a mature engineering discipline. This paper discusses such a challenge and proposes a number of ways in which open source development may benefit from the whole patrimony of formal methods.L. S. Barbosa research was partially supported by the CROSS project, under contract PTDC/EIA-CCO/108995/2008
    • …
    corecore