7,731 research outputs found

    UMSL Bulletin 2023-2024

    Get PDF
    The 2023-2024 Bulletin and Course Catalog for the University of Missouri St. Louis.https://irl.umsl.edu/bulletin/1088/thumbnail.jp

    Towards A Practical High-Assurance Systems Programming Language

    Full text link
    Writing correct and performant low-level systems code is a notoriously demanding job, even for experienced developers. To make the matter worse, formally reasoning about their correctness properties introduces yet another level of complexity to the task. It requires considerable expertise in both systems programming and formal verification. The development can be extremely costly due to the sheer complexity of the systems and the nuances in them, if not assisted with appropriate tools that provide abstraction and automation. Cogent is designed to alleviate the burden on developers when writing and verifying systems code. It is a high-level functional language with a certifying compiler, which automatically proves the correctness of the compiled code and also provides a purely functional abstraction of the low-level program to the developer. Equational reasoning techniques can then be used to prove functional correctness properties of the program on top of this abstract semantics, which is notably less laborious than directly verifying the C code. To make Cogent a more approachable and effective tool for developing real-world systems, we further strengthen the framework by extending the core language and its ecosystem. Specifically, we enrich the language to allow users to control the memory representation of algebraic data types, while retaining the automatic proof with a data layout refinement calculus. We repurpose existing tools in a novel way and develop an intuitive foreign function interface, which provides users a seamless experience when using Cogent in conjunction with native C. We augment the Cogent ecosystem with a property-based testing framework, which helps developers better understand the impact formal verification has on their programs and enables a progressive approach to producing high-assurance systems. Finally we explore refinement type systems, which we plan to incorporate into Cogent for more expressiveness and better integration of systems programmers with the verification process

    Operatic Pasticcios in 18th-Century Europe

    Get PDF
    In Early Modern times, techniques of assembling, compiling and arranging pre-existing material were part of the established working methods in many arts. In the world of 18th-century opera, such practices ensured that operas could become a commercial success because the substitution or compilation of arias fitting the singer's abilities proved the best recipe for fulfilling the expectations of audiences. Known as »pasticcios« since the 18th-century, these operas have long been considered inferior patchwork. The volume collects essays that reconsider the pasticcio, contextualize it, define its preconditions, look at its material aspects and uncover its aesthetical principles

    A two-scale solver for linear elasticity problems in the context of parallel message passing

    Full text link
    This paper pushes further the intrinsic capabilities of the GFEMgl^{gl} global-local approach introduced initially in [1]. We develop a distributed computing approach using MPI (Message Passing Interface) both for the global and local problems. Regarding local problems, a specific scheduling strategy is introduced. Then, to measure correctly the convergence of the iterative process, we introduce a reference solution that revisits the product of classical and enriched functions. As a consequence, we are able to propose a purely matrix-based implementation of the global-local problem. The distributed approach is then compared to other parallel solvers either direct or iterative with domain decomposition. The comparison addresses the scalability as well as the elapsed time. Numerical examples deal with linear elastic problems: a polynomial exact solution problem, a complex micro-structure, and, finally, a pull-out test (with different crack extent). 1: C. A. Duarte, D.-J. Kim, and I. Babu\v{s}ka. A global-local approach for the construction of enrichment functions for the generalized fem and its application to three-dimensional cracks. In Advances in Meshfree Techniques, Dordrecht, 2007. SpringerComment: To be published in Computer Methods in Applied Mechanics and Engineering. Revision: mainly introduction, conclusion and Fig3 update

    AN EMPIRICAL STUDY OF CONCURRENT FEATURE USAGE IN GO

    Get PDF
    The Go language includes support for running functions or methods concurrently as goroutines, which are lightweight threads managed directly by the Go language runtime. Go is probably best known for the use of a channel-based, message-passing concurrency mechanism, based on Hoare's Communicating Sequential Processes (CSP), for inter-thread communication. However, Go also includes support for traditional concurrency features, such as mutexes and condition variables, that are commonly used in other languages. In this paper, we analyze the use of these traditional concurrency features, using a corpus of Go programs used in earlier work to study the use of message-passing concurrency features in Go. The goal of this work is to better support developers in using traditional concurrency features, or a combination of traditional and message-passing features, in Go

    Performance, memory efficiency and programmability: the ambitious triptych of combining vertex-centricity with HPC

    Get PDF
    The field of graph processing has grown significantly due to the flexibility and wide applicability of the graph data structure. In the meantime, so has interest from the community in developing new approaches to graph processing applications. In 2010, Google introduced the vertex-centric programming model through their framework Pregel. This consists of expressing computation from the perspective of a vertex, whilst inter-vertex communications are achieved via data exchanges along incoming and outgoing edges, using the message-passing abstraction provided. Pregel ’s high-level programming interface, designed around a set of simple functions, provides ease of programmability to the user. The aim is to enable the development of graph processing applications without requiring expertise in optimisation or parallel programming. Such challenges are instead abstracted from the user and offloaded to the underlying framework. However, fine-grained synchronisation, unpredictable memory access patterns and multiple sources of load imbalance make it difficult to implement the vertex centric model efficiently on high performance computing platforms without sacrificing programmability. This research focuses on combining vertex-centric and High-Performance Comput- ing (HPC), resulting in the development of a shared-memory framework, iPregel, which demonstrates that a performance and memory efficiency similar to that of non-vertex- centric approaches can be achieved while preserving the programmability benefits of vertex-centric. Non-volatile memory is then explored to extend single-node capabilities, during which multiple versions of iPregel are implemented to experiment with the various data movement strategies. Then, distributed memory parallelism is investigated to overcome the resource limitations of single node processing. A second framework named DiP, which ports applicable iPregel ’s optimisations to distributed memory, prioritises performance to high scalability. This research has resulted in a set of techniques and optimisations illustrated through a shared-memory framework iPregel and a distributed-memory framework DiP. The former closes a gap of several orders of magnitude in both performance and memory efficiency, even able to process a graph of 750 billion edges using non-volatile memory. The latter has proved that this competitiveness can also be scaled beyond a single node, enabling the processing of the largest graph generated in this research, comprising 1.6 trillion edges. Most importantly, both frameworks achieved these performance and capability gains whilst also preserving programmability, which is the cornerstone of the vertex-centric programming model. This research therefore demonstrates that by combining vertex-centricity and High-Performance Computing (HPC), it is possible to maintain performance, memory efficiency and programmability

    A prototype of the data quality pipeline of the Online Observation Quality System of ASTRI-Mini Array telescope system

    Get PDF
    Gamma-ray astronomy investigates the physics of the universe and the characteristics of celestial objects through gamma rays. Gamma-rays are the most energetic part of the electromagnetic spectrum, emitted in some of the brightest events in the universe, such as pulsars, quasars, and supernova remnants. Gamma rays can be observed with satellites or ground-based telescopes. The latter allow to detect gamma rays in the very high energy range with the indirect Cherenkov technique. When highly energetic photons enter Earth's atmosphere, they generate air showers, cascades of particles whose fast motion produces elusive flashes of blue Cherenkov light in the sky. This thesis discusses the research conducted at the Astrophysics and Space Science Observatory of Bologna in collaboration with the international project, guided by INAF, for ground-based gamma-ray astrophysics, ASTRI Mini-Array. The focus is on the Online Observation Quality System (OOQS), which conducts a quick look analysis during the telescope observation. The Cherenkov Camera Data Quality Checker is the OOQS component that performs real-time quality checks on the data acquired at high frequency, up to 1000\,Hz, and with a total bandwidth of 148MB/s, from the nine Cherenkov Cameras. The thesis presents the implementation of the OOQS-Pipeline, a software prototype that receives scientific packets from a Cherenkov Camera, performs quality analysis, and stores the results. The pipeline consists of three main applications: Kafka-Consumer, DQ-Analysis, and DQ-Aggregator. The pipeline was tested on a server having similar performance as the ones of the Array Observing Site, and results indicate that it is possible to acquire the maximum data flow produced by the cameras. Overall, the thesis presents an important contribution to the ASTRI Mini-Array project, about the development of the first version of the OOQS-Pipeline, which will maximize observation time with quality data passing the verification thresholds

    Identification of Micro- and Submicron (Nano) Plastics in Water Sources and the Impact of COVID-19 on Plastic Pollution

    Get PDF
    One of the most significant environmental issues that our society may deal with this century could be plastics. The world's water bodies, as well as land and air, are becoming more and more contaminated by plastic due to the ongoing and expanding manufacturing of these synthetic materials, as well as the lack of an effective strategy for managing plastic waste. The fact that plastics break down into smaller particles (micro and nanoplastics) by action of environmental physical and chemical reactions, and do not degrade biologically in a reasonable time, is a cause of concern as plastics are believed to cause harm in animals, plants and humans.To identify the types of plastics prevalent in aquatic habitats, a number of procedures have been developed, from sampling to identification. After a water body has been sampled using nets, pumps, or other tools, depending on the type of sample taken, it is usually necessary to treat the samples for separation and purification. The next stage is to employ analytical techniques to identify the synthetic contaminants. The most common approaches are microscopy, spectroscopy, and thermal analysis. This thesis gives an overview of where in the environment microplastics (MPs) and nanoplastics (NPs) can be found and summarizes the most important technologies applied to analyse the importance of plastics as a contaminant in water bodies. The development of standardised analytical procedures is still necessary as most of them are not suitable for the identification of particles below 50 μm due to resolution limitations. The preparation and analysis of samples are usually time-consuming factors that shall be considered. Particularly for MP and NP analysis in aqueous samples, thermal analysis methods based on sample degradation are generally not considered to be the most effective approach. Nevertheless, Pyrolysis - Gas Chromatography Time-of-Flight Mass Spectrometry (Py-GCToFMS) is used in this thesis to propose a novel approach as due to its unique detection abilities, and with a novel filtration methodology for collection, it enables the identification of tiny particle sizes (>0.1 μm) in water samples.PTFE membranes were selected to filter the liquid samples using a glass filtration system. This way, the synthetic particles will be deposited on the membranes and will allow the study and analysis of the precipitated material. PTFE is a readily available, reasonably priced, and adaptable product that makes sample preparation quick and simple.The three plastics under study—polypropylene (PP), polystyrene (PS), and polyvinyl chloride (PVC)—can be identified from complex samples at trace levels thanks to the employment of these widely used membranes and the identification of various and specific (marker) ions. The technique was examined against a range of standards samples that contained predetermined concentrations of MPs and NPs. Detection levels were then determined for PVC and PS and were found to be below <50 μg/ L, with repeatable data showing good precision (RSD <20 %). The examination of a complex matrix sample taken from a nearby river contributed to further validate this innovative methodology; the results indicated the existence of PS with a semi-quantifiable result of 250.23 g/L. Because of this, PY-GCToFMS appears to be a method that is appropriate for the task of identifying MPs and NPs from complex mixtures.This thesis also focuses on the environmental challenge that disposable plastic face masks (DPFMs) pose, which has been made significantly worse due to the COVID-19 pandemic. By the time this thesis was written, the production of disposable plastic facemasks had reached to approximately 200 million a day, in a global effort to tackle the spread of the new SARS-CoV-2 virus. This thesis investigates the emissions of pollutants from several different DPFM brands (medical and non-medical) that were submerged in water to replicate the conditions in the environment after these DPFMs have been discarded. The DPFM leachates were filtered using inorganic membranes type and characterized using Fourier transform infrared spectroscopy (FTIR), Scanning electron microscopy coupled with energy-dispersive X-ray spectroscopy (SEM-EDS), Light/Optical Microscopy (LM/OM), Inductively coupled plasma mass spectrometry (ICP-MS) and Liquid chromatography–mass spectrometry (LC-MS). Micro and nano scale polymeric fibres, particles, siliceous fragments and leachable inorganic and organic chemicals were observed from all of the tested DPFMs. For non-medical DPFMs, traces of concerning heavy metals were detected in association with silicon containing fragments (i.e. lead up to 6.79 μg/L). ICP-MS also confirmed the presence of other leachable metals like cadmium (up to 1.92 μg/L), antimony (up to 3.93 μg/L) and copper (up to 4.17 μg/L). LC-MS analysis identified organic species related to plastic additives; polyamide-66 monomer and oligomers (nylon-66 synthesis), surfactant molecules, and dye-like molecules were all tentatively identified in the leachate. The question of whether DPFMs are safe to use daily and what implications may be anticipated after their disposal into the environment is brought up by the toxicity of some of the chemicals discovered.The previous approach is expanded to medical DPFMs with the utilisation of Field Emission Gun Scanning Electron Microscope (FEG-SEM) in order to get high resolution images of the micro and nanoparticles deposited on the membranes. It is also incorporated the use of 0.02 μm pore size inorganic membranes to better identify the nanoparticles released.Separated aqueous samples were also obtained by submerging medical DPFMs for 24 hours to be analysed using ICP-MS and LC-MS.Both particles and fibres in the micro and nano scale were found in all 6 DPFMs brands of this study. EDS analysis revealed the presence of particles containing different heavy metals like lead, mercury, and arsenic among others. ICP-MS analysis results confirmed traces of heavy metals (antimony up to 2.41 μg/L and copper up to 4.68 μg/L). LC-MS analysis results identified organic species related to plastic additives and contaminants; polyamide-66 monomer and oligomers (nylon-66 synthesis), surfactant molecules, and polyethylene glycol were all tentatively identified in the leachate. The toxicity of some of the chemicals found raises the question of whether DPFMs are safe to be used on a daily basis and what consequences are to be expected after their disposal into the environment

    Casting a Line to the Land: Narratologies of Embodied Rituals and Connectivity to Place

    Get PDF
    This research argues for a re-conceptualisation and extension of place within architectural praxis that is reflective of broader discussions of meaning and environmental experience. This study examines the alignment between place and ritual theory as a way to better understand phenomenon that inform a greater connectivity between embodied and meaningful experiences with the environment. This will be explored through the act of fly fishing. Fly-fishing is relevant as the focus of the study owing in part to the author’s own experience as a fly angler. The relationship between the act of fly fishing, being part of nature and being in the world is meaningful. By conceptualising fly fishing as a form of ritual and process of ritualization, this study looks at how ritual affords fly anglers a connective relationship with place. This ritualized connection extends beyond mere doing and evokes cultivation of a deeper sensibility of and towards place. Present here is recognition of the role the body and ritual plays in enabling the formation of this sensibility of place. This research is advanced through narratological methods of autoethnography, storytelling, cognitive mapping and film. This way of working builds on the power of narrative and image-making to expose individual meaning-making. Employing these methods forms a holistic representation of experience, and so overcome potential disconnects that can arise between meaning-making efforts and actual experience. This research recognises that we that are embodied beings, and our embodied experiences are central to our relationships with place. Through this study, the researcher aims to develop a greater understanding of what place means – notably in how we generate a connection to place through our bodies – and to draw upon this insight to expand architectural discourse with a better understanding of connective and meaningful phenomena

    Bionic Lid Implant for Natural Closure (BLINC)

    Get PDF
    Facial nerve palsy (FNP) leads to an inability to blink. The exposed eye is at risk of developing corneal keratopathy and currently there is a lack of solution to active eye closure that is immediate and reliable. Bionic Lid Implant for Natural Closure (BLINC) proposes the use of an implantable actuator combined with the effects of an eyelid sling for dynamic eye closure. The aims of this thesis are to 1) explore the clinical need for BLINC, 2) describe the BLINC technology, and 3) present the results of its application in cadaveric and live models. Methods The aims of this project are addressed in three parts. In part one, the current therapies addressing key clinical end points in FNP from an ocular perspective and the setting where BLINC may first be used are explored. In part two the science behind BLINC is outlined. Finally in part three application of BLINC in cadaveric and live models are studied followed by a discussion on future steps preceding a pilot study in humans. Results Patients with FNP consistently identify issues related to the eye a primary concern. Current reanimation strategies offer the possibility of dynamic eye closure but the results are delayed and often unpredictable. BLINC reliably achieves active eye closure in cadaveric models by means of a wireless-powered, implantable electromagnetic actuator in conjunction with an eyelid sling. BLINC closes the eye in a similar fashion to natural closure for a symmetrical blink in FNP. Successful application of an inactive device in its complete form is achieved in a live animal without significant morbidity. Conclusion BLINC offers the possibility of restoring active eye closure with use of an implantable actuator. The concept has been successfully demonstrated in cadaveric models with successful device implantation in a live model. Future live trials are needed to address the remaining biocompatibility issues in preparation for human application
    • …
    corecore