1,530 research outputs found

    An Abstract Interpretation Framework for the Round-Off Error Analysis of Floating-Point Programs

    Get PDF
    This paper presents an abstract interpretation framework for the round-off error analysis of floating-point programs. This framework defines a parametric abstract analysis that computes, for each combination of ideal and floating-point execution path of the program, a sound over-approximation of the accumulated floating-point round-off error that may occur. In addition, a Boolean expression that characterizes the input values leading to the computed error approximation is also computed. An abstraction on the control flow of the program is proposed to mitigate the explosion of the number of elements generated by the analysis. Additionally, a widening operator is defined to ensure the convergence of recursive functions and loops. An instantiation of this framework is implemented in the prototype tool PRECiSA that generates formal proof certificates stating the correctness of the computed round-off errors

    Cancer: Investigating the impact of the implementation platform on machine learning models

    Get PDF
    In the context of global cancer prevalence and the imperative need to improve diagnostic efficiency, scientists have turned to machine learning (ML) techniques to expedite diagnosis processes. Although previous research has shown promising results in developing predictive models for faster cancer diagnosis, discrepancies in outcomes have emerged, even when employing the same dataset. This study addresses a critical question: does the choice of development platform for ML models impact their performance in cancer diagnosis? Utilizing the publicly available Wisconsin Diagnostic Breast Cancer (WDBC) dataset from the University of California, Irvine (UCI) to train four ML algorithms on two distinct platforms: Python SciKit-Learn and Knime Analytics. The algorithmsā€™ performance was rigorously assessed and compared, with both platforms operating under their default configurations. The findings of this study underscore an impact of platform selection on ML model performance, emphasizing the need for thoughtful consideration when choosing a platform for predictive modelsā€™ development. Such a decision bears significant implications for model efficacy and, ultimately, patient outcomes in the healthcare industry. The source code (Python and Knime) and data for this study are made fully available through a public GitHub repository

    Design and construction of a display symbology tester for general aviation

    Get PDF
    The objective of this research project was to design and construct a system to evaluate the effectiveness of diverse types of flight instrument symbology in transmitting information to the pilot. A structured systems engineering approach was used to select system components given the requirements to minimize cost and modifications to the test aircraft and maximize display flexibility while providing enough information to execute instrument flight tasks. The system is composed of three major elements: flight data collection, data processing, and the display. Each of these elements was analyzed individually to establish requirements and implementation options. The options were compared using a weighted-array analysis to select the most appropriate solution. The final system configuration combined a three-axis inertial measurement unit from Watson Industries, a Motorola pressure sensor, a Pentium III laptop computer running custom software written in Microsoft Visual Basic, and an Earth Technologies 8.4 inch color liquid crystal display. The software was developed using an iterative process to design, test, and refine instrument appearance and function in a simulated, flight environment. During early testing, significant IMU deficiencies were noted and partially compensated for by the addition of a pressure sensor for direct altitude measurement and a variable velocity input circuit. Despite these enhancements, IMU performance remained poor and resulted in a significant number of aborted test points. Three display layouts were used to conduct a limited evaluation of the potential of the system to meet the requirements. The first display was designed to closely replicate the standard GA aircraft \u27instrument-T : The second consisted of a gyroscope the size of the entire display with digital readouts of other flight parameters superimposed upon it. The third display was designed to look similar to an F-16 heads up display. Three pilots were chosen from diverse backgrounds and were tasked with performing a simulated precision instrument approach while their performance was recorded. Error analysis was then conducted using commercial data analysis and plotting tools. Pilot performance varied widely between displays, with the instrument-T display producing the worst average performance. This proof-of-principle evaluation was successful, but the basic system architecture should be refined one more time before conducting further symbology testing. The most significant recommendation is to replace the IMU with a more reliable data collection system, preferably one based upon the Global Positioning System. Also, before a rigorous symbology evaluation can be conducted, more detail needs to be added to the instrument types already implemented and new instruments, such as a horizontal situation indicator and turn-and-slip instrument, must be added. Then a sponsor should be sought to finance continuing display research projects

    Plasma needle : exploring biomedical applications of non-thermal plasmas

    Get PDF
    The plasma needle is a novel design of a radio-frequency discharge in helium/air mixtures at atmospheric pressure. The discharge contains neutral, excited and ionized particles, and emits ultraviolet (UV) light. It operates at low electric power and close to ambient temperature; it combines chemical activity with non-destructive character. Therefore it is expected that the plasma needle will be used in future in (micro) surgery, e.g. in wound healing and in controlled tissue removal through cell detachment or apoptosis, avoiding necrosis and inĀ°ammation reactions. Focus of this study is both on optimization of needle design and on assessment of effects of plasma activity on living cells. This work is a pio- neering study of the effects of non-thermal plasma on biological samples. The design of the plasma needle was adjusted in such a way that instead of operating in a closed reactor, now the treatments could be performed in open air. Thus, larger samples could be treated and handling times were reduced. Then, a characterization of the needle was performed using electrical as well as optical diagnostics (Chapter 3). It was found that the needle operated at voltages of 140 Vrms and higher. A model was made to determine the resistance of the plasma and from this an estimation of the electron density could be made. The latter can be regarded as an indirect measure for plasma reactivity. Results from optical emission spectroscopy showed that reactive oxygen species, such as OĀ¢ and OHĀ¢, were produced in the plasma. Furthermore, UV emission was detected. Both the radicals and the UV are known to interact with cells and tissues. For applications, the amount of radicals that reach the sample or that are generated in the sample is important. For this reason, radicals were detected in liquid that was treated with plasma using a chemical technique (Chapter 4). It involved a fluorescent probe: the probe was dissolved in liquid and after reaction with specific radicals it became fluorescent. Radical density in the liquid depended on plasma conditions, treatment time, and amount of liquid used, but it was always in the micromolar range. These concentrations were found to be comparable with physiological concentrations that were stated in literature. Basic cell reactions after plasma treatment were determined by experiments on cultured Chinese hamster ovarian (CHO K1) cells (Chapter 5). One of these reactions was cell detachment: cells detached from their environment but remained alive after treatment. Other reactions included a small percentage of apoptosis and, when high plasma powers were used, necrosis. A comparison with the effect of UV light from UV lamps was made (Chapter 7). The main effect of UV treatment was necrosis, but only above a certain threshold value. For mammalian cells, this threshold was reasonably high. Thus, the ef- fects of plasma treatment could not be explained by the action of the UV light from the plasma. Quantitative experiments were performed on cultured bovine aortic endothelial cells (BAEC) and rat smooth muscle cells (A7r5) (Chapter 6). These two cell types constitute walls of blood vessels. It was shown that treatment times of less than one minute cause detachment of the cells if the layer thickness of the liquid that covered the cells was low (around 0.1 mm). This suggests that at short treatment times, the penetration depth of the plasma into the sample is limited. The percentage of necrotic cells was low after treatment. No difference was found in the detachment behavior of both cell types. Finally, pilot experiments were performed on carotid arteries of C57BL/6 and Swiss mice ex vivo (Chapter 8). They were studied using a two-photon laser scanning microscope (TPLSM). Cell nuclei, elastin bands, and collagen could be visualized. Preliminary results indicate that induced changes are not strongly dependent on applied energy if no heating eĀ®ects are induced. Apparent effects were limited to the adventitia, probably due to a low penetration depth of active plasma species. In conclusion, we can state that the plasma needle is a non-destructive tool that can be ap- plied with precision. It has a superficial action and causes little damage to the tissue. The level of damage can be controlled to achieve a desired therapeutic effect. Both on cultured cells and on ex vivo arteries interesting effects were found that confirm the hypothesis that the plasma needle will have a future in surgery

    DEsignBench: Exploring and Benchmarking DALL-E 3 for Imagining Visual Design

    Full text link
    We introduce DEsignBench, a text-to-image (T2I) generation benchmark tailored for visual design scenarios. Recent T2I models like DALL-E 3 and others, have demonstrated remarkable capabilities in generating photorealistic images that align closely with textual inputs. While the allure of creating visually captivating images is undeniable, our emphasis extends beyond mere aesthetic pleasure. We aim to investigate the potential of using these powerful models in authentic design contexts. In pursuit of this goal, we develop DEsignBench, which incorporates test samples designed to assess T2I models on both "design technical capability" and "design application scenario." Each of these two dimensions is supported by a diverse set of specific design categories. We explore DALL-E 3 together with other leading T2I models on DEsignBench, resulting in a comprehensive visual gallery for side-by-side comparisons. For DEsignBench benchmarking, we perform human evaluations on generated images in DEsignBench gallery, against the criteria of image-text alignment, visual aesthetic, and design creativity. Our evaluation also considers other specialized design capabilities, including text rendering, layout composition, color harmony, 3D design, and medium style. In addition to human evaluations, we introduce the first automatic image generation evaluator powered by GPT-4V. This evaluator provides ratings that align well with human judgments, while being easily replicable and cost-efficient. A high-resolution version is available at https://github.com/design-bench/design-bench.github.io/raw/main/designbench.pdf?download=Comment: Project page at https://design-bench.github.io

    Force-induced acoustic phonon transport across single-digit nanometre vacuum gaps

    Full text link
    Heat transfer between bodies separated by nanoscale vacuum gap distances has been extensively studied for potential applications in thermal management, energy conversion and data storage. For vacuum gap distances down to 20 nm, state-of-the-art experiments demonstrated that heat transport is mediated by near-field thermal radiation, which can exceed Planck's blackbody limit due to the tunneling of evanescent electromagnetic waves. However, at sub-10-nm vacuum gap distances, current measurements are in disagreement on the mechanisms driving thermal transport. While it has been hypothesized that acoustic phonon transport across single-digit nanometre vacuum gaps (or acoustic phonon tunneling) can dominate heat transfer, the underlying physics of this phenomenon and its experimental demonstration are still unexplored. Here, we use a custom-built high-vacuum shear force microscope (HV-SFM) to measure heat transfer between a silicon (Si) tip and a feedback-controlled platinum (Pt) nanoheater in the near-contact, asperity-contact, and bulk-contact regimes. We demonstrate that in the near-contact regime (i.e., single-digit nanometre or smaller vacuum gaps before making asperity contact), heat transfer between Si and Pt surfaces is dominated by force-induced acoustic phonon transport that exceeds near-field thermal radiation predictions by up to three orders of magnitude. The measured thermal conductance shows a gap dependence of dāˆ’5.7Ā±1.1d^{-5.7\pm1.1} in the near-contact regime, which is consistent with acoustic phonon transport modelling based on the atomistic Green's function (AGF) framework. Our work suggests the possibility of engineering heat transfer across single-digit nanometre vacuum gaps with external force stimuli, which can make transformative impacts to the development of emerging thermal management technologies.Comment: 9 pages with 4 figures (Main text), 13 pages with 7 figures (Methods), and 13 pages with 6 figures and 1 table (Supplementary Information

    Safetyā€oriented discrete event model for airport Aā€SMGCS reliability assessment

    Get PDF
    A detailed analysis of State of the Art Technologies and Procedures into Airport Advanced-Surface Movement Guidance and Control Systems has been provided in this thesis, together with the review ofStatistical Monte Carlo Analysis, Reliability Assessment and Petri Nets theories. This practical and theoretical background has lead the author to the conclusion that there is a lack of linkage in between these fields. At the same of time the rapid increasing of Air Traffic all over the world, has brought in evidence the urgent need of practical instruments able to identify and quantify the risks connected with Aircraft operations on the ground, since the Airport has shown to be the actual ā€˜bottle neckā€™ of the entire Air Transport System. Therefore, the only winning approach to such a critical matter has to be multi-disciplinary, sewing together apparently different subjects, coming from the most disparate areas of interest and trying to fulfil the gap. The result of this thesis work has come to a start towards the end, when a Timed Coloured Petri Net (TCPN) model of a ā€˜sampleā€™ Airport A-SMGCS has been developed, that is capable of taking into account different orders of questions arisen during these recent years and tries to give them some good answers. The A-SMGCS Airport model is, in the end, a parametric tool relying on Discrete Event System theory, able to perform a Reliability Analysis of the system itself, that: ā€¢ uses a Monte Carlo Analysis applied to a Timed Coloured Petri Net, whose purpose is to evaluate the Safety Level of Surface Movements along an Airport ā€¢ lets the user to analyse the impact of Procedures and Reliability Indexes of Systems such as Surface Movement Radars, Automatic Dependent Surveillance-Broadcast, Airport Lighting Systems, Microwave Sensors, and so onā€¦ onto the Safety Level of Airport Aircraft Transport System ā€¢ not only is a valid instrument in the Design Phase, but it is useful also into the Certifying Activities an in monitoring the Safety Level of the above mentioned System with respect to changes to Technologies and different Procedures.This TCPN model has been verified against qualitative engineering expectations by using simulation experiments and occupancy time schedules generated a priori. Simulation times are good, and since the model has been written into Simulink/Stateflow programming language, it can be compiled to run real-time in C language (Real-time workshop and Stateflow Coder), thus relying on portable code, able to run virtually on any platform, giving even better performances in terms of execution time. One of the most interesting applications of this work is the estimate, for an Airport, of the kind of A-SMGCS level of implementation needed (Technical/Economical convenience evaluation). As a matter of fact, starting from the Traffic Volume and choosing the kind of Ground Equipment to be installed, one can make predictions about the Safety Level of the System: if the value is compliant with the TLS required by ICAO, the A-SMGCS level of Implementation is sufficiently adequate. Nevertheless, even if the Level of Safety has been satisfied, some delays due to reduced or simplified performances (even if Safety is compliant) of some of the equipment (e.g. with reference to False Alarm Rates) can lead to previously unexpected economical consequences, thus requiring more accurate systems to be installed, in order to meet also Airport economical constraints. Work in progress includes the analysis of the effect of weather conditions and re-sequencing of a given schedule. The effect of re-sequencing a given schedule is not yet enough realistic since the model does not apply inter arrival and departure separations. However, the model might show some effect on different sequences based on runway occupancy times. A further developed model containing wake turbulence separation conditions would be more sensitive for this case. Hence, further work will be directed towards: ā€¢ The development of On-Line Re-Scheduling based on the available actual runway/taxiway configuration and weather conditions. ā€¢ The Engineering Safety Assessment of some small Italian Airport A-SMGCSs (Model validation with real data). ā€¢ The application of Stochastic Differential Equations systems in order to evaluate the collision risk on the ground inside the Place alone on the Petri Net, in the event of a Short Term Conflict Alert (STCA), by adopting Reich Collision Risk Model. ā€¢ Optimal Air Traffic Control Algorithms Synthesis (Adaptive look-ahead Optimization), by Dynamically Timed Coloured Petri Nets, together with the implementation of Error-Recovery Strategies and Diagnosis Functions

    Novel Development of a Low-Cost, Micrometer-Scale Tip-Enhanced Raman Spectroscopy System

    Get PDF
    Modern scientific instruments are significant capital investments for universities. These investments can be outside of the funding capabilities of some smaller universities or departments and can be a significant barrier in the pursuit of scientific breakthroughs. This project aims to provide a template for universities or research groups to upgrade, at a reasonable price, an existing Raman spectroscopy system to a Tip-Enhanced Raman Spectroscopy (TERS) system. This system can serve as a permanent upgrade to an existing system or as a bridge necessary to prove the viability of a research path before significant capital investment in a commercial TERS system. This project explains, in detail, all required components of a TERS system and the rationale of each designed component. The enhancement factor demonstrated using this homebuilt TERS system shows the potential of this system as an alternative to much more expensive commercial systems

    WPI Research, 2017

    Get PDF
    https://digitalcommons.wpi.edu/wpiresearch-all/1003/thumbnail.jp
    • ā€¦
    corecore