106 research outputs found

    Model Checking Graph Transformations:A Comparison of Two Approaches

    Get PDF

    Verificare: a platform for composable verification with application to SDN-Enabled systems

    Full text link
    Software-Defined Networking (SDN) has become increasing prevalent in both the academic and industrial communities. A new class of system built on SDNs, which we refer to as SDN-Enabled, provide programmatic interfaces between the SDN controller and the larger distributed system. Existing tools for SDN verification and analysis are insufficiently expressive to capture this composition of a network and a larger distributed system. Generic verification systems are an infeasible solution, due to their monolithic approach to modeling and rapid state-space explosion. In this thesis we present a new compositional approach to system modeling and verification that is particularly appropriate for SDN-Enabled systems. Compositional models may have sub-components (such as switches and end-hosts) modified, added, or removed with only minimal, isolated changes. Furthermore, invariants may be defined over the composed system that restrict its behavior, allowing assumptions to be added or removed and for components to be abstracted away into the service guarantee that they provide (such as guaranteed packet arrival). Finally, compositional modeling can minimize the size of the state space to be verified by taking advantage of known model structure. We also present the Verificare platform, a tool chain for building compositional models in our modeling language and automatically compiling them to multiple off-the-shelf verification tools. The compiler outputs a minimal, calculus-oblivious formalism, which is accessed by plugins via a translation API. This enables a wide variety of requirements to be verified. As new tools become available, the translator can easily be extended with plugins to support them

    Versatile, low-cost, computer-controlled, sample positioning system for vacuum applications

    Get PDF
    A versatile, low-cost, easy to implement, microprocessor-based motorized positioning system (MPS) suitable for accurate sample manipulation in a Second Ion Mass Spectrometry (SIMS) system, and for other ultra-high vacuum (UHV) applications was designed and built at NASA LeRC. The system can be operated manually or under computer control. In the latter case, local, as well as remote operation is possible via the IEEE-488 bus. The position of the sample can be controlled in three linear orthogonal and one angular coordinates

    Zero-knowledge Proof Meets Machine Learning in Verifiability: A Survey

    Full text link
    With the rapid advancement of artificial intelligence technology, the usage of machine learning models is gradually becoming part of our daily lives. High-quality models rely not only on efficient optimization algorithms but also on the training and learning processes built upon vast amounts of data and computational power. However, in practice, due to various challenges such as limited computational resources and data privacy concerns, users in need of models often cannot train machine learning models locally. This has led them to explore alternative approaches such as outsourced learning and federated learning. While these methods address the feasibility of model training effectively, they introduce concerns about the trustworthiness of the training process since computations are not performed locally. Similarly, there are trustworthiness issues associated with outsourced model inference. These two problems can be summarized as the trustworthiness problem of model computations: How can one verify that the results computed by other participants are derived according to the specified algorithm, model, and input data? To address this challenge, verifiable machine learning (VML) has emerged. This paper presents a comprehensive survey of zero-knowledge proof-based verifiable machine learning (ZKP-VML) technology. We first analyze the potential verifiability issues that may exist in different machine learning scenarios. Subsequently, we provide a formal definition of ZKP-VML. We then conduct a detailed analysis and classification of existing works based on their technical approaches. Finally, we discuss the key challenges and future directions in the field of ZKP-based VML

    Estimating memory locality for virtual machines on NUMA systems

    Get PDF
    Thesis: M. Eng., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, 2013.Cataloged from PDF version of thesis.Includes bibliographical references (pages 59-61).The multicore revolution sparked another, similar movement towards scalable memory architectures. With most machines nowadays exhibiting non-uniform memory access (NUMA) properties, software and operating systems have seen the necessity to optimize their memory management to take full advantage of such architectures. Type 1 (native) hypervisors, in particular, are required to extract maximum performance from the underlying hardware, as they often run dozens of virtual machines (VMs) on a single system and provide clients with performance guarantees that must be met. While VM memory demand is often satisfied by CPU caches, memory-intensive workloads may induce a higher rate of last-level cache misses, requiring more accesses to RAM. On today's typical NUMA systems, accessing local RAM is approximately 50% faster than remote RAM. We discovered that current-generation processors from major manufacturers do not provide inexpensive ways to characterize the memory locality achieved by VMs and their constituents. Instead, we present in this thesis a series of techniques based on statistical sampling of memory that produce powerful estimates for NUMA locality and related metrics. Our estimates offer tremendous insight on inefficient placement of VMs and memory, and can be a solid basis for algorithms aiming at dynamic reorganization for improvements in locality, as well as NUMA-aware CPU scheduling algorithms.by Alexandre Milouchev.M. Eng

    Pushouts in software architecture design

    Get PDF
    A classical approach to program derivation is to progressively extend a simple specification and then incrementally refine it to an implementation. We claim this approach is hard or impractical when reverse engineering legacy software architectures. We present a case study that shows optimizations and pushouts--in addition to refinements and extensions--are essential for practical stepwise development of complex software architectures.NSF CCF 0724979NSF CNS 0509338NSF CCF 0917167NSF DGE-1110007FCT SFRH/BD/47800/2008FCT UTAustin/CA/0056/200

    FULLY AUTONOMOUS SELF-POWERED INTELLIGENT WIRELESS SENSOR FOR REAL-TIME TRAFFIC SURVEILLANCE IN SMART CITIES

    Get PDF
    Reliable, real-time traffic surveillance is an integral and crucial function of the 21st century intelligent transportation systems (ITS) network. This technology facilitates instantaneous decision-making, improves roadway efficiency, and maximizes existing transportation infrastructure capacity, making transportation systems safe, efficient, and more reliable. Given the rapidly approaching era of smart cities, the work detailed in this dissertation is timely in that it reports on the design, development, and implementation of a novel, fully-autonomous, self-powered intelligent wireless sensor for real-time traffic surveillance. Multi-disciplinary, innovative integration of state-of-the-art, ultra-low-power embedded systems, smart physical sensors, and the wireless sensor network—powered by intelligent algorithms—are the basis of the developed Intelligent Vehicle Counting and Classification Sensor (iVCCS) platform. The sensor combines an energy-harvesting subsystem to extract energy from multiple sources and enable sensor node self-powering aimed at potentially indefinite life. A wireless power receiver was also integrated to remotely charge the sensor’s primary battery. Reliable and computationally efficient intelligent algorithms for vehicle detection, speed and length estimation, vehicle classification, vehicle re-identification, travel-time estimation, time-synchronization, and drift compensation were fully developed, integrated, and evaluated. Several length-based vehicle classification schemes particular to the state of Oklahoma were developed, implemented, and evaluated using machine learning algorithms and probabilistic modeling of vehicle magnetic length. A feature extraction employing different techniques was developed to determine suitable and efficient features for magnetic signature-based vehicle re-identification. Additionally, two vehicle re-identification models based on matching vehicle magnetic signature from a single magnetometer were developed. Comprehensive system evaluation and extensive data analyses were performed to fine-tune and validate the sensor, ensuring reliable and robust operation. Several field studies were conducted under various scenarios and traffic conditions on a number of highways and urban roads and resulted in 99.98% detection accuracy, 97.4782% speed estimation accuracy, and 97.6951% classification rate when binning vehicles into four groups based on their magnetic length. Threshold-based, re-identification results revealed 65.25%~100% identification rate for a window of 25~500 vehicles. Voting-based, re-identification evaluation resulted in 90~100% identification rate for a window of 25~500 vehicles. The developed platform is portable and cost-effective. A single sensor node costs only $30 and can be installed for short-term use (e.g., work zone safety, traffic flow studies, roadway and bridge design, traffic management in atypical situations), as well as long-term use (e.g., collision avoidance at intersections, traffic monitoring) on highways, roadways, or roadside surfaces. The power consumption assessment showed that the sensor is operational for several years. The iVCCS platform is expected to significantly supplement other data collection methods used for traffic monitoring throughout the United States. The technology is poised to play a vital role in tomorrow’s smart cities

    KP-LAB Knowledge Practices Laboratory -- Specification of end-user applications

    Get PDF
    deliverablesThe present deliverable provides a high-level view on the new specifications of end user applications defined in the WPII during the M37-M46 period of the KP-Lab project. This is the last in the series of four deliverables that cover all the tools developed in the project, the previous ones being D6.1, D6.4 and D6.6. This deliverable presents specifications for the new functionalities for supporting the dedicated research studies defined in the latest revision of the KP-Lab research strategy. The tools addressed are: the analytic tools (Data export, Time-line-based analyser, Visual analyser), Clipboard, Search, Versioning of uploadable content items, Visual Model Editor (VME) and Visual Modeling Language Editor (VMLE). The main part of the deliverable provides the summary of tool specifications and the description of the Knowledge Practices Environment architecture, as well as an overview of the revised technical design process, of the toolsÂ’ relationship with the research studies, and of the driving objectives and the high-level requirements relevant for the present specifications. The full specifications of tools are provided in the annexes 1-9
    • …
    corecore