1,192 research outputs found

    An automated wrapper-based approach to the design of dependable software

    Get PDF
    The design of dependable software systems invariably comprises two main activities: (i) the design of dependability mechanisms, and (ii) the location of dependability mechanisms. It has been shown that these activities are intrinsically difficult. In this paper we propose an automated wrapper-based methodology to circumvent the problems associated with the design and location of dependability mechanisms. To achieve this we replicate important variables so that they can be used as part of standard, efficient dependability mechanisms. These well-understood mechanisms are then deployed in all relevant locations. To validate the proposed methodology we apply it to three complex software systems, evaluating the dependability enhancement and execution overhead in each case. The results generated demonstrate that the system failure rate of a wrapped software system can be several orders of magnitude lower than that of an unwrapped equivalent

    PyCARL: A PyNN Interface for Hardware-Software Co-Simulation of Spiking Neural Network

    Full text link
    We present PyCARL, a PyNN-based common Python programming interface for hardware-software co-simulation of spiking neural network (SNN). Through PyCARL, we make the following two key contributions. First, we provide an interface of PyNN to CARLsim, a computationally-efficient, GPU-accelerated and biophysically-detailed SNN simulator. PyCARL facilitates joint development of machine learning models and code sharing between CARLsim and PyNN users, promoting an integrated and larger neuromorphic community. Second, we integrate cycle-accurate models of state-of-the-art neuromorphic hardware such as TrueNorth, Loihi, and DynapSE in PyCARL, to accurately model hardware latencies that delay spikes between communicating neurons and degrade performance. PyCARL allows users to analyze and optimize the performance difference between software-only simulation and hardware-software co-simulation of their machine learning models. We show that system designers can also use PyCARL to perform design-space exploration early in the product development stage, facilitating faster time-to-deployment of neuromorphic products. We evaluate the memory usage and simulation time of PyCARL using functionality tests, synthetic SNNs, and realistic applications. Our results demonstrate that for large SNNs, PyCARL does not lead to any significant overhead compared to CARLsim. We also use PyCARL to analyze these SNNs for a state-of-the-art neuromorphic hardware and demonstrate a significant performance deviation from software-only simulations. PyCARL allows to evaluate and minimize such differences early during model development.Comment: 10 pages, 25 figures. Accepted for publication at International Joint Conference on Neural Networks (IJCNN) 202

    Beta-Delayed Neutron Spectroscopy of Fission Fragments Using the Versatile Array of Neutron Detectors at Low Energy

    Get PDF
    This work details the study of nuclear decay in the region of doubly magic 78Ni using the Versatile Array of Neutron Detectors at Low Energy (VANDLE). This detector system uses the time-of-flight technique to measure the energy of beta-delayed neutrons. VANDLE uses a fully digital data acquisition system equipped with timing algorithms developed as part of the experimental work. The experiment examined nearly 30 beta-delayed neutron precursors produced at the Hollifield Radioactive Ion Beam Facility at Oak Ridge National Laboratory. This work discusses three of these nuclei: 77,78Cu and 84Ga. Results from the experiment provide details of the Gamow-Teller decay in neutron rich nuclei near 78Ni

    The IceCube Neutrino Observatory: Instrumentation and Online Systems

    Get PDF
    The IceCube Neutrino Observatory is a cubic-kilometer-scale high-energy neutrino detector built into the ice at the South Pole. Construction of IceCube, the largest neutrino detector built to date, was completed in 2011 and enabled the discovery of high-energy astrophysical neutrinos. We describe here the design, production, and calibration of the IceCube digital optical module (DOM), the cable systems, computing hardware, and our methodology for drilling and deployment. We also describe the online triggering and data filtering systems that select candidate neutrino and cosmic ray events for analysis. Due to a rigorous pre-deployment protocol, 98.4% of the DOMs in the deep ice are operating and collecting data. IceCube routinely achieves a detector uptime of 99% by emphasizing software stability and monitoring. Detector operations have been stable since construction was completed, and the detector is expected to operate at least until the end of the next decade.Comment: 83 pages, 50 figures; updated with minor changes from journal review and proofin

    Playing for Data: Ground Truth from Computer Games

    Full text link
    Recent progress in computer vision has been driven by high-capacity models trained on large datasets. Unfortunately, creating large datasets with pixel-level labels has been extremely costly due to the amount of human effort required. In this paper, we present an approach to rapidly creating pixel-accurate semantic label maps for images extracted from modern computer games. Although the source code and the internal operation of commercial games are inaccessible, we show that associations between image patches can be reconstructed from the communication between the game and the graphics hardware. This enables rapid propagation of semantic labels within and across images synthesized by the game, with no access to the source code or the content. We validate the presented approach by producing dense pixel-level semantic annotations for 25 thousand images synthesized by a photorealistic open-world computer game. Experiments on semantic segmentation datasets show that using the acquired data to supplement real-world images significantly increases accuracy and that the acquired data enables reducing the amount of hand-labeled real-world data: models trained with game data and just 1/3 of the CamVid training set outperform models trained on the complete CamVid training set.Comment: Accepted to the 14th European Conference on Computer Vision (ECCV 2016

    Enterprise manager tool

    Get PDF
    Text in English; Abstract: English and TurkishIncludes bibliographical references (leaves 48)vii, 42 leavesIn this thesis we develop a new Enterprise Manager Tool with transformation data to Excel sheet capability. This is a user friendly tool.This tool gives user to access databases and database objects such as tables,views,stored procedures,they can manage databases where ever they are.With transfer to excel capability the datas can be seen more efficiently. To create this tool. NET technology and MSSQL Server is used.Enterprise Manager Tool ,kullanıcıların databaslerine ve databaselerin objelerine (table,stored procedure,view) eriƟebilmeleri ve bunları yönetebilmeleri için yapılmÄ±ĆŸ içerisinde excel'e data transfer özelliği bulunan,windows ortamında yapılmÄ±ĆŸ bir tooldur.Excel'e data transferi özelliği sayesinde kullanıcılar verilerilerini rahatlıkla Excel ortamında kullanabilirler.Tool'un geliƟtirilmesine Teknoloji olarak.NET Server olarakta MSSQL Server kullanılmÄ±ĆŸtır

    A study of the security implications involved with the use of executable World Wide Web content

    Get PDF
    Malicious executable code is nothing new. While many consider that the concept of malicious code began in the 1980s when the first PC viruses began to emerge, the concept does in fact date back even earlier. Throughout the history of malicious code, methods of hostile code delivery have mirrored prevailing patterns of code distribution. In the 1980s, file infecting and boot sector viruses were common, mirroring the fact that during this time, executable code was commonly transferred via floppy disks. Since the 1990s email has been a major vector for malicious code attacks. Again, this mirrors the fact that during this period of time email has been a common means of sharing code and documents. This thesis examines another model of executable code distribution. It considers the security risks involved with the use of executable code embedded or attached to World Wide Web pages. In particular, two technologies are examined. Sun Microsystems\u27 Java Programming Language and Microsoft\u27s ActiveX Control Architecture are both technologies that can be used to connect executable program code to World Wide Web pages. This thesis examines the architectures on which these technologies are based, as well as the security and trust models that they implement. In doing so, this thesis aims to assess the level of risk posed by such technologies and to highlight similar risks that might occur with similar future technologies. ()
    • 

    corecore