11,160 research outputs found

    Special Session on Industry 4.0

    Get PDF
    No abstract available

    Analysis on the Possibility of RISC-V Adoption

    Get PDF
    As the interface between hardware and software, Instruction Set Architectures (ISAs) play a key role in the operation of computers. While both hardware and software have continued to evolve rapidly over time, ISAs have undergone minimal change. Since its release in 2010, RISC-V has begun to erode the industry aversion to ISA innovation. Established on the principals of the Reduced Instruction Set Computer (RISC), and as an open source ISA, RISC-V offers many benefits over popular ISAs like Intel’s x86 and Arm Holding’s Advanced RISC Machine (ARM). In this literature review I evaluate the literature discussing: What makes changing Instruction Set Architectures difficultWhy might the industry choose to implement RISC-V When researching this topic I visited the IEEE (Institute of Electrical and Electronics Engineers), INSPEC (Engineering Village), and ACM (Association for Computing Machinery) Digital Library databases. I used the search terms, “RISC-V”, “Instruction Set Architecture”, “RISC-V” AND “x86”, and “RISC-V” AND “Instruction Set Architecture”. This literature review evaluates 10 papers on implementation of RISC-V. As this paper was intended to cover recent developments in the field, publication dates were limited to from 2015 to present

    Invest to Save: Report and Recommendations of the NSF-DELOS Working Group on Digital Archiving and Preservation

    Get PDF
    Digital archiving and preservation are important areas for research and development, but there is no agreed upon set of priorities or coherent plan for research in this area. Research projects in this area tend to be small and driven by particular institutional problems or concerns. As a consequence, proposed solutions from experimental projects and prototypes tend not to scale to millions of digital objects, nor do the results from disparate projects readily build on each other. It is also unclear whether it is worthwhile to seek general solutions or whether different strategies are needed for different types of digital objects and collections. The lack of coordination in both research and development means that there are some areas where researchers are reinventing the wheel while other areas are neglected. Digital archiving and preservation is an area that will benefit from an exercise in analysis, priority setting, and planning for future research. The WG aims to survey current research activities, identify gaps, and develop a white paper proposing future research directions in the area of digital preservation. Some of the potential areas for research include repository architectures and inter-operability among digital archives; automated tools for capture, ingest, and normalization of digital objects; and harmonization of preservation formats and metadata. There can also be opportunities for development of commercial products in the areas of mass storage systems, repositories and repository management systems, and data management software and tools.

    Emulating dynamic non-linear simulators using Gaussian processes

    Get PDF
    The dynamic emulation of non-linear deterministic computer codes where the output is a time series, possibly multivariate, is examined. Such computer models simulate the evolution of some real-world phenomenon over time, for example models of the climate or the functioning of the human brain. The models we are interested in are highly non-linear and exhibit tipping points, bifurcations and chaotic behaviour. However, each simulation run could be too time-consuming to perform analyses that require many runs, including quantifying the variation in model output with respect to changes in the inputs. Therefore, Gaussian process emulators are used to approximate the output of the code. To do this, the flow map of the system under study is emulated over a short time period. Then, it is used in an iterative way to predict the whole time series. A number of ways are proposed to take into account the uncertainty of inputs to the emulators, after fixed initial conditions, and the correlation between them through the time series. The methodology is illustrated with two examples: the highly non-linear dynamical systems described by the Lorenz and Van der Pol equations. In both cases, the predictive performance is relatively high and the measure of uncertainty provided by the method reflects the extent of predictability in each system

    "Innovation Versus Diffusion: Determinants of Productivity Growth Among Japanese Firms"

    Get PDF
    This paper presents a model of firm-level productivity growth that distinguishes between innovation and technology diffusion, and then applies the model to a large-scale data set of Japanese manufacturing and non-manufacturing firms between 1994 and 2000. We find both innovation and diffusion are important factors in firm-level productivity growth. Results also suggest that innovation comes not only directly from R&D activities, but also indirectly from patent purchases and imports. Previously, patent purchases and imports were considered as sources of technology diffusion rather than innovation. In fact, we find patent purchases are more effective in this regard than R&D expenditure.

    Enterprise model verification and validation : an approach

    Get PDF
    This article presents a verification and validation approach which is used here in order to complete the classical tool box the industrial user may utilize in enterprise modeling and integration domain. This approach, which has been defined independently from any application domain is based on several formal concepts and tools presented in this paper. These concepts are property concepts, property reference matrix, properties graphs, enterprise modeling domain ontology, conceptual graphs and formal reasoning mechanisms

    MiniCPS: A toolkit for security research on CPS Networks

    Full text link
    In recent years, tremendous effort has been spent to modernizing communication infrastructure in Cyber-Physical Systems (CPS) such as Industrial Control Systems (ICS) and related Supervisory Control and Data Acquisition (SCADA) systems. While a great amount of research has been conducted on network security of office and home networks, recently the security of CPS and related systems has gained a lot of attention. Unfortunately, real-world CPS are often not open to security researchers, and as a result very few reference systems and topologies are available. In this work, we present MiniCPS, a CPS simulation toolbox intended to alleviate this problem. The goal of MiniCPS is to create an extensible, reproducible research environment targeted to communications and physical-layer interactions in CPS. MiniCPS builds on Mininet to provide lightweight real-time network emulation, and extends Mininet with tools to simulate typical CPS components such as programmable logic controllers, which use industrial protocols (Ethernet/IP, Modbus/TCP). In addition, MiniCPS defines a simple API to enable physical-layer interaction simulation. In this work, we demonstrate applications of MiniCPS in two example scenarios, and show how MiniCPS can be used to develop attacks and defenses that are directly applicable to real systems.Comment: 8 pages, 6 figures, 1 code listin

    Comparison of different classification algorithms for fault detection and fault isolation in complex systems

    Get PDF
    Due to the lack of sufficient results seen in literature, feature extraction and classification methods of hydraulic systems appears to be somewhat challenging. This paper compares the performance of three classifiers (namely linear support vector machine (SVM), distance-weighted k-nearest neighbor (WKNN), and decision tree (DT) using data from optimized and non-optimized sensor set solutions. The algorithms are trained with known data and then tested with unknown data for different scenarios characterizing faults with different degrees of severity. This investigation is based solely on a data-driven approach and relies on data sets that are taken from experiments on the fuel system. The system that is used throughout this study is a typical fuel delivery system consisting of standard components such as a filter, pump, valve, nozzle, pipes, and two tanks. Running representative tests on a fuel system are problematic because of the time, cost, and reproduction constraints involved in capturing any significant degradation. Simulating significant degradation requires running over a considerable period; this cannot be reproduced quickly and is costly

    The designer of the 90's: A live demonstration

    Get PDF
    A survey of design tools to be used by the aircraft designer is given. Structural reliability, maintainability, cost and predictability, and acoustics expert systems are discussed, as well as scheduling, drawing, engineering systems, sizing functions, and standard parts and materials data bases
    • 

    corecore