136,431 research outputs found

    Correlative visualization techniques for multidimensional data

    Get PDF
    Critical to the understanding of data is the ability to provide pictorial or visual representation of those data, particularly in support of correlative data analysis. Despite the advancement of visualization techniques for scientific data over the last several years, there are still significant problems in bringing today's hardware and software technology into the hands of the typical scientist. For example, there are other computer science domains outside of computer graphics that are required to make visualization effective such as data management. Well-defined, flexible mechanisms for data access and management must be combined with rendering algorithms, data transformation, etc. to form a generic visualization pipeline. A generalized approach to data visualization is critical for the correlative analysis of distinct, complex, multidimensional data sets in the space and Earth sciences. Different classes of data representation techniques must be used within such a framework, which can range from simple, static two- and three-dimensional line plots to animation, surface rendering, and volumetric imaging. Static examples of actual data analyses will illustrate the importance of an effective pipeline in data visualization system

    Simultaneous Interconnection and Damping Assignment Passivity-Based Control: Two Practical Examples

    Get PDF
    Passivity-based control (PBC) is a generic name given to a family of controller design techniques that achieves system stabilization via the route of passivation, that is, rendering the closed-loop system passive with a desired storage function (that usually qualifies as a Lyapunov function for the stability analysis.) If the passivity property turns out to be output strict, with an output signal with respect to which the system is detectable, then asymptotic stability is ensured.Peer Reviewe

    DeepGauge: Multi-Granularity Testing Criteria for Deep Learning Systems

    Full text link
    Deep learning (DL) defines a new data-driven programming paradigm that constructs the internal system logic of a crafted neuron network through a set of training data. We have seen wide adoption of DL in many safety-critical scenarios. However, a plethora of studies have shown that the state-of-the-art DL systems suffer from various vulnerabilities which can lead to severe consequences when applied to real-world applications. Currently, the testing adequacy of a DL system is usually measured by the accuracy of test data. Considering the limitation of accessible high quality test data, good accuracy performance on test data can hardly provide confidence to the testing adequacy and generality of DL systems. Unlike traditional software systems that have clear and controllable logic and functionality, the lack of interpretability in a DL system makes system analysis and defect detection difficult, which could potentially hinder its real-world deployment. In this paper, we propose DeepGauge, a set of multi-granularity testing criteria for DL systems, which aims at rendering a multi-faceted portrayal of the testbed. The in-depth evaluation of our proposed testing criteria is demonstrated on two well-known datasets, five DL systems, and with four state-of-the-art adversarial attack techniques against DL. The potential usefulness of DeepGauge sheds light on the construction of more generic and robust DL systems.Comment: The 33rd IEEE/ACM International Conference on Automated Software Engineering (ASE 2018

    Dynamic modelling of meat plant energy systems : a thesis presented in partial fulfilment of the requirements for the degree of Master of Technology at Massey University

    Get PDF
    The objective of this study was to develop dynamic mathematical models of the major energy use and recovery operations within the New Zealand meat industry. Ordinary differential equation based models were developed for the five most common rendering systems, for hot water use, generation and storage, and for the refrigeration system. These cover about 90% of process heat use and about two-thirds of electricity demand. Each model was constructed so that ultimately it could be linked to the others to develop an integrated energy supply and demand model. Strong linkages to product flow were developed for the rendering models, but those for hot water and refrigeration are less developed, although there is no technological impediment. In developing the models for rendering it was assumed that cookers and dryers are perfectly mixed vessels and that time delays in materials transport are negligible. Model predictions could be improved by removing these assumptions, but taking into account the possible extent of data uncertainties, the present accuracy may be adequate for the overall meat plant energy model. A major consequence of the development of a hot water demand model was that areas of low efficiency were identified. By attention to equipment designs for hand tool sterilisers and cleanup systems substantial heat savings are possible. Although not tested, both the model for heat recovery and the model for hot water storage and supply are expected to be accurate as few major assumptions were required in their development. The main novel feature of the refrigeration model is that it treats the refrigeration applications in abstract terms rather than performing a room by room analysis. As a consequence data demands are lower than for refrigeration models which use a room-based approach, and the actual data needed are more easily obtainable. In spite of the lower data requirements good accuracy was demonstrated. The models developed will have major benefits to the NZ meat industry, initially as stand-alone entities, but later as an integrated package to help in reducing energy use
    • …
    corecore