1,061,254 research outputs found

    Requirement verification in simulation-based automation testing

    Full text link
    The emergence of the Industrial Internet results in an increasing number of complicated temporal interdependencies between automation systems and the processes to be controlled. There is a need for verification methods that scale better than formal verification methods and which are more exact than testing. Simulation-based runtime verification is proposed as such a method, and an application of Metric temporal logic is presented as a contribution. The practical scalability of the proposed approach is validated against a production process designed by an industrial partner, resulting in the discovery of requirement violations.Comment: 4 pages, 2 figures. Added IEEE copyright notic

    Modelling Probabilistic Wireless Networks

    Full text link
    We propose a process calculus to model high level wireless systems, where the topology of a network is described by a digraph. The calculus enjoys features which are proper of wireless networks, namely broadcast communication and probabilistic behaviour. We first focus on the problem of composing wireless networks, then we present a compositional theory based on a probabilistic generalisation of the well known may-testing and must-testing pre- orders. Also, we define an extensional semantics for our calculus, which will be used to define both simulation and deadlock simulation preorders for wireless networks. We prove that our simulation preorder is sound with respect to the may-testing preorder; similarly, the deadlock simulation pre- order is sound with respect to the must-testing preorder, for a large class of networks. We also provide a counterexample showing that completeness of the simulation preorder, with respect to the may testing one, does not hold. We conclude the paper with an application of our theory to probabilistic routing protocols

    Validation of highly reliable, real-time knowledge-based systems

    Get PDF
    Knowledge-based systems have the potential to greatly increase the capabilities of future aircraft and spacecraft and to significantly reduce support manpower needed for the space station and other space missions. However, a credible validation methodology must be developed before knowledge-based systems can be used for life- or mission-critical applications. Experience with conventional software has shown that the use of good software engineering techniques and static analysis tools can greatly reduce the time needed for testing and simulation of a system. Since exhaustive testing is infeasible, reliability must be built into the software during the design and implementation phases. Unfortunately, many of the software engineering techniques and tools used for conventional software are of little use in the development of knowledge-based systems. Therefore, research at Langley is focused on developing a set of guidelines, methods, and prototype validation tools for building highly reliable, knowledge-based systems. The use of a comprehensive methodology for building highly reliable, knowledge-based systems should significantly decrease the time needed for testing and simulation. A proven record of delivering reliable systems at the beginning of the highly visible testing and simulation phases is crucial to the acceptance of knowledge-based systems in critical applications

    Simulation-based stress testing of banks’ regulatory capital adequacy

    Get PDF
    Banks’ holding of reasonable capital buffers in excess of minimum requirements could alleviate the procyclicality problem potentially exacerbated by the rating-sensitive capital charges of Basel II. Determining the required buffer size is an important risk management issue for banks, which the Basle Committee (2002) suggests should be approached via stress testing. We present here a simulation-based approach to stress testing of capital adequacy where rating transitions are conditioned on business-cycle phase and business-cycle dynamics are taken into account. Our approach is an extension of the standard credit portfolio analysis in that we simulate actual bank capital and minimum capital requirements simultaneously. Actual bank capital (absent mark-to-market accounting) is driven by bank income and default losses, whereas capital requirements within the Basel II framework are driven by rating transitions. The joint dynamics of these determine the necessary capital buffers, given bank management’s specified confidence level for capital adequacy. We provide a tentative calibration of this confidence level to data on actual bank capital ratios, which enables a ceteris-paribus extrapolation of bank capital under the current regime to bank capital under Basel II.Basel II; Pillar 2; bank capital; stress tests; procyclicality

    Cloud based testing of business applications and web services

    Get PDF
    This paper deals with testing of applications based on the principles of cloud computing. It is aimed to describe options of testing business software in clouds (cloud testing). It identifies the needs for cloud testing tools including multi-layer testing; service level agreement (SLA) based testing, large scale simulation, and on-demand test environment. In a cloud-based model, ICT services are distributed and accessed over networks such as intranet or internet, which offer large data centers deliver on demand, resources as a service, eliminating the need for investments in specific hardware, software, or on data center infrastructure. Businesses can apply those new technologies in the contest of intellectual capital management to lower the cost and increase competitiveness and also earnings. Based on comparison of the testing tools and techniques, the paper further investigates future trend of cloud based testing tools research and development. It is also important to say that this comparison and classification of testing tools describes a new area and it has not yet been done

    Non-Nested Models and the Likelihood Ratio Statistic: A Comparison of Simulation and Bootstrap Based Tests

    Get PDF
    We consider an alternative use of simulation in the context of using the Likelihood-Ratio statistic to test non-nested models. To date simulation has been used to estimate the Kullback-Leibler measure of closeness between two densities, which in turn 'mean adjusts' the Likelihood-Ratio statistic. Given that this adjustment is still based upon asymptotic arguments, an alternative procedure is to utilise bootstrap procedures to construct the empirical density. To our knowledge this study represents the first comparison of the properties of bootstrap and simulation-based tests applied to non-nested tests. More specifically, the design of experiments allows us to comment on the relative performance of these two testing frameworks across models with varying degrees of nonlinearity. In this respect although the primary focus of the paper is upon the relative evaluation of simulation and bootstrap-based nonnested procedures in testing across a class of nonlinear threshold models, the inclusion of a similar analysis of the more standard linear/log-linear models provides a point of comparison.Non-nested tests, Simulation-based inference, Bootstrap tests, Nonlinear threshold models

    Oersted Medal Lecture 2007: Interactive simulations for teaching physics: What works, what doesn't, and why

    Get PDF
    We give an overview of the Physics Educational Technology (PhET) project to research and develop web-based interactive simulations for teaching and learning physics. The design philosophy, simulation development and testing process, and range of available simulations are described. The highlights of PhET research on simulation design and effectiveness in a variety of educational settings are provided. This work has shown that a well-designed interactive simulation can be an engaging and effective tool for learning physics
    corecore