6,211 research outputs found

    The Market Process and The Firm Toward a Dynamic Property Rights Perspective

    Get PDF
    We discuss the relations between alternative conceptualizations of the market process - neoclassical, Austrian and radical subjectivist/evolutionary - and alternative approaches to economic organization, for example, nexus of contract theory, Williamsonian transaction cost economics and the dynamic transaction cost approach of Langlois and Robertson. We argue that there is a distinct need for more firmly grounding theories of economic organization in theories of the market process, and that key ideas of the more dynamic conceptualizations of the market are likely to substantially enrichen the theory of economic organization.The market process, the theory of the firm

    Integrating modes of policy analysis and strategic management practice : requisite elements and dilemmas

    Get PDF
    There is a need to bring methods to bear on public problems that are inclusive, analytic, and quick. This paper describes the efforts of three pairs of academics working from three different though complementary theoretical foundations and intervention backgrounds (i.e., ways of working) who set out together to meet this challenge. Each of the three pairs had conducted dozens of interventions that had been regarded as successful or very successful by the client groups in dealing with complex policy and strategic problems. One approach focused on leadership issues and stakeholders, another on negotiating competitive strategic intent with attention to stakeholder responses, and the third on analysis of feedback ramifications in developing policies. This paper describes the 10 year longitudinal research project designed to address the above challenge. The important outcomes are reported: the requisite elements of a general integrated approach and the enduring puzzles and tensions that arose from seeking to design a wide-ranging multi-method approach

    The Future of X-ray Time Domain Surveys

    Full text link
    Modern X-ray observatories yield unique insight into the astrophysical time domain. Each X-ray photon can be assigned an arrival time, an energy and a sky position, yielding sensitive, energy-dependent light curves and enabling time-resolved spectra down to millisecond time-scales. Combining those with multiple views of the same patch of sky (e.g., in the Chandra and XMM-Newton deep fields) so as to extend variability studies over longer baselines, the spectral timing capacity of X-ray observatories then stretch over 10 orders of magnitude at spatial resolutions of arcseconds, and 13 orders of magnitude at spatial resolutions of a degree. A wealth of high-energy time-domain data already exists, and indicates variability on timescales ranging from microseconds to years in a wide variety of objects, including numerous classes of AGN, high-energy phenomena at the Galactic centre, Galactic and extra-Galactic X-ray binaries, supernovae, gamma-ray bursts, stellar flares, tidal disruption flares, and as-yet unknown X-ray variables. This workshop explored the potential of strategic X-ray surveys to probe a broad range of astrophysical sources and phenomena. Here we present the highlights, with an emphasis on the science topics and mission designs that will drive future discovery in the X-ray time domain.Comment: 8 pages, 1 figure, Conference proceedings for IAU Symposium 285, "New Horizons in Time Domain Astronomy," Oxford, UK, Sep 19-23, 2011. To be published by IA

    Proposing the Use of Hazard Analysis for Machine Learning Data Sets

    Get PDF
    There is no debating the importance of data for artificial intelligence. The behavior of data-driven machine learning models is determined by the data set, or as the old adage states: “garbage in, garbage out (GIGO).” While the machine learning community is still debating which techniques are necessary and sufficient to assess the adequacy of data sets, they agree some techniques are necessary. In general, most of the techniques being considered focus on evaluating the volumes of attributes. Those attributes are evaluated with respect to anticipated counts of attributes without considering the safety concerns associated with those attributes. This paper explores those techniques to identify instances of too little data and incorrect attributes. Those techniques are important; however, for safety critical applications, the assurance analyst also needs to understand the safety impact of not having specific attributes present in the machine learning data sets. To provide that information, this paper proposes a new technique the authors call data hazard analysis. The data hazard analysis provides an approach to qualitatively analyze the training data set to reduce the risk associated with the GIGO

    XRound : A reversible template language and its application in model-based security analysis

    Get PDF
    Successful analysis of the models used in Model-Driven Development requires the ability to synthesise the results of analysis and automatically integrate these results with the models themselves. This paper presents a reversible template language called XRound which supports round-trip transformations between models and the logic used to encode system properties. A template processor that supports the language is described, and the use of the template language is illustrated by its application in an analysis workbench, designed to support analysis of security properties of UML and MOF-based models. As a result of using reversible templates, it is possible to seamlessly and automatically integrate the results of a security analysis with a model. (C) 2008 Elsevier B.V. All rights reserved

    FRAUD PREVENTION AND DETECTION SYSTEM IN NIGERIA BANKING INDUSTRIES

    Get PDF
    Fraud is on the rise as a result of the advent of modern technology and the global superhighways of banking transactions, resulting in billions of dollars in losses worldwide each year. Although fraud prevention technologies are the most effective method of combating fraud, fraudsters are flexible and will usually find a way around them over time. We need fraud detection approaches if we are to catch fraudsters after fraud prevention has failed. Statistics and machine learning are effective fraud detection technologies that have been used to detect money laundering, e-commerce credit card fraud, telecommunications fraud, and computer intrusion, to name a few. The program is simple to use, and anyone with permission can use it. The importance of computer technology has expanded as it has advanced in all areas of human endeavor.                        Keywords: Fraud Detection, Fraud Prevention, Banking Industries, Telecommunications

    Process of designing robust, dependable, safe and secure software for medical devices: Point of care testing device as a case study

    Get PDF
    This article has been made available through the Brunel Open Access Publishing Fund.Copyright © 2013 Sivanesan Tulasidas et al. This paper presents a holistic methodology for the design of medical device software, which encompasses of a new way of eliciting requirements, system design process, security design guideline, cloud architecture design, combinatorial testing process and agile project management. The paper uses point of care diagnostics as a case study where the software and hardware must be robust, reliable to provide accurate diagnosis of diseases. As software and software intensive systems are becoming increasingly complex, the impact of failures can lead to significant property damage, or damage to the environment. Within the medical diagnostic device software domain such failures can result in misdiagnosis leading to clinical complications and in some cases death. Software faults can arise due to the interaction among the software, the hardware, third party software and the operating environment. Unanticipated environmental changes and latent coding errors lead to operation faults despite of the fact that usually a significant effort has been expended in the design, verification and validation of the software system. It is becoming increasingly more apparent that one needs to adopt different approaches, which will guarantee that a complex software system meets all safety, security, and reliability requirements, in addition to complying with standards such as IEC 62304. There are many initiatives taken to develop safety and security critical systems, at different development phases and in different contexts, ranging from infrastructure design to device design. Different approaches are implemented to design error free software for safety critical systems. By adopting the strategies and processes presented in this paper one can overcome the challenges in developing error free software for medical devices (or safety critical systems).Brunel Open Access Publishing Fund

    Statistical Tools for the Rapid Development & Evaluation of High-Reliability Products

    Get PDF
    - Today’s manufacturers face increasingly intense global competition. To remain profitable, they are challenged to design, develop, test, and manufacture high reliability products in ever-shorter product-cycle times and, at the same time, remain within stringent cost constraints. Design, manufacturing, and reliability engineers have developed an impressive array of tools for producing reliable products. These tools will continue to be important. However, due to changes in way that new productconcepts are being developed and brought to market, there is need for changes in the usual methods used for design-for-reliability and reliability testing, assessment, and improvement programs. This tutorial uses a conceptual degradation-based reliability model to describe the role of, and need for, integration of reliability data sources. These sources include accelerated degradation testing, accelerated life testing (for materials and components), accelerated multifactor robust-design experiments and over-stress prototype testing (for subsystems and systems), and the use of field data (especially early-production) to produce a robust, high-reliability product and to provide a process for continuing improvement of reliability of existing & future products. Manufacturers need to develop economical & timely methods of obtaining, at each step of the product design & development process, the information needed to meet overall reliability goals. We emphasize the need for intensive, effective upstream testing of product materials, components, and design concepts

    Statistical Tools for the Rapid Development & Evaluation of High-Reliability Products

    Get PDF
    Today\u27s manufacturers face increasingly intense global competition. To remain profitable, they are challenged to design, develop, test, and manufacture high reliability products in ever-shorter product-cycle times and, at the same time, remain within stringent cost constraints. Design, manufacturing, and reliability engineers have developed an impressive array of tools for producing reliable products. These tools will continue to be important. However, due to changes in the way that new product-concepts are being developed and brought to market, there is need for change in the usual methods used for design-for-reliability and reliability testing, assessment, and improvement programs. This tutorial uses a conceptual degradation-based reliability model to describe the role of, and need for, integration of reliability data sources. These sources include accelerated degradation testing, accelerated life testing (for materials and components), accelerated multifactor robust-design experiments and over-stress prototype testing (for subsystems and systems), and the use of field data (especially early-production) to produce a robust, high-reliability product and to provide a process for continuing improvement of reliability of existing and future products. Manufacturers need to develop economical and timely methods of obtaining, at each step of the product design and development process, the information needed to meet overall reliability goals. We emphasize the need for intensive, effective upstream testing of product materials, components, and design concept
    corecore