460 research outputs found

    Invasive Species Forecasting System: A Decision Support Tool for the U.S. Geological Survey: FY 2005 Benchmarking Report v.1.6

    Get PDF
    The National Institute of Invasive Species Science (NIISS), through collaboration with NASA's Goddard Space Flight Center (GSFC), recently began incorporating NASA observations and predictive modeling tools to fulfill its mission. These enhancements, labeled collectively as the Invasive Species Forecasting System (ISFS), are now in place in the NIISS in their initial state (V1.0). The ISFS is the primary decision support tool of the NIISS for the management and control of invasive species on Department of Interior and adjacent lands. The ISFS is the backbone for a unique information services line-of-business for the NIISS, and it provides the means for delivering advanced decision support capabilities to a wide range of management applications. This report describes the operational characteristics of the ISFS, a decision support tool of the United States Geological Survey (USGS). Recent enhancements to the performance of the ISFS, attained through the integration of observations, models, and systems engineering from the NASA are benchmarked; i.e., described quantitatively and evaluated in relation to the performance of the USGS system before incorporation of the NASA enhancements. This report benchmarks Version 1.0 of the ISFS

    The robust optimization of non-linear requirements models

    Get PDF
    Solutions to non-linear requirements engineering problems may be brittle ; i.e. small changes may dramatically alter solution effectiveness. Hence, it is not enough to just generate solutions to requirements problems---we must also assess solution robustness. This thesis aims to address two concerns: (a) Is demonstrating robustness a time consuming task? and (b) Is it necessary that solution quality be traded off against solution robustness?;Using a Bayesian ranking heuristic, the KEYS2 algorithm fixes a small number of important variables, rapidly pushing the search into a stable, optimal plateau. By design, KEYS2 generates decision ordering diagrams (in time experimentally shown to be O(N2)). Once generated, these diagrams can confirm solution robustness in linear time. When assessed in terms of reducing inference times, increasing solution quality, and decreasing the variance of the generated solution, KEYS2 out-performs other search algorithms (simulated annealing, A*, MaxWalkSat)

    Deception in Game Theory: A Survey and Multiobjective Model

    Get PDF
    Game theory is the study of mathematical models of conflict. It provides tools for analyzing dynamic interactions between multiple agents and (in some cases) across multiple interactions. This thesis contains two scholarly articles. The first article is a survey of game-theoretic models of deception. The survey describes the ways researchers use game theory to measure the practicality of deception, model the mechanisms for performing deception, analyze the outcomes of deception, and respond to, or mitigate the effects of deception. The survey highlights several gaps in the literature. One important gap concerns the benefit-cost-risk trade-off made during deception planning. To address this research gap, the second article introduces a novel approach for modeling these trade-offs. The approach uses a game theoretic model of deception to define a new multiobjective optimization problem called the deception design problem (DDP). Solutions to the DDP provide courses of deceptive action that are efficient in terms of their benefit, cost, and risk to the deceiver. A case study based on the output of an air-to-air combat simulator demonstrates the DDP in a 7 x 7 normal form game. This approach is the first to evaluate benefit, cost, and risk in a single game theoretic model of deception

    Requirements Prioritization Based on Benefit and Cost Prediction: An Agenda for Future Research

    Get PDF
    In early phases of the software cycle, requirements prioritization necessarily relies on the specified requirements and on predictions of benefit and cost of individual requirements. This paper presents results of a systematic review of literature, which investigates how existing methods approach the problem of requirements prioritization based on benefit and cost. From this review, it derives a set of under-researched issues which warrant future efforts and sketches an agenda for future research in this area

    Software for Probabilistic Risk Reduction

    Get PDF
    A computer program implements a methodology, denoted probabilistic risk reduction, that is intended to aid in planning the development of complex software and/or hardware systems. This methodology integrates two complementary prior methodologies: (1) that of probabilistic risk assessment and (2) a risk-based planning methodology, implemented in a prior computer program known as Defect Detection and Prevention (DDP), in which multiple requirements and the beneficial effects of risk-mitigation actions are taken into account. The present methodology and the software are able to accommodate both process knowledge (notably of the efficacy of development practices) and product knowledge (notably of the logical structure of a system, the development of which one seeks to plan). Estimates of the costs and benefits of a planned development can be derived. Functional and non-functional aspects of software can be taken into account, and trades made among them. It becomes possible to optimize the planning process in the sense that it becomes possible to select the best suite of process steps and design choices to maximize the expectation of success while remaining within budget

    The TechSat 21 Autonomous Sciencecraft Experiment

    Get PDF
    Software has been developed to perform a number of functions essential to autonomous operation in the Autonomous Sciencecraft Experiment (ASE), which is scheduled to be demonstrated aboard a constellation of three spacecraft, denoted TechSat 21, to be launched by the Air Force into orbit around the Earth in January 2006. A prior version of this software was reported in Software for an Autonomous Constellation of Satellites (NPO-30355), NASA Tech Briefs, Vol. 26, No. 11 (November 2002), page 44. The software includes the following components: Algorithms to analyze image data, generate scientific data products, and detect conditions, features, and events of potential scientific interest; A program that uses component-based computational models of hardware to analyze anomalous situations and to generate novel command sequences, including (when possible) commands to repair components diagnosed as faulty; A robust-execution-management component that uses the Spacecraft Command Language (SCL) software to enable event-driven processing and low-level autonomy; and The Continuous Activity Scheduling, Planning, Execution, and Replanning (CASPER) program for replanning activities, including downlink sessions, on the basis of scientific observations performed during previous orbit cycles

    Optimizing the Design of Spacecraft Systems Using Risk as Currency

    Get PDF
    Abstract-Treating risk as a "currency" has proven to be key in systematically optimizing the design of spacecraft systems. This idea has been applied in the design of individual components of spacecraft systems, and in the end-to-end design of such systems. The process, called "Defect Detection and Prevention" (DDP), its tool support, and applications, are described in We are now extending this process to include consideration of architectural alternatives, qualification of components, fabrication and assembly, integration and test, and mission operation. The results of applying this extended process in the pre-formulation, formulation and implementation phases of various NASA and other government agency missions will be discussed. This paper will also discuss the results of developing optimized technology development and qualification plans

    Software for Optimizing Quality Assurance of Other Software

    Get PDF
    Software assurance is the planned and systematic set of activities that ensures that software processes and products conform to requirements, standards, and procedures. Examples of such activities are the following: code inspections, unit tests, design reviews, performance analyses, construction of traceability matrices, etc. In practice, software development projects have only limited resources (e.g., schedule, budget, and availability of personnel) to cover the entire development effort, of which assurance is but a part. Projects must therefore select judiciously from among the possible assurance activities. At its heart, this can be viewed as an optimization problem; namely, to determine the allocation of limited resources (time, money, and personnel) to minimize risk or, alternatively, to minimize the resources needed to reduce risk to an acceptable level. The end result of the work reported here is a means to optimize quality-assurance processes used in developing software

    Software for Analyzing Laminar-to-Turbulent Flow Transitions

    Get PDF
    Software assurance is the planned and systematic set of activities that ensures that software processes and products conform to requirements, standards, and procedures. Examples of such activities are the following: code inspections, unit tests, design reviews, performance analyses, construction of traceability matrices, etc. In practice, software development projects have only limited resources (e.g., schedule, budget, and availability of personnel) to cover the entire development effort, of which assurance is but a part. Projects must therefore select judiciously from among the possible assurance activities. At its heart, this can be viewed as an optimization problem; namely, to determine the allocation of limited resources (time, money, and personnel) to minimize risk or, alternatively, to minimize the resources needed to reduce risk to an acceptable level. The end result of the work reported here is a means to optimize quality-assurance processes used in developing software. This is achieved by combining two prior programs in an innovative manne
    • …
    corecore