6,802 research outputs found

    Developing a global risk engine

    Get PDF
    Risk analysis is a critical link in the reduction of casualties and damages due to earthquakes. Recognition of this relation has led to a rapid rise in demand for accurate, reliable and flexible risk assessment software. However, there is a significant disparity between the high quality scientific data developed by researchers and the availability of versatile, open and user-friendly risk analysis tools to meet the demands of end-users. In the past few years several open-source software have been developed that play an important role in the seismic research, such as OpenSHA and OpenSEES. There is however still a gap when it comes to open-source risk assessment tools and software. In order to fill this gap, the Global Earthquake Model (GEM) has been created. GEM is an internationally sanctioned program initiated by the OECD that aims to build independent, open standards to calculate and communicate earthquake risk around the world. This initiative started with a one-year pilot project named GEM1, during which an evaluation of a number of existing risk software was carried out. After a critical review of the results it was concluded that none of the software were adequate for GEM requirements and therefore, a new object-oriented tool was to be developed. This paper presents a summary of some of the most well known applications used in risk analysis, highlighting the main aspects that were considered for the development of this risk platform. The research that was carried out in order to gather all of the necessary information to build this tool was distributed in four different areas: information technology approach, seismic hazard resources, vulnerability assessment methodologies and sources of exposure data. The main aspects and findings for each of these areas will be presented as well as how these features were incorporated in the up-to-date risk engine. Currently, the risk engine is capable of predicting human or economical losses worldwide considering both deterministic and probabilistic-based events, using vulnerability curves. A first version of GEM will become available at the end of 2013. Until then the risk engine will continue to be developed by a growing community of developers, using a dedicated open-source platform

    A method for interference mitigation in space communications scheduling

    Get PDF
    Increases in the number of user spacecraft and data rates supported by NASA's Tracking and Data Relay Satellite System (TDRSS) in the S and Ku bands could result in communications conflicts due to mutual interference. A method to mitigate interference while minimizing unnecessary scheduling restrictions on both TDRSS network and user resources, based on consideration of all relevant communications parameters, was developed. The steps of this method calculate required separation angles at TDRS and produce interference intervals, which can be used in the production of schedules free of unacceptable interference. The method can also be used as a basis for analysis, evaluation, and optimization of user schedules with respect to communications performance. Described here are the proposed method and its potential application to scheduling in space communications. Test cases relative to planned missions, including the Earth Observing System, the Space Station Manned Base, and the Space Shuttle are discussed

    Optimization Testbed Cometboards Extended into Stochastic Domain

    Get PDF
    COMparative Evaluation Testbed of Optimization and Analysis Routines for the Design of Structures (CometBoards) is a multidisciplinary design optimization software. It was originally developed for deterministic calculation. It has now been extended into the stochastic domain for structural design problems. For deterministic problems, CometBoards is introduced through its subproblem solution strategy as well as the approximation concept in optimization. In the stochastic domain, a design is formulated as a function of the risk or reliability. Optimum solution including the weight of a structure, is also obtained as a function of reliability. Weight versus reliability traced out an inverted-S-shaped graph. The center of the graph corresponded to 50 percent probability of success, or one failure in two samples. A heavy design with weight approaching infinity could be produced for a near-zero rate of failure that corresponded to unity for reliability. Weight can be reduced to a small value for the most failure-prone design with a compromised reliability approaching zero. The stochastic design optimization (SDO) capability for an industrial problem was obtained by combining three codes: MSC/Nastran code was the deterministic analysis tool, fast probabilistic integrator, or the FPI module of the NESSUS software, was the probabilistic calculator, and CometBoards became the optimizer. The SDO capability requires a finite element structural model, a material model, a load model, and a design model. The stochastic optimization concept is illustrated considering an academic example and a real-life airframe component made of metallic and composite materials

    Benefits of pollution monitoring technology for greenhouse gas offset markets

    Get PDF
    Environmental economists have shown that tradable emission permit markets can reduce the costs to society of pollution reduction. However, when emissions are difficult to monitor and verify, offset credits from pollution reductions may be subject to price discounts that reduce social welfare. In this paper, we estimate the extent to which social welfare could be improved by using new technology to increase the accuracy with which pollution flows from agricultural fields can be monitored. We use a hypothetical case study of a situation in which farmers can reduce nitrous oxide (N2O) emissions from Midwest agricultural land parcels and sell the resulting offset permits in a greenhouse gas tradable permit market. We simulate market outcomes with and without an inexpensive technology that increases the accuracy of emission estimates, reduces the discount to which agricultural offset permits are subject, and improves the performance of tradable permit system. We find that the benefits from such technology range as high as $138 for a 100 acre field if N2O emissions are an exponential function of nitrogen application rates. However, variation in the benefits to farmers of eliminating price discounts may mean efficient technology adoption is not uniform across space.tradable permit, greenhouse gases, uncertainty, technology
    • …
    corecore