1,729 research outputs found

    Earthquake forecasting and its verification

    Get PDF
    No proven method is currently available for the reliable short time prediction of earthquakes (minutes to months). However, it is possible to make probabilistic hazard assessments for earthquake risk. These are primarily based on the association of small earthquakes with future large earthquakes. In this paper we discuss a new approach to earthquake forecasting. This approach is based on a pattern informatics (PI) method which quantifies temporal variations in seismicity. The output is a map of areas in a seismogenic region (``hotspots'') where earthquakes are forecast to occur in a future 10-year time span. This approach has been successfully applied to California, to Japan, and on a worldwide basis. These forecasts are binary--an earthquake is forecast either to occur or to not occur. The standard approach to the evaluation of a binary forecast is the use of the relative operating characteristic (ROC) diagram, which is a more restrictive test and less subject to bias than maximum likelihood tests. To test our PI method, we made two types of retrospective forecasts for California. The first is the PI method and the second is a relative intensity (RI) forecast based on the hypothesis that future earthquakes will occur where earthquakes have occurred in the recent past. While both retrospective forecasts are for the ten year period 1 January 2000 to 31 December 2009, we performed an interim analysis 5 years into the forecast. The PI method out performs the RI method under most circumstances.Comment: 10(+1) pages, 5 figures, 2 tables. Submitted to Nonlinearl Processes in Geophysics on 5 August 200

    Cost-effectiveness of [18F] fluoroethyl-L-tyrosine for temozolomide therapy assessment in patients with glioblastoma

    Get PDF
    Background and Purpose: Glioblastomas are the most aggressive of all gliomas. The prognosis of these gliomas, which are classified as grade IV tumors by the World Health Organization (WHO), is poor. Combination therapy, including surgery, radiotherapy, and chemotherapy has variable outcomes and is expensive. In light of rising healthcare costs, there are societal demands for the justification of medical expenses. Therefore, we calculated the cost-effectiveness of follow-up [F-18] fluoroethyl-L-tyrosine ([F-18] FET) positron emission tomography (PET) scans performed on patients with glioblastoma after surgery and before commencing temozolomide maintenance treatment. Materials and Methods: To determine the cost-effectiveness of follow-up [F-18] FET PET procedures, we examined published clinical data and calculated the associated costs in the context of Belgian healthcare. We subsequently performed one-way deterministic sensitivity analysis and Monte Carlo analysis on the calculated ratios. Results: The decision tree based on overall survival rates showed that the number of non-responders identified using PET was 57.14% higher than the number of non-responders identified using conventional MRI. Further, the decision tree based on progression-free survival rates revealed a comparable increase of 57.50% non-responders identified. The calculated cost of two required PET scans per patient during the follow-up treatment phase was 780.50 euros. Two cost-effectiveness ratios were determined for overall survival and progression-free survival rates. Both of these calculations yielded very similar results: incremental cost-effectiveness ratios of 1,365.86 and 1,357.38 euros, respectively, for each identified non-responder. The findings of the sensitivity analysis supported the calculated results, confirming that the obtained data were robust. Conclusion: Our comparative study of conventional MRI and [F-18] FET PET revealed that the latter is a valuable tool for predicting the treatment responses of patients with glioblastomas to follow-up temozolomide maintenance treatment while considering its cost-effectiveness. Thus, [F-18] FET PET scans enable clinical outcomes to be predicted accurately and at a low cost. Moreover, given the robustness of the data in the sensitivity analyses, the level of certainty of this outcome is acceptable

    Automatic Target Recognition Classification System Evaluation Methodology

    Get PDF
    This dissertation research makes contributions towards the evaluation of developing Automatic Target Recognition (ATR) technologies through the application of decision analysis (DA) techniques. ATR technology development decisions should rely not only on the measures of performance (MOPs) associated with a given ATR classification system (CS), but also on the expected measures of effectiveness (MOEs). The purpose of this research is to improve the decision-makers in the ATR Technology development. A decision analysis framework that allows decision-makers in the ATR community to synthesize the performance measures, costs, and characteristics of each ATR system with the preferences and values of both the evaluators and the warfighters is developed. The inclusion of the warfighter\u27s perspective is important in that it has been proven that basing ATR CS comparisons solely upon performance characteristics does not ensure superior operational effectiveness. The methodology also captures the relationship between MOPs and MOEs via a combat model. An example scenario demonstrates how ATR CSs may be compared. Sensitivity analysis is performed to demonstrate the robustness of the MOP to value score and MOP to MOE translations. A multinomial section procedure is introduced to account for the random nature of the MOP estimates

    Seismic risk of infrastructure systems with treatment of and sensitivity to epistemic uncertainty

    Get PDF
    Modern society’s very existence is tied to the proper and reliable functioning of its Critical Infrastructure (CI) systems. In the seismic risk assessment of an infrastructure, taking into account all the relevant uncertainties affecting the problem is crucial. While both aleatory and epistemic uncertainties affect the estimate of seismic risk to an infrastructure and should be considered, the focus herein is on the latter. After providing an up-to-date literature review about the treatment of and sensitivity to epistemic uncertainty, this paper presents a comprehensive framework for seismic risk assessment of interdependent spatially distributed infrastructure systems that accounts for both aleatory and epistemic uncertainties and provides confidence in the estimate, as well as sensitivity of uncertainty in the output to the components of epistemic uncertainty in the input. The logic tree approach is used for the treatment of epistemic uncertainty and for the sensitivity analysis, whose results are presented through tornado diagrams. Sensitivity is also evaluated by elaborating the logic tree results through weighted ANOVA. The formulation is general and can be applied to risk assessment problems involving not only infrastructural but also structural systems. The presented methodology was implemented into an open-source software, OOFIMS, and applied to a synthetic city composed of buildings and a gas network and subjected to seismic hazard. The gas system’s performance is assessed through a flow-based analysis. The seismic hazard, the vulnerability assessment and the evaluation of the gas system’s operational state are addressed with a simulation-based approach. The presence of two systems (buildings and gas network) proves the capability to handle system interdependencies and highlights that uncertainty in models/parameters related to one system can affect uncertainty in the output related to dependent systems

    Earthquake forecasting and its verification

    Get PDF
    No proven method is currently available for the reliable short time prediction of earthquakes (minutes to months). However, it is possible to make probabilistic hazard assessments for earthquake risk. In this paper we discuss a new approach to earthquake forecasting based on a pattern informatics (PI) method which quantifies temporal variations in seismicity. The output, which is based on an association of small earthquakes with future large earthquakes, is a map of areas in a seismogenic region ('hotspots'') where earthquakes are forecast to occur in a future 10-year time span. This approach has been successfully applied to California, to Japan, and on a worldwide basis. Because a sharp decision threshold is used, these forecasts are binary--an earthquake is forecast either to occur or to not occur. The standard approach to the evaluation of a binary forecast is the use of the relative (or receiver) operating characteristic (ROC) diagram, which is a more restrictive test and less subject to bias than maximum likelihood tests. To test our PI method, we made two types of retrospective forecasts for California. The first is the PI method and the second is a relative intensity (RI) forecast based on the hypothesis that future large earthquakes will occur where most smaller earthquakes have occurred in the recent past. While both retrospective forecasts are for the ten year period 1 January 2000 to 31 December 2009, we performed an interim analysis 5 years into the forecast. The PI method out performs the RI method under most circumstances

    Uncertainty Analysis in Population-Based Disease Microsimulation Models

    Get PDF
    Objective. Uncertainty analysis (UA) is an important part of simulation model validation. However, literature is imprecise as to how UA should be performed in the context of population-based microsimulation (PMS) models. In this expository paper, we discuss a practical approach to UA for such models. Methods. By adapting common concepts from published UA guidelines, we developed a comprehensive, step-by-step approach to UA in PMS models, including sample size calculation to reduce the computational time. As an illustration, we performed UA for POHEM-OA, a microsimulation model of osteoarthritis (OA) in Canada. Results. The resulting sample size of the simulated population was 500,000 and the number of Monte Carlo (MC) runs was 785 for 12-hour computational time. The estimated 95% uncertainty intervals for the prevalence of OA in Canada in 2021 were 0.09 to 0.18 for men and 0.15 to 0.23 for women. The uncertainty surrounding the sex-specific prevalence of OA increased over time. Conclusion. The proposed approach to UA considers the challenges specific to PMS models, such as selection of parameters and calculation of MC runs and population size to reduce computational burden. Our example of UA shows that the proposed approach is feasible. Estimation of uncertainty intervals should become a standard practice in the reporting of results from PMS models
    • …
    corecore