99 research outputs found

    Dynamic Agent Based Modeling Using Bayesian Framework for Addressing Intelligence Adaptive Nuclear Nonproliferation Analysis

    Get PDF
    Realistically, no two nuclear proliferating or defensive entities are exactly identical; Agent Based Modeling (ABM) is a computational methodology addressing the uniqueness of those facilitating or preventing nuclear proliferation. The modular Bayesian ABM Nonproliferation Enterprise (BANE) tool has been developed at Texas A &M University for nuclear nonproliferation analysis. Entities engaged in nuclear proliferation cover a range of activities and fall within proliferating, defensive, and neutral agent classes. In BANE proliferating agents pursue nuclear weapons, or at least a latent nuclear weapons capability. Defensive nonproliferation agents seek to uncover, hinder, reverse, or dismantle any proliferation networks they discover. The vast majority of agents are neutral agents, of which only a small subset can significantly enable proliferation. BANE facilitates intelligent agent actions by employing entropy and mutual information for proliferation pathway determinations. Factors including technical success, resource expenditures, and detection probabilities are assessed by agents seeking optimal proliferation postures. Coupling ABM with Bayesian analysis is powerful from an omniscience limitation perspective. Bayesian analysis supports linking crucial knowledge and technology requirements into relationship networks for each proliferation category. With a Bayesian network, gaining information on proliferator actions in one category informs defensive agents where to expend limited counter-proliferation impeding capabilities. Correlating incomplete evidence for pattern recognition in BANE using Bayesian inference draws upon technical supply side proliferation linkages grounded in physics. Potential or current proliferator security, economic trajectory, or other factors modify demand drivers for undertaking proliferation. Using Bayesian inference the coupled demand and supply proliferation drivers are connected to create feedback interactions. Verification and some validation for BANE is performed using scenarios and historical case studies. Restrictive export controls, swings in global soft power affinity, and past proliferation program assessments for entities ranging from the Soviet Union to Iraq demonstrates BANE’s flexibility and applicability. As a newly developed tool, BANE has room for future contributions from computer science, engineering, and social scientists. Through BANE the framework exists for detailed nonproliferation expansion into broader weapons of mass effect analysis; since, nuclear proliferation is but one option for addressing international security concerns

    Development of a Prognostic Method for the Production of Undeclared Enriched Uranium

    Get PDF
    As global demand for nuclear energy and threats to nuclear security increase, the need for verification of the peaceful application of nuclear materials and technology also rises. In accordance with the Nuclear Nonproliferation Treaty, the International Atomic Energy Agency is tasked with verification of the declared enrichment activities of member states. Due to the increased cost of inspection and verification of a globally growing nuclear energy industry, remote process monitoring has been proposed as part of a next-generation, information-driven safeguards program. To further enhance this safeguards approach, it is proposed that process monitoring data may be used to not only verify the past but to anticipate the future via prognostic analysis. While prognostic methods exist for health monitoring of physical processes, the literature is absent of methods to predict the outcome of decision-based events, such as the production of undeclared enriched uranium. This dissertation introduces a method to predict the time at which a significant quantity of unaccounted material is expected to be diverted during an enrichment process. This method utilizes a particle filter to model the data and provide a Type III (degradation-based) prognostic estimate of time to diversion of a significant quantity. Measurement noise for the particle filter is estimated using historical data and may be updated with Bayesian estimates from the analyzed data. Dynamic noise estimates are updated based on observed changes in process data. The reliability of the prognostic model for a given range of data is validated via information complexity scores and goodness of fit statistics. The developed prognostic method is tested using data produced from the Oak Ridge Mock Feed and Withdrawal Facility, a 1:100 scale test platform for developing gas centrifuge remote monitoring techniques. Four case studies are considered: no diversion, slow diversion, fast diversion, and intermittent diversion. All intervals of diversion and non-diversion were correctly identified and significant quantity diversion time was accurately estimated. A diversion of 0.8 kg over 85 minutes was detected after 10 minutes and predicted to be 84 minutes and 10 seconds after 46 minutes and 40 seconds with an uncertainty of 2 minutes and 52 seconds

    Institutional plan FY 1999--FY 2004

    Full text link
    • …
    corecore