7 research outputs found
Nuclear Fuel Cycle Reasoner: PNNL FY13 Report
In Fiscal Year 2012 (FY12) PNNL implemented a formal reasoning framework and applied it to a specific challenge in nuclear nonproliferation. The Semantic Nonproliferation Analysis Platform (SNAP) was developed as a preliminary graphical user interface to demonstrate the potential power of the underlying semantic technologies to analyze and explore facts and relationships relating to the nuclear fuel cycle (NFC). In Fiscal Year 2013 (FY13) the SNAP demonstration was enhanced with respect to query and navigation usability issues
Nuclear Fuel Cycle Reasoner: PNNL FY12 Report
Building on previous internal investments and leveraging ongoing advancements in semantic technologies, PNNL implemented a formal reasoning framework and applied it to a specific challenge in nuclear nonproliferation. The Semantic Nonproliferation Analysis Platform (SNAP) was developed as a preliminary graphical user interface to demonstrate the potential power of the underlying semantic technologies to analyze and explore facts and relationships relating to the nuclear fuel cycle (NFC). In developing this proof of concept prototype, the utility and relevancy of semantic technologies to the Office of Defense Nuclear Nonproliferation Research and Development (DNN R&D) has been better understood
Recommended from our members
Ontological Annotation with WordNet
Semantic Web applications require robust and accurate annotation tools that are capable of automating the assignment of ontological classes to words in naturally occurring text (ontological annotation). Most current ontologies do not include rich lexical databases and are therefore not easily integrated with word sense disambiguation algorithms that are needed to automate ontological annotation. WordNet provides a potentially ideal solution to this problem as it offers a highly structured lexical conceptual representation that has been extensively used to develop word sense disambiguation algorithms. However, WordNet has not been designed as an ontology, and while it can be easily turned into one, the result of doing this would present users with serious practical limitations due to the great number of concepts (synonym sets) it contains. Moreover, mapping WordNet to an existing ontology may be difficult and requires substantial labor. We propose to overcome these limitations by developing an analytical platform that (1) provides a WordNet-based ontology offering a manageable and yet comprehensive set of concept classes, (2) leverages the lexical richness of WordNet to give an extensive characterization of concept class in terms of lexical instances, and (3) integrates a class recognition algorithm that automates the assignment of concept classes to words in naturally occurring text. The ensuing framework makes available an ontological annotation platform that can be effectively integrated with intelligence analysis systems to facilitate evidence marshaling and sustain the creation and validation of inference models
Nuclear Nonproliferation Ontology Assessment Team Final Report
Final Report for the NA22 Simulations, Algorithm and Modeling (SAM) Ontology Assessment Team's efforts from FY09-FY11. The Ontology Assessment Team began in May 2009 and concluded in September 2011. During this two-year time frame, the Ontology Assessment team had two objectives: (1) Assessing the utility of knowledge representation and semantic technologies for addressing nuclear nonproliferation challenges; and (2) Developing ontological support tools that would provide a framework for integrating across the Simulation, Algorithm and Modeling (SAM) program. The SAM Program was going through a large assessment and strategic planning effort during this time and as a result, the relative importance of these two objectives changed, altering the focus of the Ontology Assessment Team. In the end, the team conducted an assessment of the state of art, created an annotated bibliography, and developed a series of ontological support tools, demonstrations and presentations. A total of more than 35 individuals from 12 different research institutions participated in the Ontology Assessment Team. These included subject matter experts in several nuclear nonproliferation-related domains as well as experts in semantic technologies. Despite the diverse backgrounds and perspectives, the Ontology Assessment team functioned very well together and aspects could serve as a model for future inter-laboratory collaborations and working groups. While the team encountered several challenges and learned many lessons along the way, the Ontology Assessment effort was ultimately a success that led to several multi-lab research projects and opened up a new area of scientific exploration within the Office of Nuclear Nonproliferation and Verification
Modeling Human Behavior to Anticipate Insider Attacks
The insider threat ranks among the most pressing cyber-security challengesthat threaten government and industry information infrastructures.To date, no systematic methods have been developed that provide acomplete and effective approach to prevent data leakage, espionage, andsabotage. Current practice is forensic in nature, relegating to the analystthe bulk of the responsibility to monitor, analyze, and correlate an overwhelmingamount of data. We describe a predictive modeling frameworkthat integrates a diverse set of data sources from the cyber domain, as wellas inferred psychological/motivational factors that may underlie maliciousinsider exploits. This comprehensive threat assessment approachprovides automated support for the detection of high-risk behavioral triggers to help focus the analyst\u27s attention and inform the analysis.Designed to be domain-independent, the system may be applied to manydifferent threat and warning analysis/sense-making problems
Recommended from our members
Deposition Velocities of Newtonian and Non-Newtonian Slurries in Pipelines
The WTP pipe plugging issue, as stated by the External Flowsheet Review Team (EFRT) Executive Summary, is as follows: āPiping that transports slurries will plug unless it is properly designed to minimize this risk. This design approach has not been followed consistently, which will lead to frequent shutdowns due to line plugging.ā A strategy was employed to perform critical-velocity tests on several physical simulants. Critical velocity is defined as the point where a stationary bed of particles deposits on the bottom of a straight horizontal pipe during slurry transport operations. Results from the critical velocity testing provide an indication of slurry stability as a function of fluid rheological properties and transport conditions. The experimental results are compared to the WTP design guide on slurry transport velocity in an effort to confirm minimum waste velocity and flushing velocity requirements as established by calculations and critical line velocity correlations in the design guide. The major findings of this testing is discussed below. Experimental results indicate that the use of the Oroskar and Turian (1980) correlation in the design guide is conservativeāSlurry viscosity has a greater affect on particles with a large surface area to mass ratio. The increased viscous forces on these particles result in a decrease in predicted critical velocities from this traditional industry derived equations that focus on particles large than 100 ļm in size. Since the Hanford slurry particles generally have large surface area to mass ratios, the reliance on such equations in the Hall (2006) design guide is conservative. Additionally, the use of the 95% percentile particle size as an input to this equation is conservative. However, test results indicate that the use of an average particle density as an input to the equation is not conservative. Particle density has a large influence on the overall result returned by the correlation. Lastly, the viscosity correlation used in the WTP design guide has been shown to be inaccurate for Hanford waste feed materials. The use of the Thomas (1979) correlation in the design guide is not conservativeāIn cases where 100% of the particles are smaller than 74 ļm or particles are considered to be homogeneous due to yield stress forces suspending the particles the homogeneous fraction of the slurry can be set to 100%. In such cases, the predicted critical velocity based on the conservative Oroskar and Turian (1980) correlation is reduced to zero and the design guide returns a value from the Thomas (1979) correlation. The measured data in this report show that the Thomas (1979) correlation predictions often fall below that measured experimental values. A non-Newtonian deposition velocity design guide should be developed for the WTPā Since the WTP design guide is limited to Newtonian fluids and the WTP expects to process large quantities of such materials, the existing design guide should be modified address such systems. A central experimental finding of this testing is that the flow velocity required to reach turbulent flow increases with slurry rheological properties due to viscous forces dampening the formation of turbulent eddies. The flow becomes dominated by viscous forces rather than turbulent eddies. Since the turbulent eddies necessary for particle transport are not present, the particles will settle when crossing this boundary called the transitional deposition boundary. This deposition mechanism should be expected and designed for in the WTP