64 research outputs found
Recommended from our members
PHASER user`s manual version 2.10
PHASER (Probabilistic Hybrid Analytical System Evaluation Routine) is a computer code for solving the top event probability of a system fault tree. It has the capability for easy migration of individual basic event probabilities from a zero-{open_quotes}scale{close_quotes}-factor (completely subjective) state to one in which the analyst has total knowledge (completely stochastic) about each basic event. The code implements a fuzzy algebra solution for subjective data, a probabilistic solution for stochastic data, and a hybrid mathematics solution for data that are partly subjective and partly stochastic. Events that are not completely subjective or completely stochastic are hybrid events and are internally handled as such. The stochastic and fuzzy ranges of uncertainty in the top event probability are also computed for the analyst. These are provided in the form of a fuzzy function for the subjective uncertainty, a probability density function (PDF) for the stochastic variability, and the overall {open_quotes}confidence{close_quotes} factors for the two constituents of uncertainty, giving a complete hybrid result. PHASER interfaces with other Sandia codes (SABLE, LHS and LHSPOST) to assist the user in determining cutsets, and to compute probability density functions
Team 2: Situation Awareness of an Infantry Unit in a Chemical Environment
from Scythe : Proceedings and Bulletin of the International Data Farming Community, Issue 2 Workshop 14The German Federal Office of Defense Technology and
Procurement has been analyzing the influence of
networked sensors and effectors on military
capabilities. The background of our overall scenario is
peace support operations (PSO) in an urban
environment. The background for the actual technical
evaluations of sensors, effectors and the connecting
network is the following scenario vignette: Convoy
Protection
The human GCOM1 complex gene interacts with the NMDA receptor and internexin-alpha
The known functions of the human GCOM1 complex hub gene include transcription elongation and the intercalated disk of cardiac myocytes. However, in all likelihood, the gene's most interesting, and thus far least understood, roles will be found in the central nervous system. To investigate the functions of the GCOM1 gene in the CNS, we have cloned human and rat brain cDNAs encoding novel, 105 kDa GCOM1 combined (Gcom) proteins, designated Gcom15, and identified a new group of GCOM1 interacting genes, termed Gints, from yeast two-hybrid (Y2H) screens. We showed that Gcom15 interacts with the NR1 subunit of the NMDA receptor by co-expression in heterologous cells, in which we observed bi-directional co-immunoprecipitation of human Gcom15 and murine NR1. Our Y2H screens revealed 27 novel GCOM1 interacting genes, many of which are synaptic proteins and/or play roles in neurologic diseases. Finally, we showed, using rat brain protein preparations, that the Gint internexin-alpha (INA), a known interactor of the NMDAR, co-IPs with GCOM1 proteins, suggesting a GCOM1-GRIN1-INA interaction and a novel pathway that may be relevant to neuroprotection
Recommended from our members
ETPRE User`s Manual Version 3.00
ETPRE is a preprocessor for the Event Progression Analysis Code EVNTRE. It reads an input file of event definitions and writes the lengthy EVNTRE code input files. ETPRE`s advantage is that it eliminates the error-prone task of manually creating or revising these files since their formats are quite elaborate. The user-friendly format of ETPRE differs from the EVNTRE code format in that questions, branch references, and other event tree components are defined symbolically instead of numerically. When ETPRE is executed, these symbols are converted to their numeric equivalents and written to the output files using formats defined in the EVNTRE Reference Manual. Revisions to event tree models are simplified by allowing the user to edit the symbolic format and rerun the preprocessor, since questions, branch references, and other symbols are automatically resequenced to their new values with each execution. ETPRE and EVNTRE have both been incorporated into the SETAC event tree analysis package
Recommended from our members
Hybrid Processing of Measurable and Subjective Data
Conventional systems surety analysis is basically restricted to measurable or physical-model-derived data. However, most analyses, including high-consequence system surety analysis, must also utilize subjective information. In order to address this need, there has been considerable effort on analytically incorporating engineering judgment. For example, Dempster-Shafer theory establishes a framework in which frequentist probability and Bayesian incorporation of new data are subsets. Although Bayesian and Dempster-Shafer methodology both allow judgment, neither derives results that can indicate the relative amounts of subjective judgment and measurable data in the results. The methodology described in this report addresses these problems through a hybrid-mathematics-based process that allows tracking of the degree of subjective information in the output, thereby providing more informative (as well as more appropriate) results. In addition, most high consequence systems offer difficult-to-analyze situations. For example, in the Sandia National Laboratories nuclear weapons program, the probability that a weapon responds safely when exposed to an abnormal environment (e.g., lightning, crush, metal-melting temperatures) must be assured to meet a specific requirement. There are also non-probabilistic DOE and DoD requirements (e.g., for determining the adequacy of positive measures). The type of processing required for these and similar situations transcends conventional probabilistic and human factors methodology. The results described herein address these situations by efficiently utilizing subjective and objective information in a hybrid mathematical structure in order to directly apply to the surety assessment of high consequence systems. The results can also improve the quality of the information currently provided to decision-makers. To this end, objective inputs are processed in a conventional manner; while subjective inputs are derived from the combined engineering judgment of experts in the appropriate disciplines. In addition to providing output constituents (including portrayal of uncertainty) corresponding to combination of these input types, their individual contributions to the resultant uncertainty are determined and provided as part of the output information. Finally, the safety assessment is complemented by a latent effects analysis, facilitated by soft-aggregation accumulation of observed operational constituents
- …