1,440 research outputs found

    A Tailored Systems Engineering Framework for Science and Technology Projects

    Get PDF
    As government and industry becomes subject to a wider range of technology initiatives, science and technology (S&T) research project leadership recognizes the need to incorporate more systems engineering (SE) rigor into their projects. The objective of this research is to develop a tailorable systems engineering framework for S&T project planning, execution, assessment and transition. The key deliverable is an Excel-based tool instantiating the SE framework for a wide range of S&T projects in technology development organizations. It includes a report with tailored methods based on programmatic discriminants. To develop this framework, a comprehensive understanding of SE principles is applied to several case studies across government and supporting industry-sponsored S&T activities. This research followed a six-step approach: (1) Literature Review; (2) Formulate Taxonomy; (3) Prepare Data Gathering Approach; (4) Review Case Studies; (5) Develop Tailorable SE Framework for Technology Development and Transition; and (6) Validate Framework.The framework allows S&T project leaders and engineers to customize a recommended set of SE processes, methods and tools for their specific project type, size, maturity, budget, and integration level. Recommendations for SE methods are made at a summary level, with additional details available for desired activities. References to established SE documentation is also included for further investigation of appropriate SE techniques

    A computationally efficient method for probabilistic parameter threshold analysis for health economic evaluations

    Get PDF
    Background. Threshold analysis is used to determine the threshold value of an input parameter at which a health care strategy becomes cost-effective. Typically, it is performed in a deterministic manner, in which inputs are varied one at a time while the remaining inputs are each fixed at their mean value. This approach will result in incorrect threshold values if the cost-effectiveness model is nonlinear or if inputs are correlated. Objective. To propose a probabilistic method for performing threshold analysis, which accounts for the joint uncertainty in all input parameters and makes no assumption about the linearity of the cost-effectiveness model. Methods. Three methods are compared: 1) deterministic threshold analysis (DTA); 2) a 2-level Monte Carlo approach, which is considered the gold standard; and 3) a regression-based method using a generalized additive model (GAM), which identifies threshold values directly from a probabilistic sensitivity analysis sample. Results. We applied the 3 methods to estimate the minimum probability of hospitalization for typhoid fever at which 3 different vaccination strategies become cost-effective in Uganda. The threshold probability of hospitalization at which routine vaccination at 9 months with catchup campaign to 5 years becomes cost-effective is estimated to be 0.060 and 0.061 (95% confidence interval [CI], 0.058–0.064), respectively, for 2-level and GAM. According to DTA, routine vaccination at 9 months with catchup campaign to 5 years would never become cost-effective. The threshold probability at which routine vaccination at 9 months with catchup campaign to 15 years becomes cost-effective is estimated to be 0.092 (DTA), 0.074 (2-level), and 0.072 (95% CI, 0.069–0.075) (GAM). GAM is 430 times faster than the 2-level approach. Conclusions. When the cost-effectiveness model is nonlinear, GAM provides similar threshold values to the 2-level Monte Carlo approach and is computationally more efficient. DTA provides incorrect results and should not be used

    EU Development of High Heat Flux Components

    Get PDF
    The development of plasma facing components for next step fusion devices in Europe is strongly focused to ITER. Here a wide spectrum of different design options for the divertor target and the first wall have been investigated with tungsten, CFC, and beryllium armor. Electron beam simulation experiments have been used to determine the performance of high heat flux components under ITER specific thermal loads. Beside thermal fatigue loads with power density levels up to 20MWm(-2), off normal events are a serious concern for the lifetime of plasma facing components. These phenomena are expected to occur on a time scale of a few milliseconds (plasma disruptions) or several hundred milliseconds (vertical displacement events) and have been identified as a major source for the production of neutron activated metallic or tritium enriched carbon dust which is of serious importance from a safety point of view. The irradiation induced material degradation is another critical concern for future D-T-burning fusion devices. In ITER the integrated neutron fluence to the first wall and the divertor armour will remain in the order of I dpa and 0.7 dpa, respectively. This value is low compared to future commercial fusion reactors; nevertheless, a non-negligible degradation of the materials has been detected, both for mechanical and thermal properties, in particular for the thermal conductivity of carbon based materials. Beside the degradation of individual material properties, the high heat flux performance of actively cooled plasma facing components has been investigated under ITER specific thermal and neutron loads

    Long-range interactions and non-extensivity in ferromagnetic spin models

    Full text link
    The Ising model with ferromagnetic interactions that decay as 1/rα1/r^\alpha is analyzed in the non-extensive regime 0αd0\leq\alpha\leq d, where the thermodynamic limit is not defined. In order to study the asymptotic properties of the model in the NN\rightarrow\infty limit (NN being the number of spins) we propose a generalization of the Curie-Weiss model, for which the NN\rightarrow\infty limit is well defined for all α0\alpha\geq 0. We conjecture that mean field theory is {\it exact} in the last model for all 0αd0\leq\alpha\leq d. This conjecture is supported by Monte Carlo heat bath simulations in the d=1d=1 case. Moreover, we confirm a recently conjectured scaling (Tsallis\cite{Tsallis}) which allows for a unification of extensive (α>d\alpha>d) and non-extensive (0αd0\leq\alpha\leq d) regimes.Comment: RevTex, 12 pages, 1 eps figur

    Interatomic-Coulombic-decay-induced recapture of photoelectrons in helium dimers

    Full text link
    We investigate the onset of photoionization shakeup induced interatomic Coulombic decay (ICD) in He2 at the He+*(n = 2) threshold by detecting two He+ ions in coincidence. We find this threshold to be shifted towards higher energies compared to the same threshold in the monomer. The shifted onset of ion pairs created by ICD is attributed to a recapture of the threshold photoelectron after the emission of the faster ICD electron.Comment: 5 Pages, 2 Figure

    Vibrationally Resolved Decay Width of Interatomic Coulombic Decay in HeNe

    Full text link
    We investigate the ionization of HeNe from below the He 1s3p excitation to the He ionization threshold. We observe HeNe+^+ ions with an enhancement by more than a factor of 60 when the He side couples resonantly to the radiation field. These ions are an experimental proof of a two-center resonant photoionization mechanism predicted by Najjari et al. [Phys. Rev. Lett. 105, 153002 (2010)]. Furthermore, our data provide electronic and vibrational state resolved decay widths of interatomic Coulombic decay (ICD) in HeNe dimers. We find that the ICD lifetime strongly increases with increasing vibrational state.Comment: 7 pages, 5 figure

    A measurement of the evolution of Interatomic Coulombic Decay in the time domain

    Full text link
    During the last 15 years a novel decay mechanism of excited atoms has been discovered and investigated. This so called ''Interatomic Coulombic Decay'' (ICD) involves the chemical environment of the electronically excited atom: the excitation energy is transferred (in many cases over long distances) to a neighbor of the initially excited particle usually ionizing that neighbor. It turned out that ICD is a very common decay route in nature as it occurs across van-der-Waals and hydrogen bonds. The time evolution of ICD is predicted to be highly complex, as its efficiency strongly depends on the distance of the atoms involved and this distance typically changes during the decay. Here we present the first direct measurement of the temporal evolution of ICD using a novel experimental approach.Comment: 6 pages, 4 figures, submitted to PR

    Criticality in strongly correlated fluids

    Full text link
    In this brief review I will discuss criticality in strongly correlated fluids. Unlike simple fluids, molecules of which interact through short ranged isotropic potential, particles of strongly correlated fluids usually interact through long ranged forces of Coulomb or dipolar form. While for simple fluids mechanism of phase separation into liquid and gas was elucidated by van der Waals more than a century ago, the universality class of strongly correlated fluids, or in some cases even existence of liquid-gas phase separation remains uncertain.Comment: Proceedings of Scaling Concepts and Complex Systems, Merida, Mexic

    Two-particle interference of electron pairs on a molecular level

    Full text link
    We investigate the photo-doubleionization of H2H_2 molecules with 400 eV photons. We find that the emitted electrons do not show any sign of two-center interference fringes in their angular emission distributions if considered separately. In contrast, the quasi-particle consisting of both electrons (i.e. the "dielectron") does. The work highlights the fact that non-local effects are embedded everywhere in nature where many-particle processes are involved
    corecore