45,645 research outputs found
Managing the Uncertainty Associated with Hydrogen Gas Hazards and Operability Issues in Nuclear Chemical Plants
The complex and diverse nature of reprocessing and decommissioning operations in existing nuclear chemical plants within the UK results in a variety of challenges. The
challenges relate to the quantified risk from hydrogen explosions and how best to manage the associated uncertainties.
Several knowledge gaps in terms of the Quantified Risk Assessment (QRA) of hydrogen hazards have been identified in this research work. These include radiolytic hydrogen explosions in sealed process pipes, the failure of ventilation systems used to dilute radiolytic hydrogen in process vessels, the decision uncertainty in installing additional hydrogen purge systems and the uncertainty associated with hold-up of hydrogen in radioactive sludges. The effect of a subsequent sudden release of the heldup hydrogen gas into a vessel ullage space presents a further knowledge gap. Nuclear decommissioning and reprocessing operations also result in operational risk knowledge gaps including the mixing behaviour of radioactive sludges, the performance of robotics for nuclear waste characterisation and control of nuclear fission products associated with solid wastes.
Bayesian Belief Networks (BBNs) and Monte Carlo Simulation (MC) techniques have been deployed in this research work to address the identified knowledge gaps. These techniques provide a powerful means of uncertainty analysis of complex systems involving multiple interdependent variables such as those affecting nuclear decommissioning and reprocessing.
Through the application of BBN and MC Simulation methodologies to a series of nuclear chemical plant case studies, new knowledge in decommissioning and reprocessing operations has been generated. This new knowledge relates to establishing a realistic quantified risk from hydrogen explosions and nuclear plant operability issues. New knowledge in terms of the key sensitivities affecting the quantified risk of hydrogen explosions and operability in nuclear environments as well as the optimum improvements necessary to mitigate such risks has also been gained
Addressing the Attainment Gap through Early Intervention: Assessment of the First of a Multi-Year Project
This paper follows on from a previous paper where we identified issues with the way the attainment gap is calculated. We identified that to address this gap solely using academic research is unlikely to be effective and that schools also needed to conduct their own research to develop approaches that best meet the needs of their students. Furthermore, based on research from the Education Endowment Foundation and the Department for Education, we identified that the most effective approaches are ones that begin early, continue over the long term and meet the specific needs of students. Finally, based on research conducted within our school, we identified that our students' challenges centred primarily on issues relating to beliefs, self-study, health and support. As a result, we developed a multi-year mentoring project aimed at addressing these challenges. This paper discusses the initial findings after the first year of this multi-year project
Is the Attainment Gap Fundamentally Flawed? Challenges and Opportunities
Inequality of outcome has become one of the most pressing issues in education. Nowhere is this more apparent than in the performance of disadvantaged students. However, despite increased support, no school in England or Wales has managed to consistently close the attainment gap between disadvantaged students and their peers. This raises many questions, none more important than: is there an error in how we measure the performance of disadvantaged students? Furthermore, what are the implications of such a potential error? This paper argues that the attainment gap as it is currently calculated is ineffective in identifying the locus of underperformance, the specific needs of disadvantaged students and the support needed to improve outcomes. Finally, this paper attempts to address this by discussing a pilot project focused on identifying and addressing disadvantaged students’ needs and the challenges and opportunities this raises
Quantified Risk and Uncertainty Analysis
The legal requirement in the UK for the duty holder of
a chemical process plant to demonstrate that risk is
as low as reasonably practicable (ALARP) means that
quantified risk assessments (QRAs) must be accurate and
robust and that identified risks are adequately mitigated. Bayesian belief networks(BBN) is an emerging technique which can be used to determine the likelihood of an event in support of the QRA process. It is a statistical method involving estimating the probability distribution for a given hypothesis. The most interesting features which distinguish this QRA technique from all the others are:
• it can analyse complex systems of any given number of
variables and their dependability within a single analysis;
• it can analyse parameters over a range of probability
values for any given set of conditions, providing a better
understanding in terms of sensitivity analysis;
• it engages expert judgement and learning from previous
events to update the probability distribution, thus
improving QRA accuracy; and
• it is not just restricted to fault analysis and can be used
to support plant operational decision making using a
quantified approac
Application of Bayesian Belief Networks to assess hydrogen gas retention hazards and equipment reliability in nuclear chemical plants
Many nuclear waste reprocessing and storage plant processes result in the generation of hydrogen gas. Radiolysis of radioactive liquors and corrosion of metallic magnesium waste are the main mechanisms for generating hydrogen in such facilities. Corrosion products such as magnesium hydroxide sludge are also formed which require storage in transportable vessels. Demonstration of sufficient reliability of systems such as purge air and ventilation extract is therefore required to protect against releases of hydrogen. Factors affecting hydrogen ignition and removal in nuclear environments as well as the identification of appropriate hazard management strategies have been the key areas of research for decommissioning and reprocessing plants. However, a knowledge gap has been identified in terms of assessing the likelihood of hydrogen retention within the sludge and waste matrix resulting in a sudden release of the gas into a vessel ullage. Hydrogen gas retention and the potential for a sudden release are affected by numerous factors such as faults leading to adverse waste disturbance. As such an appropriate technique must be applied to analyse the uncertainty from this gas behaviour. Bayesian Belief Networks (BBN) is an emerging statistical technique which allows uncertainty and dependencies between multiple variables to be taken into account in a quantified risk assessment. A BBN analysis has been undertaken to determine the key factors that would lead to disturbance of the sludge waste and the subsequent sudden release of hydrogen into the ullage space of a process vessel. The results show that the key sensitivities are adverse disturbance of the vessel sludge waste caused by faults leading to uncontrolled movements and clashes of the vessel. The benefits of applying the BBN technique to assess reliability of the purge and ventilation extract systems against radiolytic hydrogen release have also been explored. The BBN model has shown to be particularly advantageous, as it has allowed input of probability distributions of the key variables, instead of single point values, thus providing an enhanced understanding of uncertainty. Furthermore, the BBN technique has allowed updating of the probability of a known variable given a particular condition of the other variables. This updating function has enabled the key sensitivities to be determined
A parallel algorithm for Hamiltonian matrix construction in electron-molecule collision calculations: MPI-SCATCI
Construction and diagonalization of the Hamiltonian matrix is the
rate-limiting step in most low-energy electron -- molecule collision
calculations. Tennyson (J Phys B, 29 (1996) 1817) implemented a novel algorithm
for Hamiltonian construction which took advantage of the structure of the
wavefunction in such calculations. This algorithm is re-engineered to make use
of modern computer architectures and the use of appropriate diagonalizers is
considered. Test calculations demonstrate that significant speed-ups can be
gained using multiple CPUs. This opens the way to calculations which consider
higher collision energies, larger molecules and / or more target states. The
methodology, which is implemented as part of the UK molecular R-matrix codes
(UKRMol and UKRMol+) can also be used for studies of bound molecular Rydberg
states, photoionisation and positron-molecule collisions.Comment: Write up of a computer program MPI-SCATCI Computer Physics
Communications, in pres
Vortex-scalar element calculations of a diffusion flame stabilized on a plane mixing layer
The vortex-scalar element method, a scheme which utilizes vortex elements to discretize the region of high vorticity and scalar elements to represent species or temperature fields, is utilized in the numerical simulations of a two-dimensional reacting mixing layer. Computations are performed for a diffusion flame at high Reynolds and Peclet numbers without resorting to turbulence models. In the nonreacting flow, the mean and fluctuation profiles of a conserved scalar show good agreement with experimental measurements. Results for the reacting flow indicate that for temperature independent kinetics, the chemical reaction begins immediately downstream of the splitter plate where mixing starts. Results for the reacting flow with Arrhenius kinetics show an ignition delay, which depends on reactant temperature, before significant chemical reaction occurs. Harmonic forcing changes the structure of the layer, and concomitantly the rates of mixing and reaction, in accordance with experimental results. Strong stretch within the braids in the nonequilibrium kinetics case causes local flame quenching due to the temperature drop associated with the large convective fluxes
On Multiobjective Evolution Model
Self-Organized Criticality (SOC) phenomena could have a significant effect on
the dynamics of ecosystems. The Bak-Sneppen (BS) model is a simple and robust
model of biological evolution that exhibits punctuated equilibrium behavior.
Here we will introduce random version of BS model. Also we generalize the
single objective BS model to a multiobjective one.Comment: 6 pages, 5 figure
Sidelobe Control in Collaborative Beamforming via Node Selection
Collaborative beamforming (CB) is a power efficient method for data
communications in wireless sensor networks (WSNs) which aims at increasing the
transmission range in the network by radiating the power from a cluster of
sensor nodes in the directions of the intended base station(s) or access
point(s) (BSs/APs). The CB average beampattern expresses a deterministic
behavior and can be used for characterizing/controling the transmission at
intended direction(s), since the mainlobe of the CB beampattern is independent
on the particular random node locations. However, the CB for a cluster formed
by a limited number of collaborative nodes results in a sample beampattern with
sidelobes that severely depend on the particular node locations. High level
sidelobes can cause unacceptable interference when they occur at directions of
unintended BSs/APs. Therefore, sidelobe control in CB has a potential to
increase the network capacity and wireless channel availability by decreasing
the interference. Traditional sidelobe control techniques are proposed for
centralized antenna arrays and, therefore, are not suitable for WSNs. In this
paper, we show that distributed, scalable, and low-complexity sidelobe control
techniques suitable for CB in WSNs can be developed based on node selection
technique which make use of the randomness of the node locations. A node
selection algorithm with low-rate feedback is developed to search over
different node combinations. The performance of the proposed algorithm is
analyzed in terms of the average number of trials required to select the
collaborative nodes and the resulting interference. Our simulation results
approve the theoretical analysis and show that the interference is
significantly reduced when node selection is used with CB.Comment: 30 pages, 10 figures, submitted to the IEEE Trans. Signal Processin
- …