996 research outputs found
Protection of stainless-steels against corrosion in sulphidizing environments by Ce oxide coatings: X-ray absorption and thermogravimetric studies
In this paper a study is reported concerning ceramic coatings containing cerium oxide, prepared by the sol-gel method, used to protect Incoloy 800H against sulphidation. When the coating is sintered in air at 850Ā°C good protection is obtained. In an X-ray absorption spectroscopic study of the coatings it was observed that the best protective coating contains all cerium as CeIV after pretreatment. After sulphidizing cerium was reduced to CeIII. Possible mechanisms to explain the protective properties are discussed
Determination of the relative concentrations of rare earth ions by x-ray absorption spectroscopy: Application to terbium mixed oxides
A method, based on X-ray absorption spectroscopy (XAS) in the range 0.8ā1.5 keV, to determine the relative amounts of rare earth ions in different valencies is explained and tested for the case of terbium mixed oxides. The results are in agreement with those obtained by existing analytical methods. The XAS method is advantageous in that it can be applied where other, conventional, methods break down
Recommended from our members
Modeling Natural Hazards Engineering Data to Cyberinfrastructure
DesignSafe-CI is an end-to-end data lifecycle management, analysis, and publication cloud platform for natural hazards engineering. To facilitate ongoing data curation and sharing in a cloud environment that is intuitive to the end users, developers and curators teamed with experts in the different hazards to design data models and vocabularies that map their research workflows and domain terminology. The experimental data models - six - emphasize provenance through relationships between research processes, data and their documentation, and highlight commonalities between experiment types. They mediate between the user interface and the repository layers of the cyberinfrastructure to automate tasks such as organizing data and facilitating its description. Using data from triaxial experiments, we conducted a user evaluation of the geotechnical data model, both for its fitness to real data and for purposes of data understandability during reuse. The results of the evaluation guided testing and selection of the Fedora 4 repository backend to enhance data discovery and reuse.National Science FoundationTexas Advanced Computing Center (TACC
Fitting in a complex chi^2 landscape using an optimized hypersurface sampling
Fitting a data set with a parametrized model can be seen geometrically as
finding the global minimum of the chi^2 hypersurface, depending on a set of
parameters {P_i}. This is usually done using the Levenberg-Marquardt algorithm.
The main drawback of this algorithm is that despite of its fast convergence, it
can get stuck if the parameters are not initialized close to the final
solution. We propose a modification of the Metropolis algorithm introducing a
parameter step tuning that optimizes the sampling of parameter space. The
ability of the parameter tuning algorithm together with simulated annealing to
find the global chi^2 hypersurface minimum, jumping across chi^2{P_i} barriers
when necessary, is demonstrated with synthetic functions and with real data
Application of mathematical and machine learning techniques to analyse eye tracking data enabling better understanding of childrenās visual cognitive behaviours
In this research, we aimed to investigate the visual-cognitive behaviours of a sample of 106 children in Year 3 (8.8 Ā± 0.3 years) while completing a mathematics bar-graph task. Eye movements were recorded while children completed the task and the patterns of eye movements were explored using machine learning approaches. Two different techniques of machine-learning were used (Bayesian and K-Means) to obtain separate model sequences or average scanpaths for those children who responded either correctly or incorrectly to the graph task. Application of these machine-learning approaches indicated distinct differences in the resulting scanpaths for children who completed the graph task correctly or incorrectly: children who responded correctly accessed information that was mostly categorised as critical, whereas children responding incorrectly did not. There was also evidence that the children who were correct accessed the graph information in a different, more logical order, compared to the children who were incorrect. The visual behaviours aligned with different aspects of graph comprehension, such as initial understanding and orienting to the graph, and later interpretation and use of relevant information on the graph. The findings are discussed in terms of the implications for early mathematics teaching and learning, particularly in the development of graph comprehension, as well as the application of machine learning techniques to investigations of other visual-cognitive behaviours.Peer reviewe
Optimal Uncertainty Quantification
We propose a rigorous framework for Uncertainty Quantification (UQ) in which
the UQ objectives and the assumptions/information set are brought to the
forefront. This framework, which we call \emph{Optimal Uncertainty
Quantification} (OUQ), is based on the observation that, given a set of
assumptions and information about the problem, there exist optimal bounds on
uncertainties: these are obtained as values of well-defined optimization
problems corresponding to extremizing probabilities of failure, or of
deviations, subject to the constraints imposed by the scenarios compatible with
the assumptions and information. In particular, this framework does not
implicitly impose inappropriate assumptions, nor does it repudiate relevant
information. Although OUQ optimization problems are extremely large, we show
that under general conditions they have finite-dimensional reductions. As an
application, we develop \emph{Optimal Concentration Inequalities} (OCI) of
Hoeffding and McDiarmid type. Surprisingly, these results show that
uncertainties in input parameters, which propagate to output uncertainties in
the classical sensitivity analysis paradigm, may fail to do so if the transfer
functions (or probability distributions) are imperfectly known. We show how,
for hierarchical structures, this phenomenon may lead to the non-propagation of
uncertainties or information across scales. In addition, a general algorithmic
framework is developed for OUQ and is tested on the Caltech surrogate model for
hypervelocity impact and on the seismic safety assessment of truss structures,
suggesting the feasibility of the framework for important complex systems. The
introduction of this paper provides both an overview of the paper and a
self-contained mini-tutorial about basic concepts and issues of UQ.Comment: 90 pages. Accepted for publication in SIAM Review (Expository
Research Papers). See SIAM Review for higher quality figure
- ā¦