5,030 research outputs found
The strength and deformation characteristics of a soft alluvial clay under full scale loading conditions
Imperial Users onl
A novel route to Pt-Bi2O3 composite thin films and their application in photo-reduction of water
A novel homoleptic bismuth(III) β-diketonate (dibenzoylmethane – dbm) complex [Bi(dbm)3]2 has been used as a precursor to thin films of crystalline β-Bi2O3, and hexachloroplatinic acid (H2PtCl6·6H2O) has been demonstrated as a suitable precursor for deposition of platinum nanoparticles, both deposited via aerosol-assisted chemical vapour deposition (AACVD). Thin films of Pt–Bi2O3 were co-deposited from a mixture of [Bi(dbm)3]2 and H2PtCl6·6H2O; the introduction of Pt particles into β-Bi2O3 causes hydrogen to be evolved during photolysis of water over the composite material, a property not found for Pt particles or β-Bi2O3 alone
Electronic band structure, Fermi surface, and elastic properties of new 4.2K superconductor SrPtAs from first-principles calculations
The hexagonal phase SrPtAs (s.g. P6/mmm; #194) with a honeycomb lattice
structure very recently was declared as a new low-temperature (TC ~ 4.2K)
superconductor. Here by means of first-principles calculations the optimized
structural parameters, electronic bands, Fermi surface, total and partial
densities of states, inter-atomic bonding picture, independent elastic
constants, bulk and shear moduli for SrPtAs were obtained for the first time
and analyzed in comparison with the related layered superconductor SrPt2As2.Comment: 8 pages, 4 figure
An Efficient Algorithm for Optimizing Adaptive Quantum Metrology Processes
Quantum-enhanced metrology infers an unknown quantity with accuracy beyond
the standard quantum limit (SQL). Feedback-based metrological techniques are
promising for beating the SQL but devising the feedback procedures is difficult
and inefficient. Here we introduce an efficient self-learning
swarm-intelligence algorithm for devising feedback-based quantum metrological
procedures. Our algorithm can be trained with simulated or real-world trials
and accommodates experimental imperfections, losses, and decoherence
A Bayesian method for microseismic source inversion
Earthquake source inversion is highly dependent on location determination and velocity models. Uncertainties in both the model parameters and the observations need to be rigorously incorporated into an inversion approach. Here, we show a probabilistic Bayesian method that allows formal inclusion of the uncertainties in the moment tensor inversion. This method allows the combination of different sets of far-field observations, such as P-wave and S-wave polarities and amplitude ratios, into one inversion. Additional observations can be included by deriving a suitable likelihood function from the uncertainties. This inversion produces samples from the source posterior probability distribution, including a best-fitting solution for the source mechanism and associated probability. The inversion can be constrained to the double-couple space or allowed to explore the gamut of moment tensor solutions, allowing volumetric and other non-double-couple components. The posterior probability of the double-couple and full moment tensor source models can be evaluated from the Bayesian evidence, using samples from the likelihood distributions for the two source models, producing an estimate of whether or not a source is double-couple. Such an approach is ideally suited to microseismic studies where there are many sources of uncertainty and it is often difficult to produce reliability estimates of the source mechanism, although this can be true of many other cases. Using full-waveform synthetic seismograms, we also show the effects of noise, location, network distribution and velocity model uncertainty on the source probability density function. The noise has the largest effect on the results, especially as it can affect other parts of the event processing. This uncertainty can lead to erroneous non-double-couple source probability distributions, even when no other uncertainties exist. Although including amplitude ratios can improve the constraint on the source probability distribution, the measurements are often systematically affected by noise, leading to deviation from their noise-free true values and consequently adversely affecting the source probability distribution, especially for the full moment tensor model. As an example of the application of this method, four events from the Krafla volcano in Iceland are inverted, which show clear differentiation between non-double-couple and double-couple sources, reflected in the posterior probability distributions for the source models
Automatic Bayesian polarity determination
The polarity of the first motion of a seismic signal from an earthquake is an important constraint in earthquake source inversion. Microseismic events often have low signal-to-noise ratios, which may lead to difficulties estimating the correct first-motion polarities of the arrivals. This paper describes a probabilistic approach to polarity picking that can be both automated and combined with manual picking. This approach includes a quantitative estimate of the uncertainty of the polarity, improving calculation of the polarity probability density function for source inversion. It is sufficiently fast to be incorporated into an automatic processing workflow. When used in source inversion, the results are consistent with those from manual observations. In some cases, they produce a clearer constraint on the range of high-probability source mechanisms, and are better constrained than source mechanisms determined using a uniform probability of an incorrect polarity pick
Breaking quantum linearity: constraints from human perception and cosmological implications
Resolving the tension between quantum superpositions and the uniqueness of
the classical world is a major open problem. One possibility, which is
extensively explored both theoretically and experimentally, is that quantum
linearity breaks above a given scale. Theoretically, this possibility is
predicted by collapse models. They provide quantitative information on where
violations of the superposition principle become manifest. Here we show that
the lower bound on the collapse parameter lambda, coming from the analysis of
the human visual process, is ~ 7 +/- 2 orders of magnitude stronger than the
original bound, in agreement with more recent analysis. This implies that the
collapse becomes effective with systems containing ~ 10^4 - 10^5 nucleons, and
thus falls within the range of testability with present-day technology. We also
compare the spectrum of the collapsing field with those of known cosmological
fields, showing that a typical cosmological random field can yield an efficient
wave function collapse.Comment: 13 pages, LaTeX, 3 figure
Recommended from our members
Serving GODAE Data and Products to the Ocean Community
The Global Ocean Data Assimilation Experiment (GODAE [http://
www.godae.org]) has spanned a decade of rapid technological development. The ever-increasing volume and diversity of oceanographic data produced by in situ instruments, remote-sensing platforms, and computer simulations have driven
the development of a number of innovative technologies that are essential for connecting scientists with the data that they need. This paper gives an overview of the technologies that have been developed and applied in the course of GODAE, which now provide users of oceanographic data with the capability to discover, evaluate, visualize, download, and analyze data from all over the world. The key to this
capability is the ability to reduce the inherent complexity of oceanographic data by providing a consistent, harmonized view of the various data products. The challenges of data serving have been addressed over the last 10 years through the cooperative skills and energies of many individuals
Systematic search for low-enthalpy sp3 carbon using evolutionary metadynamics
We present a systematic search for low-energy metastable superhard carbon
allotropes by using the recently developed evolutionary metadynamics technique.
It is known that cold compression of graphite produces an allotrope at 15-20
GPa. Here we look for all low-enthalpy structures accessible from graphite.
Starting from 2H- or 3R-graphite and applying the pressure of 20 GPa, a large
variety of intermediate carbon allotropes were observed in evolutionary
metadynamics simulation. Our calculation not only found all the previous
proposed candidates for `superhard graphite', but also predicted two allotropes
(\emph{X}-carbon and \emph{Y}-carbon) showing novel 5+7 and 4+8 topologies.
These superhard carbon allotropes can be classified into five families based on
6 (diamond/lonsdaleite), 5+7 (\emph{M/W}-carbon), 5+7 (\emph{X}-carbon), 4+8
(bct C), and 4+8 (\emph{Y}-carbon) topologies. This study shows
evolutionary metadynamics is a powerful approach both to find the global minima
and systematically search for low-energy metastable phases reachable from given
starting materials.Comment: 6 pages, 7 figure
- …