1,682 research outputs found
Reliability-as-a-Service for bearing risk assessment investigated with advanced mathematical models
As a key player in bearing service life, the lubricant chemistry has a profound effect on bearing reliability. To increase the reliability of bearings, an Industrial Analytics solution is proposed for proactive condition monitoring and this is delivered via a Reliability-as-a-Service application. The performance predictions of bearings rely on customized algorithms with the main focus on digitalizing lubricant chemistry; the principles behind these processes are outlined in this study. Subsequently, independent testing is performed to confirm the ability of the presented Industrial Analytics solution for such predictions. By deciphering the chemical compounds of lubricants and characteristics of the interface, the Industrial Analytics solution delivers a precise bearing reliability assessment a priori to predict service life of the operation. Bearing tests have shown that the classification system of this Industrial Analytics solution is able to predict 12 out of 13 bearing failures (92%). The described approach provides a proactive bearing risk classification that allows the operator to take immediate action in reducing the failure potential during smooth operation - preventing any potential damage from occurring. For this purpose, a mathematical model is introduced that derives a set of classification rules for oil lubricants, based on linear binary classifiers (support vector machines) that are applied to the chemical compound's mixture data.</p
Boundary condition for Ginzburg-Landau theory of superconducting layers
Electrostatic charging changes the critical temperature of superconducting
thin layers. To understand the basic mechanism, it is possible to use the
Ginzburg-Landau theory with the boundary condition derived by de Gennes from
the BCS theory. Here we show that a similar boundary condition can be obtained
from the principle of minimum free energy. We compare the two boundary
conditions and use the Budd-Vannimenus theorem as a test of approximations.Comment: 6 pages, 4 figure
Assimilation of OMI NO<sub>2</sub> retrievals into the limited-area chemistry-transport model DEHM (V2009.0) with a 3-D OI algorithm
Data assimilation is the process of combining real-world observations with a modelled geophysical field. The increasing abundance of satellite retrievals of atmospheric trace gases makes chemical data assimilation an increasingly viable method for deriving more accurate analysed fields and initial conditions for air quality forecasts. We implemented a three-dimensional optimal interpolation (OI) scheme to assimilate retrievals of NO2 tropospheric columns from the Ozone Monitoring Instrument into the Danish Eulerian Hemispheric Model (DEHM, version V2009.0), a three-dimensional, regional-scale, offline chemistry-transport model. The background error covariance matrix, B, was estimated based on differences in the NO2 concentration field between paired simulations using different meteorological inputs. Background error correlations were modelled as non-separable, horizontally homogeneous and isotropic. Parameters were estimated for each month and for each hour to allow for seasonal and diurnal patterns in NO2 concentrations. Three experiments were run to compare the effects of observation thinning and the choice of observation errors. Model performance was assessed by comparing the analysed fields to an independent set of observations: ground-based measurements from European air-quality monitoring stations. The analysed NO2 and O3 concentrations were more accurate than those from a reference simulation without assimilation, with increased temporal correlation for both species. Thinning of satellite data and the use of constant observation errors yielded a better balance between the observed increments and the prescribed error covariances, with no appreciable degradation in the surface concentrations due to the observation thinning. Forecasts were also considered and these showed rather limited influence from the initial conditions once the effects of the diurnal cycle are accounted for. The simple OI scheme was effective and computationally feasible in this context, where only a single species was assimilated, adjusting the three-dimensional field for this compound. Limitations of the assimilation scheme are discussed
Symplectic structure of N=1 supergravity with anomalies and Chern-Simons terms
The general actions of matter-coupled N=1 supergravity have Peccei-Quinn
terms that may violate gauge and supersymmetry invariance. In addition, N=1
supergravity with vector multiplets may also contain generalized Chern-Simons
terms. These have often been neglected in the literature despite their
importance for gauge and supersymmetry invariance. We clarify the interplay of
Peccei-Quinn terms, generalized Chern-Simons terms and quantum anomalies in the
context of N=1 supergravity and exhibit conditions that have to be satisfied
for their mutual consistency. This extension of the previously known N=1
matter-coupled supergravity actions follows naturally from the embedding of the
gauge group into the group of symplectic duality transformations. Our results
regarding this extension provide the supersymmetric framework for studies of
string compactifications with axionic shift symmetries, generalized
Chern-Simons terms and quantum anomalies.Comment: 27 pages; v2: typos corrected; version to be published in
Class.Quantum Gra
Interaction between ionic lattices and superconducting condensates
The interaction of the ionic lattice with the superconducting condensate is
treated in terms of the electrostatic force in superconductors. It is shown
that this force is similar but not identical to the force suggested by the
volume difference of the normal and superconducting states. The BCS theory
shows larger deviations than the two-fluid model.Comment: 6 pages no figure
Electric/magnetic duality for chiral gauge theories with anomaly cancellation
We show that 4D gauge theories with Green-Schwarz anomaly cancellation and
possible generalized Chern-Simons terms admit a formulation that is manifestly
covariant with respect to electric/magnetic duality transformations. This
generalizes previous work on the symplectically covariant formulation of
anomaly-free gauge theories as they typically occur in extended supergravity,
and now also includes general theories with (pseudo-)anomalous gauge
interactions as they may occur in global or local N=1 supersymmetry. This
generalization is achieved by relaxing the linear constraint on the embedding
tensor so as to allow for a symmetric 3-tensor related to electric and/or
magnetic quantum anomalies in these theories. Apart from electric and magnetic
gauge fields, the resulting Lagrangians also feature two-form fields and can
accommodate various unusual duality frames as they often appear, e.g., in
string compactifications with background fluxes.Comment: 37 pages; v2: typos corrected and 1 reference adde
The Complexity of Computing Minimal Unidirectional Covering Sets
Given a binary dominance relation on a set of alternatives, a common thread
in the social sciences is to identify subsets of alternatives that satisfy
certain notions of stability. Examples can be found in areas as diverse as
voting theory, game theory, and argumentation theory. Brandt and Fischer [BF08]
proved that it is NP-hard to decide whether an alternative is contained in some
inclusion-minimal upward or downward covering set. For both problems, we raise
this lower bound to the Theta_{2}^{p} level of the polynomial hierarchy and
provide a Sigma_{2}^{p} upper bound. Relatedly, we show that a variety of other
natural problems regarding minimal or minimum-size covering sets are hard or
complete for either of NP, coNP, and Theta_{2}^{p}. An important consequence of
our results is that neither minimal upward nor minimal downward covering sets
(even when guaranteed to exist) can be computed in polynomial time unless P=NP.
This sharply contrasts with Brandt and Fischer's result that minimal
bidirectional covering sets (i.e., sets that are both minimal upward and
minimal downward covering sets) are polynomial-time computable.Comment: 27 pages, 7 figure
Target-Controlled Infusion of Cefepime in Critically Ill Patients:single center experience
Attainment of appropriate pharmacokinetic-pharmacodynamic (PK-PD) targets for antimicrobial treatment is challenging in critically ill patients, particularly for cefepime, which exhibits a relative narrow therapeutic-toxic window compared to other beta-lactam antibiotics. Target-controlled infusion (TCI) systems, which deliver drugs to achieve specific target drug concentrations, have successfully been implemented for improved dosing of sedatives and analgesics in anesthesia. We conducted a clinical trial in an intensive care unit (ICU) to investigate the performance of TCI for adequate target attainment of cefepime. Twenty-one patients treated with cefepime according to the standard of care were included. Cefepime was administered through continuous infusion using TCI for a median duration of 4.5 days. TCI was based on a previously developed population PK model incorporating the estimated creatinine clearance based on the Cockcroft-Gault formula as the input variable to calculate cefepime clearance. A cefepime blood concentration of 16 mg/liter was targeted. To evaluate the measured versus predicted plasma concentrations, blood samples were taken (median of 10 samples per patient), and total cefepime concentrations were measured using ultraperformance liquid chromatography-tandem mass spectrometry. The performance of the TCI system was evaluated using Varvel criteria. Half (50.3%) of the measured cefepime concentrations were within +/- 30% around the target value of 16 mg liter(-1). The wobble was 11.4%, the median performance error (MdPE) was 21.1%, the median absolute performance error (MdAPE) was 32.0%, and the divergence was -3.72% h(-1). Based on these results, we conclude that TCI is useful for dose optimization of cefepime in ICU patients
Surface deformation caused by the Abrikosov vortex lattice
In superconductors penetrated by Abrikosov vortices the magnetic pressure and
the inhomogeneous condensate density induce a deformation of the ionic lattice.
We calculate how this deformation corrugates the surface of a semi-infinite
sample. The effect of the surface dipole is included
Simultaneous Measurements of X-Ray Luminosity and Kilohertz Quasi-Periodic Oscillations in Low-Mass X-Ray Binaries
We measure simultaneously the properties of the energy spectra and the
frequencies of the kilohertz quasi-periodic oscillations (QPOs) in fifteen low
mass X-ray binaries covering a wide range of X-ray luminosities. In each source
the QPO frequencies cover the same range of approximately 300 Hz to 1300 Hz,
though the sources differ by two orders of magnitude in their X-ray
luminosities (as measured from the unabsorbed 2-50 keV flux). So the X-ray
luminosity does not uniquely determine the QPO frequency. This is difficult to
understand since the evidence from individual sources indicates that the
frequency and luminosity are very well correlated at least over short
timescales. Perhaps beaming effects or bolometric corrections change the
observed luminosities, or perhaps part of the energy in mass accretion is used
to power outflows reducing the energy emitted in X-rays. It is also possible
that the parameters of a QPO model are tuned in such a way that the same range
of frequencies appears in all sources. Different modes of accretion may be
involved for example (disk and radial) or multiple parameters may conspire to
yield the same frequencies.Comment: 14 pages, 2 figures (1 in color), accepted by ApJ, see the 'QPO
page': http://www.astro.uva.nl/~ecford/qpos.htm
- …