5,567 research outputs found
EUCARS: A partial equilibrium model of EUropean CAR emissions (Version 3.0).
EUCARS has been designed to analyse the cost-effectiveness of various transport policy measures to reach air quality objectives. A general description and a thorough technical presentation of this partial equilibrium model of passenger transport are given. Some simulation results are then discussed to illustrate the simulation properties.eucars, transport, emissions, modelling
Elimination Of Catalytic Hydrogen Generation In Defense Waste Processing Facility Slurries
Based on lab-scale simulations of Defense Waste Processing Facility (DWPF) slurry chemistry, the addition of sodium nitrite and sodium hydroxide to waste slurries at concentrations sufficient to take the aqueous phase into the alkaline region (pH > 7) with approximately 500 mg nitrite ion/kg slurry (assuming <25 wt% total solids, or equivalently 2,000 mg nitrite/kg total solids) is sufficient to effectively deactivate the noble metal catalysts at temperatures between room temperature and boiling. This is a potential strategy for eliminating catalytic hydrogen generation from the list of concerns for sludge carried over into the DWPF Slurry Mix Evaporator Condensate Tank (SMECT) or Recycle Collection Tank (RCT). These conclusions are drawn in large part from the various phases of the DWPF catalytic hydrogen generation program conducted between 2005 and 2009. The findings could apply to various situations, including a solids carry-over from either the Sludge Receipt and Adjustment Tank (SRAT) or Slurry Mix Evaporator (SME) into the SMECT with subsequent transfer to the RCT, as well as a spill of formic acid into the sump system and transfer into an RCT that already contains sludge solids. There are other potential mitigating factors for the SMECT and RCT, since these vessels are typically operated at temperatures close to the minimum temperatures that catalytic hydrogen has been observed to occur in either the SRAT or SME (pure slurry case), and these vessels are also likely to be considerably more dilute in both noble metals and formate ion (the two essential components to catalytic hydrogen generation) than the two primary process vessels. Rhodium certainly, and ruthenium likely, are present as metal-ligand complexes that are favored under certain concentrations of the surrounding species. Therefore, in the SMECT or RCT, where a small volume of SRAT or SME material would be significantly diluted, conditions would be less optimal for forming or sustaining the catalytic ligand species. Such conditions are likely to adversely impact the ability of the transferred mass to produce hydrogen at the same rate (per unit mass SRAT or SME slurry) as in the SRAT or SME vessels
Ionisation profile monitor tests in the SPS
A beam profile monitor, from DESY, based on the ionisation of the rest gas, was installed in the SPS in 1997. Horizontal beam profiles obtained from the extracted positive ions are presented. It is known that in this case some broadening affects the signal, which limits the monitor resolution. This broadening results from the transverse momentum that the ions gain within the space charge field of the circulating beam. In order to improve the resolution for LHC applications, the monitor was modified during the 1998/99 winter stop. A magnetic focusing was incorporated. The aim is to analyse the signal provided by collecting the electrons, rather than the ions, of the ionised rest gas. The details of this new set-up and the expectations for the resolution limit will be compared to the measurement results
Quartz wires versus carbon fibres for improved beam handling capacity of the LEP wire scanners
After the first investigations performed in 1994, the study of thermal effects on Carbon and Quartz wires has been pursued in 1995. Carbon wires of 8 µm have been studied. Light emission resulting from the two heating mechanisms, electromagnetic fields and collision losses with the beam, were observed. Quartz wires of 10 and 30 µm were investigated and light emission due to the heating by collision with the beam was observed. The heat pattern differs completely from that of Carbon fibres. The Quartz wires withstood at 20 GeV circulating currents of at least 8 mA, the 1995 operational level in LEP. Quantitative evaluations and the influence of various dissipation processes are presented with the aim of evaluating a beam current limit
From Classical to Quantum Mechanics: "How to translate physical ideas into mathematical language"
In this paper, we investigate the connection between Classical and Quantum
Mechanics by dividing Quantum Theory in two parts: - General Quantum Axiomatics
(a system is described by a state in a Hilbert space, observables are
self-adjoint operators and so on) - Quantum Mechanics properly that specifies
the Hilbert space, the Heisenberg rule, the free Hamiltonian... We show that
General Quantum Axiomatics (up to a supplementary "axiom of classicity") can be
used as a non-standard mathematical ground to formulate all the ideas and
equations of ordinary Classical Statistical Mechanics. So the question of a
"true quantization" with "h" must be seen as an independent problem not
directly related with quantum formalism. Moreover, this non-standard
formulation of Classical Mechanics exhibits a new kind of operation with no
classical counterpart: this operation is related to the "quantization process",
and we show why quantization physically depends on group theory (Galileo
group). This analytical procedure of quantization replaces the "correspondence
principle" (or canonical quantization) and allows to map Classical Mechanics
into Quantum Mechanics, giving all operators of Quantum Mechanics and
Schrodinger equation. Moreover spins for particles are naturally generated,
including an approximation of their interaction with magnetic fields. We find
also that this approach gives a natural semi-classical formalism: some exact
quantum results are obtained only using classical-like formula. So this
procedure has the nice property of enlightening in a more comprehensible way
both logical and analytical connection between classical and quantum pictures.Comment: 47 page
Forecasting Cross-Sections of Frailty-Correlated Default
We propose a novel econometric model for estimating and forecasting cross-sections of time-varying conditional default probabilities. The model captures the systematic variation in corporate default counts across e.g. rating and industry groups by using dynamic factors from a large panel of selected macroeconomic and financial data as well as common unobserved risk factors. All factors are statistically and economically significant and together capture a large part of the time-variation in observed default rates. In this framework we improve the out-of-sample forecasting accuracy associated with conditional default probabilities by about 10-35 % in terms of Mean Absolute Error, particularly in years of default stress
Consistency of the Shannon entropy in quantum experiments
The consistency of the Shannon entropy, when applied to outcomes of quantum
experiments, is analysed. It is shown that the Shannon entropy is fully
consistent and its properties are never violated in quantum settings, but
attention must be paid to logical and experimental contexts. This last remark
is shown to apply regardless of the quantum or classical nature of the
experiments.Comment: 12 pages, LaTeX2e/REVTeX4. V5: slightly different than the published
versio
Discovering the benefits of being interrupted by colleagues at work
The negative effects of work interruptions are well documented: difficulties moving ahead with tasks, time-pressure, stress, and lowered productivity. Managers often look for ways to eliminate, or at least minimise such interruptions. But a new study by Harshad Puranik, Joel Koopman, and Heather C. Vough shows an upside to these workplace interruptions: increased feelings of belonging
Cosmic rays studied with a hybrid high school detector array
The LORUN/NAHSA system is a pathfinder for hybrid cosmic ray research
combined with education and outreach in the field of astro-particle physics.
Particle detectors and radio antennae were mainly setup by students and placed
on public buildings. After fully digital data acquisition, coincidence
detections were selected. Three candidate events confirmed a working prototype,
which can be multiplied to extend further particle detector arrays on high
schools.Comment: 10 pages, 6 figures. Nigl, A., Timmermans, C., Schellart, P.,
Kuijpers, J., Falcke, H., Horneffer, A., de Vos, C. M., Koopman, Y., Pepping,
H. J., Schoonderbeek, G., Cosmic rays studied with a hybrid high school
detector array, Europhysics News (EPN), Vol. 38, No. 5, accepted on
22/08/200
Computing Observation Weights for Signal Extraction and Filtering
We present algorithms for computing the weights implicitly assigned to observations when estimating unobserved components using a model in state space form. The algorithms are for both filtering and signal extraction. In linear time-invariant models such weights can sometimes be obtained analytically from the Wiener-Kolmogorov formulae. Our method is much more general, being applicable to any model with a linear state space form, including models with deterministic components and time-varying state matrices. It applies to multivariate models and it can be used when there are data irregularities, such as missing observations. The algorithms can be useful for a variety of purposes in econometrics and statistics: (i) the weights for signal extraction can be regarded as equivalent kernel functions and hence the weight pattern can be compared with the kernels typically used in nonparametric trend estimation; (ii) the weight algorithm for filtering implicitly computes the coefficients of the vector error-correction model (VECM) representation of any linear time series model; (iii) as a by-product the mean square errors associated with estimators may be obtained; (iv) the algorithm can be incorporated within a Markov chain Monte Carlo (MCMC) method enabling computation of weights assigned to observations when computing the posterior mean of unobserved components within a Bayesian treatment. A wide range of illustrations show how the algorithms may provide important insights in empirical analysis. The algorithms are provided and implemented for the software package SsfPack 2.3 , that is a set of filtering, smoothing and simulation algorithms for models in state space form (see www.ssfpack.com). Some details of implementation and example programs are given in the appendix of the paper.
- …