8,730 research outputs found
Discrete Symmetries of Off-Shell Electromagnetism
We discuss the discrete symmetries of the Stueckelberg-Schrodinger
relativistic quantum theory and its associated 5D local gauge theory, a
dynamical description of particle/antiparticle interactions, with monotonically
increasing Poincare-invariant parameter. In this framework, worldlines are
traced out through the parameterized evolution of spacetime events, advancing
or retreating with respect to the laboratory clock, with negative energy
trajectories appearing as antiparticles when the observer describes the
evolution using the laboratory clock. The associated gauge theory describes
local interactions between events (correlated by the invariant parameter)
mediated by five off-shell gauge fields. These gauge fields are shown to
transform tensorially under under space and time reflections, unlike the
standard Maxwell fields, and the interacting quantum theory therefore remains
manifestly Lorentz covariant. Charge conjugation symmetry in the quantum theory
is achieved by simultaneous reflection of the sense of evolution and the fifth
scalar field. Applying this procedure to the classical gauge theory leads to a
purely classical manifestation of charge conjugation, placing the CPT
symmetries on the same footing in the classical and quantum domains. In the
resulting picture, interactions do not distinguish between particle and
antiparticle trajectories -- charge conjugation merely describes the
interpretation of observed negative energy trajectories according to the
laboratory clock.Comment: 26 page
Recommended from our members
Chance-Constrained Efficiency Analysis
Data envelopment analysis (DEA) is extended to the case of stochastic inputs and outputs through the use of chance-constrained programming. The chance-constrained envelope envelops a given set of observations "most of the time." We show that the chance-constrained enveloping process leads to the definition of a conventional (certainty-equivalent) efficiency ratio (a ratio between weighted outputs and weighted inputs). Furthermore, extending the concept of Pareto and Koopmans efficiency to the case of chance-constrained dominance (to be defined), we establish the identity of the following two chance-constrained efficiency concepts: (i) the chance constrained DEA efficiency measure of a particular output-input point is unity, and all chance-constraints are binding; (ii) the point is efficient in the sense Pareto and Koopmans. Finally we discuss the implications of our approach for econometric frontier analysis.IC2 Institut
Non-Gaussianity detections in the Bianchi VIIh corrected WMAP 1-year data made with directional spherical wavelets
Many of the current anomalies reported in the Wilkinson Microwave Anisotropy
Probe (WMAP) 1-year data disappear after `correcting' for the best-fit embedded
Bianchi type VII_h component (Jaffe et al. 2005), albeit assuming no dark
energy component. We investigate the effect of this Bianchi correction on the
detections of non-Gaussianity in the WMAP data that we previously made using
directional spherical wavelets (McEwen et al. 2005a). As previously discovered
by Jaffe et al. (2005), the deviations from Gaussianity in the kurtosis of
spherical Mexican hat wavelet coefficients are eliminated once the data is
corrected for the Bianchi component. This is due to the reduction of the cold
spot at Galactic coordinates (l,b)=(209^\circ,-57\circ), which Cruz et al.
(2005) claim to be the source of non-Gaussianity introduced in the kurtosis.
Our previous detections of non-Gaussianity observed in the skewness of
spherical wavelet coefficients are not reduced by the Bianchi correction.
Indeed, the most significant detection of non-Gaussianity made with the
spherical real Morlet wavelet at a significant level of 98.4% remains (using a
very conservative method to estimate the significance). We make our code to
simulate Bianchi induced temperature fluctuations publicly available.Comment: 11 pages, 8 figures, replaced to match version accepted by MNRA
Trellis-Based Equalization for Sparse ISI Channels Revisited
Sparse intersymbol-interference (ISI) channels are encountered in a variety
of high-data-rate communication systems. Such channels have a large channel
memory length, but only a small number of significant channel coefficients. In
this paper, trellis-based equalization of sparse ISI channels is revisited. Due
to the large channel memory length, the complexity of maximum-likelihood
detection, e.g., by means of the Viterbi algorithm (VA), is normally
prohibitive. In the first part of the paper, a unified framework based on
factor graphs is presented for complexity reduction without loss of optimality.
In this new context, two known reduced-complexity algorithms for sparse ISI
channels are recapitulated: The multi-trellis VA (M-VA) and the
parallel-trellis VA (P-VA). It is shown that the M-VA, although claimed, does
not lead to a reduced computational complexity. The P-VA, on the other hand,
leads to a significant complexity reduction, but can only be applied for a
certain class of sparse channels. In the second part of the paper, a unified
approach is investigated to tackle general sparse channels: It is shown that
the use of a linear filter at the receiver renders the application of standard
reduced-state trellis-based equalizer algorithms feasible, without significant
loss of optimality. Numerical results verify the efficiency of the proposed
receiver structure.Comment: To be presented at the 2005 IEEE Int. Symp. Inform. Theory (ISIT
2005), September 4-9, 2005, Adelaide, Australi
Cyclical Tests of Selected Space Shuttle TPS Metallic Materials in a Plasma Arc Tunnel. Volume 2: Appendices - Data Tabulation
Calibration data are presented for heat flux and pressure profiles, model temperature histories, and model weight and thickness changes
Cyclical tests of selected space shuttle TPS metallic materials in a plasma arc tunnel Volume 1: Description of tests and program summary
Work, concerned with cyclical thermal evaluation of selected space shuttle thermal protection system (TPS) metallic materials in a hypervelocity oxidizing atmosphere that approximated an actual entry environment, is presented. A total of 325 sample test hours were conducted on 21 super-alloy metallic samples at temperatures from 1800 to 2200 F (1256 to 1478 K) without any failures. The 4 x 4 in. (10.2 x 10.2 cm) samples were fabricated from five nickel base alloys and one cobalt base alloy. Eighteen of the samples were cycled 100 times each and the other three samples 50 times each in a test stream emanating from an 8 in. (20.3 cm) diam exit, Mach 4.6, conical nozzle. The test cycle consisted of a 10 min heat pulse to a controlled temperature followed by a 10 min cooldown period. The TD-NiCrAl and TD-NiAlY materials showed the least change in weight, thickness, and physical appearance even though they were subjected to the highest temperature environment
A ‘conversation’ between Frank Land [FL] and Antony Bryant [AB] – : Part 2
Part 1 of the ‘conversation’ offered important insights into a groundbreaking era for computer development – adding further detail to existing writings by Frank Land, the work of the LEO group in general, and extended accounts such as those by Ferry, Hally and Harding. This should have whetted the appetite for readers keen to know more, also prompting others to offer their own accounts. Part 2 moves on to Frank Land’s subsequent activities as one of the founding figures of the Information Systems (IS) Academy, and his ‘Emeritus’ phase
- …