26,504 research outputs found

    Remarks on the Extended Characteristic Uncertainty Relations

    Get PDF
    Three remarks concerning the form and the range of validity of the state-extended characteristic uncertainty relations (URs) are presented. A more general definition of the uncertainty matrix for pure and mixed states is suggested. Some new URs are provided.Comment: LaTex, 4 pages, no figure

    The end-to-end testbed of the Optical Metrology System on-board LISA Pathfinder

    Full text link
    LISA Pathfinder is a technology demonstration mission for the Laser Interferometer Space Antenna (LISA). The main experiment on-board LISA Pathfinder is the so-called LISA Technology Package (LTP) which has the aim to measure the differential acceleration between two free-falling test masses with an accuracy of 3x10^(-14) ms^(-2)/sqrt[Hz] between 1 mHz and 30 mHz. This measurement is performed interferometrically by the Optical Metrology System (OMS) on-board LISA Pathfinder. In this paper we present the development of an experimental end-to-end testbed of the entire OMS. It includes the interferometer and its sub-units, the interferometer back-end which is a phasemeter and the processing of the phasemeter output data. Furthermore, 3-axes piezo actuated mirrors are used instead of the free-falling test masses for the characterisation of the dynamic behaviour of the system and some parts of the Drag-free and Attitude Control System (DFACS) which controls the test masses and the satellite. The end-to-end testbed includes all parts of the LTP that can reasonably be tested on earth without free-falling test masses. At its present status it consists mainly of breadboard components. Some of those have already been replaced by Engineering Models of the LTP experiment. In the next steps, further Engineering Models and Flight Models will also be inserted in this testbed and tested against well characterised breadboard components. The presented testbed is an important reference for the unit tests and can also be used for validation of the on-board experiment during the mission

    Proper Size of the Visible Universe in FRW Metrics with Constant Spacetime Curvature

    Full text link
    In this paper, we continue to examine the fundamental basis for the Friedmann-Robertson-Walker (FRW) metric and its application to cosmology, specifically addressing the question: What is the proper size of the visible universe? There are several ways of answering the question of size, though often with an incomplete understanding of how far light has actually traveled in reaching us today from the most remote sources. The difficulty usually arises from an inconsistent use of the coordinates, or an over-interpretation of the physical meaning of quantities such as the so-called proper distance R(t)=a(t)r, written in terms of the (unchanging) co-moving radius r and the universal expansion factor a(t). In this paper, we use the five non-trivial FRW metrics with constant spacetime curvature (i.e., the static FRW metrics, but excluding Minkowski) to prove that in static FRW spacetimes in which expansion began from an initial signularity, the visible universe today has a proper size equal to R_h(t_0/2), i.e., the gravitational horizon at half its current age. The exceptions are de Sitter and Lanczos, whose contents had pre-existing positions away from the origin. In so doing, we confirm earlier results showing the same phenomenon in a broad range of cosmologies, including LCDM, based on the numerical integration of null geodesic equations through an FRW metric.Comment: Accepted for publication in Classical and Quantum Gravit

    Renormalized Effective QCD Hamiltonian: Gluonic Sector

    Get PDF
    Extending previous QCD Hamiltonian studies, we present a new renormalization procedure which generates an effective Hamiltonian for the gluon sector. The formulation is in the Coulomb gauge where the QCD Hamiltonian is renormalizable and the Gribov problem can be resolved. We utilize elements of the Glazek and Wilson regularization method but now introduce a continuous cut-off procedure which eliminates non-local counterterms. The effective Hamiltonian is then derived to second order in the strong coupling constant. The resulting renormalized Hamiltonian provides a realistic starting point for approximate many-body calculations of hadronic properties for systems with explicit gluon degrees of freedom.Comment: 25 pages, no figures, revte

    Spherically Symmetric Solutions in M\o ller's Tetrad Theory of Gravitation

    Full text link
    The general solution of M\o ller's field equations in case of spherical symmetry is derived. The previously obtained solutions are verified as special cases of the general solution.Comment: LaTeX2e with AMS-LaTeX 1.2, 8 page

    Inequalities for quantum skew information

    Full text link
    We study quantum information inequalities and show that the basic inequality between the quantum variance and the metric adjusted skew information generates all the multi-operator matrix inequalities or Robertson type determinant inequalities studied by a number of authors. We introduce an order relation on the set of functions representing quantum Fisher information that renders the set into a lattice with an involution. This order structure generates new inequalities for the metric adjusted skew informations. In particular, the Wigner-Yanase skew information is the maximal skew information with respect to this order structure in the set of Wigner-Yanase-Dyson skew informations. Key words and phrases: Quantum covariance, metric adjusted skew information, Robertson-type uncertainty principle, operator monotone function, Wigner-Yanase-Dyson skew information

    Removing non-stationary, non-harmonic external interference from gravitational wave interferometer data

    Get PDF
    We describe a procedure to identify and remove a class of non-stationary and non-harmonic interference lines from gravitational wave interferometer data. These lines appear to be associated with the external electricity main supply, but their amplitudes are non-stationary and they do not appear at harmonics of the fundamental supply frequency. We find an empirical model able to represent coherently all the non-harmonic lines we have found in the power spectrum, in terms of an assumed reference signal of the primary supply input signal. If this signal is not available then it can be reconstructed from the same data by making use of the coherent line removal algorithm that we have described elsewhere. All these lines are broadened by frequency changes of the supply signal, and they corrupt significant frequency ranges of the power spectrum. The physical process that generates this interference is so far unknown, but it is highly non-linear and non-stationary. Using our model, we cancel the interference in the time domain by an adaptive procedure that should work regardless of the source of the primary interference. We have applied the method to laser interferometer data from the Glasgow prototype detector, where all the features we describe in this paper were observed. The algorithm has been tuned in such a way that the entire series of wide lines corresponding to the electrical interference are removed, leaving the spectrum clean enough to detect signals previously masked by them. Single-line signals buried in the interference can be recovered with at least 75 % of their original signal amplitude.Comment: 14 pages, 5 figures, Revtex, psfi

    Probabilistic models of information retrieval based on measuring the divergence from randomness

    Get PDF
    We introduce and create a framework for deriving probabilistic models of Information Retrieval. The models are nonparametric models of IR obtained in the language model approach. We derive term-weighting models by measuring the divergence of the actual term distribution from that obtained under a random process. Among the random processes we study the binomial distribution and Bose--Einstein statistics. We define two types of term frequency normalization for tuning term weights in the document--query matching process. The first normalization assumes that documents have the same length and measures the information gain with the observed term once it has been accepted as a good descriptor of the observed document. The second normalization is related to the document length and to other statistics. These two normalization methods are applied to the basic models in succession to obtain weighting formulae. Results show that our framework produces different nonparametric models forming baseline alternatives to the standard tf-idf model

    Energy-Momentum Complex in M\o ller's Tetrad Theory of Gravitation

    Full text link
    M\o ller's Tetrad Theory of Gravitation is examined with regard to the energy-momentum complex. The energy-momentum complex as well as the superpotential associated with M\o ller's theory are derived. M\o ller's field equations are solved in the case of spherical symmetry. Two different solutions, giving rise to the same metric, are obtained. The energy associated with one solution is found to be twice the energy associated with the other. Some suggestions to get out of this inconsistency are discussed at the end of the paper.Comment: LaTeX2e with AMS-LaTeX 1.2, 13 page

    Transport theory yields renormalization group equations

    Full text link
    We show that dissipative transport and renormalization can be described in a single theoretical framework. The appropriate mathematical tool is the Nakajima-Zwanzig projection technique. We illustrate our result in the case of interacting quantum gases, where we use the Nakajima-Zwanzig approach to investigate the renormalization group flow of the effective two-body interaction.Comment: 11 pages REVTeX, twocolumn, no figures; revised version with additional examples, to appear in Phys. Rev.
    • …
    corecore