5,042 research outputs found

    Gravitational Repulsion within a Black-Hole using the Stueckelberg Quantum Formalism

    Full text link
    We wish to study an application of Stueckelberg's relativistic quantum theory in the framework of general relativity. We study the form of the wave equation of a massive body in the presence of a Schwarzschild gravitational field. We treat the mathematical behavior of the wavefunction also around and beyond the horizon (r=2M). Classically, within the horizon, the time component of the metric becomes spacelike and distance from the origin singularity becomes timelike, suggesting an inevitable propagation of all matter within the horizon to a total collapse at r=0. However, the quantum description of the wave function provides a different understanding of the behavior of matter within the horizon. We find that a test particle can almost never be found at the origin and is more probable to be found at the horizon. Matter outside the horizon has a very small wave length and therefore interference effects can be found only on a very small atomic scale. However, within the horizon, matter becomes totally "tachionic" and is potentially "spread" over all space. Small location uncertainties on the atomic scale become large around the horizon, and different mass components of the wave function can therefore interfere on a stellar scale. This interference phenomenon, where the probability of finding matter decreases as a function of the distance from the horizon, appears as an effective gravitational repulsion.Comment: 20 pages, 6 figure

    Improvements in estimating proportions of objects from multispectral data

    Get PDF
    Methods for estimating proportions of objects and materials imaged within the instantaneous field of view of a multispectral sensor were developed further. Improvements in the basic proportion estimation algorithm were devised as well as improved alien object detection procedures. Also, a simplified signature set analysis scheme was introduced for determining the adequacy of signature set geometry for satisfactory proportion estimation. Averaging procedures used in conjunction with the mixtures algorithm were examined theoretically and applied to artificially generated multispectral data. A computationally simpler estimator was considered and found unsatisfactory. Experiments conducted to find a suitable procedure for setting the alien object threshold yielded little definitive result. Mixtures procedures were used on a limited amount of ERTS data to estimate wheat proportion in selected areas. Results were unsatisfactory, partly because of the ill-conditioned nature of the pure signature set

    Changes in zebrafish (Danio rerio) lens crystallin content during development.

    Get PDF
    PurposeThe roles that crystallin proteins play during lens development are not well understood. Similarities in the adult crystallin composition of mammalian and zebrafish lenses have made the latter a valuable model for examining lens function. In this study, we describe the changing zebrafish lens proteome during development to identify ontogenetic shifts in crystallin expression that may provide insights into age-specific functions.MethodsTwo-dimensional gel electrophoresis and size exclusion chromatography were used to characterize the lens crystallin content of 4.5-day to 27-month-old zebrafish. Protein spots were identified with mass spectrometry and comparisons with previously published proteomic maps, and quantified with densitometry. Constituents of size exclusion chromatography elution peaks were identified with sodium dodecyl sulfate-polyacrylamide gel electrophoresis.ResultsZebrafish lens crystallins were expressed in three ontogenetic patterns, with some crystallins produced at relatively constant levels throughout development, others expressed primarily before 10 weeks of age (βB1-, βA1-, and γN2-crystallins), and a third group primarily after 10 weeks (α-, βB3-, and γS-crystallins). Alpha-crystallins comprised less than 1% of total lens protein in 4.5-day lenses and increased to less than 7% in adult lenses. The developmental period between 6 weeks and 4 months contained the most dramatic shifts in lens crystallin expression.ConclusionsThese data provide the first two-dimensional gel electrophoresis maps of the developing zebrafish lens, with quantification of changing crystallin abundance and visualization of post-translational modification. Results suggest that some crystallins may play stage specific roles during lens development. The low levels of zebrafish lens α-crystallin relative to mammals may be due to the high concentrations of γ-crystallins in this aquatic lens. Similarities with mammalian crystallin expression continue to support the use of the zebrafish as a model for lens crystallin function

    Hypercomplex quantum mechanics

    Full text link
    The fundamental axioms of the quantum theory do not explicitly identify the algebraic structure of the linear space for which orthogonal subspaces correspond to the propositions (equivalence classes of physical questions). The projective geometry of the weakly modular orthocomplemented lattice of propositions may be imbedded in a complex Hilbert space; this is the structure which has traditionally been used. This paper reviews some work which has been devoted to generalizing the target space of this imbedding to Hilbert modules of a more general type. In particular, detailed discussion is given of the simplest generalization of the complex Hilbert space, that of the quaternion Hilbert module.Comment: Plain Tex, 11 page

    Generalized Boltzmann Equation in a Manifestly Covariant Relativistic Statistical Mechanics

    Get PDF
    We consider the relativistic statistical mechanics of an ensemble of NN events with motion in space-time parametrized by an invariant ``historical time'' Ï„.\tau . We generalize the approach of Yang and Yao, based on the Wigner distribution functions and the Bogoliubov hypotheses, to find the approximate dynamical equation for the kinetic state of any nonequilibrium system to the relativistic case, and obtain a manifestly covariant Boltzmann-type equation which is a relativistic generalization of the Boltzmann-Uehling-Uhlenbeck (BUU) equation for indistinguishable particles. This equation is then used to prove the HH-theorem for evolution in Ï„.\tau . In the equilibrium limit, the covariant forms of the standard statistical mechanical distributions are obtained. We introduce two-body interactions by means of the direct action potential V(q),V(q), where qq is an invariant distance in the Minkowski space-time. The two-body correlations are taken to have the support in a relative O(2,1)O( 2,1)-invariant subregion of the full spacelike region. The expressions for the energy density and pressure are obtained and shown to have the same forms (in terms of an invariant distance parameter) as those of the nonrelativistic theory and to provide the correct nonrelativistic limit

    Chemical Measurement and Fluctuation Scaling

    Get PDF
    Main abstract: Fluctuation scaling reports on all processes producing a data set. Some fluctuation scaling relationships, such as the Horwitz curve, follow exponential dispersion models which have useful properties. The mean-variance method applied to Poisson distributed data is a special case of these properties allowing the gain of a system to be measured. Here, a general method is described for investigating gain (G), dispersion (β), and process (α) in any system whose fluctuation scaling follows a simple exponential dispersion model, a segmented exponential dispersion model, or complex scaling following such a model locally. When gain and dispersion cannot be obtained directly, relative parameters, GR and βR, may be used. The method was demonstrated on data sets conforming to simple, segmented, and complex scaling. These included mass, fluorescence intensity, and absorbance measurements and specifications for classes of calibration weights. Changes in gain, dispersion, and process were observed in the scaling of these data sets in response to instrument parameters, photon fluxes, mathematical processing, and calibration weight class. The process parameter which limits the type of statistical process that can be invoked to explain a data set typically exhibited 04 possible. With two exceptions, calibration weight class definitions only affected β. Adjusting photomultiplier voltage while measuring fluorescence intensity changed all three parameters (0<α<0.8; 0<βR<3; 0<GR<4.1). The method provides a framework for calibrating and interpreting uncertainty in chemical measurement allowing robust compar ison of specific instruments, conditions, and methods. Supporting information abstract: On first inspection, fluctuation scaling data may appear to approximate a straight line when log transformed. The data presented in figure 5 of the main text gives a reasonable approximation to a straight line and for many purposes this would be sufficient. The purpose of the study of fluorescence intensity was to determine whether adjusting the voltage of a photomultiplier tube while measuring a fluorescent sample changes the process (α), the dispersion (β) and/or the gain (G). In this regard, the linear model established that PMT setting affects more than the gain. However, a detailed analysis beginning with testing for model mis-specification provides additional information. Specifically, Poisson behavior is only seen over a limited wavelength range in the 600 V and 700 V data sets

    Intertwined strands for ecology in planetary health

    Get PDF
    Ecology is both blessed and burdened by romanticism, with a legacy that is multi-edged for health. The prefix ‘eco-’ can carry a cultural and political (subversive) baggage, associated with motivating environmental activism. Ecology is also practiced as a technical ‘science’, with quantitative and deterministic leanings and a biophysical emphasis. A challenge for planetary health is to avoid lapsing into, or rejecting, either position. A related opportunity is to adopt ecological thought that offers a rich entrance to understanding living systems: a relationality of connectedness, interdependence, and reciprocity to understand health in a complex and uncertain world. Planetary health offers a global scale framing; we regard its potential as equivalent to the degree to which it can embrace, at its core, ecological thought, and develop its own political narrative

    Galilean limit of equilibrium relativistic mass distribution for indistinguishable events

    Full text link
    The relativistic distribution for indistinguishable events is considered in the mass-shell limit m2≅M2,m^2\cong M^2, where MM is a given intrinsic property of the events. The characteristic thermodynamic quantities are calculated and subject to the zero-mass and the high-temperature limits. The results are shown to be in agreement with the corresponding expressions of an on-mass-shell relativistic kinetic theory. The Galilean limit c→∞,c\rightarrow \infty , which coincides in form with the low-temperature limit, is considered. The theory is shown to pass over to a nonrelativistic statistical mechanics of indistinguishable particles.Comment: Report TAUP-2136-9

    Precise Null Pointer Analysis Through Global Value Numbering

    Full text link
    Precise analysis of pointer information plays an important role in many static analysis techniques and tools today. The precision, however, must be balanced against the scalability of the analysis. This paper focusses on improving the precision of standard context and flow insensitive alias analysis algorithms at a low scalability cost. In particular, we present a semantics-preserving program transformation that drastically improves the precision of existing analyses when deciding if a pointer can alias NULL. Our program transformation is based on Global Value Numbering, a scheme inspired from compiler optimizations literature. It allows even a flow-insensitive analysis to make use of branch conditions such as checking if a pointer is NULL and gain precision. We perform experiments on real-world code to measure the overhead in performing the transformation and the improvement in the precision of the analysis. We show that the precision improves from 86.56% to 98.05%, while the overhead is insignificant.Comment: 17 pages, 1 section in Appendi
    • …
    corecore