32,131 research outputs found
Resource Letter: Gravitational Lensing
This Resource Letter provides a guide to a selection of the literature on
gravitational lensing and its applications. Journal articles, books, popular
articles, and websites are cited for the following topics: foundations of
gravitational lensing, foundations of cosmology, history of gravitational
lensing, strong lensing, weak lensing, and microlensing.Comment: Resource Letter, 2012, in press
(http://ajp.dickinson.edu/Readers/resLetters.html); 21 pages, no figures;
diigo version available at
http://groups.diigo.com/group/gravitational-lensin
Bounding inconsistency using a novel threshold metric for dead reckoning update packet generation
Human-to-human interaction across distributed applications requires that sufficient consistency be maintained among participants in the face of network characteristics such as latency and limited bandwidth. The level of inconsistency arising from the network is proportional to the network delay, and thus a function of bandwidth consumption. Distributed simulation has often used a bandwidth reduction technique known as dead reckoning that combines approximation and estimation in the communication of entity movement to reduce network traffic, and thus improve consistency. However, unless carefully tuned to application and network characteristics, such an approach can introduce more inconsistency than it avoids. The key tuning metric is the distance threshold. This paper questions the suitability of the standard distance threshold as a metric for use in the dead reckoning scheme. Using a model relating entity path curvature and inconsistency, a major performance related limitation of the distance threshold technique is highlighted. We then propose an alternative time—space threshold criterion. The time—space threshold is demonstrated, through simulation, to perform better for low curvature movement. However, it too has a limitation. Based on this, we further propose a novel hybrid scheme. Through simulation and live trials, this scheme is shown to perform well across a range of curvature values, and places bounds on both the spatial and absolute inconsistency arising from dead reckoning
Exploring the use of local consistency measures as thresholds for dead reckoning update packet generation
Human-to-human interaction across distributed applications requires that sufficient consistency be maintained among participants in the face of network characteristics such as latency and limited bandwidth. Techniques and approaches for reducing bandwidth usage can minimize network delays by reducing the network traffic and therefore better exploiting available bandwidth. However, these approaches induce inconsistencies within the level of human perception. Dead reckoning is a well-known technique for reducing the number of update packets transmitted between participating nodes. It employs a distance threshold for deciding when to generate update packets. This paper questions the use of such a distance threshold in the context of absolute consistency and it highlights a major drawback with such a technique. An alternative threshold criterion based on time and distance is examined and it is compared to the distance only threshold. A drawback with this proposed technique is also identified and a hybrid threshold criterion is then proposed. However, the trade-off between spatial and temporal inconsistency remains
Scaling Symmetries of Scatterers of Classical Zero-Point Radiation
Classical radiation equilibrium (the blackbody problem) is investigated by
the use of an analogy. Scaling symmetries are noted for systems of classical
charged particles moving in circular orbits in central potentials V(r)=-k/r^n
when the particles are held in uniform circular motion against radiative
collapse by a circularly polarized incident plane wave. Only in the case of a
Coulomb potential n=1 with fixed charge e is there a unique scale-invariant
spectrum of radiation versus frequency (analogous to zero-point radiation)
obtained from the stable scattering arrangement. These results suggest that
non-electromagnetic potentials are not appropriate for discussions of classical
radiation equilibrium.Comment: 13 page
Luminous Satellites II: Spatial Distribution, Luminosity Function and Cosmic Evolution
We infer the normalization and the radial and angular distributions of the
number density of satellites of massive galaxies
() between redshifts 0.1 and 0.8 as a function
of host stellar mass, redshift, morphology and satellite luminosity. Exploiting
the depth and resolution of the COSMOS HST images, we detect satellites up to
eight magnitudes fainter than the host galaxies and as close as 0.3 (1.4)
arcseconds (kpc). Describing the number density profile of satellite galaxies
to be a projected power law such that P(R)\propto R^{\rpower}, we find
\rpower=-1.1\pm 0.3. We find no dependency of \rpower on host stellar mass,
redshift, morphology or satellite luminosity. Satellites of early-type hosts
have angular distributions that are more flattened than the host light profile
and are aligned with its major axis. No significant average alignment is
detected for satellites of late-type hosts. The number of satellites within a
fixed magnitude contrast from a host galaxy is dependent on its stellar mass,
with more massive galaxies hosting significantly more satellites. Furthermore,
high-mass late-type hosts have significantly fewer satellites than early-type
galaxies of the same stellar mass, likely a result of environmental
differences. No significant evolution in the number of satellites per host is
detected. The cumulative luminosity function of satellites is qualitatively in
good agreement with that predicted using subhalo abundance matching techniques.
However, there are significant residual discrepancies in the absolute
normalization, suggesting that properties other than the host galaxy luminosity
or stellar mass determine the number of satellites.Comment: 23 pages, 12 figures, Accepted for publication in the Astrophysical
Journa
One-man electrochemical air revitalization system evaluation
A program to evaluate the performance of a one man capacity, self contained electrochemical air revitalization system was successfully completed. The technology readiness of this concept was demonstrated by characterizing the performance of this one man system over wide ranges in cabin atmospheric conditions. The electrochemical air revitalization system consists of a water vapor electrolysis module to generate oxygen from water vapor in the cabin air, and an electrochemical depolarized carbon dioxide concentrator module to remove carbon dioxide from the cabin air. A control/monitor instrumentation package that uses the electrochemical depolarized concentrator module power generated to partially offset the water vapor electrolysis module power requirements and various structural fluid routing components are also part of the system. The system was designed to meet the one man metabolic oxygen generation and carbon dioxide removal requirements, thereby controlling cabin partial pressure of oxygen at 22 kN/sq m and cabin pressure of carbon dioxide at 400 N/sq m over a wide range in cabin air relative humidity conditions
A Census of X-ray gas in NGC 1068: Results from 450ks of Chandra HETG Observation
We present models for the X-ray spectrum of the Seyfert 2 galaxy NGC 1068.
These are fitted to data obtained using the High Energy Transmission Grating
(HETG) on the Chandra X-ray observatory. The data show line and radiative
recombination continuum (RRC) emission from a broad range of ions and elements.
The models explore the importance of excitation processes for these lines
including photoionization followed by recombination, radiative excitation by
absorption of continuum radiation and inner shell fluorescence. The models show
that the relative importance of these processes depends on the conditions in
the emitting gas, and that no single emitting component can fit the entire
spectrum. In particular, the relative importance of radiative excitation and
photoionization/recombination differs according to the element and ion stage
emitting the line. This in turn implies a diversity of values for the
ionization parameter of the various components of gas responsible for the
emission, ranging from log(xi)=1 -- 3. Using this, we obtain an estimate for
the total amount of gas responsible for the observed emission. The mass flux
through the region included in the HETG extraction region is approximately 0.3
Msun/yr assuming ordered flow at the speed characterizing the line widths. This
can be compared with what is known about this object from other techniques.Comment: 39 pages, 12 figures, Ap. J. in pres
Recommended from our members
Predicting pilot error on the flight deck: Validation of a new methodology and a multiple methods and analysts approach to enhancing error prediction sensitivity
The Human Error Template (HET) is a recently developed methodology for predicting designed induced pilot error. This article describes a validation study undertaken to compare the performance of HET against three contemporary Human Error Identification (HEI) approaches when used to predict pilot errors for an approach and landing task and also to compare individual analyst error predictions to an approach to enhancing error prediction sensitivity: the multiple analysts and methods approach, whereby multiple analyst predictions using a range of HEI technique are pooled. The findings indicate that, of the four methodologies used in isolation, analysts using the HET methodology offered the most accurate error predictions, and also that the multiple analysts and methods approach was more successful overall in terms of error prediction sensitivity than the three other methods but not the HET approach. The results suggest that when predicting design induced error, it is appropriate to use domain specific approaches and also a toolkit of different HEI approaches and multiple analysts in order to heighten error prediction sensitivity
Radiative Properties of the Stueckelberg Mechanism
We examine the mechanism for generating a mass for a U(1) vector field
introduced by Stueckelberg. First, it is shown that renormalization of the
vector mass is identical to the renormalization of the vector field on account
of gauge invariance. We then consider how the vector mass affects the effective
potential in scalar quantum electrodynamics at one-loop order. The possibility
of extending this mechanism to couple, in a gauge invariant way, a charged
vector field to the photon is discussed.Comment: 8 pages, new Introduction, added Reference
Performance, Politics and Media: How the 2010 British General Election leadership debates generated ‘talk’ amongst the electorate.
During the British General Election 2010 a major innovation was introduced in part to improve engagement: a series of three live televised leadership debates took place where the leader of each of the three main parties, Labour, Liberal Democrat and Conservative, answered questions posed by members of the public and subsequently debated issues pertinent to the questions. In this study we consider these potentially ground breaking debates as the kind of event that was likely to generate discussion. We investigate various aspects of the ‘talk’ that emerged as a result of watching the debates. As an exploratory study concerned with situated accounts of the participants experiences we take an interpretive perspective. In this paper we outline the meta-narratives (of talk) associated with the viewing of the leadership debates that were identified, concluding our analysis by suggesting that putting a live debate on television and promoting and positioning it as a major innovation is likely to mean that is how the audience will make sense of it – as a media event
- …