80,865 research outputs found
Time delays for 11 gravitationally lensed quasars revisited
We test the robustness of published time delays for 11 lensed quasars by
using two techniques to measure time shifts in their light curves.
We chose to use two fundamentally different techniques to determine time
delays in gravitationally lensed quasars: a method based on fitting a numerical
model and another one derived from the minimum dispersion method introduced by
Pelt and collaborators. To analyse our sample in a homogeneous way and avoid
bias caused by the choice of the method used, we apply both methods to 11
different lensed systems for which delays have been published: JVAS B0218+357,
SBS 0909+523, RX J0911+0551, FBQS J0951+2635, HE 1104-1805, PG 1115+080, JVAS
B1422+231, SBS 1520+530, CLASS B1600+434, CLASS B1608+656, and HE 2149-2745
Time delays for three double lenses, JVAS B0218+357, HE 1104-1805, and CLASS
B1600+434, as well as the quadruply lensed quasar CLASS B1608+656 are confirmed
within the error bars. We correct the delay for SBS 1520+530. For PG 1115+080
and RX J0911+0551, the existence of a second solution on top of the published
delay is revealed. The time delays in four systems, SBS 0909+523, FBQS
J0951+2635, JVAS B1422+231, and HE 2149-2745 prove to be less reliable than
previously claimed.
If we wish to derive an estimate of H_0 based on time delays in
gravitationally lensed quasars, we need to obtain more robust light curves for
most of these systems in order to achieve a higher accuracy and robustness on
the time delays
The Galactic Faraday depth sky revisited
The Galactic Faraday depth sky is a tracer for both the Galactic magnetic
field and the thermal electron distribution. It has been previously
reconstructed from polarimetric measurements of extra-galactic point sources.
Here, we improve on these works by using an updated inference algorithm as well
as by taking into account the free-free emission measure map from the Planck
survey. In the future, the data situation will improve drastically with the
next generation Faraday rotation measurements from SKA and its pathfinders.
Anticipating this, the aim of this paper is to update the map reconstruction
method with the latest development in imaging based on information field
theory. We demonstrate the validity of the new algorithm by applying it to the
Oppermann et al. (2012) data compilation and compare our results to the
previous map.\\ Despite using exactly the previous data set, a number of novel
findings are made: A non-parametric reconstruction of an overall amplitude
field resembles the free-free emission measure map of the Galaxy. Folding this
free-free map into the analysis allows for more detailed predictions. The joint
inference enables us to identify regions with deviations from the assumed
correlations between the free-free and Faraday data, thereby pointing us to
Galactic structures with distinguishably different physics. We e.g. find
evidence for an alignment of the magnetic field within the line of sights along
both directions of the Orion arm.Comment: 16 pages, 15 figure
Weak and Semi-Strong Form Stock Return Predictability Revisited
This paper makes indirect inference about the time-variation in expected stock returns by comparing unconditional sample variances to estimates of expected conditional variances. The evidence reveals more predictability as more information is used, and no evidence that predictability has diminished in recent years. Semi-strong form evidence suggests that time-variation in expected returns remains economically important.
Lower Bounds for Oblivious Near-Neighbor Search
We prove an lower bound on the dynamic
cell-probe complexity of statistically
approximate-near-neighbor search () over the -dimensional
Hamming cube. For the natural setting of , our result
implies an lower bound, which is a quadratic
improvement over the highest (non-oblivious) cell-probe lower bound for
. This is the first super-logarithmic
lower bound for against general (non black-box) data structures.
We also show that any oblivious data structure for
decomposable search problems (like ) can be obliviously dynamized
with overhead in update and query time, strengthening a classic
result of Bentley and Saxe (Algorithmica, 1980).Comment: 28 page
- âŠ