5,864 research outputs found
Post-mission Viking data anaysis
Three Mars data analysis projects from the Viking Mars program were identified initially, and three more came into being as the work proceeded. All together, these six pertained to: (1) the veritical distribution of scattering particles in the Martian atmosphere at various locations in various seasons, (2) the physical parameters that define photometric properties of the Martian surface and atmosphere, (3) patterns of dust-cloud and global dust-storm development, (4) a direct comparison of near-simultaneous Viking and ground-based observations, (5) the annual formation and dissipation of polar frost caps, and (6) evidence concerning possible present-day volcanism or venting. A list of publications pertaining to the appropriate projects is included
A new source detection algorithm using FDR
The False Discovery Rate (FDR) method has recently been described by Miller
et al (2001), along with several examples of astrophysical applications. FDR is
a new statistical procedure due to Benjamini and Hochberg (1995) for
controlling the fraction of false positives when performing multiple hypothesis
testing. The importance of this method to source detection algorithms is
immediately clear. To explore the possibilities offered we have developed a new
task for performing source detection in radio-telescope images, Sfind 2.0,
which implements FDR. We compare Sfind 2.0 with two other source detection and
measurement tasks, Imsad and SExtractor, and comment on several issues arising
from the nature of the correlation between nearby pixels and the necessary
assumption of the null hypothesis. The strong suggestion is made that
implementing FDR as a threshold defining method in other existing
source-detection tasks is easy and worthwhile. We show that the constraint on
the fraction of false detections as specified by FDR holds true even for highly
correlated and realistic images. For the detection of true sources, which are
complex combinations of source-pixels, this constraint appears to be somewhat
less strict. It is still reliable enough, however, for a priori estimates of
the fraction of false source detections to be robust and realistic.Comment: 17 pages, 7 figures, accepted for publication by A
Rydberg transition frequencies from the Local Density Approximation
A method is given that extracts accurate Rydberg excitations from LDA density
functional calculations, despite the short-ranged potential. For the case of He
and Ne, the asymptotic quantum defects predicted by LDA are in less than 5%
error, yielding transition frequency errors of less than 0.1eV.Comment: 4 pages, 6 figures, submitted to Phys. Rev. Let
De-biased Populations of Kuiper Belt Objects from the Deep Ecliptic Survey
The Deep Ecliptic Survey (DES) discovered hundreds of Kuiper Belt objects
from 1998-2005. Follow-up observations yielded 304 objects with good dynamical
classifications (Classical, Scattered, Centaur, or 16 mean-motion resonances
with Neptune). The DES search fields are well documented, enabling us to
calculate the probability of detecting objects with particular orbital
parameters and absolute magnitudes at a randomized point in each orbit.
Grouping objects together by dynamical class leads, we estimate the orbital
element distributions (a, e, i) for the largest three classes (Classical, 3:2,
and Scattered) using maximum likelihood. Using H-magnitude as a proxy for the
object size, we fit a power law to the number of objects for 8 classes with at
least 5 detected members (246 objects). The best Classical slope is
alpha=1.02+/-0.01 (observed from 5<=H<=7.2). Six dynamical classes (Scattered
plus 5 resonances) are consistent in slope with the Classicals, though the
absolute number of objects is scaled. The exception to the power law relation
are the Centaurs (non-resonant with perihelia closer than Neptune, and thus
detectable at smaller sizes), with alpha=0.42+/-0.02 (7.5<H<11). This is
consistent with a knee in the H-distribution around H=7.2 as reported elsewhere
(Bernstein et al. 2004, Fraser et al. 2014). Based on the Classical-derived
magnitude distribution, the total number of objects (H<=7) in each class are:
Classical (2100+/-300 objects), Scattered (2800+/-400), 3:2 (570+/-80), 2:1
(400+/-50), 5:2 (270+/-40), 7:4 (69+/-9), 5:3 (60+/-8). The independent
estimate for the number of Centaurs in the same H range is 13+/-5. If instead
all objects are divided by inclination into "Hot" and "Cold" populations,
following Fraser et al. (2014), we find that alphaHot=0.90+/-0.02, while
alphaCold=1.32+/-0.02, in good agreement with that work.Comment: 26 pages emulateapj, 6 figures, 5 tables, accepted by A
Managing Risk of Bidding in Display Advertising
In this paper, we deal with the uncertainty of bidding for display
advertising. Similar to the financial market trading, real-time bidding (RTB)
based display advertising employs an auction mechanism to automate the
impression level media buying; and running a campaign is no different than an
investment of acquiring new customers in return for obtaining additional
converted sales. Thus, how to optimally bid on an ad impression to drive the
profit and return-on-investment becomes essential. However, the large
randomness of the user behaviors and the cost uncertainty caused by the auction
competition may result in a significant risk from the campaign performance
estimation. In this paper, we explicitly model the uncertainty of user
click-through rate estimation and auction competition to capture the risk. We
borrow an idea from finance and derive the value at risk for each ad display
opportunity. Our formulation results in two risk-aware bidding strategies that
penalize risky ad impressions and focus more on the ones with higher expected
return and lower risk. The empirical study on real-world data demonstrates the
effectiveness of our proposed risk-aware bidding strategies: yielding profit
gains of 15.4% in offline experiments and up to 17.5% in an online A/B test on
a commercial RTB platform over the widely applied bidding strategies
How to project a bipartite network?
The one-mode projecting is extensively used to compress the bipartite
networks. Since the one-mode projection is always less informative than the
bipartite representation, a proper weighting method is required to better
retain the original information. In this article, inspired by the network-based
resource-allocation dynamics, we raise a weighting method, which can be
directly applied in extracting the hidden information of networks, with
remarkably better performance than the widely used global ranking method as
well as collaborative filtering. This work not only provides a creditable
method in compressing bipartite networks, but also highlights a possible way
for the better solution of a long-standing challenge in modern information
science: How to do personal recommendation?Comment: 7 pages, 4 figure
Regularization independent of the noise level: an analysis of quasi-optimality
The quasi-optimality criterion chooses the regularization parameter in
inverse problems without taking into account the noise level. This rule works
remarkably well in practice, although Bakushinskii has shown that there are
always counterexamples with very poor performance. We propose an average case
analysis of quasi-optimality for spectral cut-off estimators and we prove that
the quasi-optimality criterion determines estimators which are rate-optimal
{\em on average}. Its practical performance is illustrated with a calibration
problem from mathematical finance.Comment: 18 pages, 3 figure
- …