230 research outputs found
Stochastic Budget Optimization in Internet Advertising
Internet advertising is a sophisticated game in which the many advertisers
"play" to optimize their return on investment. There are many "targets" for the
advertisements, and each "target" has a collection of games with a potentially
different set of players involved. In this paper, we study the problem of how
advertisers allocate their budget across these "targets". In particular, we
focus on formulating their best response strategy as an optimization problem.
Advertisers have a set of keywords ("targets") and some stochastic information
about the future, namely a probability distribution over scenarios of cost vs
click combinations. This summarizes the potential states of the world assuming
that the strategies of other players are fixed. Then, the best response can be
abstracted as stochastic budget optimization problems to figure out how to
spread a given budget across these keywords to maximize the expected number of
clicks.
We present the first known non-trivial poly-logarithmic approximation for
these problems as well as the first known hardness results of getting better
than logarithmic approximation ratios in the various parameters involved. We
also identify several special cases of these problems of practical interest,
such as with fixed number of scenarios or with polynomial-sized parameters
related to cost, which are solvable either in polynomial time or with improved
approximation ratios. Stochastic budget optimization with scenarios has
sophisticated technical structure. Our approximation and hardness results come
from relating these problems to a special type of (0/1, bipartite) quadratic
programs inherent in them. Our research answers some open problems raised by
the authors in (Stochastic Models for Budget Optimization in Search-Based
Advertising, Algorithmica, 58 (4), 1022-1044, 2010).Comment: FINAL versio
Heliospheric Transport of Neutron-Decay Protons
We report on new simulations of the transport of energetic protons
originating from the decay of energetic neutrons produced in solar flares.
Because the neutrons are fast-moving but insensitive to the solar wind magnetic
field, the decay protons are produced over a wide region of space, and they
should be detectable by current instruments over a broad range of longitudes
for many hours after a sufficiently large gamma-ray flare. Spacecraft closer to
the Sun are expected to see orders-of magnitude higher intensities than those
at the Earth-Sun distance. The current solar cycle should present an excellent
opportunity to observe neutron-decay protons with multiple spacecraft over
different heliographic longitudes and distances from the Sun.Comment: 12 pages, 4 figures, to be published in special issue of Solar
Physic
Supersymmetry Without Prejudice at the LHC
The discovery and exploration of Supersymmetry in a model-independent fashion
will be a daunting task due to the large number of soft-breaking parameters in
the MSSM. In this paper, we explore the capability of the ATLAS detector at the
LHC ( TeV, 1 fb) to find SUSY within the 19-dimensional
pMSSM subspace of the MSSM using their standard transverse missing energy and
long-lived particle searches that were essentially designed for mSUGRA. To this
end, we employ a set of k previously generated model points in the
19-dimensional parameter space that satisfy all of the existing experimental
and theoretical constraints. Employing ATLAS-generated SM backgrounds and
following their approach in each of 11 missing energy analyses as closely as
possible, we explore all of these k model points for a possible SUSY
signal. To test our analysis procedure, we first verify that we faithfully
reproduce the published ATLAS results for the signal distributions for their
benchmark mSUGRA model points. We then show that, requiring all sparticle
masses to lie below 1(3) TeV, almost all(two-thirds) of the pMSSM model points
are discovered with a significance in at least one of these 11 analyses
assuming a 50\% systematic error on the SM background. If this systematic error
can be reduced to only 20\% then this parameter space coverage is increased.
These results are indicative that the ATLAS SUSY search strategy is robust
under a broad class of Supersymmetric models. We then explore in detail the
properties of the kinematically accessible model points which remain
unobservable by these search analyses in order to ascertain problematic cases
which may arise in general SUSY searches.Comment: 69 pages, 40 figures, Discussion adde
Hubble expansion and structure formation in the "running FLRW model" of the cosmic evolution
A new class of FLRW cosmological models with time-evolving fundamental
parameters should emerge naturally from a description of the expansion of the
universe based on the first principles of quantum field theory and string
theory. Within this general paradigm, one expects that both the gravitational
Newton's coupling, G, and the cosmological term, Lambda, should not be strictly
constant but appear rather as smooth functions of the Hubble rate. This
scenario ("running FLRW model") predicts, in a natural way, the existence of
dynamical dark energy without invoking the participation of extraneous scalar
fields. In this paper, we perform a detailed study of these models in the light
of the latest cosmological data, which serves to illustrate the
phenomenological viability of the new dark energy paradigm as a serious
alternative to the traditional scalar field approaches. By performing a joint
likelihood analysis of the recent SNIa data, the CMB shift parameter, and the
BAOs traced by the Sloan Digital Sky Survey, we put tight constraints on the
main cosmological parameters. Furthermore, we derive the theoretically
predicted dark-matter halo mass function and the corresponding redshift
distribution of cluster-size halos for the "running" models studied. Despite
the fact that these models closely reproduce the standard LCDM Hubble
expansion, their normalization of the perturbation's power-spectrum varies,
imposing, in many cases, a significantly different cluster-size halo redshift
distribution. This fact indicates that it should be relatively easy to
distinguish between the "running" models and the LCDM cosmology using realistic
future X-ray and Sunyaev-Zeldovich cluster surveys.Comment: Version published in JCAP 08 (2011) 007: 1+41 pages, 6 Figures, 1
Table. Typos corrected. Extended discussion on the computation of the
linearly extrapolated density threshold above which structures collapse in
time-varying vacuum models. One appendix, a few references and one figure
adde
Large Scale Structure and Supersymmetric Inflation without Fine Tuning
We explore constraints on the spectral index of density fluctuations and
the neutrino energy density fraction , employing data from a
variety of large scale observations. The best fits occur for and
, over a range of Hubble constants km
s Mpc. We present a new class of inflationary models based on
realistic supersymmetric grand unified theories which do not have the usual
`fine tuning' problems. The amplitude of primordial density fluctuations, in
particular, is found to be proportional to , where
denote the GUT (Planck) scale, which is reminiscent of cosmic strings! The
spectral index , in excellent agreement with the observations
provided the dark matter is a mixture of `cold' and `hot' components.Comment: LaTEX, 14 pp. + 1 postscript figure appende
Zebrafish models of collagen VI-related myopathies
Collagen VI is an integral part of the skeletal muscle extracellular matrix, providing mechanical stability and facilitating matrix-dependent cell signaling. Mutations in collagen VI result in either Ullrich congenital muscular dystrophy (UCMD) or Bethlem myopathy (BM), with UCMD being clinically more severe. Recent studies demonstrating increased apoptosis and abnormal mitochondrial function in Col6a1 knockout mice and in human myoblasts have provided the first mechanistic insights into the pathophysiology of these diseases. However, how loss of collagen VI causes mitochondrial dysfunction remains to be understood. Progress is hindered in part by the lack of an adequate animal model for UCMD, as knockout mice have a mild motor phenotype. To further the understanding of these disorders, we have generated zebrafish models of the collagen VI myopathies. Morpholinos designed to exon 9 of col6a1 produced a severe muscle disease reminiscent of UCMD, while ones to exon 13 produced a milder phenotype similar to BM. UCMD-like zebrafish have increased cell death and abnormal mitochondria, which can be attenuated by treatment with the proton pump modifier cyclosporin A (CsA). CsA improved the motor deficits in UCMD-like zebrafish, but failed to reverse the sarcolemmal membrane damage. In all, we have successfully generated the first vertebrate model matching the clinical severity of UCMD and demonstrated that CsA provides phenotypic improvement, thus corroborating data from knockout mice supporting the use of mitochondrial permeability transition pore modifiers as therapeutics in patients, and providing proof of principle for the utility of the zebrafish as a powerful preclinical model
A Bayesian analysis of pentaquark signals from CLAS data
We examine the results of two measurements by the CLAS collaboration, one of
which claimed evidence for a pentaquark, whilst the other found no
such evidence. The unique feature of these two experiments was that they were
performed with the same experimental setup. Using a Bayesian analysis we find
that the results of the two experiments are in fact compatible with each other,
but that the first measurement did not contain sufficient information to
determine unambiguously the existence of a . Further, we suggest a
means by which the existence of a new candidate particle can be tested in a
rigorous manner.Comment: 5 pages, 3 figure
A stochastic evolutionary model generating a mixture of exponential distributions
Recent interest in human dynamics has stimulated the investigation of the stochastic processes that explain human behaviour in various contexts, such as mobile phone networks and social media.
In this paper, we extend the stochastic urn-based model proposed in \cite{FENN15} so that it can generate mixture models,
in particular, a mixture of exponential distributions.
The model is designed to capture the dynamics of survival analysis, traditionally employed in clinical trials, reliability analysis in engineering, and more recently in the analysis of large data sets recording human dynamics. The mixture modelling approach, which is relatively simple and well understood, is very effective in capturing heterogeneity in data.
We provide empirical evidence for the validity of the model, using a data set of popular search engine queries collected over a period of 114 months. We show that the survival function of these queries is closely matched by the exponential mixture solution for our model
An Integrated TCGA Pan-Cancer Clinical Data Resource to Drive High-Quality Survival Outcome Analytics
For a decade, The Cancer Genome Atlas (TCGA) program collected clinicopathologic annotation data along with multi-platform molecular profiles of more than 11,000 human tumors across 33 different cancer types. TCGA clinical data contain key features representing the democratized nature of the data collection process. To ensure proper use of this large clinical dataset associated with genomic features, we developed a standardized dataset named the TCGA Pan-Cancer Clinical Data Resource (TCGA-CDR), which includes four major clinical outcome endpoints. In addition to detailing major challenges and statistical limitations encountered during the effort of integrating the acquired clinical data, we present a summary that includes endpoint usage recommendations for each cancer type. These TCGA-CDR findings appear to be consistent with cancer genomics studies independent of the TCGA effort and provide opportunities for investigating cancer biology using clinical correlates at an unprecedented scale. Analysis of clinicopathologic annotations for over 11,000 cancer patients in the TCGA program leads to the generation of TCGA Clinical Data Resource, which provides recommendations of clinical outcome endpoint usage for 33 cancer types
- âŠ