230 research outputs found

    Stochastic Budget Optimization in Internet Advertising

    Full text link
    Internet advertising is a sophisticated game in which the many advertisers "play" to optimize their return on investment. There are many "targets" for the advertisements, and each "target" has a collection of games with a potentially different set of players involved. In this paper, we study the problem of how advertisers allocate their budget across these "targets". In particular, we focus on formulating their best response strategy as an optimization problem. Advertisers have a set of keywords ("targets") and some stochastic information about the future, namely a probability distribution over scenarios of cost vs click combinations. This summarizes the potential states of the world assuming that the strategies of other players are fixed. Then, the best response can be abstracted as stochastic budget optimization problems to figure out how to spread a given budget across these keywords to maximize the expected number of clicks. We present the first known non-trivial poly-logarithmic approximation for these problems as well as the first known hardness results of getting better than logarithmic approximation ratios in the various parameters involved. We also identify several special cases of these problems of practical interest, such as with fixed number of scenarios or with polynomial-sized parameters related to cost, which are solvable either in polynomial time or with improved approximation ratios. Stochastic budget optimization with scenarios has sophisticated technical structure. Our approximation and hardness results come from relating these problems to a special type of (0/1, bipartite) quadratic programs inherent in them. Our research answers some open problems raised by the authors in (Stochastic Models for Budget Optimization in Search-Based Advertising, Algorithmica, 58 (4), 1022-1044, 2010).Comment: FINAL versio

    Heliospheric Transport of Neutron-Decay Protons

    Get PDF
    We report on new simulations of the transport of energetic protons originating from the decay of energetic neutrons produced in solar flares. Because the neutrons are fast-moving but insensitive to the solar wind magnetic field, the decay protons are produced over a wide region of space, and they should be detectable by current instruments over a broad range of longitudes for many hours after a sufficiently large gamma-ray flare. Spacecraft closer to the Sun are expected to see orders-of magnitude higher intensities than those at the Earth-Sun distance. The current solar cycle should present an excellent opportunity to observe neutron-decay protons with multiple spacecraft over different heliographic longitudes and distances from the Sun.Comment: 12 pages, 4 figures, to be published in special issue of Solar Physic

    Supersymmetry Without Prejudice at the LHC

    Full text link
    The discovery and exploration of Supersymmetry in a model-independent fashion will be a daunting task due to the large number of soft-breaking parameters in the MSSM. In this paper, we explore the capability of the ATLAS detector at the LHC (s=14\sqrt s=14 TeV, 1 fb−1^{-1}) to find SUSY within the 19-dimensional pMSSM subspace of the MSSM using their standard transverse missing energy and long-lived particle searches that were essentially designed for mSUGRA. To this end, we employ a set of ∌71\sim 71k previously generated model points in the 19-dimensional parameter space that satisfy all of the existing experimental and theoretical constraints. Employing ATLAS-generated SM backgrounds and following their approach in each of 11 missing energy analyses as closely as possible, we explore all of these 7171k model points for a possible SUSY signal. To test our analysis procedure, we first verify that we faithfully reproduce the published ATLAS results for the signal distributions for their benchmark mSUGRA model points. We then show that, requiring all sparticle masses to lie below 1(3) TeV, almost all(two-thirds) of the pMSSM model points are discovered with a significance S>5S>5 in at least one of these 11 analyses assuming a 50\% systematic error on the SM background. If this systematic error can be reduced to only 20\% then this parameter space coverage is increased. These results are indicative that the ATLAS SUSY search strategy is robust under a broad class of Supersymmetric models. We then explore in detail the properties of the kinematically accessible model points which remain unobservable by these search analyses in order to ascertain problematic cases which may arise in general SUSY searches.Comment: 69 pages, 40 figures, Discussion adde

    Hubble expansion and structure formation in the "running FLRW model" of the cosmic evolution

    Full text link
    A new class of FLRW cosmological models with time-evolving fundamental parameters should emerge naturally from a description of the expansion of the universe based on the first principles of quantum field theory and string theory. Within this general paradigm, one expects that both the gravitational Newton's coupling, G, and the cosmological term, Lambda, should not be strictly constant but appear rather as smooth functions of the Hubble rate. This scenario ("running FLRW model") predicts, in a natural way, the existence of dynamical dark energy without invoking the participation of extraneous scalar fields. In this paper, we perform a detailed study of these models in the light of the latest cosmological data, which serves to illustrate the phenomenological viability of the new dark energy paradigm as a serious alternative to the traditional scalar field approaches. By performing a joint likelihood analysis of the recent SNIa data, the CMB shift parameter, and the BAOs traced by the Sloan Digital Sky Survey, we put tight constraints on the main cosmological parameters. Furthermore, we derive the theoretically predicted dark-matter halo mass function and the corresponding redshift distribution of cluster-size halos for the "running" models studied. Despite the fact that these models closely reproduce the standard LCDM Hubble expansion, their normalization of the perturbation's power-spectrum varies, imposing, in many cases, a significantly different cluster-size halo redshift distribution. This fact indicates that it should be relatively easy to distinguish between the "running" models and the LCDM cosmology using realistic future X-ray and Sunyaev-Zeldovich cluster surveys.Comment: Version published in JCAP 08 (2011) 007: 1+41 pages, 6 Figures, 1 Table. Typos corrected. Extended discussion on the computation of the linearly extrapolated density threshold above which structures collapse in time-varying vacuum models. One appendix, a few references and one figure adde

    Large Scale Structure and Supersymmetric Inflation without Fine Tuning

    Full text link
    We explore constraints on the spectral index nn of density fluctuations and the neutrino energy density fraction ΩHDM\Omega_{HDM}, employing data from a variety of large scale observations. The best fits occur for n≈1n\approx 1 and ΩHDM≈0.15−0.30\Omega_{HDM} \approx 0.15 - 0.30, over a range of Hubble constants 40−6040-60 km s−1^{-1} Mpc−1^{-1}. We present a new class of inflationary models based on realistic supersymmetric grand unified theories which do not have the usual `fine tuning' problems. The amplitude of primordial density fluctuations, in particular, is found to be proportional to (MX/MP)2(M_X /M_P)^2, where MX(MP)M_X (M_P) denote the GUT (Planck) scale, which is reminiscent of cosmic strings! The spectral index n=0.98n = 0.98, in excellent agreement with the observations provided the dark matter is a mixture of `cold' and `hot' components.Comment: LaTEX, 14 pp. + 1 postscript figure appende

    Zebrafish models of collagen VI-related myopathies

    Get PDF
    Collagen VI is an integral part of the skeletal muscle extracellular matrix, providing mechanical stability and facilitating matrix-dependent cell signaling. Mutations in collagen VI result in either Ullrich congenital muscular dystrophy (UCMD) or Bethlem myopathy (BM), with UCMD being clinically more severe. Recent studies demonstrating increased apoptosis and abnormal mitochondrial function in Col6a1 knockout mice and in human myoblasts have provided the first mechanistic insights into the pathophysiology of these diseases. However, how loss of collagen VI causes mitochondrial dysfunction remains to be understood. Progress is hindered in part by the lack of an adequate animal model for UCMD, as knockout mice have a mild motor phenotype. To further the understanding of these disorders, we have generated zebrafish models of the collagen VI myopathies. Morpholinos designed to exon 9 of col6a1 produced a severe muscle disease reminiscent of UCMD, while ones to exon 13 produced a milder phenotype similar to BM. UCMD-like zebrafish have increased cell death and abnormal mitochondria, which can be attenuated by treatment with the proton pump modifier cyclosporin A (CsA). CsA improved the motor deficits in UCMD-like zebrafish, but failed to reverse the sarcolemmal membrane damage. In all, we have successfully generated the first vertebrate model matching the clinical severity of UCMD and demonstrated that CsA provides phenotypic improvement, thus corroborating data from knockout mice supporting the use of mitochondrial permeability transition pore modifiers as therapeutics in patients, and providing proof of principle for the utility of the zebrafish as a powerful preclinical model

    A Bayesian analysis of pentaquark signals from CLAS data

    Get PDF
    We examine the results of two measurements by the CLAS collaboration, one of which claimed evidence for a Θ+\Theta^{+} pentaquark, whilst the other found no such evidence. The unique feature of these two experiments was that they were performed with the same experimental setup. Using a Bayesian analysis we find that the results of the two experiments are in fact compatible with each other, but that the first measurement did not contain sufficient information to determine unambiguously the existence of a Θ+\Theta^{+}. Further, we suggest a means by which the existence of a new candidate particle can be tested in a rigorous manner.Comment: 5 pages, 3 figure

    A stochastic evolutionary model generating a mixture of exponential distributions

    Get PDF
    Recent interest in human dynamics has stimulated the investigation of the stochastic processes that explain human behaviour in various contexts, such as mobile phone networks and social media. In this paper, we extend the stochastic urn-based model proposed in \cite{FENN15} so that it can generate mixture models, in particular, a mixture of exponential distributions. The model is designed to capture the dynamics of survival analysis, traditionally employed in clinical trials, reliability analysis in engineering, and more recently in the analysis of large data sets recording human dynamics. The mixture modelling approach, which is relatively simple and well understood, is very effective in capturing heterogeneity in data. We provide empirical evidence for the validity of the model, using a data set of popular search engine queries collected over a period of 114 months. We show that the survival function of these queries is closely matched by the exponential mixture solution for our model

    An Integrated TCGA Pan-Cancer Clinical Data Resource to Drive High-Quality Survival Outcome Analytics

    Get PDF
    For a decade, The Cancer Genome Atlas (TCGA) program collected clinicopathologic annotation data along with multi-platform molecular profiles of more than 11,000 human tumors across 33 different cancer types. TCGA clinical data contain key features representing the democratized nature of the data collection process. To ensure proper use of this large clinical dataset associated with genomic features, we developed a standardized dataset named the TCGA Pan-Cancer Clinical Data Resource (TCGA-CDR), which includes four major clinical outcome endpoints. In addition to detailing major challenges and statistical limitations encountered during the effort of integrating the acquired clinical data, we present a summary that includes endpoint usage recommendations for each cancer type. These TCGA-CDR findings appear to be consistent with cancer genomics studies independent of the TCGA effort and provide opportunities for investigating cancer biology using clinical correlates at an unprecedented scale. Analysis of clinicopathologic annotations for over 11,000 cancer patients in the TCGA program leads to the generation of TCGA Clinical Data Resource, which provides recommendations of clinical outcome endpoint usage for 33 cancer types
    • 

    corecore