1,263 research outputs found
Statistical Arbitrage Mining for Display Advertising
We study and formulate arbitrage in display advertising. Real-Time Bidding
(RTB) mimics stock spot exchanges and utilises computers to algorithmically buy
display ads per impression via a real-time auction. Despite the new automation,
the ad markets are still informationally inefficient due to the heavily
fragmented marketplaces. Two display impressions with similar or identical
effectiveness (e.g., measured by conversion or click-through rates for a
targeted audience) may sell for quite different prices at different market
segments or pricing schemes. In this paper, we propose a novel data mining
paradigm called Statistical Arbitrage Mining (SAM) focusing on mining and
exploiting price discrepancies between two pricing schemes. In essence, our
SAMer is a meta-bidder that hedges advertisers' risk between CPA (cost per
action)-based campaigns and CPM (cost per mille impressions)-based ad
inventories; it statistically assesses the potential profit and cost for an
incoming CPM bid request against a portfolio of CPA campaigns based on the
estimated conversion rate, bid landscape and other statistics learned from
historical data. In SAM, (i) functional optimisation is utilised to seek for
optimal bidding to maximise the expected arbitrage net profit, and (ii) a
portfolio-based risk management solution is leveraged to reallocate bid volume
and budget across the set of campaigns to make a risk and return trade-off. We
propose to jointly optimise both components in an EM fashion with high
efficiency to help the meta-bidder successfully catch the transient statistical
arbitrage opportunities in RTB. Both the offline experiments on a real-world
large-scale dataset and online A/B tests on a commercial platform demonstrate
the effectiveness of our proposed solution in exploiting arbitrage in various
model settings and market environments.Comment: In the proceedings of the 21st ACM SIGKDD international conference on
Knowledge discovery and data mining (KDD 2015
Recommended from our members
Shallow Flaws Under Biaxial Loading Conditions, Part II: Application of a Weibull Stress Analysis of the Cruciform Bend Specimen Using a Hydrostatic Stress Criterion
Cruciform beam fracture mechanics specimensl have been developed in the Heavy Section Steel Technology (HSST) Program at Oak Ridge National Laboratory (ORNL) to introduce a prototypic, far- field, out-of-plane biaxird bending stress component in the test section that approximates the nonlinear biaxial stresses resulting from pressurized-thernxd-shock or pressure-temperature loading of a nuclear reactor pressure vessel (RPV). Matrices of cruciform beam tests were developed to investigate and quantify the effects of temperature, biaxial loading, and specimen size on fracture initiation toughness of two-dimensional (constant depth), shtdlow, surface flaws. Tests were conducted under biaxial load ratios ranging from uniaxial to equibiaxial. These tests demonstrated that biaxial loading can have a pronounced effect on shallow-flaw fracture toughness in the lower transition temperature region for RPV materials. Two and three- parameter Weibull models have been calibrated using a new scheme (developed at the University of Illinois) that maps toughness data from test specimens with distinctly different levels of crack-tip constraint to a small scale yielding (SSY) Weibull stress space. These models, using the new hydrostatic stress criterion in place of the more commonly used maximum principal stress in the kernel of the OW integral definition, have been shown to correlate the experimentally observed biaxiaI effect in cruciform specimens, thereby providing a scaling mechanism between uniaxial and biaxial loading states
A Model of Vertical Oligopolistic Competition
This paper develops a model of successive oligopolies with endogenous market entry, allowing for varying degrees of product differentiation and entry costs in both markets. Our analysis shows that the downstream conditions dominate the overall profitability of the two-tier structure while
the upstream conditions mainly affect the distribution of profits. We compare the welfare effects of upstream versus downstream deregulation policies and show that the impact of deregulation may be overvalued when ignoring feedback effects from the other market. Furthermore, we analyze how different forms of vertical restraints influence the endogenous market structure and show when they are welfare enhancing
Recommended from our members
Fracture assessment of HSST Plate 14 shallow-flaw cruciform bend specimens tested under biaxial loading conditions
A technology to determine shallow-flaw fracture toughness of reactor pressure vessel (RPV) steels is being developed for application to the safety assessment of RPVs containing postulated shallow surface flaws. Matrices of cruciform beam tests were developed to investigate and quantify the effects of temperature, biaxial loading, and specimen size on fracture initiation toughness of two-dimensional (constant depth), shallow, surface flaws. The cruciform beam specimens were developed at Oak Ridge National Laboratory (ORNL) to introduce a far-field, out-of-plane biaxial stress component in the test section that approximates the nonlinear stresses resulting from pressurized-thermal-shock or pressure-temperature loading of an RPV. Tests were conducted under biaxial load ratios ranging from uniaxial to equibiaxial. These tests demonstrated that biaxial loading can have a pronounced effect on shallow-flaw fracture toughness in the lower transition temperature region for an RPV material. The cruciform fracture toughness data were used to evaluate fracture methodologies for predicting the observed effects of biaxial loading on shallow-flaw fracture toughness. Initial emphasis was placed on assessment of stress-based methodologies, namely, the J-Q formulation, the Dodds-Anderson toughness scaling model, and the Weibull approach. Applications of these methodologies based on the hydrostatic stress fracture criterion indicated an effect of loading-biaxiality on fracture toughness; the conventional maximum principal stress criterion indicated no effect. A three-parameter Weibull model based on the hydrostatic stress criterion is shown to correlate the experimentally observed biaxial effect on cleavage fracture toughness by providing a scaling mechanism between uniaxial and biaxial loading states
Recommended from our members
Biaxial loading effects on fracture toughness of reactor pressure vessel steel
The preliminary phases of a program to develop and evaluate fracture methodologies for assessing crack-tip constraint effects on fracture toughness of reactor pressure vessel (RPV) steels have been completed by the Heavy-Section Steel Technology (HSST) Program. Objectives were to investigate effect of biaxial loading on fracture toughness, quantify this effect through existing stress-based, dual-parameter, fracture-toughness correlations, or propose and verify alternate correlations. A cruciform beam specimen with 2-D, shallow, through-thickness flaw and a special loading fixture was designed and fabricated. Tests were performed using biaxial loading ratios of 0:1 (uniaxial), 0.6:1, and 1:1 (equi-biaxial). Critical fracture-toughness values were calculated for each test. Biaxial loading of 0.6:1 resulted in a reduction in the lower bound fracture toughness of {approximately}12% as compared to that from the uniaxial tests. The biaxial loading of 1:1 yielded two subsets of toughness values; one agreed well with the uniaxial data, while one was reduced by {approximately}43% when compared to the uniaxial data. Results were evaluated using J-Q theory and Dodds-Anderson (D-A) micromechanical scaling model. The D-A model predicted no biaxial effect, while the J-Q method gave inconclusive results. When applied to the 1:1 biaxial data, these constraint methodologies failed to predict the observed reduction in fracture toughness obtained in one experiment. A strain-based constraint methodology that considers the relationship between applied biaxial load, the plastic zone width in the crack plane, and fracture toughness was formulated and applied successfully to the data. Evaluation of this dual-parameter strain-based model led to the conclusion that it has the capability of representing fracture behavior of RPV steels in the transition region, including the effects of out-of-plane loading on fracture toughness. This report is designated as HSST Report No. 150
Targeting quiescent leukemic stem cells using second generation autophagy inhibitors
In chronic myeloid leukemia (CML), tyrosine kinase inhibitor (TKI) treatment induces autophagy that promotes survival and TKI-resistance in leukemic stem cells (LSCs). In clinical studies hydroxychloroquine (HCQ), the only clinically approved autophagy inhibitor, does not consistently inhibit autophagy in cancer patients, so more potent autophagy inhibitors are needed. We generated a murine model of CML in which autophagic flux can be measured in bone marrow-located LSCs. In parallel, we use cell division tracing, phenotyping of primary CML cells, and a robust xenotransplantation model of human CML, to investigate the effect of Lys05, a highly potent lysosomotropic agent, and PIK-III, a selective inhibitor of VPS34, on the survival and function of LSCs. We demonstrate that long-term haematopoietic stem cells (LT-HSCs: Lin−Sca-1+c-kit+CD48−CD150+) isolated from leukemic mice have higher basal autophagy levels compared with non-leukemic LT-HSCs and more mature leukemic cells. Additionally, we present that while HCQ is ineffective, Lys05-mediated autophagy inhibition reduces LSCs quiescence and drives myeloid cell expansion. Furthermore, Lys05 and PIK-III reduced the number of primary CML LSCs and target xenografted LSCs when used in combination with TKI treatment, providing a strong rationale for clinical use of second generation autophagy inhibitors as a novel treatment for CML patients with LSC persistence
Does \u2018bigger\u2019mean \u2018better\u2019? Pitfalls and shortcuts associated with big data for social research
\u2018Big data is here to stay.\u2019 This key statement has a double value: is an assumption as well as the reason why a theoretical reflection is needed. Furthermore, Big data is something that is gaining visibility and success in social sciences even, overcoming the division between humanities and computer sciences. In this contribution some considerations on the presence and the certain persistence of Big data as a socio-technical assemblage will be outlined. Therefore, the intriguing opportunities for social research linked to such interaction between practices and technological development will be developed. However, despite a promissory rhetoric, fostered by several scholars since the birth of Big data as a labelled concept, some risks are just around the corner. The claims for the methodological power of bigger and bigger datasets, as well as increasing speed in analysis and data collection, are creating a real hype in social research. Peculiar attention is needed in order to avoid some pitfalls. These risks will be analysed for what concerns the validity of the research results \u2018obtained through Big data. After a pars distruens, this contribution will conclude with a pars construens; assuming the previous critiques, a mixed methods research design approach will be described as a general proposal with the objective of stimulating a debate on the integration of Big data in complex research projecting
License prices for financially constrained firms
It is often alleged that high auction prices inhibit service deployment. We investigate this claim under the extreme case of financially constrained bidders. If demand is just slightly elastic, auctions maximize consumer surplus if consumer surplus is a convex function of quantity (a common assumption), or if consumer surplus is concave and the proportion of expenditure spent on deployment is greater than one over the elasticity of demand. The latter condition appears to be true for most of the large telecom auctions in the US and Europe. Thus, even if high auction prices inhibit service deployment, auctions appear to be optimal from the consumers’ point of view
- …