72 research outputs found

    Deal or No Deal? Consumer Expectations and Competition in Daily Deals

    Get PDF
    Daily deals have emerged as an integral part of the marketing mix for retail merchants and have enjoyed wide acceptance by consumers. However, there is considerable ambiguity about the effects of deals on brand evaluation, and resulting electronic word-of-mouth (eWOM). In this paper, we propose that the effects of deals on eWOM are contingent on merchant heterogeneity and whether consumers perceive merchants\u27 marketing efforts as desperate. We empirically model the effects of daily deals on eWOM for restaurants in Washington DC over 13 months. Results show that price segment, age, and competitive deal intensity strongly moderate the effect of deals on resulting eWOM. We also show that deals have significant spillover effects on neighboring merchants who do not offer deals. We confirm these effects using three controlled lab experiments, where similar results are obtained without the possibility of deal redemption

    ОЦЕНКА РЫНОЧНЫХ РИСКОВ ПО (T+)-ОПЕРАЦИЯМ

    Get PDF
    Market risk analysis and estimation are presentedin T+ transactionsas they are used within the Moscow Exchange. There is a need to do so as a result of the cut-off of a new REPO product with Central Counterpartner (CCP). Here repurchase agreement goes through the National Clearing Center (NCC), the last being a bank and a clearing structure within the Moscow Exchange group.NCC actsas an intermediary (so called “Central Counterpartner”) between trading participants.REPOs with CCP raisecontractor claims and commitments to the CCP which takes the risk of default on commitments from unfair contract side. The REPO with CCP cut-off made ready a technological platform to implement T+2 trades at the Moscow Exchange. As a result of it there appeared the possibility to enter security purchase/sell contracts partially collateralized. All these transactions (the REPO with CCP, T+) made it a must determining security market risks. The paper is aimed at presenting VaR-like risk estimates. The methods used are from the computer fi nance. Unusual TS rate of return indicator is proposed and applied to find optimal portfolios under the Markowitz approach and their VaRs (losses) forecasts given the real “big” share price data and various horizons. Portfolio extreme rate and loss forecasting is our goal. To this end the forecasts are computed for three horizons (2, 5 and 10 days) and for three significance levels.There were developed R-, Excel- and Bloomberg-basedsoftware tools as needed. The whole range of proposed computing steps and the tables with charts may be considered as candidates to be included in the future market risk standards.Paper results permit capital market participants to choose the correct (as to the required risk level) common stocks.Рассматриваются вопросы анализа и оценки рыночных рисков операций Т+ в том их виде, в каком они реализуются на Московской Бирже. Необходимость решения этой задачи обусловлена запуском нового продукта РЕПО с Центральным контрагентом. В этом продукте сделка РЕПО осуществляется через Национальный клиринговый центр (НКЦ), являющийся банком и клиринговой организацией в составе группы Московская Биржа. НКЦ выступает в роли посредника, так называемого «центрального контрагента», между участниками торгов. У контрагентов по сделке «РЕПО с ЦК» появляются требования и обязательства перед центральным контрагентом, который берет на себя риск неисполнения обязательств недобросовестной стороной по сделке. Запуск РЕПО с центральным контрагентом подготовил технологическую базу для реализации торгов Т+2 на Московской Бирже, в результате которого появилась возможность заключать сделки купли/продажи ценных бумаг с частичным обеспечением. Все эти операции (РЕПО с ЦК, Т+) вызвали необходимость расчета рыночных рисков по ценным бумагам. Цель статьи заключается в обсуждении полученных ГП-подобных числовых оценок риска. Используемые методы относятся к области нового формируемого направления «компьютерных финансов». Предлагается оригинальный показатель доходности для временных рядов, который и используется для построения оптимальных портфелей Марковица. Портфели служат основой для прогноза потерь для заданных реальных рядов цен акций («больших данных») и горизонтов. Основным результатом работы является процедура прогнозирования предельных значений доходности и потерь портфеля. Для этого прогнозируемые значения рассчитываются для трех горизонтов прогноза (на 2, 5 и 10 дней) и трех уровней значимости. Для каждого случая разработаны программы, составленные на базе R-системы, электронной таблицы Excel и профессиональной сервисной системы Bloomberg. Все предложенные вычислительные этапы в совокупности, а также сопровождающие их таблицы вместе с графиками, могут трактоваться как потенциальные компоненты будущих стандартов по расчету рыночных рисков. Результаты работы позволяют участникам фондового рынка осуществлять подбор подходящих (с точки зрения минимизации риска) акций

    The track finding algorithm of the Belle II vertex detectors

    Get PDF
    The Belle II experiment is a high energy multi purpose particle detector operated at the asymmetric e+e-- collier SuperKEKB in Tsukuba (Japan). In this work we describe the algorithm performing the pattern recognition for inner tracking detector which consists of two layers of pixel detectors and four layers of double sided silicon strip detectors arranged around the interaction region. The track finding algorithm will be used both during the High Level Trigger on-line track reconstruction and during the off-line full reconstruction. It must provide good efficiency down to momenta as low as 50 MeV/c where material effects are sizeable even in an extremely thin detector as the VXD. In addition it has to be able to cope with the high occupancy of the Belle II detectors due to the background. The underlying concept of the track finding algorithm, as well as details of the implementation are outlined. The algorithm is proven to run with good performance on simulated Y (4S) â\u86\u92 BB events with an efficiency for reconstructing tracks of above 90% over a wide range of momentum

    2-point statistics covariance with fewer mocks

    Full text link
    We present an approach for accurate estimation of the covariance of 2-point correlation functions that requires fewer mocks than the standard mock-based covariance. This can be achieved by dividing a set of mocks into jackknife regions and fitting the correction term first introduced in Mohammad & Percival (2022), such that the mean of the jackknife covariances corresponds to the one from the mocks. This extends the model beyond the shot-noise limited regime, allowing it to be used for denser samples of galaxies. We test the performance of our fitted jackknife approach, both in terms of accuracy and precision, using lognormal mocks with varying densities and approximate EZmocks mimicking the DESI LRG and ELG samples in the redshift range of z = [0.8, 1.2]. We find that the Mohammad-Percival correction produces a bias in the 2-point correlation function covariance matrix that grows with number density and that our fitted jackknife approach does not. We also study the effect of the covariance on the uncertainty of cosmological parameters by performing a full-shape analysis. We find that our fitted jackknife approach based on 25 mocks is able to recover unbiased and as precise cosmological parameters as the ones obtained from a covariance matrix based on 1000 or 1500 mocks, while the Mohammad-Percival correction produces uncertainties that are twice as large. The number of mocks required to obtain an accurate estimation of the covariance for 2-point correlation function is therefore reduced by a factor of 40-60.Comment: 13 pages, 14 figures, submitted to MNRA

    Effects of word-of-mouth versus traditional marketing: findings from an internet social networking site

    Get PDF
    The authors study the effect of word-of-mouth (WOM) marketing on member growth at an Internet social networking site and compare it with traditional marketing vehicles. Because social network sites record the electronic invitations from existing members, outbound WOM can be precisely tracked. Along with traditional marketing, WOM can then be linked to the number of new members subsequently joining the site (sign-ups). Because of the endogeneity among WOM, new sign-ups, and traditional marketing activity, the authors employ a vector autoregression (VAR) modeling approach. Estimates from the VAR model show that WOM referrals have substantially longer carryover effects than traditional marketing actions and produce substantially higher response elasticises. Based on revenue from advertising impressions served to a new member, the monetary value of a WOM referral can be calculated; this yields an upper-bound estimate for the financial incentives the firm might offer to stimulate WOM.pre-prin

    Deal Or No Deal? the Effect Online Deals on Consumer Quality Perceptions and Competition Deal or No Deal? The Effect Online Deals on Consumer Quality Perceptions and Competition EXTENDED ABSTRACT Introduction and Research Questions

    No full text
    We study the effect of online deals, such as Groupon on consumer quality expectations in online reviews. Through both empirical models using yelp.com's reviews and lab experiments, we find that the effect of online deals on online reviews is strongly moderated by the merchant characteristics and competition

    Marketing For Internet Social Networks The idea of online communities is as old as...

    No full text
    ***Please do not cite or quote without permission of the authors**

    Estimating aggregate consumer preferences from online product reviews

    No full text
    Decker R, Trusov M. Estimating aggregate consumer preferences from online product reviews. International Journal of Research in Marketing. 2010;27(4):293-307.Today, consumer reviews are available on the Internet for a large number of product categories. The pros and cons expressed in this way uncover individually perceived strengths and weaknesses of the respective products, whereas the usually assigned product ratings represent their overall valuation. The key question at this point is how to turn the available plentitude of individual consumer opinions into aggregate consumer preferences, which can be used, for example, in product development or improvement processes. To solve this problem, an econometric framework is presented that can be applied to the mentioned type of data after having prepared it using natural language processing techniques. The suggested methodology enables the estimation of parameters, which allow inferences on the relative effect of product attributes and brand names on the overall evaluation of the products. Specifically, we discuss options for taking opinion heterogeneity into account. Both the practicability and the benefits of the suggested approach are demonstrated using product review data from the mobile phone market. This paper demonstrates that the review-based results compare very favorably with consumer preferences obtained through conjoint analysis techniques. (C) 2010 Elsevier B.V. All rights reserved
    corecore