1,949 research outputs found

    Measuring and Optimizing Cultural Markets

    Full text link
    Social influence has been shown to create significant unpredictability in cultural markets, providing one potential explanation why experts routinely fail at predicting commercial success of cultural products. To counteract the difficulty of making accurate predictions, "measure and react" strategies have been advocated but finding a concrete strategy that scales for very large markets has remained elusive so far. Here we propose a "measure and optimize" strategy based on an optimization policy that uses product quality, appeal, and social influence to maximize expected profits in the market at each decision point. Our computational experiments show that our policy leverages social influence to produce significant performance benefits for the market, while our theoretical analysis proves that our policy outperforms in expectation any policy not displaying social information. Our results contrast with earlier work which focused on showing the unpredictability and inequalities created by social influence. Not only do we show for the first time that dynamically showing consumers positive social information under our policy increases the expected performance of the seller in cultural markets. We also show that, in reasonable settings, our policy does not introduce significant unpredictability and identifies "blockbusters". Overall, these results shed new light on the nature of social influence and how it can be leveraged for the benefits of the market

    The benefits of social influence in optimized cultural markets

    Get PDF
    Social influence has been shown to create significant unpredictability in cultural markets, providing one potential explanation why experts routinely fail at predicting commercial success of cultural products. As a result, social influence is often presented in a negative light. Here, we show the benefits of social influence for cultural markets. We present a policy that uses product quality, appeal, position bias and social influence to maximize expected profits in the market. Our computational experiments show that our profit-maximizing policy leverages social influence to produce significant performance benefits for the market, while our theoretical analysis proves that our policy outperforms in expectation any policy not displaying social signals. Our results contrast with earlier work which focused on showing the unpredictability and inequalities created by social influence. Not only do we show for the first time that, under our policy, dynamically showing consumers positive social signals increases the expected profit of the seller in cultural markets. We also show that, in reasonable settings, our profit-maximizing policy does not introduce significant unpredictability and identifies "blockbusters". Overall, these results shed new light on the nature of social influence and how it can be leveraged for the benefits of the market

    Predictability of extreme events in social media

    Get PDF
    It is part of our daily social-media experience that seemingly ordinary items (videos, news, publications, etc.) unexpectedly gain an enormous amount of attention. Here we investigate how unexpected these events are. We propose a method that, given some information on the items, quantifies the predictability of events, i.e., the potential of identifying in advance the most successful items defined as the upper bound for the quality of any prediction based on the same information. Applying this method to different data, ranging from views in YouTube videos to posts in Usenet discussion groups, we invariantly find that the predictability increases for the most extreme events. This indicates that, despite the inherently stochastic collective dynamics of users, efficient prediction is possible for the most extreme events.Comment: 13 pages, 3 figure

    Path dependence, its critics and the quest for ‘historical economics’

    Get PDF
    The concept of path dependence refers to a property of contingent, non- reversible dynamical processes, including a wide array of biological and social processes that can properly be described as 'evolutionary.' To dispell existing confusions in the literature, and clarify the meaning and significance of path dependence for economists, the paper formulates definitions that relate the phenomenon to the property of non-ergodicity in stochastic processes; it examines the nature of the relationship between between path dependence and 'market failure,' and discusses the meaning of 'lock-in.' Unlike tests for the presence of non-ergodicity, assessments of the economic significance of path dependence are shown to involve difficult issues of counterfactual specification, and the welfare evaluation of alternative dynamic paths rather than terminal states. The policy implications of the existence of path dependence are shown to be more subtle and, as a rule, quite different from those which have been presumed by critics of the concept. A concluding section applies the notion of 'lock-in' reflexively to the evolution of economic analysis, suggesting that resistence to historical economics is a manifestation of 'sunk cost hysteresis' in the sphere of human cognitive development.path dependence, non-ergodicity, irreversibility, lock-in, counterfactual analysis

    Can the Capability Approach be Evaluated within the Frame of Mainstream Economics? A Methodological Analysis

    Get PDF
    The aim of this article is to examine the capability approach of Amartya Sen and mainstream economic theory in terms of their epistemological, methodological and philosophical/cultural aspects. The reason for undertaking this analysis is the belief that Sen’s capability approach, contrary to some economists’ claim, is uncongenial to mainstream economic views on epistemology and methodology (not on ontologically). However, while some social scientists regard that Sen, on the whole, is a mainstream economist, his own approach strongly criticizes both the theory and practice of mainstream economics.Amartya Sen, Mainstream economics, Methodological individualism.

    Modeling consumer behaviour in the presence of network effects

    Get PDF
    Consumer choice models are a key component in fields such as Revenue Management and Transport Logistics, where the demands for certain products or services are assumed to follow a particular form, and sellers or market-makers use that information to adjust their strategies accordingly, choosing for example which products to display (assortment problem) or their prices (pricing problem). In the last couple of decades, online markets have taken a lot of relevance, providing a setting where consumers can compare easily different products, before deciding to buy them. More information is now available, and the purchasing decisions not only depend on the quality, prices and availability of the products, but also on what previous consumers think about them (phenomenon commonly known as Network Effects). Hence, in order to create a suitable model for this kind of market, it is relevant to understand how the collective decisions affect the market evolution. In this thesis we consider a particular subset of those online markets, cultural markets, where the products are for example songs, video games or ebooks. This kind of market has the special feature that its products have unlimited supply (since they are just a digital copy), and therefore we can exploit this in our models, to justify assumptions of the asymptotic behaviour of the market. We study some variations of the traditional Multinomial Logit (MNL) model, characterising the behaviour of consumers, where their purchasing decisions are affected by the quality and prices (initially fixed) of the available products, as well as their visibilities in the market interface and the consumption patterns of previous users. We focus particularly on the parameters associated to the network effects, where depending on the strength of the network effects, it is possible to explain: herd behaviours, where an alternative overpowers the rest; as well as more well-distributed settings, where all the alternatives receive enough attention giving a notion of fairness, since higher quality products get a larger market share. Finally, using the model where market shares are distributed according to the quality of the products, we study pricing strategies, where sellers can either collaborate or compete. We analyse the effect of both type of strategies into the choice model

    The Market Process and the Emergence of the Firm Some Indications of Entrepreneurship Under Genuine uncertainty

    Get PDF
    This paper examines the nature of genuine uncertainty and rule-following behaviour and suggests some implications for the theory of the firm. The firm is seen here as emerging as a means to manage some of the experienced uncertainty. The nature of the firm is perceived as an evolving institution creating predictability both inside the firm and in the market. But because of the spontaneous nature of life-world, social processes remain open-ended. This subjectivist perspective cannot assign any particular premeditated purpose to the spontaneous order which emerges through the market process. The process is not kaleidic but nor is it considered to be moving toward increasing efficiency either. Rules and institutions provide predictability to the extent that novelties can be introduced to the process. Discoveries do not, however, only introduce new outcomes in the market process, they also change the rules of the game.Uncertainly, institutions, transaction costs

    Assortment and Pricing Optimisation under non-conventional customer choice models

    Get PDF
    Nowadays, extensive research is being done in the area of revenue management, with applications across industries. In the center of this area lays the assortment problem, which amounts to find a subset of products to offer in order to maximise revenue, provided that customers follow a certain model of choice. Most studied models satisfy the following property: whenever the offered set is enlarged, then the probability of selecting a specific product decreases. This property is called regularity in the literature. However, customer behaviour often shows violations of this condition such as the decoy effect, where adding extra options sometimes leads to a positive effect for some products, whose probabilities of being selected increase relative to other products (e.g., including a medium size popcorn slightly cheaper than the large one, with the purpose of making the latter more attractive by comparison). We study two models of customer choice where regularity violations can be accommodated (hence the non-conventionality), and show that the assortment optimisation problem can still be solved in polynomial time. First we analyse the Sequential Multinomial Logit (SML). Under the SML model, products are partitioned into two levels, to capture differences in attractiveness, brand awareness and, or visibility of the products in the market. When a consumer is presented with an assortment of products, she first considers products on the first level and, if none of them is purchased, products in the second level are considered. This model is a special case of the Perception-Adjusted Luce Model (PALM) recently proposed by Echenique et al.(2018). It can explain many behavioural phenomena such as the attraction, compromise, similarity effects and choice overload which cannot be explained by the Multinomial Logit (MNL) model or any discrete choice model based on random utility. We show that the concept of revenue-ordered assortment sets, which contain an optimal assortment under the MNL model, can be generalized to the SML model. More precisely, we show that all optimal assortments under the SML are revenue-ordered by level, a natural generalization of revenue-ordered assortments that contains, at most, a quadratic number of assortments. As a corollary, assortment optimization under the SML is polynomial-time solvable Secondly, the Two-Stage Luce model (2SLM), is a discrete choice model introduced by Echenique and Saito (2018) that generalizes the standard multinomial logit model (MNL). The 2SLM does not satisfy the Independence of Irrelevant Alternatives (IIA) property nor regularity, and to model customer behaviour, each product has an intrinsic utility, and uses a dominance relation between products. Given a proposed assortment S, consumers first discard all dominated products in S before using an MNL model on the remaining products. As a result, the model can capture behaviour that cannot be replicated by any discrete choice model based on random utilities. We show that the assortment problem under the 2SLM is polynomially-solvable. Moreover, we prove that the capacitated assortment optimization problem is NP-hard and present polynomial-time algorithms for the cases where (1) the dominance relation is attractiveness correlated and (2) its transitive reduction is a forest. The proofs exploit a strong connection between assortments under the 2SLM and independent sets in comparability graphs. The third and final contribution is an in-depth study of the pricing problem under the 2SLM. We first note that changes in prices should be reflected in the dominance relation if the differences between the resulting attractiveness are large enough. This is formalised by solving the joint assortment and pricing problem under the Threshold Luce model, where one product dominates another if the ratio between their attractiveness is greater than a fixed threshold. In this setting, we show that this problem can be solved in polynomial time
    corecore