672 research outputs found

    Designing and Deploying Online Field Experiments

    Full text link
    Online experiments are widely used to compare specific design alternatives, but they can also be used to produce generalizable knowledge and inform strategic decision making. Doing so often requires sophisticated experimental designs, iterative refinement, and careful logging and analysis. Few tools exist that support these needs. We thus introduce a language for online field experiments called PlanOut. PlanOut separates experimental design from application code, allowing the experimenter to concisely describe experimental designs, whether common "A/B tests" and factorial designs, or more complex designs involving conditional logic or multiple experimental units. These latter designs are often useful for understanding causal mechanisms involved in user behaviors. We demonstrate how experiments from the literature can be implemented in PlanOut, and describe two large field experiments conducted on Facebook with PlanOut. For common scenarios in which experiments are run iteratively and in parallel, we introduce a namespaced management system that encourages sound experimental practice.Comment: Proceedings of the 23rd international conference on World wide web, 283-29

    The Determinants Of Success In the New Financial Services Environment: Now That Firms Can Do Everything, What Should They Do And Why Should Regulators Care?

    Get PDF
    The United States government enacted the Banking Act of 1933, commonly known as the Glass-Steagall Act, at least partially in an effort to calm fears stemming from bank failures during the Great Depression. While there has been a recent debate concerning the historic realism of characterizing the banking industry structure as the cause of the financial crisis (Benston, 1990), the perception of bank activities in the financial market as risky (Puri, 1994), and the motivation of the legislators (Benston, 1996), the historical outcome of this legislation is clear. Glass-Steagall placed a heavy regulatory burden on commercial banks by limiting their product array, the prices they could charge, and the types of firms with whom they may affiliate. It short, it restricted the activities in which banks may participate. During the ensuing sixty-five years, this landmark piece of regulation slowly has become both outdated and untenable. Technological innovation, regulatory circumvention, and new delivery mechanisms all have conspired to make the restrictions of the Act increasingly irrelevant. The first force of change, technology, permitted firms to create and recreate products and services in different ways than had been envisioned decades ago. The most obvious example is the transformation of the local mortgage loan market into the global securities giant of today. However, one could equally cite the explosive growth of both derivatives and trading activity as areas where technology has transformed the very core of financial services (Allen and Santomero, 1997). Because of regulation, however, individual financial firms were still limited in the scope of the activities that was permissible. Commercial banks could not offer the full range of security investment services; investment firms could not offer demand deposits; and, insurance firms were limited in offering services beyond their own "appropriate" products as well. Many firms responded by circumventing regulation, either explicitly or implicitly (Kane, 1999, Kaufman, 1996). Some more aggressive members of the fraternity simply acted in a manner not allowed by regulation in hopes of either an innovative interpretation of the law, e.g., NOW accounts, or money funds, or formal regulatory relief, e.g., Citigroup. The results were, almost always, regulatory accommodation or capitulation. These decisions, at times, made economic sense, e.g., the decisions on private placement activity, or advisory services, but at other times they stretched the credibility of the rules, if not the English language, e.g., non-bank banks, the facilitation of commercial paper placement, and mutual funds distribution. Yet, through this mechanism of regulatory evolution the industry progressed. Banks were granted greater latitude in product mix, as well as permitted to form holding companies that expanded their operations further. At the same time, competition increased as the rules permitted new entrants who flourished in focused areas, e.g., GE Capital. Today, a myriad of financial services firms, operating under different regulatory charters are competing in the broad financial marketplace. The final force of change is the continual evolution of the delivery channels through which financial services are offered. This has occurred in many ways and in several stages. First, the use of postal services substituted for physical market presence; this was followed by increased use of telephones for both customer service and outbound marketing; and now, personal computers and the web have altered the very balance of the financial industry. Throughout this period the application of technology has disrupted the industry's delivery paradigms and the traditional channels of service distribution. The combined use of new technology, conduits of distribution, and financial innovation have broadened the product offerings of all firms beyond their historic core business. Nonetheless, by law, financial service firms of specific types continued to be expressly limited in their activities. Finally, the Financial Modernization Act of 1999 (FMA), introduced on January 6, 1999 in the House of Representatives as H.R.10, has become law under the name the Gramm-Leach-Bliley Act. The bill's stated purpose was "[t]o enhance competition in the financial services industry by providing a prudential framework for the affiliation of banks, securities firms, and other financial service providers, and for other purposes." The potential ramifications of FMA have been, and surely will be, continuously analyzed as the details of the enabling regulation emerge and the industry responds to its new perspective on firm structure and allowable activity (ABA,1999, Stein and Perrino, 2000). Yet, the proponents of the FMA have already heralded its passage and argued that the legislation will result in more competitive, stable, and efficient financial firms, and a better overall capital market (Greenspan, 1997). Detractors, and there have been some, claim the new law will result in unfair business practices and less stable capital markets (Berger and Udell, 1996). In this contribution to the debate we attempt to consolidate many of the arguments for and against the financial conglomeration that will inevitably follow the passage of the new law. We offer our view of the effects of this new competitive landscape on affected financial firms, as well as the behavior of the capital market itself. Our focus is on the impact of the changing nature of both the market infrastructure and the regulatory regime on the behavior and likely span of activity conducted by large financial firms. In the words of our title: now that firms can do everything, what should they do, and why should regulators care?

    The determinants of success in the new financial services environment: now that firms can do everything, what should they do and why should regulators care?

    Get PDF
    Financial services industry ; Financial services industry - Europe ; Bank supervision ; Business forecasting

    Seeding with Costly Network Information

    Full text link
    We study the task of selecting kk nodes in a social network of size nn, to seed a diffusion with maximum expected spread size, under the independent cascade model with cascade probability pp. Most of the previous work on this problem (known as influence maximization) focuses on efficient algorithms to approximate the optimal seed set with provable guarantees, given the knowledge of the entire network. However, in practice, obtaining full knowledge of the network is very costly. To address this gap, we first study the achievable guarantees using o(n)o(n) influence samples. We provide an approximation algorithm with a tight (1-1/e){\mbox{OPT}}-\epsilon n guarantee, using Oϵ(k2logn)O_{\epsilon}(k^2\log n) influence samples and show that this dependence on kk is asymptotically optimal. We then propose a probing algorithm that queries Oϵ(pn2log4n+kpn1.5log5.5n+knlog3.5n){O}_{\epsilon}(p n^2\log^4 n + \sqrt{k p} n^{1.5}\log^{5.5} n + k n\log^{3.5}{n}) edges from the graph and use them to find a seed set with the same almost tight approximation guarantee. We also provide a matching (up to logarithmic factors) lower-bound on the required number of edges. To address the dependence of our probing algorithm on the independent cascade probability pp, we show that it is impossible to maintain the same approximation guarantees by controlling the discrepancy between the probing and seeding cascade probabilities. Instead, we propose to down-sample the probed edges to match the seeding cascade probability, provided that it does not exceed that of probing. Finally, we test our algorithms on real world data to quantify the trade-off between the cost of obtaining more refined network information and the benefit of the added information for guiding improved seeding strategies
    corecore