40 research outputs found

    Optimal Tax Mix with Income Tax Non-compliance *

    Get PDF
    Abstract Although developing countries face high levels of income inequality, they rely more on consumption taxes, which tend to be linear and are less effective for redistribution than a non-linear income tax. One explanation for this pattern is that the consumption taxes are generally more enforceable in these economies. This paper studies the optimal combination of a linear consumption tax, with a non-linear income tax, for redistributive purposes. In our model, households might not comply with the income tax code by reporting income levels that differ from their true income. However, the consumption tax is fully enforceable. We derive a formula for the optimal income tax schedule as a function of the consumption tax rate, the recoverable elasticities, and the moments of the taxable income distribution. Our equation differs from those of JEL: D31, D63, D82, H21, H23, H26, I30, O2

    Commercialization of the Internet: The Interaction of Public Policy and Private Actions,” in

    Get PDF
    seminar participants for comments. I am particularly grateful to Zvi Griliches who encouraged this research when it was at a formative stage. All remaining errors are mine alone. Abstract Why did commercialization of the Internet go so well? This paper examines events in the Internet access market as a window on this broad question. The study emphasizes four themes. First, commercializing Internet access did not give rise to many of the anticipated technical and operational challenges. Entrepreneurs quickly learned that the Internet access business was commercially feasible. Second, Internet access was malleable as a technology and as an economic unit. Third, privatization fostered attempts to adapt the technology in new uses, new locations, new market settings, new applications and in conjunction with other lines of business. These went beyond what anyone would have forecast by examining the uses for the technology prior to 1992. Fourth, and not trivially, the NSF was lucky in one specific sense. The Internet access industry commercialized at a propitious moment, at the same time as the growth of an enormous new technological opportunity, the World Wide Web. As it turned out, the Web thrived under market-oriented, decentralized and independent decision making. The paper draws lessons for policies governing the commercialization of other government managed technologies and for the Internet access market moving forward. 1 Motivation The "commercialization of the Internet" is shorthand for three nearly simultaneous events: the removal of restrictions by the National Science Foundation (NSF) over use of the Internet for commercial purposes, the browser wars initiated by the founding of Netscape, and the rapid entry of tens of thousands of firms into commercial ventures using technologies which employ the suite of TCP/IP standards. These events culminated years of work at NSF to transfer the Internet into commercial hands from its exclusive use for research activity in government funded laboratories and universities. Sufficient time has passed to begin to evaluate how the market performed after commercialization. Such an evaluation is worth doing. Actual events have surpassed the forecasts of the most optimistic managers at NSF. Was this due to mere good fortune or something systematic whose lessons illuminate the market today? Other government managed technologies usually face vexing technical and commercial challenges that prevent the technology from diffusing quickly, if at all. Can we draw lessons from this episode for the commercialization of other government managed technologies? In that spirit, this paper examines the Internet access market and one set of actors, Internet Service Provides (ISPs). ISPs provide Internet access for most of the households and business users in the country (NTIA, 1999), usually for a fee or, more recently, in exchange for advertising. Depending on the user facilities, whether it is a business or personal residence, access can involve dial-up to a local number or 1-800 number at different speeds, or direct access to the user's server employing one of several high-speed access technologies. The largest ISP in the United States today is America-On-Line, to which approximately half the households in the US subscribe. There also are many national ISPs with recognizable names, such as AT&T Worldnet, MCI WorldCom/UUNet, Mindspring/Earthlink, and PSINet, as well as thousands of smaller regional ISPs. The Internet access market is a good case to examine. Facilities for similar activity existed prior to commercialization, but there was reason to expect a problematic migration into commercial use. This activity 2 appeared to possess idiosyncratic technical features and uneconomic operational procedures which made it unsuitable in other settings. The Internet's exclusive use by academics and researchers fostered cautious predictions that unanticipated problems would abound and commercial demand might not materialize. In sharp contrast to cautious expectations, however, the ISP market displayed three extraordinary features. For one, this market grew rapidly, attracting thousands of entrants and many users, quickly achieving mass-market status. Second, firms offering this service became nearly geographically pervasive, a diffusion pattern rarely found in new infrastructure markets. And third, firms did not settle on a standard menu of services to offer, indicative of new commercial opportunities and also a lack of consensus about the optimal business model for this opportunity. Aside from defying expectations, all three traits --rapid growth, geographic pervasiveness and the absence of settlement --do not inherently go together in most markets. The presence of restructuring should have interfered with rapid growth and geographic expansion. So explaining this market experience is also interesting in its own right. What happened to make commercialization go so well? This paper's examination reveals four themes. First, commercialization did not give rise to many of the anticipated technical and operational challenges. Entrepreneurs quickly learned that the Internet access business was commercially feasible. This happened for a variety of economic reasons. ISPs began offering commercial service after making only incremental changes to familiar operating procedures borrowed from the academic setting. It was technically easy to collect revenue at what used to be the gateway functions of academic modem pools. Moreover, the academic model of Internet access migrated into commercial operation without any additional new equipment suppliers. Second, Internet access was malleable as a technology and as an economic unit. This is because the foundation for Internet inter-connectivity, TCP/IP, is not a single invention, diffusing across time and space without changing form. Instead, it is embedded in equipment which uses a suite of communication technologies, protocols and standards for networking between computers. This technology obtains economic value in combination with complementary invention, investment and equipment. While commercialization did 3 give rise to restructuring of Internet access to suit commercial users, the restructuring did not stand in the way of diffusion, nor interfere with the initial growth of demand. Third, privatizing Internet access fostered customizing Internet access technology to a wide variety of locations, circumstances and users. As it turned out, the predominant business model was feasible at small scale and, thus, at low levels of demand. This meant that the technology was commercially viable at low densities of population, whether or not it was part of a national branded service or a local geographically Fourth, and not trivially, the NSF was lucky in a particular sense of the word. It enabled the commercialization of the Internet access industry at a propitious moment, at the same time as the growth of an enormous new technological opportunity, the World Wide Web. This invention motivated further experimentation to take advantage of the new opportunity, which, as it turned out, thrived under marketoriented and decentralized decision making. The paper first develop these themes. Then it describes recent experience. It ends by discussing how these themes continue to resonate today. Challenges during technology transfer: an overview Conventional approaches to technological development led most observers in 1992 to be cautious about the commercialization of the Internet. To understand how this prediction went awry, it is important to understand its foundations. For example, military users frequently require electronic components to meet specifications that suit 6 the component to battle conditions. Extensive technical progress is needed to tailor a product design to meet these requirements. Yet, and this is difficult to anticipate prior to commercialization, an additional amount of invention is often needed to bring such a product design and to bring its manufacturing to a price/point with features that meet more cost-conscious or less technically stringent commercial requirements. Commercial challenges arise when commercial markets require substantial adaptation of operation and business processes in order to put technologies into use. In other words, government users or users in a research environment often tolerate operational processes that do not translate profitably to commercial environments. After a technology transfers out of government sponsorship, it may not be clear how to balance costs and revenues for technologies that had developed under settings with substantial subsidies underwriting losses, and research goals justifying expenditures. Hence, many government managed technologies require considerable experimentation with business models before they begin to grow, if they grow at all. For example, the supersonic transport actually met its engineering targets, but still failed to satisfy basic operational economics in most settings. Being technically sleek was insufficient to attract enough interest to generate the revenue which covered operating costs on any but a small set of routes. No amount of operational innovations and marketing campaigns were able to overcome these commercial problems. New technologies are also vulnerable to structural challenges that impede pathways to commercialization. Commercial and structural challenges are not necessarily distinct, though the latter are typically more complex. Structural challenges are those which require change to the bundle of services offered, change to the boundary of the firms offering or using the new technology, or dramatic change to the operational structure of the service organization. These challenges arise because technologies developed under government auspices may presume implementation at a particular scale or with a set of technical standards, but require a different set of organizational arrangements to support commercial applications. For example, while many organizations provided the technical advances necessary for scientific 7 computing in academic settings during the 1950s, very few of these same firms migrated into supporting large customer bases among business users. As it turned out, the required changes were too dramatic for many companies to make. The structure of the support and sales organization were very different, and so too were the product designs. Of course, the few who successfully made the transition to commercial users, such as IBM, did quite well, but doing so required overcoming considerable obstacles. In summary, conventional analysis forecasts that migrating Internet access into commercial use would engender technical, commercial and structural challenges. Why did the migration proceed so different than expected? The absence of challenge in the Internet Access industry An ISP is a commercial firm who provides access, maintains it for a fee and develop related applications as users require. While sometimes this is all they do, with business users they often do much more. Sometimes ISPs do simple things such as filtering. Sometimes it involves managing and designing email accounts, data-bases and web pages. Some ISPs label this activity consulting and charge for it separately; others do not consider it distinct from the normal operation of the Internet access services. On the surface the record of achievement for ISPs is quite remarkable. Most recent surveys show that no more than 10 percent of US households get their Internet access from university-sponsored Internet access providers, the predominant provider of such access prior to commercialization. Today almost all users go to a commercial providers By the end of the century the ISP market had obtained a remarkable structure. One firm, America On-line, provided access to close to half the households in the US market, while several score of other ISPs provided access to millions of households and businesses on a nationwide basis. Thousands of ISPs also 8 provided access for limited geographic areas, such as one city or region. Such small ISPs accounted for roughly a quarter of household use and another fraction of business use. Technical challenges did not get in the way The Internet access market did suffer from some technical challenges, but not enough to prevent rapid diffusion. Commercialization induced considerable technical innovation in complementary inventive activities. Much of this innovative activity became associated with developing new applications for existing users and new users. It is often forgotten that when the electronic commerce first developed based on TCP/IP standards, it was relatively mature in some applications, such as e-mail and file transfers, which were the most popular applications (these programs continue to be the most popular today, NTIA [1999]). To be sure, TCP/IP based programs were weak in others areas, such as commercial data base and software applications for business use, but those uses did not necessarily have to come immediately. The invention of the World Wide Web in the early 1990s further stretched the possibilities for potential applications and highlighted these weaknesses. More important for the initial diffusion, little technical invention was required for commercial vendors to put this technology into initial mainstream use. Academic modem pools and computing centers tended to use technologies similar to their civilian counterparts --such as bulletin board operators --while buying most equipment from commercial suppliers. Moving this activity into the mainstream commercial sector did not necessitate building a whole new Internet equipment industry; it was already there, supplying goods and services to the universities and to home PC users. Similarly, much of the software continued to be usefuli.e., Unix systems, the gate-keeping software, and the basic communication protocols. Indeed, every version of Unix software had been TPC/IP compatible for many years due to Department of Defense requirements. A simple commercial operation only needed to add a billing component to the gate-keeping software to turn an academic modem pool into a rudimentary commercial operation. 9 Technical information about these operations was easy to obtain if one had sufficient technical background; a BA in basic electrical engineering or computer science was far more than adequate. Many IS

    Evaluating operational AVHRR sea surface temperature data at the coastline using surfers

    Get PDF
    Sea surface temperature (SST) is an essential climate variable that can be measured routinely from Earth Observation (EO) with high temporal and spatial coverage. To evaluate its suitability for an application, it is critical to know the accuracy and precision (performance) of the EO SST data. This requires comparisons with co-located and concomitant in situ data. Owing to a relatively large network of in situ platforms there is a good understanding of the performance of EO SST data in the open ocean. However, at the coastline this performance is not well known, impeded by a lack of in situ data. Here, we used in situ SST measurements collected by a group of surfers over a three year period in the coastal waters of the UK and Ireland, to improve our understanding of the performance of EO SST data at the coastline. At two beaches near the city of Plymouth, UK, the in situ SST measurements collected by the surfers were compared with in situ SST collected from two autonomous buoys located ∼7 km and ∼33 km from the coastline, and showed good agreement, with discrepancies consistent with the spatial separation of the sites. The in situ SST measurements collected by the surfers around the coastline, and those collected offshore by the two autonomous buoys, were used to evaluate the performance of operational Advanced Very High Resolution Radiometer (AVHRR) EO SST data. Results indicate: (i) a significant reduction in the performance of AVHRR at retrieving SST at the coastline, with root mean square errors in the range of 1.0 to 2.0 °C depending on the temporal difference between match-ups, significantly higher than those at the two offshore stations (0.4 to 0.6 °C); (ii) a systematic negative bias in the AVHRR retrievals of approximately 1 °C at the coastline, not observed at the two offshore stations; and (iii) an increase in the root mean square error at the coastline when the temporal difference between match-ups exceeded three hours. Harnessing new solutions to improve in situ sampling coverage at the coastline, such as tagging surfers with sensors, can improve our understanding of the performance of EO SST data in coastal regions, helping inform users interested in EO SST products for coastal applications. Yet, validating EO SST products using in situ SST data at the coastline is challenged by difficulties reconciling the two measurements, which are provided at different spatial scales in a dynamic and complex environment

    Incidence of Schizophrenia and Other Psychoses in England, 1950–2009: A Systematic Review and Meta-Analyses

    Get PDF
    Background We conducted a systematic review of incidence rates in England over a sixty-year period to determine the extent to which rates varied along accepted (age, sex) and less-accepted epidemiological gradients (ethnicity, migration and place of birth and upbringing, time). Objectives To determine variation in incidence of several psychotic disorders as above. Data Sources Published and grey literature searches (MEDLINE, PSycINFO, EMBASE, CINAHL, ASSIA, HMIC), and identification of unpublished data through bibliographic searches and author communication. Study Eligibility Criteria Published 1950–2009; conducted wholly or partially in England; original data on incidence of non-organic adult-onset psychosis or one or more factor(s) pertaining to incidence. Participants People, 16–64 years, with first -onset psychosis, including non-affective psychoses, schizophrenia, bipolar disorder, psychotic depression and substance-induced psychosis. Study Appraisal and Synthesis Methods Title, abstract and full-text review by two independent raters to identify suitable citations. Data were extracted to a standardized extraction form. Descriptive appraisals of variation in rates, including tables and forest plots, and where suitable, random-effects meta-analyses and meta-regressions to test specific hypotheses; rate heterogeneity was assessed by the I2-statistic. Results 83 citations met inclusion. Pooled incidence of all psychoses (N = 9) was 31.7 per 100,000 person-years (95%CI: 24.6–40.9), 23.2 (95%CI: 18.3–29.5) for non-affective psychoses (N = 8), 15.2 (95%CI: 11.9–19.5) for schizophrenia (N = 15) and 12.4 (95%CI: 9.0–17.1) for affective psychoses (N = 7). This masked rate heterogeneity (I2: 0.54–0.97), possibly explained by socio-environmental factors; our review confirmed (via meta-regression) the typical age-sex interaction in psychosis risk, including secondary peak onset in women after 45 years. Rates of most disorders were elevated in several ethnic minority groups compared with the white (British) population. For example, for schizophrenia: black Caribbean (pooled RR: 5.6; 95%CI: 3.4–9.2; N = 5), black African (pooled RR: 4.7; 95%CI: 3.3–6.8; N = 5) and South Asian groups in England (pooled RR: 2.4; 95%CI: 1.3–4.5; N = 3). We found no evidence to support an overall change in the incidence of psychotic disorder over time, though diagnostic shifts (away from schizophrenia) were reported. Limitations Incidence studies were predominantly cross-sectional, limiting causal inference. Heterogeneity, while evidencing important variation, suggested pooled estimates require interpretation alongside our descriptive systematic results. Conclusions and Implications of Key Findings Incidence of psychotic disorders varied markedly by age, sex, place and migration status/ethnicity. Stable incidence over time, together with a robust socio-environmental epidemiology, provides a platform for developing prediction models for health service planning

    A Service of zbw Consumer Learning about Established Firms: Evidence from Automobile Insurance

    No full text
    Standard-Nutzungsbedingungen: Die Dokumente auf EconStor dürfen zu eigenen wissenschaftlichen Zwecken und zum Privatgebrauch gespeichert und kopiert werden. Sie dürfen die Dokumente nicht für öffentliche oder kommerzielle Zwecke vervielfältigen, öffentlich ausstellen, öffentlich zugänglich machen, vertreiben oder anderweitig nutzen. Sofern die Verfasser die Dokumente unter Open-Content-Lizenzen (insbesondere CC-Lizenzen) zur Verfügung gestellt haben sollten, gelten abweichend von diesen Nutzungsbedingungen die in der dort genannten Lizenz gewährten Nutzungsrechte. Terms of use: Documents in Abstract Most research on experience goods embodies the notion that, while direct product experience is required to learn about new goods, information is more complete for established products. This view is supported, at least in part, by three premises -that learning from direct product experiences occurs rapidly; that a consumer's preference for a given firm increases with information (so that firms have strong incentives to disseminate information), and that consumer purchase choices react strongly to that information. However, officials in many industries question these views -arguing that limited consumer information impacts demand even for wellestablished products, that learning from direct experiences can be quite slow, that consumers are often initially optimistic and then disappointed by experiences, and that by the time consumers learn they may be too "locked-in" to react. Unfortunately the empirical measurement required to settle these issues is nearly impossible in the standard, non-durable product markets generally studied -if consumers learn each time they purchase a product, it is quite difficult to separate learning from other sources of state dependence in demand. Markets for continuously provided services, such as credit cards, telephony, or insurance, are potentially much better venues for such measurement, because consumers learn about service quality at distinct interactions with firms. Unfortunately, the occurrence of these interactions tends to be either endogenous or unobservable. This paper overcomes these problems by considering automobile insurance, where consumers learn about service quality each time they have a claim, and the occurrence of claims is completely distinct from a consumer's satisfaction with her firm and fully observable from company records. Using a panel of 18,595 consumers from one well-established auto insurance company, the paper estimates a structural model of consumers' departure decisions with an imbedded Bayesian learning model. Among the key findings are: patterns of consumer departures by age and claims experience strongly suggest the importance of consumer learning at a longstanding firm; consumers enter the firm optimistic about its quality and are generally disappointed by experiences; and the impact of learning is greatly mitigated by the slow arrival of claims and the accrual of consumer lock-in over tenure with one firm
    corecore