72 research outputs found

    The Question of Spectrum: Technology, Management, and Regime Change

    Get PDF
    There is general agreement that the traditional command-and-control regulation of radio spectrum by the FCC (and NTIA) has failed. There is no general agreement on which regime should succeed it. Property rights advocates take Ronald Coase's advice that spectrum licenses should be sold off and traded in secondary markets, like any other assets. Commons advocates argue that new technologies cannot be accommodated by a licensing regime (either traditional or property rights) and that a commons regime leads to the most efficient means to deliver useful spectrum to the American public. This article reviews the scholarly history of this controversy, outlines the revolution of FCC thinking, and parses the question of property rights vs. commons into four distinct parts: new technology, spectrum uses, spectrum management, and the overarching legal regime. Advocates on both sides find much to agree about on the first three factors; the disagreement is focused on the choice of overarching regime to most efficiently and effectively make spectrum and its applications available to the American public. There are two feasible regime choices: a property rights regime and a mixed licensed/commons regime subject to regulation. The regime choice depends upon four factors: dispute resolution, transactions costs, tragedies of the commons and anticommons, and flexibility to changing technologies and demands. Each regime is described and analyzed against these four factors. With regard to pure transactions costs, commons may hold an advantage but it appears quite small. For all other factors, the property rights regime holds very substantial advantages relative to the mixed regime. I conclude that the choice comes down to markets vs. regulation as mechanism for allocating resources.

    A National Broadband Plan for Our Future: A Customer-Centric Framework

    Get PDF
    Congress has recently charged the Federal Communications Commission to establish a National Broadband Plan. This paper argues that a customer-centric plan, which puts the customer in control of decision-making, will yield the best broadband result for the U.S. The Federal government must establish a market infrastructure that encourages competition, requires transparency of both network providers and application providers, and includes vigorous antitrust enforcement. Competition from wireless broadband is present now and will become far more prevalent shortly, on the basis of current and announced investment plans. Regulators must also make available far more licensed spectrum to ensure this competition is realized. Calls for regulation in the form of mandated unbundling and more unlicensed spectrum are regulatory cul-de-sacs with proven track records of failure. Calls for regulatory control of network provider practices (other than transparency), such as network neutrality, are misguided. Such decisions are best left to customers, who can very well decide for themselves which of the broadband providers offer terms that best suit the customer.Technology and Industry

    Mobile Telephony: Economic and Social Impact

    Get PDF
    The ubiquitous cell phone is often portrayed as the scourge of civilized society: rude callers on streets, in malls and offices, disturbing those around them with loud talking, school kids constantly texting in class, drivers whose attention has wandered during a cell phone conversation causing accidents, “crackberry” addicts who check their e-mail during real-world conversations, the list goes on. Is this an invention whose result has been to make us all worse off, like Internet spam and phishing attacks? In this paper, I informally survey the rise and impact of cellular technology, both in the US and the world. I find that the reach and the speed of its worldwide diffusion has exceeded even that of the Internet, and certainly with far more reach and speed than the personal computer. Mobile’s economic and social impact has been unprecedented, especially in the developing world where it has been a boon to economic development. While many in the US focus on expanding the diffusion of the PC both domestically and worldwide, as well as expanding the availability of broadband connectivity, I argue that while PC-broadband architecture will continue to be important, the terminal device of choice for most people on this planet will be the mobile, accessing information services over a wireless connection. Mobile telephony is, I believe, the highest impact communications technology of the last 50 years, rivaled only by the Internet.

    Solving the Interoperability Problem: Are We On the Same Channel? An Essay on the Problems and Prospects for Public Safety Radio

    Get PDF
    Symposium: The Crisis in Public Safety Communications. Held at the Mercatus Center at George Mason University, December 8, 2006. A number of disasters over the last two decades have demonstrated the dire consequences that occur when first responders are unable to communicate due to interoperability of their communications equipment. Each such disaster is followed by a strong reaction from the Federal government, promising immediate action, often with plans to deploy the latest technology. In fact, nothing has ever actually happened at the Federal level to solve first responders\u27 interoperability problem. As I show using a case study from Delaware, states have stepped into the breach and provided fully interoperable systems using technology that is twenty years old. While the Federal government has made much political noise about the problem and its role in fixing it, it is the states that are quietly getting the job done

    Will Access Regulation Work?

    Get PDF
    The Enduring Lessons of the Breakup of AT&T: A Twenty-Five Year Retrospective. \u27 Conference held at the University of Pennsylvania Law School on April 18-19, 2008. The FCC is transitioning from a rate regulation regime to an access regime. A rate regulation regime gives all customers full access to network facilities (common carrier) at regulated rates-generally, rate base rate of return regulation. An access regime is one in which all competitors are given full access to incumbents\u27 networks, with little or no retail rate regulation, thereby allowing competition (over incumbents\u27 networks) to discipline the market. Is this a good idea? Is it likely to work? What is the evidence for this

    Will Access Regulation Work?

    Get PDF
    The Enduring Lessons of the Breakup of AT&T: A Twenty-Five Year Retrospective. \u27 Conference held at the University of Pennsylvania Law School on April 18-19, 2008. The FCC is transitioning from a rate regulation regime to an access regime. A rate regulation regime gives all customers full access to network facilities (common carrier) at regulated rates-generally, rate base rate of return regulation. An access regime is one in which all competitors are given full access to incumbents\u27 networks, with little or no retail rate regulation, thereby allowing competition (over incumbents\u27 networks) to discipline the market. Is this a good idea? Is it likely to work? What is the evidence for this

    Public Policy for a Networked Nation

    Get PDF

    File Sharing, Copyright, and the Optimal Production of Music

    Get PDF
    Much economic, political, judicial and legal attention has been showered on the significant changes currently taking place within the music production and distribution business forced by the use of the Internet for both file sharing (of unauthorized copyrighted material) and more recent online (legal) music distribution. The strong demand for music, coupled with the low cost of distributing illegal copies via peer-to-peer (P2P) systems, is unraveling the business model by which music has traditionally been created, developed, and distributed. Application of traditional copyright law has been ineffective in stopping the loss of business in the traditional channels. Producers have implemented forms of Digital Rights Management ( DRM ) in an attempt to protect their property via technologically self-enforcing contracts. Past DRM efforts have alienated customers, resulted in defective products, and, in some cases, been laughably easy to defeat by hackers. Producers assert that if the problem isn\u27t solved, music production will be sharply curtailed. The cost of free music via P2P is less music produced and fewer choices, an outcome that all seem to agree is bad. In this Article, I attempt to answer the question whether or not a reduction in music choice is, in fact, bad. I model the music industry as a Hotelling-Salop differentiated products market and, using results from Bhaskar and To, I show that significant overproduction of music may occur. The worst hypothesized loss from file sharing tends to reduce this overproduction, but does not eliminate it. Applying effective DRM simply returns the market to overproduction. Taking account of potential externalities (using rough preliminary estimates) of creative material suggests that overproduction of music is still the most likely outcome. Further empirical research is needed, but, on the basis of this model, the most likely outcome is that the displacement of CD sales by P2P file sharing actually increases welfare by constraining the overproduction of music that results from its unique market structure. The very tentative policy conclusion is that legitimizing file sharing under the doctrine of fair use is likely to be welfare enhancing

    Spectrum Management: Property Rights, Markets, and The Commons

    Get PDF
    Gerald Faulhaber and David Farberconsider alternatives to the current licensing regime for spectrum, which appears to lead to substantial inefficiencies in spectrum allocation.Specifically, they examine two property rights regimes and a commons regime.Theynote that economists have favored a market-based regime while engineers have favored a commons-based regime to promote new technologies. Mr. Faulhaber and Mr. Farbershow that thereis aproperty rights market-based regime that unleashes the power of the market andthe power of the new technologies to efficiently allocate spectrum, and that is likely to meet our needs for the near-term future. This regime resolves the presumed dichotomy between the market-based and the commons-based views, so that both objectives can be realized.The authorsalso outline a transition processfor achieving the desired regime outcome that is a "win-win" for all stakeholders, and that could be politically feasible. The change to a property rights regime is likely to lower the cost of spectrum substantially, in many cases to zero.Mr. Faulhaber and Mr. Farberassert that a commons model and a market model can co-exist, at least until spectrum becomes truly scarce.

    Banking Markets: Productivity, Risk, and Customer Satisfaction

    Get PDF
    This paper describes a structural model which incorporates bank decisions on productivity, risk-taking and customer satisfaction into an equilibrium model of banking markets for 219 large U.S. banks, between 1984 and 1992. The cost of bank risk is assessed using banks stock market betas as a broad measure of total risk. In particular, the effect of size on risk-taking is analyzed, as well as the decomposition of risk on a product basis. The Capital Asset Pricing Model is used to measure bank risk, thereby capturing all risk in a measure based on market behavior. Productivity losses due to mismatches between demand capacity are modeled and estimated. In addition, the effect of customer satisfaction, or quality, on bank profitability is measured. Structural model estimation is used to expand the range of questions that can be empirically addressed. It is also used to explain and directly estimate inefficiencies heretofore captured only by inference in fixed effects models. In this structural model banks face long- and short-run cost function choices, they optimally choose capacity levels and risk levels with limited information, and they interact in markets, leading to an asymmetric Cournot-Nash equilibrium. Each bank's profit function is derived from the market equilibrium conditions, and its parameters are directly estimated. The empirical analysis incorporates six bank products and develops a unified approach to thinking about what a bank s products are. The ability of banks to satisfy their customers is examined for the first time in the literature, and it is shown to be a source of profitability A new customer satisfaction data set is introduced to study this phenomenon. All of the relevant factors are integrated into a single model that estimates all parameters simultaneously. The model has five distinct parts: operating activities, risk, demand, customer satisfaction or quality and competitive interactions. The author concludes that banks differ widely in their ability to manage risk. Larger banks take on relatively more risk, with risk cost accounting for 38 percent of bank earnings on average. There also are substantial inefficiencies due to demand/capacity mismatches. On average, banks are over-optimistic by 10 percent in the demand for which they plan, and the cost to them is approximately 2.2 percent of total costs. This is substantially more than can be justified by "optimal overshooting" in the face of planning uncertainty and amounts to over 25 percent of average bank margin. Greater customer satisfaction correlates with greater profitability, principally due to greater demand. The effect of quality on cost and price is minimal. Bank-specific fixed effects are relatively insignificant. The very significant bank-specific effects that previous research discovered appear to have been largely captured and directly estimated in the structural model. The research confirms the results of previous research that there are no significant long-run economies of scale or scope.
    corecore