730 research outputs found

    How Do Limitations in Spectrum Fungibility Impact Spectrum Trading?

    Get PDF
    Secondary markets for spectrum trading have been considered an important solution for generating spectrum opportunities in an environment where scarcity is the rule. Nonetheless, an important factor when envisioning a successful spectrum trading environment is to consider how comparable an available frequency is to the frequency an spectrum user prefers. With this aim, we consider the fungibility scores previously determined in [1] in order to explore further parameters that can influence this quantification of the level of fungibility. Further, we merge these fungibility calculations with an existing spectrum trading model, SPECTRAD [2], seeking to determine the actual impact of the limitations of spectrum fungibility in the market viability

    Assessment of the Reliability of Reserves Estimates of Public Companies in the U.S. and Canada.

    Get PDF
    Estimation of reserves is a process used to quantify the volumes of hydrocarbon fluids that can be recovered economically from a reservoir, field, area or region, from a given date forward. A considerable level of uncertainty is involved throughout the reserves-estimation process. Unfortunately, individuals are poor at assessing uncertainty, with a common tendency for overconfidence (underestimation of uncertainty) and optimism. There are a few studies that address the reliability of reserves estimates, but none of them quantify the reliability of these estimates. This research aims to assess quantitatively the reliability of reserves estimates of public companies filing in the U.S. and Canada. To do this I measured biases in reported reserves estimates for 34 companies filing in Canada and 32 companies filing in the U.S. over the time period 2007 to 2017. Canadian companies explicitly report technical revisions of proved (1P) and proved-plus-probable (2P) reserves. U.S. companies do not report “technical revisions,” but instead report “revisions of previous estimates” and revisions due to price changes of proved (1P) reserves separately. I calculated Revisions Other Than Price (ROTP) by subtraction for U.S. companies and assumed the difference was the same as “technical revisions.” Based on probabilistic reserves definitions, it is reasonable to assume that proved reserves estimates are expected to have positive technical revisions 90% of the time, while proved- plus-probable reserves estimates are expected to have positive revisions 50% of the time. The reliability of proved and proved-plus-probable reserves estimates was assessed using calibration plots, in which the frequency of positive technical revisions is plotted against the estimate probability. Calibration plots can be used to measure confidence bias, ranging from underconfidence to complete overconfidence, and directional bias, ranging from complete pessimism to complete optimism. “Technical revisions” reported by 34 Canadian companies for the 11-year period were positive an average of 72% for 1P reserves and an average of 54% for 2P reserves, whereas the expected values were 90% and 50%, respectively. Thus, on average over this time period, filers in Canada overestimated 1P reserves and underestimated 2P reserves. Considering the entire reserves distributions, bias measurements indicate that filers in Canada were moderately overconfident and slightly pessimistic. Revisions Other Than Price (ROTP) calculated for 32 U.S. companies for the 11-year period were positive an average of only 51% for 1P reserves, compared to an expected 90%. Thus, on average over this time period, filers in the U.S. overestimated 1P reserves significantly. Considering the entire reserves distributions, bias measurements indicate that filers in the U.S. were somewhere between complete overconfidence and neutral directional bias, and moderate overconfidence and complete optimism. The biases in reserves estimates filed in both Canada and the U.S. suggest that adjustments in reserves estimation procedures are warranted. Three groups of professionals can benefit from this study: (1) estimators, who can use the methodology to track their technical revisions over time, calibrate them, and use this information to adjust future estimation procedures; (2) investors, who can analyze reported reserves estimates to compare volumes fairly; and (3) regulators, who can ensure that filers are complying with appropriate criteria for 1P and 2P reserves

    Wireless Network Virtualization as an Enabler for Spectrum Sharing

    Get PDF
    Spectrum Sharing and Wireless Network Virtualization have been explored as methods to achieve spectrum efficiency, increase network capacity and, overall, to address the existing spectrum scarcity problems. This work aims at exploring the link between these two topics, by specifically placing virtualization as a technology that can render spectrum sharing schemes feasible. No complete analysis can be made without taking into account three important axes: technology, policy and economics. In this light, in order to explore how virtualization enables spectrum sharing, flexibility is studied as a common attribute, due to the characteristics it presents regarding the three preceding axes. By determining how spectrum sharing, wireless virtualization and flexibility tie together, ground can be laid toward exploring further opportunities that would enhance spectrum usage, making it possible for this resource to foster these days’ ever-increasing demand

    Radiocarbon reservoir ages and hardwater effect for the northeastern coastal waters of Argentina.

    Get PDF
    Accelerator mass spectrometry (AMS) radiocarbon dates were obtained for 18 mollusk shells collected alive along the Buenos Aires province coast, Argentina, over the period AD 1914–1935. Reservoir ages were estimated for all samples on the basis of the tree-ring calibration curve for the Southern Hemisphere (SHCal04, McCormac et al. 2004) and the marine ΔR values calculated as the difference between the conventional 14C age and the age deduced from the marine, mixedlayer model calculation (Marine04, Hughen et al. 2004). For most coastal locations, a great ΔR scatter was observed, ranging from 191 to 2482 yr, which is explained by the input of varying content of dissolved carbonate by rivers and groundwater (“hardwater effect”) and indicates a serious limitation for shell-based 14C chronologies. Within the interior of Bahía Blanca estuary, ΔR values ranged from –40 to 50 ± 46 as a consequence of the local geological particularities of the environment. This suggests that, with some restrictions, the marine calibration curve with standard parameters (ΔR = 0) could be used at this location.Fil: Gomez, Eduardo Alberto. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico Conicet - Bahía Blanca. Instituto Argentino de Oceanografía. Universidad Nacional del Sur. Instituto Argentino de Oceanografía; ArgentinaFil: Borel, Claudia Marcela. Consejo Nacional de Investigaciones Científicas y Técnicas; ArgentinaFil: Aguirre, Marina Laura. Consejo Nacional de Investigaciones Científicas y Técnicas; ArgentinaFil: Martinez, Daniel Emilio. Universidad Nacional de Mar del Plata. Facultad de Ciencias Exactas y Naturales. Instituto de Geología de Costas y del Cuaternario. Provincia de Buenos Aires. Gobernación. Comisión de Investigaciones Científicas. Instituto de Geología de Costas y del Cuaternario; Argentina. Consejo Nacional de Investigaciones Científicas y Técnicas; Argentin

    Juegos y ejercicios prácticos para las materias del área de gestión de la producción y logística en ingeniería de producción.

    Get PDF
    En el trabajo de grado “Juegos y ejercicios para la administración de operaciones” presentado en el año 2004 por los estudiantes Tomas Arango y José Ignacio Garcés se recopilaron algunas actividades lúdicas que facilitan el aprendizaje de diferentes temas vistos en el programa de Ingeniería de Producción. La nueva metodología del aprendizaje requiere que los estudiantes cuenten con las herramientas adecuadas para realizar por sí solos las actividades o laboratorios con un acompañamiento mínimo de los docentes; por tal motivo los juegos existentes deben ser revisados y mejorados. Además es necesaria la creación de nuevos juegos que complementen el aprendizaje y comprendan los temas más relevantes de cada materia. Los juegos son experiencias que los estudiantes viven, que les permiten evaluar diferentes escenarios, situaciones y experimentar realmente las consecuencias para adquirir el criterio suficiente para la posterior toma de decisiones y la interiorización de conceptos. (Paul Gee, 2008

    How can Polycentric Governance work?

    Get PDF
    Spectrum policy in the US (and throughout most of the world) consists generally of a set of nationally determined policies that apply uniformly to all localities. However, it is also true that there is considerable variation in the features (e.g., traffic demand or population density), requirements and constraints of spectrum use on a local basis. Global spectrum policies designed to resolve a situation in New York City could well be overly restrictive for communities in rural areas (such as central Wyoming). At the same time, it is necessary to ensure that more permissive policies of central Wyoming would not create problems for NYC (by ensuring, for example, that relocated radios adapt to local policies). Notions of polycentric governance that have been articulated by the late E. Ostrom [16] argue that greater good can be achieved by allowing for local autonomy in resource allocation. Shared access to spectrum is generally mediated through one of several technologies. As shown in [21], approaches mediated by geolocation databases are the most cost effective in today's technology. In the database oriented Spectrum Access System, or SAS, proposed by the FCC, users are granted (renewable) usage rights based on their location for a limited period of time. Because this system grants usage rights on a case-bycase basis, it may also allow for greater local autonomy while still maintaining global coordination. For example, it would be technically feasible for the database to include parameters such as transmit power, protocol, and bandwidth. Thus, they may provide the platform by which polycentric governance might come to spectrum management. In this paper, we explore, through some case examples, what polycentric governance of spectrum might look like and how this could be implemented in a database-driven spectrum management system. In many ways this paper is a complement to [20], which evaluted emerging SAS architectures using Ostrom's socioeconomic theory. This paper explores how a SAS-based system could be constructed that is consistent with Ostrom's polycentric governance ideas. Our approach is to address spectrum management as an emergent phenomenon rather than a top down system. This paper will describe the key details of this system and present some initial modeling results in comparison with the traditional global model of spectrum regulation. It will also discuss some of the concerns associated with this approach

    How can polycentric governance of spectrum work?

    Get PDF
    Spectrum policy in the US (and throughout most of the world) consists generally of a set of nationally determined policies that apply uniformly to all localities. However, it is also true that there is considerable variation in the features (e.g., traffic demand or population density), requirements and constraints of spectrum use on a local basis. Global spectrum policies designed to resolve a situation in New York City could well be overly restrictive for communities in central Wyoming. At the same time, it is necessary to ensure that more permissive policies of central Wyoming would not create problems for NYC (by ensuring, for example, that relocated radios adapt to local policies). Notions of polycentric governance that have been articulated by the late E. Ostrom [17] argue that greater good can be achieved by allowing for local autonomy in resource allocation. Shared access to spectrum is generally mediated through one of several technologies. As Weiss, Altamimi and Liu [22] show, approaches mediated by geolocation databases are the most cost effective in today’s technology. In the database oriented Spectrum Access System, or SAS, proposed by the FCC, users are granted (renewable) usage rights based on their location for a limited period of time. Because this system grants usage rights on a case-by-case basis, it may also allow for greater local autonomy while still maintaining global coordination. For example, it would be technically feasible for the database to include parameters such as transmit power, protocol, and bandwidth. Thus, they may provide the platform by which polycentric governance might come to spectrum management. In this paper, we explore, through some case examples, what polycentric governance of spectrum might look like and how this could be implemented in a database-driven spectrum management system. The approach proposed in this paper aims at approaching spectrum management as an emergent phenomenon rather than a top down system. This paper will describe the key details of this system and present some initial modelling results in comparison with the traditional global model of spectrum regulation. It will also address some of the concerns associated with this approach

    Secondary spectrum markets: from "naked" spectrum to virtualized commodities

    Get PDF
    The creation of secondary spectrum markets emerged as a means to enable flexible spectrum- use mechanisms and abandon a rigid spectrum allocation and assignment approach, which resulted in severe inefficiencies in the use of this resource. At the core of the deployment of spectrum markets lie the definition of electromagnetic spectrum as a tradable commodity, the reallocation of spectrum rights, the creation of incentives for resource owners to lease or transfer their spectrum holdings and the appropriate regulatory framework to support and enforce market transactions. It follows that the viability of spectrum markets depends on technical, economic and regulatory frameworks to render this approach a meaningful alternative for spectrum allocation and assignment. In this research work, we explore the conditions associated with spectrum markets viability. For this purpose, we utilize Agent-based Modeling in order to study markets under different commodity definitions as well as network configurations. These configurations are gathered in three research stages, which start with the analysis of markets as stand-alone institutions where electromagnetic frequencies, without any associated infrastructure (i.e., “naked” spectrum), are traded. This allows us to explore the degree in which the limitations in spectrum fungibility impact the trading process and outcome. In the second stage, we focus on refining the tradable commodity in such a way that allows to circumvent the physical limitations of spectrum. To this end, we rely on technologies such as LTE-Advanced and virtualization in order to define a fungible, virtualized spectrum commodity and explore the benefits that this provides for market deployment. The final stage aims at extending the range of applicability of virtualized commodities and providing opportunities that could address current spectrum service and connectivity requirements. Hence, we explore markets as part of more complex network arrangements, where we rely on middleman theory, matching markets and simple auctions in order to enable resource trading. This requires the analysis of multiple factors that impact market design from the definition of tradable commodities to the characterization of the role and objectives of market participants. These factors stem from relevant technical, economics and regulatory frameworks, which we explore to determine whether our spectrum markets proposal can be considered as a viable and applicable solution

    Trading Wireless Capacity Through Spectrum Virtualization Using LTE-A

    Get PDF
    Markets for spectrum were first proposed by Ronald Coase as a way to efficiently allocate this resource. It took another forty years for primary markets to be developed (in the form of spectrum auctions) as the mechanism for assigning spectrum licenses to users. It is not a secret that secondary markets would be necessary to fully realize the benefits of economic allocation of spectrum. But this is easier said than done, since spectrum is a complex, multi-dimensional product with relatively few buyers and sellers (at least for commercial mobile services), so liquid secondary markets have not emerged, even though spectrum trading through brokers is commonplace.\ud \ud In this paper, we find that liquidity for spectrum markets can be improved over "naked" spectrum markets when a standardized commodity can be traded that uses the principles of spectrum virtualization. We utilize the Physical Resource Blocks (PRBs) of LTE-Advanced as the traded commodity and modify the SPECTRAD model developed in [5] accordingly. Though much remains to be done, we find that this is a promising approach to finally realizing liquid secondary markets in radio spectrum

    The lion’s share. An experimental analysis of polygamy in Northern Nigeria

    Get PDF
    We use simple public goods games to investigate spousal behavior in Kano, northern Nigeria, one of the modern heartlands of polygyny. Most partners keep back at least half of their endowment from the common pool, but we find no evidence that polygynous households are less efficient than their monogamous counterparts. When men control the allocation, equal treatment of wives is common, but senior wives often receive more from their husbands, no matter what their contribution. However, the clearest result is that when men control the allocation, polygynous husbands receive a higher payoff compared to their wives and their monogamous counterparts
    corecore