24 research outputs found
Spectrum avaílability assessment tool for TV white space
The growth of wireless communication relies on the availability of radio frequency for new services. More efficient spectrum allocations are required to serve the increasing data per user. The major regulatory bodies are formulating new spectrum management techniques to forge the growing spectrum scarcity. Exclusive use of spectrum is proved to be inefficient in many spectrum occupancy measurement campaigns. As a result, spectrum sharing methods are being considered.
TV broadcasting is not using the allocated frequency in some geographic areas, creating coverage holes known as TV white spaces. Both the industry and the regulators are investigating the capability of TVWS, as a potential source of spectrum for emerging wireless services. The FCC, in the US, has already released the requirements for opportunistic access to the TV whites paces. In a similar fashion, ECC, the pan-European regulator is finalizing the work on the technical and operational requirements for the possible use of cognitive radio in this spectrum.
In this thesis work, an integrated web-based spectrum availability assessment tool is developed for Finland. The tool is a front-end visualization of a time intensive computational process to answer key technical questions related to TVWS - what secondary data rate can be supported in the available white space spectrum? The assessment involves estimation of the available TVWS and its capacity for cellular-type secondary systems. The relative effects of the secondary system parameters on the TV system are compared using appropriate signal to noise and interference ratio plots. The tool uses dynamic web technologies for a seamless and user-friendly visualization of the assessment
The Wireless Craze, The Unlimited Bandwidth Myth, The Spectrum Auction Faux Pas, and the Punchline to Ronald Coase's 'Big Joke': An Essay on Airwave Allocation Policy
In 1959 the Federal Communications Commission invited economist Ronald Coase to testify about his proposal for market allocation of radio spectrum rights. The FCC's first question: 'Is this all a big joke'' Today, however, leading policy makers, including the current FCC Chair, decry the 'spectrum drought' produced by administrative allocation and call for the creation of private bandwidth markets. This essay examines marketplace trends driving regulators' change of humor, and considers the path of spectrum policy liberalization in light of emerging technologies, theories of unlimited bandwidth, reforms such as FCC license auctions, and recent progress in deregulating wireless markets in the U.S. and around the globe.
Recommended from our members
Federal Register
Daily publication of the U.S. Office of the Federal Register contains rules and regulations, proposed legislation and rule changes, and other notices, including "Presidential proclamations and Executive Orders, Federal agency documents having general applicability and legal effect, documents required to be published by act of Congress, and other Federal agency documents of public interest" (p. ii). Table of Contents starts on page iii
A comparative investigation on performance and which is the preferred methodology for spectrum management; geo-location spectrum database or spetrum sensing
A Research Report submitted to the Faculty of Engineering and the Built Environment, University of Witwatersrand, in the partial fulfilment of the requirements for the degree of Master of Science in Engineering Johannesburg, 2015.Due to the enormous demand for multimedia services which relies hugely on the availability of spectrum, service providers and technologist are devising a means or method which is able to fully satisfy these growing demands. The availability of spectrum to meet these demands has been a lingering issue for the past couple of years. Many would have it tagged as spectrum scarcity but really the main problem is not how scarce the spectrum is but how efficiently allocated to use is the spectrum. Once such inefficiency is tackled effectively, then we are a step closer in meeting the enormous demands for uninterrupted services. However, to do so, there are techniques or methodologies being developed to aid in the efficient management of spectrum.
In this research project, two methodologies were considered and the efficiency of these methodologies in the areas of spectrum management. The Geo-location Spectrum Database (GLSD) which is the most adopted technique and the Cognitive radio spectrum sensing technique are currently the available techniques in place. The TV whitespaces (TVWS) was explored using both techniques and certain comparison based on performances; implementation, practicability, cost and flexibility were used as an evaluation parameter in arriving at a conclusion.
After accessing both methodologies, conclusions were deduced on the preferred methodology and how its use would efficiently solve the issues encountered in spectrum managemen
Reti Wireless Cognitive Cooperanti su TV White e Grey Spaces
Wireless networks rapidly became a fundamental pillar of everyday activities. Whether at work or elsewhere, people often benefits from always-on connections. This trend is likely to increase, and hence actual technologies struggle to cope with the increase in traffic demand. To this end, Cognitive Wireless Networks have been studied. These networks aim at a better utilization of the spectrum, by understanding the environment in which they operate, and adapt accordingly. In particular recently national regulators opened up consultations on the opportunistic use of the TV bands, which became partially free due to the digital TV switch over. In this work, we focus on the indoor use of of TVWS. Interesting use cases like smart metering and WiFI like connectivity arise, and are studied and compared against state of the art technology. New measurements for TVWS networks will be presented and evaluated, and fundamental characteristics of the signal derived. Then, building on that, a new model of spectrum sharing, which takes into account also the height from the terrain, is presented and evaluated in a real scenario. The principal limits and performance of TVWS operated networks will be studied for two main use cases, namely Machine to Machine communication and for wireless sensor networks, particularly for the smart grid scenario.
The outcome is that TVWS are certainly interesting to be studied and deployed, in particular when used as an additional offload for other wireless technologies. Seeing TVWS as the only wireless technology on a device is harder to be seen: the uncertainity in channel availability is the major drawback of opportunistic networks, since depending on the primary network channel allocation might lead in having no channels available for communication. TVWS can be effectively exploited as offloading solutions, and most of the contributions presented in this work proceed in this direction
An Assessment of Path Loss Tools and Practical Testing of Television White Space Frequencies for Rural Broadband Deployments
Broadband internet has grown to become a major part of our daily routines. With this growth increase, those without direct access will not be afforded the same opportunities that come with it. The need for ubiquitous coverage of broadband Internet is clear to provide everyone these opportunities. Rural environments are an area of concern of falling behind the growth as the low population densities make wired broadband solutions cost prohibitive. Wireless options are often the only option for many of these areas; WiFi, cellular, and WiMAX networks are currently used around the world, but with the opening of the unused broadcast television frequencies, deemed TV White Space (TVWS), a new option is hitting the market. This new technology needs to be assessed before it can be seen as a viable solution.
The contribution of this work is two-fold. First, findings from a real, ongoing trial of commercially available TVWS radios in the area surrounding the University of New Hampshire campus are presented. The trial shows that though the radios can provide Internet access to a distance of at least 12.5 km, certain terrain and foliage characteristics of the path can form coverage holes in that region. The second contribution explores the use of empirical path loss models to predict the path loss, and compares the predictions to actual path loss measurements from the TVWS network setup. The Stanford University Interim (SUI) model and a modified version of the Okumura-Hata model provide the lowest root mean squared error (RMSE) for the setup. Additionally, the deterministic Longley-Rice model was explored with the Radio Mobile prediction software. It was determined that without extensively tuning the foliage component of the algorithm, the model could produce significant prediction errors, resulting in a trade-off between low cost, un-tuned predictions, and prediction accuracy
Recommended from our members
Adaptive Coded Modulation Classification and Spectrum Sensing for Cognitive Radio Systems. Adaptive Coded Modulation Techniques for Cognitive Radio Using Kalman Filter and Interacting Multiple Model Methods
The current and future trends of modern wireless communication systems place heavy demands on fast data transmissions in order to satisfy end users’ requirements anytime, anywhere. Such demands are obvious in recent applications such as smart phones, long term evolution (LTE), 4 & 5 Generations (4G & 5G), and worldwide interoperability for microwave access (WiMAX) platforms, where robust coding and modulations are essential especially in streaming on-line video material, social media and gaming. This eventually resulted in extreme exhaustion imposed on the frequency spectrum as a rare natural resource due to stagnation in current spectrum management policies. Since its advent in the late 1990s, cognitive radio (CR) has been conceived as an enabling technology aiming at the efficient utilisation of frequency spectrum that can lead to potential direct spectrum access (DSA) management. This is mainly attributed to its internal capabilities inherited from the concept of software defined radio (SDR) to sniff its surroundings, learn and adapt its operational parameters accordingly. CR systems (CRs) may commonly comprise one or all of the following core engines that characterise their architectures; namely, adaptive coded modulation (ACM), automatic modulation classification (AMC) and spectrum sensing (SS).
Motivated by the above challenges, this programme of research is primarily aimed at the design and development of new paradigms to help improve the adaptability of CRs and thereby achieve the desirable signal processing tasks at the physical layer of the above core engines. Approximate modelling of Rayleigh and finite state Markov channels (FSMC) with a new concept borrowed from econometric studies have been approached. Then insightful channel estimation by using Kalman filter (KF) augmented with interacting multiple model (IMM) has been examined for the purpose of robust adaptability, which is applied for the first time in wireless communication systems. Such new IMM-KF combination has been facilitated in the feedback channel between wireless transmitter and receiver to adjust the transmitted power, by using a water-filling (WF) technique, and constellation pattern and rate in the ACM algorithm. The AMC has also benefited from such IMM-KF integration to boost the performance against conventional parametric estimation methods such as maximum likelihood estimate (MLE) for channel interrogation and the estimated parameters of both inserted into the ML classification algorithm. Expectation-maximisation (EM) has been applied to examine unknown transmitted modulation sequences and channel parameters in tandem. Finally, the non-parametric multitaper method (MTM) has been thoroughly examined for spectrum estimation (SE) and SS, by relying on Neyman-Pearson (NP) detection principle for hypothesis test, to allow licensed primary users (PUs) to coexist with opportunistic unlicensed secondary users (SUs) in the same frequency bands of interest without harmful effects. The performance of the above newly suggested paradigms have been simulated and assessed under various transmission settings and revealed substantial improvements
An analysis of regulatory frameworks for wireless communications, societal concerns and risk: the case of radio frequency (RF) allocation and licensing.
This thesis analyses how and why culture and geography influence the allocation and licensing of the radio frequency (RF) spectrum in different nations. Based on a broad study of 235 countries, an inter-disciplinary approach is used to explore regulatory frameworks and attitudes toward risk. In addition, detailed case studies of the UK, France, the US and Ecuador provide deeper insights into the main contrasting regulatory styles.
Three alternative sociological theories are used to analyse and explain the results for both the in-depth and broad brush studies. The Cultural Theory of Mary Douglas and co-workers is first used to categorise countries in terms of perceptual filters. The empirical findings indicate some countries to be apparently exceptional in their behaviour. The theory of Bounded Rationality is used to investigate and explain these apparent irrationalities. Finally,
Rational Field Theory shows how beliefs and values guide administrations in their RF regulation.
A number of key factors are found to dominate and patterns emerge. The European RF harmonisation is unique. Following European unification, wireless regulation is divided into
two major camps (the EU and the US), which differ in their risk concerns, approach to top-down mandated standards, allocation of RF spectrum to licence-exempt bands and type
approval process. The adoption of cellular and TV standards around the world reflects geopolitical and colonial influence. The language of a country is a significant indicator of its analogue TV standard. Interestingly, the longitude of a country to a fair extent defines RF
allocation: Africa and West Asia follow Europe, whereas the Americas approximate the US.
RF regulation and risk tolerability differ between tropical and non-tropical climates. The collectivised/centralised versus the individualised/market-based rationalities result in different regulatory frameworks and contrasting societal and risk concerns. The success of the top-down European GSM and the bottom-up Wi-Fi standards reveal how the central-
planning and market-based approaches have thrived. Attitudes to RF human hazards and spurious emissions levels reveal that the US, Canada and Japan are more tolerant of these risks than Europe. Australia, Canada, New Zealand, UK and USA encourage technological innovation.
A practical benefit of this study is that it will give regulators more freedom to choose a rational RF licensing protocol, by better understanding the possibly self-imposed boundaries of cultural and geographical factors which are currently shaping allocation. Academically, there is utility in undertaking a cultural and geographic analysis of a topic that is mostly the domain of engineering, economic and legal analysts