14,267 research outputs found

    International Accounting Rate Reform in Telecommunications

    Get PDF
    Twenty European countries came together in 1865 to form an organisation, the predecessor to the International Telecommunications Union and to arrive at mechanisms and agree upon a methodology of distributing the revenues from the international telegraph service. The current accounting rate system is a modified version of the then developed methodology for the International telegraph. This international settlement regime based on accounting rates has long been under attack by economists, policy-makers in developed countries and international trade organisations. The ITU, the OECD, the FCC and other regulatory bodies are pursuing various initiatives to reform or replace the existing accounting rate system. These regulatory initiatives are aimed at reducing the current pricing distortions embedded in the accounting rate system. In the wake of the WTO agreement, a system of traffic compensation that is not ‘cost oriented’ is not only unsustainable, it is also in violation of the regulatory principles set out in the WTO reference paper. The FCC has been at the forefront of the move to decrease accounting rates. In August 1997, the FCC adopted “benchmark” accounting rates for different groups of countries, which it considered more closely related to the actual costs of providing international service between those countries and the US. The benchmark rates range from 0.15−0.15-.23 per minute, and are far below those currently in practice, particularly for most of the developing countries which are sometimes in excess of $1.00 per minute. If implemented, these rates would significantly reduce international calling revenues of these countries. While the FCC obviously has no direct regulatory jurisdiction outside of the US, it has threatened to deny access to the US market to PTOs from other countries that do not reduce their accounting rates to the benchmark levels. While the future of the existing accounting rate system is being debated in regulatory circles, an increasing proportion of international traffic is bypassing this traditional system of compensation. Facilitated by the global trend towards the liberalisation of telecommunications markets, new technological means for bypassing the accounting rate system are also developing rapidly.

    Benchmarking of project planning and success in selected industries

    Get PDF
    Purpose - To identify the industry in which projects are best planned and executed and use it as a benchmark for improving project planning in other industries. Design/methodology/approach - Based on data collected from 280 project managers, project success and quality of project planning were evaluated and analyzed for four industries - construction and engineering, software and communications, services, and production and maintenance. Findings - Quality of project planning was found to be the highest in construction and engineering organizations and the lowest in manufacturing organizations. This is a result of a few factors, among them the intensive organizational support which is offered to project managers working in construction and engineering organizations. The other three industries limit their support mostly to tactical aspects, such as the purchasing of project management software. The high quality of project planning in the construction and engineering organizations resulted in their ability to complete projects by almost half the cost and schedule overruns, as compared to organizations belonging to the other industries. Finally, results of the industries in Israel and Japan are compared and analyzed. Research limitations/implications - Findings are limited to the four industries included in the study. Practical implications - If organizations, not belonging to the construction industry, wish to improve the probability of success in project planning and execution, they should follow methodologies commonly used in the construction industry. Originality/value - This paper introduces a valid field study, exploring project management practices in four industries and identifies the one which may be used as a benchmark for the others. It also identifies specific strengths and weaknesses in project management within the explored industries

    In-field entanglement distribution over a 96 km-long submarine optical fibre

    Get PDF
    Techniques for the distribution of quantum-secured cryptographic keys have reached a level of maturity allowing them to be implemented in all kinds of environments, away from any form of laboratory infrastructure. Here, we detail the distribution of entanglement between Malta and Sicily over a 96 km-long submarine telecommunications optical fibre cable. We used this standard telecommunications fibre as a quantum channel to distribute polarisation-entangled photons and were able to observe around 257 photon pairs per second, with a polarisation visibility above 90%. Our experiment demonstrates the feasibility of using deployed submarine telecommunications optical fibres as long-distance quantum channels for polarisation-entangled photons. This opens up a plethora of possibilities for future experiments and technological applications using existing infrastructure.Comment: 6 pages, 4 figure

    Issues surrounding cyber-safety for Indigenous Australians

    Get PDF
    This inquiry examined issues surrounding cyber-safety for Indigenous Australians, particularly young people in remote and rural communities.Introduction to the inquiryOn 20 March 2013 the Committee adopted an inquiry into the issues surrounding cyber-safety for Indigenous Australians.This inquiry followed the Committee’s previous inquiries into Cyber-Safety and the Young and Cybersafety for Senior Australians. Following completion of those inquiries, the Committee believed that issues surrounding cyber-safety for Indigenous Australians warranted further, more in-depth investigation. Therefore, under paragraph (1)(b) of its Resolution of Appointment, the Committee adopted the inquiry which is the subject of this report.As a Select Committee, under paragraph (17) of the Resolution of Appointment, the Committee must present its final report to Parliament no later than 27 June 2013. The terms of reference, which can be found at the start of this report, are far-reaching and could not be accomplished in any depth in the available timeframe.The Committee, therefore resolved to use the available time to investigate to the extent possible what particular issues Indigenous people might be facing with cyber-safety. This brief report discusses those issues and finds that a longer, more in-depth investigation of the topic by a Committee in the 44th Parliament would be appropriate

    An M-QAM Signal Modulation Recognition Algorithm in AWGN Channel

    Full text link
    Computing the distinct features from input data, before the classification, is a part of complexity to the methods of Automatic Modulation Classification (AMC) which deals with modulation classification was a pattern recognition problem. Although the algorithms that focus on MultiLevel Quadrature Amplitude Modulation (M-QAM) which underneath different channel scenarios was well detailed. A search of the literature revealed indicates that few studies were done on the classification of high order M-QAM modulation schemes like128-QAM, 256-QAM, 512-QAM and1024-QAM. This work is focusing on the investigation of the powerful capability of the natural logarithmic properties and the possibility of extracting Higher-Order Cumulant's (HOC) features from input data received raw. The HOC signals were extracted under Additive White Gaussian Noise (AWGN) channel with four effective parameters which were defined to distinguished the types of modulation from the set; 4-QAM~1024-QAM. This approach makes the recognizer more intelligent and improves the success rate of classification. From simulation results, which was achieved under statistical models for noisy channels, manifest that recognized algorithm executes was recognizing in M-QAM, furthermore, most results were promising and showed that the logarithmic classifier works well over both AWGN and different fading channels, as well as it can achieve a reliable recognition rate even at a lower signal-to-noise ratio (less than zero), it can be considered as an Integrated Automatic Modulation Classification (AMC) system in order to identify high order of M-QAM signals that applied a unique logarithmic classifier, to represents higher versatility, hence it has a superior performance via all previous works in automatic modulation identification systemComment: 18 page

    Baltimore

    Get PDF
    A list of various transdeletions using the word Baltimore

    Technological Change, Financial Innovation, and Financial Regulation: The Challenges for Public Policy

    Get PDF
    The two technologies that form the heart of the financial services industry—data processing and telecommunications—have experienced rapid improvement and innovation in the United States in the past few decades. In the heavily regulated financial services industry, technological innovation and improvement may pose significant problems and challenges, both for the industry itself and for the government regulators and public policy makers. In this paper, the author provides an overview of the interactions between financial innovation and regulation. The author first makes a distinction among types of financial services firms that is essential to an understanding of financial services regulation. Institutions such as banks and insurance companies that hold financial assets and issue liabilities are known as financial intermediaries. A company that extends trade credit to its customers acts as a lender and is therefore a financial intermediary. The second category of financial services firms comprises firms like stockbrokers and investment bankers who facilitate financial transactions between primary issuers of financial liabilities and the investors who purchase these instruments. These firms are known as financial facilitators. Although there are firms that act both as intermediaries and facilitators, the distinction is an important one in understanding the interaction between technological innovation and financial regulation in the U.S. The author next turns to an analysis of the four major underlying causes of the recent technological changes in financial services. First, data processing and telecommunications have become both more powerful and inexpensive, allowing improved data collection, risk assessment and wider geographical reach for products. Second, less restrictive and protectionist laws and regulations have paved the way for greater competition and allowed outside innovators to enter the financial services market. Third, the shift from a relatively stable to a risky economy beginning in the 1970s created a demand for futures and options that would protect investors from risk. Finally, as a reaction to a strict regulatory environment, financial institutions developed innovative ways to circumvent cumbersome regulations. One of these developments, for example, was the money market mutual fund. Recent easing of restrictions has also encouraged financial innovation. The author turns to a detailed discussion of financial regulation, explaining the distinctions between the three major categories of: 1) economic regulation; 2) health-safety-environment regulation and; 3) information regulation. He then covers the specifics of regulations affecting: 1) banking; 2) securities and related instruments; 3) insurance; 4) pension funds; 5) mortgage conduits and; 6) finance companies and leasing companies. He then reaches the following conclusions based on his evaluation of the environment within which financial regulation operates: 1) the widespread nature of financial regulation is not accidental; 2) of the three categories enumerated above, information regulation extends most widely across the financial sector; 3) safety regulation applies most directly and strongly to those financial intermediaries who have the most widespread liabilities and; 4) economic regulation applies most extensively to banks and other depositories. He next explores the interaction between innovation and regulation and concludes that regulation has both negative and positive effects on innovation, this determination particularly depending on the critic's perspective on the regulations. The main effects of innovation on regulation now and in the future should involve the following issues: 1) more federal centralization of regulation, and less state regulation; 2) more international markets for financial products; 3) greater efficiency of financial markets due to increased competition; 4) development of regulations for new financial instruments; 5) differential regulatory treatment of risky financial instruments and; 6) stored value cards and smart cards and other electronic based innovations; 7) new privacy policies resulting from increased gathering of personal information from electronics-based instruments; 8) increased flows of funds through EFT systems and; 9) new interactions between computer software and hardware as well as with outside institutions as financial services transactions depend more on electronics-based instruments. The author concludes that a major task of public policy must be to ensure that financial regulation does not stifle innovation while it responds appropriately to challenges posed. This paper was presented at the Financial Institutions Center's conference on Performance of Financial Institutions, May 8-10, 1997.
    • 

    corecore