114 research outputs found

    Quality of Service (QoS) in Enterprise WAN Networks

    Get PDF
    Enterprise IT organizations started to face issues related to managing applications over the slow speed Wide Area Network (WAN) circuits a few years ago with the appearance and use of file-sharing applications over their WANs and/or Internet between headquarters and branch offices. The pain has continued to grow with introduction of mission critical transactional type data along with voice and video traffic over these networks. For this very reason several techniques were developed and implemented over the years by the vendors and service providers to help the Enterprise organization cope with these problems. This field project report provides an overview of Quality of Service (QoS) and its implementation in Enterprise Networks for Engineering Managers. The author will discuss the best practices around developing designs for architectures already in place, as well as the business and technical challenges that are faced by Engineering Managers in enterprise organizations when deploying QoS. While the project focuses on QoS implementation of WAN, similar concepts can be used for any type of implementation throughout the network. The author will first address the issue of bandwidth utilization and how its use has increased in recent years, what type of applications are driving it and the issues enterprise organizations are having in managing it. Next, the various industry standard solutions available to tackle this issue and the advantages of deploying them in the network will be discussed. Recommendations on the use of Project Planning guidelines to implement this critical project within the company and strategic steps that can be used to accomplish the goal will follow. Finally, cost analysis will show that increasing the bandwidth on the WAN can simply add cost to the enterprise organizations in most cases, whereas the proper use of QoS can help the organization reduce cost while utilizing the existing circuits while delivering the same level of service. The author recommends the use of QoS in networks to manage traffic but does not rule out the possibility of increasing bandwidth in the enterprise network or the combination of both

    Campus Communications Systems: Converging Technologies

    Get PDF
    This book is a rewrite of Campus Telecommunications Systems: Managing Change, a book that was written by ACUTA in 1995. In the past decade, our industry has experienced a thousand-fold increase in data rates as we migrated from 10 megabit links (10 million bits per second) to 10 gigabit links (10 billion bits per second), we have seen the National Telecommunications Policy completely revamped; we have seen the combination of voice, data, and video onto one network; and we have seen many of our service providers merge into larger corporations able to offer more diverse services. When this book was last written, A CUT A meant telecommunications, convergence was a mathematical term, triple play was a baseball term, and terms such as iPod, DoS, and QoS did not exist. This book is designed to be a communications primer to be used by new entrants into the field of communications in higher education and by veteran communications professionals who want additional information in areas other than their field of expertise. There are reference books and text books available on every topic discussed in this book if a more in-depth explanation is desired. Individual chapters were authored by communications professionals from various member campuses. This allowed the authors to share their years of experience (more years than many of us would care to admit to) with the community at large. Foreword Walt Magnussen, Ph.D. Preface Ron Kovac, Ph.D. 1 The Technology Landscape: Historical Overview . Walt Magnussen, Ph.D. 2 Emerging Trends and Technologies . Joanne Kossuth 3 Network Security . Beth Chancellor 4 Security and Disaster Planning and Management Marjorie Windelberg, Ph.D. 5 Student Services in a University Setting . Walt Magnussen, Ph.D. 6 Administrative Services David E. O\u27Neill 7 The Business Side of Information Technology George Denbow 8 The Role of Consultants . David C. Metz Glossary Michelle Narcavag

    Innovation-based competitive differentiation amongst South African fibre to the home (FTTH) operators

    Get PDF
    A research report submitted to the Wits Business School in the University of the Witwatersrand, in partial fulfilment of the requirements for the degree of Master of Management in the field of Innovation StudiesFibre-to-the-home (FTTH) as an access technology is viewed as vital for economic growth and competitiveness. The deployment of high speed networks and advanced access technologies is essential for continuous development of broadband in South Africa. FTTH is an access technology that has recently been deployed and adopted by different network providers in South Africa. South Africa is one of the developing countries attracting investments in the deployment of infrastructure where industry players are deploying fibre at a high rate with acquisition of smaller players. FTTH operators and service providers are challenged with competitive pressures within the market where competitiveness relies on the capacity to continually develop and implement unique innovative initiatives that will drive competitive advantage over its competitors. Competitive advantage is needed to achieve the organisation’s objectives. These innovative initiatives are required due to intense competition within the market which results in reduction of prices and margins. The purpose of this study is the exploration of innovation-based competitive differentiation from a South African FTTH provider perspective. The study aims to look at potential innovation differentiation amongst South African FTTH providers and how it can be applied in order to obtain a competitive advantage by looking at the factors that influence competitive advantage and barriers for innovation within the market. The study will provide knowledge and insights to develop an understanding of innovation for FTTH providers in a developing country such as South Africa. In this study, realistic data from the different network providers and service providers has been collected utilising a qualitative research method to investigate and conduct an in-depth analysis based on interviews with subject matter experts and managers from FTTH operators and service providers. Some propositions have been suggested as a validation for the proposed framework. The sample for the study has been drawn from the active FTTH network infrastructure operators and service providers based in Gauteng. The research findings indicate that the fibre-to-the-home market is a highly competitive environment with network providers and service providers offering various products and services to customers in order to meet customer needs. However, there are challenges with innovation based competitive differentiation since, currently, 3 competition is primarily based on the price of the broadband product. As a result, prices have been plummeting, leading to some service providers operating and selling their FTTH products with no margins or negative margins. KEYWORDS: Fibre-to-the-Home, FTTH, Innovation, Differentiation, Competition, Network Operators, Competitive Advantage.GR201

    Cross-layer optimisation of quality of experience for video traffic

    Get PDF
    Realtime video traffic is currently the dominant network traffic and is set to increase in volume for the foreseeable future. As this traffic is bursty, providing perceptually good video quality is a challenging task. Bursty traffic refers to inconsistency of the video traffic level. It is at high level sometimes while is at low level at some other times. Many video traffic measurement algorithms have been proposed for measurement-based admission control. Despite all of this effort, there is no entirely satisfactory admission algorithm for variable rate flows. Furthermore, video frames are subjected to loss and delay which cause quality degradation when sent without reacting to network congestion. The perceived Quality of Experience (QoE)-number of sessions trade-off can be optimised by exploiting the bursty nature of video traffic. This study introduces a cross-layer QoE-aware optimisation architecture for video traffic. QoE is a measure of the user's perception of the quality of a network service. The architecture addresses the problem of QoE degradation in a bottleneck network. It proposes that video sources at the application layer adapt their rate to the network environment by dynamically controlling their transmitted bit rate. Whereas the edge of the network protects the quality of active video sessions by controlling the acceptance of new sessions through a QoE-aware admission control. In particular, it seeks the most efficient way of accepting new video sessions and adapts sending rates to free up resources for more sessions whilst maintaining the QoE of the current sessions. As a pathway to the objective, the performance of the video flows that react to the network load by adapting the sending rate was investigated. Although dynamic rate adaptation enhances the video quality, accepting more sessions than a link can accommodate will degrade the QoE. The video's instantaneous aggregate rate was compared to the average aggregate rate which is a calculated rate over a measurement time window. It was found that there is no substantial difference between the two rates except for a small number of video flows, long measurement window, or fast moving contents (such as sport), in which the average is smaller than the instantaneous rate. These scenarios do not always represent the reality. The finding discussed above was the main motivation for proposing a novel video traffic measurement algorithm that is QoE-aware. The algorithm finds the upper limit of the video total rate that can exceed a specific link capacity without the QoE degradation of ongoing video sessions. When implemented in a QoE-aware admission control, the algorithm managed to maintain the QoE for a higher number of video session compared to the calculated rate-based admission controls such as the Internet Engineering Task Force (IETF) standard Pre-Congestion Notification (PCN)-based admission control. Subjective tests were conducted to involve human subjects in rating of the quality of videos delivered with the proposed measurement algorithm. Mechanisms proposed for optimising the QoE of video traffic were surveyed in detail in this dissertation and the challenges of achieving this objective were discussed. Finally, the current rate adaptation capability of video applications was combined with the proposed QoE-aware admission control in a QoE-aware cross-layer architecture. The performance of the proposed architecture was evaluated against the architecture in which video applications perform rate adaptation without being managed by the admission control component. The results showed that our architecture optimises the mean Mean Opinion Score (MOS) and number of successful decoded video sessions without compromising the delay. The algorithms proposed in this study were implemented and evaluated using Network Simulator-version 2 (NS-2), MATLAB, Evalvid and Evalvid-RA. These software tools were selected based on their use in similar studies and availability at the university. Data obtained from the simulations was analysed with analysis of variance (ANOVA) and the Cumulative Distribution Functions (CDF) for the performance metrics were calculated. The proposed architecture will contribute to the preparation for the massive growth of video traffic. The mathematical models of the proposed algorithms contribute to the research community

    The InfoSec Handbook

    Get PDF
    Computer scienc

    Quality-driven resource utilization methods for video streaming in wireless communication networks

    Get PDF
    This research is focused on the optimisation of resource utilisation in wireless mobile networks with the consideration of the users’ experienced quality of video streaming services. The study specifically considers the new generation of mobile communication networks, i.e. 4G-LTE, as the main research context. The background study provides an overview of the main properties of the relevant technologies investigated. These include video streaming protocols and networks, video service quality assessment methods, the infrastructure and related functionalities of LTE, and resource allocation algorithms in mobile communication systems. A mathematical model based on an objective and no-reference quality assessment metric for video streaming, namely Pause Intensity, is developed in this work for the evaluation of the continuity of streaming services. The analytical model is verified by extensive simulation and subjective testing on the joint impairment effects of the pause duration and pause frequency. Various types of the video contents and different levels of the impairments have been used in the process of validation tests. It has been shown that Pause Intensity is closely correlated with the subjective quality measurement in terms of the Mean Opinion Score and this correlation property is content independent. Based on the Pause Intensity metric, an optimised resource allocation approach is proposed for the given user requirements, communication system specifications and network performances. This approach concerns both system efficiency and fairness when establishing appropriate resource allocation algorithms, together with the consideration of the correlation between the required and allocated data rates per user. Pause Intensity plays a key role here, representing the required level of Quality of Experience (QoE) to ensure the best balance between system efficiency and fairness. The 3GPP Long Term Evolution (LTE) system is used as the main application environment where the proposed research framework is examined and the results are compared with existing scheduling methods on the achievable fairness, efficiency and correlation. Adaptive video streaming technologies are also investigated and combined with our initiatives on determining the distribution of QoE performance across the network. The resulting scheduling process is controlled through the prioritization of users by considering their perceived quality for the services received. Meanwhile, a trade-off between fairness and efficiency is maintained through an online adjustment of the scheduler’s parameters. Furthermore, Pause Intensity is applied to act as a regulator to realise the rate adaptation function during the end user’s playback of the adaptive streaming service. The adaptive rates under various channel conditions and the shape of the QoE distribution amongst the users for different scheduling policies have been demonstrated in the context of LTE. Finally, the work for interworking between mobile communication system at the macro-cell level and the different deployments of WiFi technologies throughout the macro-cell is presented. A QoEdriven approach is proposed to analyse the offloading mechanism of the user’s data (e.g. video traffic) while the new rate distribution algorithm reshapes the network capacity across the macrocell. The scheduling policy derived is used to regulate the performance of the resource allocation across the fair-efficient spectrum. The associated offloading mechanism can properly control the number of the users within the coverages of the macro-cell base station and each of the WiFi access points involved. The performance of the non-seamless and user-controlled mobile traffic offloading (through the mobile WiFi devices) has been evaluated and compared with that of the standard operator-controlled WiFi hotspots

    The InfoSec Handbook

    Get PDF
    Computer scienc

    The Future of Cybercrime: AI and Emerging Technologies Are Creating a Cybercrime Tsunami

    Get PDF
    This paper reviews the impact of AI and emerging technologies on the future of cybercrime and the necessary strategies to combat it effectively. Society faces a pressing challenge as cybercrime proliferates through AI and emerging technologies. At the same time, law enforcement and regulators struggle to keep it up. Our primary challenge is raising awareness as cybercrime operates within a distinct criminal ecosystem. We explore the hijacking of emerging technologies by criminals (CrimeTech) and their use in illicit activities, along with the tools and processes (InfoSec) to protect against future cybercrime. We also explore the role of AI and emerging technologies (DeepTech) in supporting law enforcement, regulation, and legal services (LawTech)

    The Evolving Cyber Threat Landscape during the Coronavirus Crisis

    Get PDF
    In light of the societal changes wrought by the coronavirus pandemic, this report examines the impact this crisis has had on the general cybersecurity threat landscape during the first two quarters of 2020. This report highlights that the coronavirus pandemic has generated a set of remarkable and psycho-societal, technical, and logistical-economic circumstances upon which malicious actors have capitalized. The analysis of the COVID-19-related cyber threats conducted for this report shows an overarching degree of continuity in terms of the types of attacks, threats actors, and the volume of attacks. Adversarial behavior has, however, changed and evolved in terms of scale, sophistication, targets, and motivation. Based on these findings, this report has developed a range of recommendations for addressing key cybersecurity implications of the pandemic.Vor dem Hintergrund der durch die Corona-Pandemie hervorgerufenen gesellschaftlichen Veränderungen untersucht dieser Bericht die Auswirkungen dieser Krise auf die allgemeine Bedrohungslage im Bereich der Cybersicherheit in den ersten beiden Quartalen des Jahres 2020. Der Bericht hebt hervor, dass die Corona-Pandemie eine Reihe von bemerkenswerten psycho-sozialen, technischen und logistisch-wirtschaftlichen Umständen geschaffen hat, aus denen bösartige Akteure Kapital geschlagen haben. Die für diesen Bericht durchgeführte Analyse der COVID-19-bedingten Cyberbedrohungen zeigt ein grosses Mass an Kontinuität im Bezug auf die Arten von Angriffen, die bedrohenden Akteure und das Ausmass der Angriffe. Das Verhalten der Angreifer hat sich jedoch in Bezug auf Umfang, Raffinesse, Ziele und Motivation verändert und weiterentwickelt. Basierende auf diesen Erkenntnissen hat dieser Bericht eine Reihe von Empfehlungen entwickelt, um die wichtigsten Auswirkungen der Pandemie auf die Cybersicherheit anzugehen
    • …
    corecore