106 research outputs found

    Rethinking Routing and Peering in the era of Vertical Integration of Network Functions

    Get PDF
    Content providers typically control the digital content consumption services and are getting the most revenue by implementing an all-you-can-eat model via subscription or hyper-targeted advertisements. Revamping the existing Internet architecture and design, a vertical integration where a content provider and access ISP will act as unibody in a sugarcane form seems to be the recent trend. As this vertical integration trend is emerging in the ISP market, it is questionable if existing routing architecture will suffice in terms of sustainable economics, peering, and scalability. It is expected that the current routing will need careful modifications and smart innovations to ensure effective and reliable end-to-end packet delivery. This involves new feature developments for handling traffic with reduced latency to tackle routing scalability issues in a more secure way and to offer new services at cheaper costs. Considering the fact that prices of DRAM or TCAM in legacy routers are not necessarily decreasing at the desired pace, cloud computing can be a great solution to manage the increasing computation and memory complexity of routing functions in a centralized manner with optimized expenses. Focusing on the attributes associated with existing routing cost models and by exploring a hybrid approach to SDN, we also compare recent trends in cloud pricing (for both storage and service) to evaluate whether it would be economically beneficial to integrate cloud services with legacy routing for improved cost-efficiency. In terms of peering, using the US as a case study, we show the overlaps between access ISPs and content providers to explore the viability of a future in terms of peering between the new emerging content-dominated sugarcane ISPs and the healthiness of Internet economics. To this end, we introduce meta-peering, a term that encompasses automation efforts related to peering – from identifying a list of ISPs likely to peer, to injecting control-plane rules, to continuous monitoring and notifying any violation – one of the many outcroppings of vertical integration procedure which could be offered to the ISPs as a standalone service

    Preventing State-Led Cyberattacks Using the Bright Internet and Internet Peace Principles

    Get PDF
    The Internet has engendered serious cybersecurity problems due to its anonymity, transnationality, and technical shortcomings. This paper addresses state-led cyberattacks (SLCAs) as a particular source of threats. Recently, the concept of the Bright Internet was proposed as a means of shifting the cybersecurity paradigm from self-defensive protection to the preventive identification of malevolent origins through adopting five cohesive principles. To design a preventive solution against SLCAs, we distinguish the nature of SLCAs from that of private-led cyberattacks (PLCAs). We then analyze what can and cannot be prevented according to the principles of the Bright Internet. For this research, we collected seven typical SLCA cases and selected three illustrative PLCA cases with eleven factors. Our analysis demonstrated that Bright Internet principles alone are insufficient for preventing threats from the cyberterror of noncompliant countries. Thus, we propose a complementary measure referred to here as the Internet Peace Principles, which define that the Internet should be used only for peaceful purposes in accordance with international laws and norms. We derive these principles using an approach that combines the extension of physical conventions to cyberspace, the expansion of international cybersecurity conventions to global member countries, and analogical international norms. Based on this framework, we adopt the Charter of the United Nations, the Responsibility of States for Internationally Wrongful Acts, Recommendations by the United Nations Group of Governmental Experts, the Tallinn Manual, and Treaty of the Non-Proliferation of Nuclear Weapons, and others as reference norms that we use to derive the consistent international order embodied by the Internet Peace Principles

    Mapping peering interconnections to a facility

    Get PDF
    Annotating Internet interconnections with robust physical coordinates at the level of a building facilitates network management including interdomain troubleshooting, but also has practical value for helping to locate points of attacks, congestion, or instability on the Internet. But, like most other aspects of Internet interconnection, its geophysical locus is generally not public; the facility used for a given link must be inferred to construct a macroscopic map of peering. We develop a methodology, called constrained facility search, to infer the physical interconnection facility where an interconnection occurs among all possible candidates. We rely on publicly available data about the presence of networks at different facilities, and execute traceroute measurements from more than 8,500 available measurement servers scattered around the world to identify the technical approach used to establish an interconnection. A key insight of our method is that inference of the technical approach for an interconnection sufficiently constrains the number of candidate facilities such that it is often possible to identify the specific facility where a given interconnection occurs. Validation via private communication with operators confirms the accuracy of our method, which outperforms heuristics based on naming schemes and IP geolocation. Our study also reveals the multiple roles that routers play at interconnection facilities; in many cases the same router implements both private interconnections and public peerings, in some cases via multiple Internet exchange points. Our study also sheds light on peering engineering strategies used by different types of networks around the globe

    D2WFP: a novel protocol for forensically identifying, extracting, and analysing deep and dark web browsing activities

    Get PDF
    The use of the unindexed web, commonly known as the deep web and dark web, to commit or facilitate criminal activity has drastically increased over the past decade. The dark web is a dangerous place where all kinds of criminal activities take place, Despite advances in web forensic techniques, tools, and methodologies, few studies have formally tackled dark and deep web forensics and the technical differences in terms of investigative techniques and artefact identification and extraction. This study proposes a novel and comprehensive protocol to guide and assist digital forensic professionals in investigating crimes committed on or via the deep and dark web. The protocol, named D2WFP, establishes a new sequential approach for performing investigative activities by observing the order of volatility and implementing a systemic approach covering all browsing-related hives and artefacts which ultimately resulted in improving the accuracy and effectiveness. Rigorous quantitative and qualitative research has been conducted by assessing the D2WFP following a scientifically sound and comprehensive process in different scenarios and the obtained results show an apparent increase in the number of artefacts recovered when adopting the D2WFP which outperforms any current industry or opensource browsing forensic tools. The second contribution of the D2WFP is the robust formulation of artefact correlation and cross-validation within the D2WFP which enables digital forensic professionals to better document and structure their analysis of host-based deep and dark web browsing artefacts

    Development of a system compliant with the Application-Layer Traffic Optimization Protocol

    Get PDF
    Dissertação de mestrado integrado em Engenharia InformáticaWith the ever-increasing Internet usage that is following the start of the new decade, the need to optimize this world-scale network of computers becomes a big priority in the technological sphere that has the number of users rising, as are the Quality of Service (QoS) demands by applications in domains such as media streaming or virtual reality. In the face of rising traffic and stricter application demands, a better understand ing of how Internet Service Providers (ISPs) should manage their assets is needed. An important concern regards to how applications utilize the underlying network infras tructure over which they reside. Most of these applications act with little regard for ISP preferences, as exemplified by their lack of care in achieving traffic locality during their operation, which would be a preferable feature for network administrators, and that could also improve application performance. However, even a best-effort attempt by applications to cooperate will hardly succeed if ISP policies aren’t clearly commu nicated to them. Therefore, a system to bridge layer interests has much potential in helping achieve a mutually beneficial scenario. The main focus of this thesis is the Application-Layer Traffic Optimization (ALTO) work ing group, which was formed by the Internet Engineering Task Force (IETF) to explore standardizations for network information retrieval. This group specified a request response protocol where authoritative entities provide resources containing network status information and administrative preferences. Sharing of infrastructural insight is done with the intent of enabling a cooperative environment, between the network overlay and underlay, during application operations, to obtain better infrastructural re sourcefulness and the consequential minimization of the associated operational costs. This work gives an overview of the historical network tussle between applications and service providers, presents the ALTO working group’s project as a solution, im plements an extended system built upon their ideas, and finally verifies the developed system’s efficiency, in a simulation, when compared to classical alternatives.Com o acrescido uso da Internet que acompanha o início da nova década, a necessidade de otimizar esta rede global de computadores passa a ser uma grande prioridade na esfera tecnológica que vê o seu número de utilizadores a aumentar, assim como a exigência, por parte das aplicações, de novos padrões de Qualidade de Serviço (QoS), como visto em domínios de transmissão de conteúdo multimédia em tempo real e em experiências de realidade virtual. Face ao aumento de tráfego e aos padrões de exigência aplicacional mais restritos, é necessário melhor compreender como os fornecedores de serviços Internet (ISPs) devem gerir os seus recursos. Um ponto fulcral é como aplicações utilizam os seus recursos da rede, onde muitas destas não têm consideração pelas preferências dos ISPs, como exemplificado pela sua falta de esforço em localizar tráfego, onde o contrário seria preferível por administradores de rede e teria potencial para melhorar o desempenho aplicacional. Uma tentativa de melhor esforço, por parte das aplicações, em resolver este problema, não será bem-sucedida se as preferências administrativas não forem claramente comunicadas. Portanto, um sistema que sirva de ponte de comunicação entre camadas pode potenciar um cenário mutuamente benéfico. O foco principal desta tese é o grupo de trabalho Application-Layer Traffic Optimization (ALTO), que foi formado pelo Internet Engineering Task Force (IETF) para explorar estandardizações para recolha de informação da rede. Este grupo especificou um protocolo onde entidades autoritárias disponibilizam recursos com informação de estado de rede, e preferências administrativas. A partilha de conhecimento infraestrutural é feita para possibilitar um ambiente cooperativo entre redes overlay e underlay, para uma mais eficiente utilização de recursos e a consequente minimização de custos operacionais. É pretendido dar uma visão da histórica disputa entre aplicações e ISPs, assim como apresentar o projeto do grupo de trabalho ALTO como solução, implementar e melhorar sobre as suas ideias, e finalmente verificar a eficiência do sistema numa simulação, quando comparado com alternativas clássicas

    Security, Privacy and Economics of Online Advertising

    Get PDF
    Online advertising is at the core of today’s Web: it is the main business model, generating large annual revenues expressed in tens of billions of dollars that sponsor most of the online content and services. Online advertising consists of delivering marketing messages, embedded into Web content, to a targeted audience. In this model, entities attract Web traffic by offering the content and services for free and charge advertisers for including advertisements in this traffic (i.e., advertisers pay for users’ attention and interests). Online advertising is a very successful form of advertising as it allows for advertisements (ads) to be targeted to individual users’ interests; especially when advertisements are served on users’ mobile devices, as ads can be targeted to users’ locations and the corresponding context. However, online advertising also introduces a number of problems. Given the high ad revenue at stake, fraudsters have economic incentives to exploit the ad system and generate profit from it. Unfortunately, to achieve this goal, they often compromise users’ online security (e.g., via malware, phishing, etc.). For the purpose of maximizing the revenue by matching ads to users’ interests, a number of techniques are deployed, aimed at tracking and profiling users’ digital footprints, i.e., their behavior in the digital world. These techniques introduce new threats to users’ privacy. Consequently, some users adopt ad-avoidance tools that prevent the download of advertisements and partially thwart user profiling. Such user behavior, as well as exploits of ad systems, have economic implications as they undermine the online advertising business model. Meddling with advertising revenue disrupts the current economic model of the Web, the consequences of which are unclear. Given that today’s Web model relies on online advertising revenue in order for users to have access and consume content and services for “free”, coupled with the fact that there are many threats that could jeopardize this model, in this thesis we address the security, privacy and economic issues stemming from this fundamental element of the Web. In the first part of the thesis, we investigate the vulnerabilities of online advertising systems. We identify how an adversary can exploit the ad system to generate profit for itself, notably by performing inflight modification of ad traffic. We provide a proof-of-concept implementation of the identified threat on Wi-Fi routers. We propose a collaborative approach for securing online advertising and Web browsing against such threats. By investigating how a certificate-based authentication is deployed in practice, we assess the potential of relying on certificate-based authentication as a building block of a solution to protect the ad revenue. We propose a multidisciplinary approach for improving the current state of certificate-based authentication on the Web. In the second part of the thesis, we study the economics of ad systems’ exploits and certain potential countermeasures. We evaluate the potential of different solutions aimed at protecting ad revenue being implemented by the stakeholders (e.g., Internet Service Providers or ad networks) and the conditions under which this is likely to happen. We also study the economic ramifications of ad-avoidance technologies on the monetization of online content. We use game-theory to model the strategic behavior of involved entities and their interactions. In the third part of the thesis, we focus on privacy implications of online advertising. We identify a novel threat to users’ location privacy that enables service providers to geolocate users with high accuracy, which is needed to serve location-targeted ads for local businesses. We draw attention to the large scale of the threat and the potential impact on users’ location privacy

    Satellite Networks: Architectures, Applications, and Technologies

    Get PDF
    Since global satellite networks are moving to the forefront in enhancing the national and global information infrastructures due to communication satellites' unique networking characteristics, a workshop was organized to assess the progress made to date and chart the future. This workshop provided the forum to assess the current state-of-the-art, identify key issues, and highlight the emerging trends in the next-generation architectures, data protocol development, communication interoperability, and applications. Presentations on overview, state-of-the-art in research, development, deployment and applications and future trends on satellite networks are assembled

    An Energy-Efficient Multi-Cloud Service Broker for Green Cloud Computing Environment

    Get PDF
    The heavy demands on cloud computing resources have led to a substantial growth in energy consumption of the data transferred between cloud computing parties (i.e., providers, datacentres, users, and services) and in datacentre’s services due to the increasing loads on these services. From one hand, routing and transferring large amounts of data into a datacentre located far from the user’s geographical location consume more energy than just processing and storing the same data on the cloud datacentre. On the other hand, when a cloud user submits a job (in the form of a set of functional and non-functional requirements) to a cloud service provider (aka, datacentre) via a cloud services broker; the broker becomes responsible to find the best-fit service to the user request based mainly on the user’s requirements and Quality of Service (QoS) (i.e., response time, latency). Hence, it becomes a high necessity to locate the lowest energy consumption route between the user and the designated datacentre; and the minimum possible number of most energy efficient services that satisfy the user request. In fact, finding the most energy-efficient route to the datacentre, and most energy efficient service(s) to the user are the biggest challenges of multi-cloud broker’s environment. This thesis presents and evaluates a novel multi-cloud broker solution that contains three innovative models and their associated algorithms. The first one is aimed at finding the most energy efficient route, among multiple possible routes, between the user and cloud datacentre. The second model is to find and provide the lowest possible number of most energy efficient services in order to minimise data exchange based on a bin-packing approach. The third model creates an energy-aware composition plan by integrating the most energy efficient services, in order to fulfil user requirements. The results demonstrated a favourable performance of these models in terms of selecting the most energy efficient route and reaching the least possible number of services for an optimum and energy efficient composition

    Nurturing a Digital Learning Environment for Adults 55+

    Get PDF
    Being digitally competent means having competences in all areas of DigComp: Information and data literacy, Communication and collaboration, Digital content creation, Safety and Problem-solving. More than other demographic categories, adults 55+ have a wide range of levels of digitalization. Depending on their level of competences, individuals may join self-administered online courses to improve their skills, or they may need guidance from adult educators. Taking into consideration the above situation and willing to address adult learners regardless of their initial skill levels, the proposed educational programme is carefully designed for both: self-administrated and educator-led training. It comprises five totally innovative courses that can be separately taught or can be integrated into a complex programme delivered by adult education organizations. These courses are the result of an ERASMUS+ project “Digital Facilitator for Adults 55+”. Chapter 1 introduces the methodology for designing attractive and engaging educational materials for adults’ digital skills improvement. The methodology clarifies the inputs, the development process and the expected results. An ample explanation of the five phases of the 5E instructional strategy is presented to help adult educators build a sequence of coherent and engaging learning stages. With this approach, learners are supported to think, work, gather ideas, identify their own skill levels and needs, analyse their progress, and communicate with others under the guidance of educators. Following up on the proposed methodology, in Chapter 2 researchers from Formative Footprint (Spain), TEAM4Excellence (Romania), Voluntariat Pentru Viata (Romania) and Saricam Halk Egitimi Merkezi (Turkey) developed five course modules in line with the DIGCOMP - Digital Competence Framework for Citizens. These modules address the competence areas of information and data literacy, communication and collaboration, digital content creation, safety, and problem-solving. Each course module comprises digital textbooks, videos, interactive activities and means for evaluation developed using the 5E instructional model strategy. Understanding that accessibility is one of the main components of lifelong learning education, Chapter 3 of the manual provides an overview of the integration of educational materials, tools, instruments, video tutorials as well as DIFA55+ web app in the digital educational ecosystem. Finally, the authors formulate recommendations for usability and transferability that go beyond individuals, ensuring that educational materials are user-friendly and effective while making it easier to apply successful pedagogical approaches in other complementary educational contexts or projects.Grant Agreement—2021-1-RO01-KA220-ADU-000035297, Digital Facilitator for Adults 55

    Unauthorized Access

    Get PDF
    Going beyond current books on privacy and security, this book proposes specific solutions to public policy issues pertaining to online privacy and security. Requiring no technical or legal expertise, it provides a practical framework to address ethical and legal issues. The authors explore the well-established connection between social norms, privacy, security, and technological structure. They also discuss how rapid technological developments have created novel situations that lack relevant norms and present ways to develop these norms for protecting informational privacy and ensuring sufficient information security
    • …
    corecore