292 research outputs found

    Resource-Aware Multimedia Content Delivery: A Gambling Approach

    Get PDF
    In this paper, we propose a resource-aware solution to achieving reliable and scalable stream diffusion in a probabilistic model, i.e. where communication links and processes are subject to message losses and crashes, respectively. Our solution is resource-aware in the sense that it limits the memory consumption, by strictly scoping the knowledge each process has about the system, and the bandwidth available to each process, by assigning a fixed quota of messages to each process. We describe our approach as gambling in the sense that it consists in accepting to give up on a few processes sometimes, in the hope of better serving all processes most of the time. That is, our solution deliberately takes the risk not to reach some processes in some executions, in order to reach every process in most executions. The underlying stream diffusion algorithm is based on a tree-construction technique that dynamically distributes the load of forwarding stream packets among processes, based on their respective available bandwidths. Simulations show that this approach pays off when compared to traditional gossiping, when the latter faces identical bandwidth constraint

    Resource-Aware Multimedia Content Delivery: A Gambling Approach

    Get PDF
    In this paper, we propose a resource-aware solution to achieving reliable and scalable stream diffusion in a probabilistic model, i.e. where communication links and processes are subject to message losses and crashes, respectively. Our solution is resource-aware in the sense that it limits the memory consumption, by strictly scoping the knowledge each process has about the system, and the bandwidth available to each process, by assigning a fixed quota of messages to each process. We describe our approach as gambling in the sense that it consists in accepting to give up on a few processes sometimes, in the hope of better serving all processes most of the time. That is, our solution deliberately takes the risk not to reach some processes in some executions, in order to reach every process in most executions. The underlying stream diffusion algorithm is based on a tree-construction technique that dynamically distributes the load of forwarding stream packets among processes, based on their respective available bandwidths. Simulations show that this approach pays off when compared to traditional gossiping, when the latter faces identical bandwidth constraint

    Characterizing a Meta-CDN

    Full text link
    CDNs have reshaped the Internet architecture at large. They operate (globally) distributed networks of servers to reduce latencies as well as to increase availability for content and to handle large traffic bursts. Traditionally, content providers were mostly limited to a single CDN operator. However, in recent years, more and more content providers employ multiple CDNs to serve the same content and provide the same services. Thus, switching between CDNs, which can be beneficial to reduce costs or to select CDNs by optimal performance in different geographic regions or to overcome CDN-specific outages, becomes an important task. Services that tackle this task emerged, also known as CDN broker, Multi-CDN selectors, or Meta-CDNs. Despite their existence, little is known about Meta-CDN operation in the wild. In this paper, we thus shed light on this topic by dissecting a major Meta-CDN. Our analysis provides insights into its infrastructure, its operation in practice, and its usage by Internet sites. We leverage PlanetLab and Ripe Atlas as distributed infrastructures to study how a Meta-CDN impacts the web latency

    Internet censorship in the European Union

    Get PDF
    Diese Arbeit befasst sich mit Internetzensur innnerhalb der EU, und hier insbesondere mit der technischen Umsetzung, das heißt mit den angewandten Sperrmethoden und Filterinfrastrukturen, in verschiedenen EU-Ländern. Neben einer Darstellung einiger Methoden und Infrastrukturen wird deren Nutzung zur Informationskontrolle und die Sperrung des Zugangs zu Websites und anderen im Internet verfügbaren Netzdiensten untersucht. Die Arbeit ist in drei Teile gegliedert. Zunächst werden Fälle von Internetzensur in verschiedenen EU-Ländern untersucht, insbesondere in Griechenland, Zypern und Spanien. Anschließend wird eine neue Testmethodik zur Ermittlung der Zensur mittels einiger Anwendungen, welche in mobilen Stores erhältlich sind, vorgestellt. Darüber hinaus werden alle 27 EU-Länder anhand historischer Netzwerkmessungen, die von freiwilligen Nutzern von OONI aus der ganzen Welt gesammelt wurden, öffentlich zugänglichen Blocklisten der EU-Mitgliedstaaten und Berichten von Netzwerkregulierungsbehörden im jeweiligen Land analysiert.This is a thesis on Internet censorship in the European Union (EU), specifically regarding the technical implementation of blocking methodologies and filtering infrastructure in various EU countries. The analysis examines the use of this infrastructure for information controls and the blocking of access to websites and other network services available on the Internet. The thesis follows a three-part structure. Firstly, it examines the cases of Internet censorship in various EU countries, specifically Greece, Cyprus, and Spain. Subsequently, this paper presents a new testing methodology for determining censorship of mobile store applications. Additionally, it analyzes all 27 EU countries using historical network measurements collected by Open Observatory of Network Interference (OONI) volunteers from around the world, publicly available blocklists used by EU member states, and reports issued by network regulators in each country

    A comparison of statistical machine learning methods in heartbeat detection and classification

    Get PDF
    In health care, patients with heart problems require quick responsiveness in a clinical setting or in the operating theatre. Towards that end, automated classification of heartbeats is vital as some heartbeat irregularities are time consuming to detect. Therefore, analysis of electro-cardiogram (ECG) signals is an active area of research. The methods proposed in the literature depend on the structure of a heartbeat cycle. In this paper, we use interval and amplitude based features together with a few samples from the ECG signal as a feature vector. We studied a variety of classification algorithms focused especially on a type of arrhythmia known as the ventricular ectopic fibrillation (VEB). We compare the performance of the classifiers against algorithms proposed in the literature and make recommendations regarding features, sampling rate, and choice of the classifier to apply in a real-time clinical setting. The extensive study is based on the MIT-BIH arrhythmia database. Our main contribution is the evaluation of existing classifiers over a range sampling rates, recommendation of a detection methodology to employ in a practical setting, and extend the notion of a mixture of experts to a larger class of algorithms
    • …
    corecore