64 research outputs found

    Towards Understanding First-Party Cookie Tracking in the Field

    Get PDF
    Third-party tracking is a common and broadly used technique on the Web. Different defense mechanisms have emerged to counter these practices (e. g. browser vendors that ban all third-party cookies). However, these countermeasures only target third-party trackers and ignore the first party because the narrative is that such monitoring is mostly used to improve the utilized service (e.g. analytical services). In this paper, we present a large-scale measurement study that analyzes tracking performed by the first party but utilized by a third party to circumvent standard tracking preventing techniques. We visit the top 15,000 websites to analyze first-party cookies used to track users and a technique called “DNS CNAME cloaking”, which can be used by a third party to place first-party cookies. Using this data, we show that 76% of sites effectively utilize such tracking techniques. In a long-running analysis, we show that the usage of such cookies increased by more than 50% over 2021

    Neue Ansätze zur Echtzeitsteuerung städtischer Lichtsignalanlagen

    Get PDF
    Adaptive Traffic Control Systems (ATCS) control a set of traffic signals at connected intersections in a network. They continuously adapt the signalization in real-time to the current traffic demand. In this thesis a new ATCS prototype has been developed and evaluated. A comprehensive overview of the state-of-the-art of traffic signal control is given, followed by an overview of the conceptual design of the ATCS prototype. Every quarter of an hour, signal timings of all signalized intersections are optimized on a central computer and sent to the local controllers where they are executed. The first task is to estimate the traffic demand of the next optimization interval. Based on detector counts of previous time intervals, a forecasting module estimates detector counts of the next interval. These counts are used as constraints for the estimation of Origin-Destination flows, traffic volumes on different routes and on all links of the network. The next module makes use of classic formulas for the calculation of fixed time signal plans in order to adjust a network-wide common cycle length and individual phase durations. The subsequent model-based offset optimization aims at establishing a good coordination of adjacent intersections. A macroscopic traffic flow model is used to evaluate the effects of different offset combinations. Different optimization algorithms have been implemented, thereof two based on Genetic Algorithms. A third, deterministic algorithm has been developed as well. At the beginning of each time interval, the new signal timings have to be implemented at each intersection. Based on the state-of-the-art and on a simulation study a smooth transition technique has been identified and implemented. Finally, the ATCS prototype has been evaluated by means of a comprehensive microsimulation study. It has some potential to improve travel times compared to an optimized fixed time signal control. The degree of this improvement depends on the network.Netzsteuerungsverfahren steuern alle Lichtsignalanlagen (LSA) in einem Teilnetz. Sie passen die Signalisierung kontinuierlich an die aktuelle Verkehrsnachfrage an. In dieser Arbeit wurde ein neues Netzsteuerungsverfahren prototypisch entwickelt und evaluiert. Einer umfassenden Literaturanalyse zum Stand der Technik folgt ein Überblick über das Grundkonzept des Prototyps. Alle 15 Minuten werden die Signalprogramme zentral optimiert und an die einzelnen Steuergeräte gesendet, wo sie ausgeführt werden. Die erste Aufgabe umfasst die Schätzung der Verkehrsnachfrage im nächsten Optimierungsintervall. Basierend auf Detektorzählwerten der letzten vier Zeitintervalle schätzt ein Prognosemodul die Zählwerte des nächsten Zeitintervalls. Diese Zählwerte werden als Randbedingungen für die Schätzung der Verkehrsstärken verschiedener Quelle-Ziel-Beziehungen sowie auf unterschiedlichen Routen und auf allen Kanten im Netz genutzt. Das nächste Modul nutzt einen klassischen Ansatz zur Berechnung von Festzeitsignalprogrammen, um die netzweit einheitliche Umlaufzeit und die individuellen Phasendauern der LSA anzupassen. Die anschließende modellbasierte Versatzzeitoptimierung zielt auf eine gute Koordinierung der LSA ab. Anhand eines makroskopischen Verkehrsflussmodells werden die Auswirkungen unterschiedlicher Versatzzeitkombinationen abgeschätzt. Es wurden verschiedene Optimierungsalgorithmen umgesetzt, von denen zwei auf Genetischen Algorithmen basieren. Das dritte Verfahren ist deterministisch. Zu Beginn jedes Zeitintervalls müssen die neuen Signalpläne an allen LSA umgesetzt werden. Basierend auf einer Literaturanalyse und einer Simulationsstudie wurde ein störungsarmes Umschaltverfahren identifiziert und umgesetzt. Schließlich wurde der Prototyp anhand einer umfassenden Mikrosimulationsstudie bewertet. Er erreicht eine Reduzierung der Reisezeiten im Vergleich zu einer optimierten Festzeitsteuerung. Der Grad der Verbesserung hängt von den Randbedingungen des jeweiligen Netzes ab

    Our (in)Secure Web: Understanding Update Behavior of Websites and Its Impact on Security

    Get PDF
    Software updates take an essential role in keeping IT environments secure. If service providers delay or do not install updates, it can cause unwanted security implications for their environments. This paper conducts a large-scale measurement study of the update behavior of websites and their utilized software stacks. Across 18 months, we analyze over 5.6M websites and 246 distinct client- and server-side software distributions. We found that almost all analyzed sites use outdated software. To understand the possible security implications of outdated software, we analyze the potential vulnerabilities that affect the utilized software. We show that software components are getting older and more vulnerable because they are not updated. We find that 95 % of the analyzed websites use at least one product for which a vulnerability existed

    A critical evaluation of decadal solar cycle imprints in the MiKlip historical ensemble simulations

    Get PDF
    Studies concerning solar–terrestrial connections over the last decades claim to have found evidence that the quasi-decadal solar cycle can have an influence on the dynamics in the middle atmosphere in the Northern Hemisphere (NH) during the winter season. It has been argued that feedbacks between the intensity of the UV part of the solar spectrum and low-latitude stratospheric ozone may produce anomalies in meridional temperature gradients which have the potential to alter the zonal-mean flow in middle to high latitudes. Interactions between the zonal wind and planetary waves can lead to a downward propagation of the anomalies, produced in the middle atmosphere, down to the troposphere. More recently, it has been proposed that top-down-initiated decadal solar signals might modulate surface climate and synchronize the North Atlantic Oscillation. A realistic representation of the solar cycle in climate models was suggested to significantly enhance decadal prediction skill. These conclusions have been debated controversial since then due to the lack of realistic decadal prediction model setups and more extensive analysis. In this paper we aim for an objective and improved evaluation of possible solar imprints from the middle atmosphere to the surface and with that from head to toe. Thus, we analyze model output from historical ensemble simulations conducted with the state-of-the-art Max Planck Institute for Meteorology Earth System Model in high-resolution configuration (MPI-ESM-HR). The target of these simulations was to isolate the most crucial model physics to foster basic research on decadal climate prediction and to develop an operational ensemble decadal prediction system within the “Mittelfristige Klimaprognose” (MiKlip) framework. Based on correlations and multiple linear regression analysis we show that the MPI-ESM-HR simulates a realistic, statistically significant and robust shortwave heating rate and temperature response at the tropical stratopause, in good agreement with existing studies. However, the dynamical response to this initial radiative signal in the NH during the boreal winter season is weak. We find a slight strengthening of the polar vortex in midwinter during solar maximum conditions in the ensemble mean, which is consistent with the so-called “top-down” mechanism. The individual ensemble members, however, show a large spread in the dynamical response with opposite signs in response to the solar cycle, which might be a result of the large overall internal variability compensating for rather small solar imprints. We also analyze the possible surface responses to the 11-year solar cycle and review the proposed synchronization between the solar forcing and the North Atlantic Oscillation. We find that the simulated westerly wind anomalies in the lower troposphere, as well as the anomalies in the mean sea level pressure, are most likely independent from the timing of the solar signal in the middle atmosphere and the alleged top-down influences. The pattern rather reflects the decadal internal variability in the troposphere, mimicking positive and negative phases of the Arctic and North Atlantic oscillations throughout the year sporadically, which is then assigned to the solar predictor time series without any plausible physical connection and sound solar contribution. Finally, by applying lead–lag correlations, we find that the proposed synchronization between the solar cycle and the decadal component of the North Atlantic Oscillation might rather be a statistical artifact, affected for example by the internal decadal variability in the ocean, than a plausible physical connection between the UV solar forcing and quasi-decadal variations in the troposphere

    Das neue Etatmodell der UB Kassel: Nutzungsbasiertes Portfoliomanagement fĂĽr E-Journals und Datenbanken

    Get PDF
    Um bei stagnierendem Etat und steigenden Kosten einen effizienten Einsatz der Erwerbungsmittel zu gewährleisten, setzt die UB Kassel seit 2019 ein neues Etatverteilungsmodell ein: Fachspezifische Mittelzuweisungen gibt es nur noch für Monografien; die Höhe der Zuweisung pro Fach richtet sich nach aktuellen Professoren- und Studierendenzahlen sowie nach den jeweiligen Buchdurchschnittspreisen. Alle E-Journals und Datenbanken werden aus einem fachübergreifenden Budget finanziert. Die UB stellt durch ein nutzungsbasiertes Portfoliomanagement sicher, dass nur jene E-Ressourcen weiterhin lizenziert werden, die eine günstige Relation zwischen Nutzungshäufigkeit und jährlichen Kosten aufweisen. Die Abbestellung teurer, vergleichsweise schlecht genutzter Produkte kompensiert zum einen Preissteigerungen und schafft zum anderen finanziellen Spielraum für die Aufnahme neuer Zeitschriften und Datenbanken ins Portfolio. Ermöglicht wird das Portfoliomanagement durch das seit 2014 genutzte Electronic Ressource Management System RMS von SemperTool, in das regelmäßig alle verfügbaren Nutzungsstatistiken lizenzierter E-Ressourcen eingepflegt werden, entweder vollautomatisch per SUSHI oder per Upload von COUNTER-Reports. Liegen Nutzungsstatistiken in proprietären Formaten vor, werden sie händisch ins COUNTER-Format umgewandelt. Nur wenige Anbieter liefern keinerlei Statistiken. In diesen Fällen wird auf Zugriffszahlen des lokalen HAN-Servers zurückgegriffen. RMS stellt die Kosten pro Nutzung sämtlicher lizenzierter Produkte übersichtlich dar und erlaubt deren Export. Jeweils zu Jahresbeginn erfolgt eine detaillierte Auswertung in Excel. Unter der Vorgabe, die Ausgaben für E-Ressourcen auf vertretbarem Niveau zu halten und Bestellwünsche der Fachbereiche berücksichtigen zu können, werden die maximal akzeptablen Kosten pro Nutzung festgelegt und Produkte, die diese überschreiten, abbestellt.In 2019, the university library of Kassel introduced a new budget allocation model in order to make efficient use of stagnating funds and to cope with rising costs. Allocation of budgets for monographs remains subject-specific, depending on current numbers of professors and students as well as on average book prices. E-journals and databases, however, are now paid from a central overall budget. Only e-resources with a favourable relation of cost per usage are kept in the library’s portfolio. Expensive and poorly used products are cancelled in order to compensate for price increases and to enable new subscriptions of journals and databases based on the needs of the faculties. The university library uses SemperTool's Electronic Resource Management System RMS for its portfolio management. All available usage statistics of electronic resources are regularly collected, either automatically via SUSHI or by uploading COUNTER reports. In some cases, the latter must be created manually based on proprietary formats provided by the publishers. Only a few publishers do not provide any statistics at all. In these cases, the university library has to fall back on statistics from the local HAN server. RMS provides a cost per usage report containing all products. The report can be further processed with spreadsheet software like Excel. The maximum accepted cost per usage is adapted annually based on the requirements to keep spending at a reasonable level and to enable new subscriptions according to the faculties’ requests. All products which exceed the current threshold are cancelled

    Spin lifetimes exceeding 12 nanoseconds in graphene non-local spin valve devices

    Full text link
    We show spin lifetimes of 12.6 ns and spin diffusion lengths as long as 30.5 \mu m in single layer graphene non-local spin transport devices at room temperature. This is accomplished by the fabrication of Co/MgO-electrodes on a Si/SiO2_2 substrate and the subsequent dry transfer of a graphene-hBN-stack on top of this electrode structure where a large hBN flake is needed in order to diminish the ingress of solvents along the hBN-to-substrate interface. Interestingly, long spin lifetimes are observed despite the fact that both conductive scanning force microscopy and contact resistance measurements reveal the existence of conducting pinholes throughout the MgO spin injection/detection barriers. The observed enhancement of the spin lifetime in single layer graphene by a factor of 6 compared to previous devices exceeds current models of contact-induced spin relaxation which paves the way towards probing intrinsic spin properties of graphene.Comment: 8 pages, 5 figure

    Quantitative comparison of the magnetic proximity effect in Pt detected by XRMR and XMCD

    Full text link
    X-ray resonant magnetic reflectivity (XRMR) allows for the simultaneous measurement of structural, optical and magnetooptic properties and depth profiles of a variety of thin film samples. However, a same-beamtime same-sample systematic quantitative comparison of the magnetic properties observed with XRMR and x-ray magnetic circular dichroism (XMCD) is still pending. Here, the XRMR results (Pt L3_{3} absorption edge) for the magnetic proximity effect in Pt deposited on the two different ferromagnetic materials Fe and Co33_{33}Fe67_{67} are compared with quantitatively analyzed XMCD results. The obtained results are in very good quantitative agreement between the absorption-based (XMCD) and reflectivity-based (XRMR) techniques taking into account an ab initio calculated magnetooptic conversion factor for the XRMR analysis. Thus, it is shown that XRMR provides quantitative reliable spin depth profiles important for spintronic and spin caloritronic transport phenomena at this type of magnetic interfaces.Comment: This article may be downloaded for personal use only. Any other use requires prior permission of the author and AIP Publishing. This article appeared in Appl. Phys. Lett. 118, 012407 (2021) and may be found at https://aip.scitation.org/doi/abs/10.1063/5.003258

    Reproducibility and Replicability of Web Measurement Studies

    Get PDF
    Web measurement studies can shed light on not yet fully understood phenomena and thus are essential for analyzing how the modern Web works. This often requires building new and adjusting existing crawling setups, which has led to a wide variety of analysis tools for different (but related) aspects. If these efforts are not sufficiently documented, the reproducibility and replicability of the measurements may suffer - two properties that are crucial to sustainable research. In this paper, we survey 117 recent research papers to derive best practices for Web-based measurement studies and specify criteria that need to be met in practice. When applying these criteria to the surveyed papers, we find that the experimental setup and other aspects essential to reproducing and replicating results are often missing. We underline the criticality of this finding by performing a large-scale Web measurement study on 4.5 million pages with 24 different measurement setups to demonstrate the influence of the individual criteria. Our experiments show that slight differences in the experimental setup directly affect the overall results and must be documented accurately and carefully

    World Congress Integrative Medicine & Health 2017: Part one

    Get PDF
    • …
    corecore