21 research outputs found

    How to deploy security mechanisms online (consistently)

    Get PDF
    To mitigate a myriad of Web attacks, modern browsers support client-side secu- rity policies shipped through HTTP response headers. To enforce these policies, the operator can set response headers that the server then communicates to the client. We have shown that one of those, namely the Content Security Policy (CSP), re- quires massive engineering effort to be deployed in a non-trivially bypassable way. Thus, many policies deployed on Web sites are misconfigured. Due to the capability of CSP to also defend against framing-based attacks, it has a functionality-wise overlap with the X-Frame-Options header. We have shown that this overlap leads to inconsistent behavior of browsers, but also inconsistent deployment on real-world Web applications. Not only overloaded defense mechanisms are prone to security inconsistencies. We investigated that due to the structure of the Web it- self, misconfigured origin servers or geolocation-based CDN caches can cause unwanted security inconsistencies. To not disregard the high number of misconfigurations of CSP, we also took a closer look at the deployment process of the mechanism. By conducting a semi-structured interview, including a coding task, we were able to shed light on motivations, strategies, and roadblocks of CSP deployment. However, due to the wide usage of CSP, drastic changes are generally considered impractical. Therefore, we also evaluated if one of the newest Web security features, namely Trusted Types, can be improved.Um eine Vielzahl von Angriffen im Web zu entschärfen, unterstützen moderne Browser clientseitige Sicherheitsmechanismen, die über sogenannte HTTP Response- Header übermittelt werden. Um jene Sicherheitsfeatures anzuwenden, setzt der Betreiber einer Web site einen solchen Header, welchen der Server dann an den Client ausliefert. Wir haben gezeigt, dass das konfigurieren eines dieser Mechanismen, der Content Security Policy (CSP), einen enormen technischen Aufwand erfordert, um auf nicht triviale Weise umgangen werden zu können. Daher ist jenes feature auf vielen Webseiten, auch Top Webseiten, falsch konfiguriert. Aufgrund der Fähigkeit von CSP, auch Framing-basierte Angriffe abzuwehren, überschneidet sich seine Funktionalität darüber hinaus mit der des X-Frame-Options Headers. Wir haben gezeigt, dass dies zu inkonsistentem Verhalten von Browsern, aber auch zu inkonsistentem Einsatz in realen Webanwendungen führt. Nicht nur überladene Verteidigungsmechanismen sind anfällig für Sicherheitsinkonsistenzen. Wir haben untersucht, dass aufgrund der Struktur desWebs selbst, falsch konfigurierte Ursprungsserver, oder CDN-Caches, die von der geographischen Lage abhängen, unerwünschte Sicherheitsinkonsistenzen verursachen können. Um die hohe Anzahl an Fehlkonfigurationen von CSP-Headern nicht außer Acht zu lassen, haben wir uns auch den Erstellungsprozess eines CSP-Headers genauer angesehen. Mit Hilfe eines halbstrukturierten Interviews, welches auch eine Programmieraufgabe beinhaltete, konnten wir die Motivationen, Strategien und Hindernisse beim Einsatz von CSP beleuchten. Aufgrund der weiten Verbreitung von CSP werden drastische Änderungen allgemein jedoch als unpraktisch angesehen. Daher haben wir ebenfalls untersucht, ob eine der neuesten und daher wenig genutzten,Web-Sicherheitsmechanismen, namentlich Trusted Types, ebenfalls verbesserungswürdig ist

    Cyber Law and Espionage Law as Communicating Vessels

    Get PDF
    Professor Lubin\u27s contribution is Cyber Law and Espionage Law as Communicating Vessels, pp. 203-225. Existing legal literature would have us assume that espionage operations and “below-the-threshold” cyber operations are doctrinally distinct. Whereas one is subject to the scant, amorphous, and under-developed legal framework of espionage law, the other is subject to an emerging, ever-evolving body of legal rules, known cumulatively as cyber law. This dichotomy, however, is erroneous and misleading. In practice, espionage and cyber law function as communicating vessels, and so are better conceived as two elements of a complex system, Information Warfare (IW). This paper therefore first draws attention to the similarities between the practices – the fact that the actors, technologies, and targets are interchangeable, as are the knee-jerk legal reactions of the international community. In light of the convergence between peacetime Low-Intensity Cyber Operations (LICOs) and peacetime Espionage Operations (EOs) the two should be subjected to a single regulatory framework, one which recognizes the role intelligence plays in our public world order and which adopts a contextual and consequential method of inquiry. The paper proceeds in the following order: Part 2 provides a descriptive account of the unique symbiotic relationship between espionage and cyber law, and further explains the reasons for this dynamic. Part 3 places the discussion surrounding this relationship within the broader discourse on IW, making the claim that the convergence between EOs and LICOs, as described in Part 2, could further be explained by an even larger convergence across all the various elements of the informational environment. Parts 2 and 3 then serve as the backdrop for Part 4, which details the attempt of the drafters of the Tallinn Manual 2.0 to compartmentalize espionage law and cyber law, and the deficits of their approach. The paper concludes by proposing an alternative holistic understanding of espionage law, grounded in general principles of law, which is more practically transferable to the cyber realmhttps://www.repository.law.indiana.edu/facbooks/1220/thumbnail.jp

    Protecting Systems From Exploits Using Language-Theoretic Security

    Get PDF
    Any computer program processing input from the user or network must validate the input. Input-handling vulnerabilities occur in programs when the software component responsible for filtering malicious input---the parser---does not perform validation adequately. Consequently, parsers are among the most targeted components since they defend the rest of the program from malicious input. This thesis adopts the Language-Theoretic Security (LangSec) principle to understand what tools and research are needed to prevent exploits that target parsers. LangSec proposes specifying the syntactic structure of the input format as a formal grammar. We then build a recognizer for this formal grammar to validate any input before the rest of the program acts on it. To ensure that these recognizers represent the data format, programmers often rely on parser generators or parser combinators tools to build the parsers. This thesis propels several sub-fields in LangSec by proposing new techniques to find bugs in implementations, novel categorizations of vulnerabilities, and new parsing algorithms and tools to handle practical data formats. To this end, this thesis comprises five parts that tackle various tenets of LangSec. First, I categorize various input-handling vulnerabilities and exploits using two frameworks. First, I use the mismorphisms framework to reason about vulnerabilities. This framework helps us reason about the root causes leading to various vulnerabilities. Next, we built a categorization framework using various LangSec anti-patterns, such as parser differentials and insufficient input validation. Finally, we built a catalog of more than 30 popular vulnerabilities to demonstrate the categorization frameworks. Second, I built parsers for various Internet of Things and power grid network protocols and the iccMAX file format using parser combinator libraries. The parsers I built for power grid protocols were deployed and tested on power grid substation networks as an intrusion detection tool. The parser I built for the iccMAX file format led to several corrections and modifications to the iccMAX specifications and reference implementations. Third, I present SPARTA, a novel tool I built that generates Rust code that type checks Portable Data Format (PDF) files. The type checker I helped build strictly enforces the constraints in the PDF specification to find deviations. Our checker has contributed to at least four significant clarifications and corrections to the PDF 2.0 specification and various open-source PDF tools. In addition to our checker, we also built a practical tool, PDFFixer, to dynamically patch type errors in PDF files. Fourth, I present ParseSmith, a tool to build verified parsers for real-world data formats. Most parsing tools available for data formats are insufficient to handle practical formats or have not been verified for their correctness. I built a verified parsing tool in Dafny that builds on ideas from attribute grammars, data-dependent grammars, and parsing expression grammars to tackle various constructs commonly seen in network formats. I prove that our parsers run in linear time and always terminate for well-formed grammars. Finally, I provide the earliest systematic comparison of various data description languages (DDLs) and their parser generation tools. DDLs are used to describe and parse commonly used data formats, such as image formats. Next, I conducted an expert elicitation qualitative study to derive various metrics that I use to compare the DDLs. I also systematically compare these DDLs based on sample data descriptions available with the DDLs---checking for correctness and resilience

    JPEG: the quadruple object

    Get PDF
    The thesis, together with its practice-research works, presents an object-oriented perspective on the JPEG standard. Using the object-oriented philosophy of Graham Harman as a theoretical and also practical starting point, the thesis looks to provide an account of the JPEG digital object and its enfolding within the governmental scopic regime. The thesis looks to move beyond accounts of digital objects and protocols within software studies that position the object in terms of issues of relationality, processuality and potentiality. From an object-oriented point of view, the digital object must be seen as exceeding its relations, as actual, present and holding nothing in reserve. The thesis presents an account of JPEG starting from that position as well as an object-oriented account of JPEG’s position within the distributed, governmental scopic regime via an analysis of Facebook’s Timeline, tagging and Haystack systems. As part of a practice-research project, the author looked to use that perspective within photographic and broader imaging practices as a spur to new work and also as a “laboratory” to explore Harman’s framework. The thesis presents the findings of those “experiments” in the form of a report alongside practice-research eBooks. These works were not designed to be illustrations of the theory, nor works to be “analysed”. Rather, following the lead of Ian Bogost and Mark Amerika, they were designed to be “philosophical works” in the sense of works that “did” philosophy

    Fast Internet-Wide Scanning: A New Security Perspective

    Full text link
    Techniques like passive observation and random sampling let researchers understand many aspects of Internet day-to-day operation, yet these methodologies often focus on popular services or a small demographic of users, rather than providing a comprehensive view of the devices and services that constitute the Internet. As the diversity of devices and the role they play in critical infrastructure increases, so does understanding the dynamics of and securing these hosts. This dissertation shows how fast Internet-wide scanning provides a near-global perspective of edge hosts that enables researchers to uncover security weaknesses that only emerge at scale. First, I show that it is possible to efficiently scan the IPv4 address space. ZMap: a network scanner specifically architected for large-scale research studies can survey the entire IPv4 address space from a single machine in under an hour at 97% of the theoretical maximum speed of gigabit Ethernet with an estimated 98% coverage of publicly available hosts. Building on ZMap, I introduce Censys, a public service that maintains up-to-date and legacy snapshots of the hosts and services running across the public IPv4 address space. Censys enables researchers to efficiently ask a range of security questions. Next, I present four case studies that highlight how Internet-wide scanning can identify new classes of weaknesses that only emerge at scale, uncover unexpected attacks, shed light on previously opaque distributed systems on the Internet, and understand the impact of consequential vulnerabilities. Finally, I explore how in- creased contention over IPv4 addresses introduces new challenges for performing large-scale empirical studies. I conclude with suggested directions that the re- search community needs to consider to retain the degree of visibility that Internet-wide scanning currently provides.PHDComputer Science & EngineeringUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttps://deepblue.lib.umich.edu/bitstream/2027.42/138660/1/zakir_1.pd

    JPEG: the quadruple object

    Get PDF
    The thesis, together with its practice-research works, presents an object-oriented perspective on the JPEG standard. Using the object-oriented philosophy of Graham Harman as a theoretical and also practical starting point, the thesis looks to provide an account of the JPEG digital object and its enfolding within the governmental scopic regime. The thesis looks to move beyond accounts of digital objects and protocols within software studies that position the object in terms of issues of relationality, processuality and potentiality. From an object-oriented point of view, the digital object must be seen as exceeding its relations, as actual, present and holding nothing in reserve. The thesis presents an account of JPEG starting from that position as well as an object-oriented account of JPEG’s position within the distributed, governmental scopic regime via an analysis of Facebook’s Timeline, tagging and Haystack systems. As part of a practice-research project, the author looked to use that perspective within photographic and broader imaging practices as a spur to new work and also as a “laboratory” to explore Harman’s framework. The thesis presents the findings of those “experiments” in the form of a report alongside practice-research eBooks. These works were not designed to be illustrations of the theory, nor works to be “analysed”. Rather, following the lead of Ian Bogost and Mark Amerika, they were designed to be “philosophical works” in the sense of works that “did” philosophy

    The Origins of the United Nations Relief and Rehabilitation Administration, 1939-1943

    Get PDF
    This dissertation analyzes the bureaucratic origins and diplomatic processes that led to the creation of the United Nations Relief and Rehabilitation Administration (UNRRA), established in November 1943 to aid destitute populations and battle-scarred countries after the Second World War. Based on archival work in Canada, Europe, and the United States, the author argues that UNRRA was not only a test case for the United Nations organization set up after the war; it also served as a model for the whole system of postwar global governance. Yet this agency was not what it seemed. While Franklin Roosevelt claimed the UN signified the emergence of a new world civilization, his Administration planned to use UNRRA to construct and manage a global order in America\u27s image. UNRRA would provide the U.S. government an instrument with which to advance its ideological agenda and achieve its geo-strategic aims. The UN, in effect, was imagined and conceived in Washington as a tool of informal empire. American officials had little desire to surrender U.S. resources or freedom of action to any international authority. They therefore devised a scheme that, while giving the impression of wide participation, would enable the U.S. to dominate the organization and act unilaterally if necessary. However, wartime exigencies, criticism from countries all over the world, and the presence of Soviet power forced American diplomats to compromise when negotiating the UNRRA agreement. The resulting concessions limited Washington\u27s strategic options vis-Ã -vis the Soviet Union and various regions of the world, particularly Eastern Europe. This fact certainly pleased Moscow, but a series of subsequent revisions to the agreement hardly appeased the other concerned countries. Yet they accepted it: these countries needed and feared the United States. As a result, UNRRA came into being in late 1943, but the process that made it possible had damaged Washington\u27s clout. This research challenges accepted views of the United Nations and America\u27s place in the world. It revises our understanding of Franklin Roosevelt\u27s grand strategy, the Cold War\u27s origins, and the international system in existence today. It also unearths the roots of post-Cold War anti-Americanism

    Bowdoin Orient v.118, no.1-27 (1988-1989)

    Get PDF
    https://digitalcommons.bowdoin.edu/bowdoinorient-1980s/1009/thumbnail.jp
    corecore