15 research outputs found

    Evasive Internet: Reducing Internet Vulnerability through Transient Destination

    Get PDF
    In the current Internet architecture, traffic is commonly routed to its destination using DNS names that are mapped to IP addresses, yet there are no inherent means for receivers to attribute sources of traffic to senders or for receivers to authorize senders. These deficiencies leave the Internet and its connected hosts vulnerable to a wide range of attacks including denial-of-service and misrepresentation (spoofing, phishing, etc.) which continue to cause material damage. In this mechanism to combat these vulnerabilities by introducing attribution and authorization into the network using a transient addressing scheme to establish attribution through DNS, establish authorization at the host, and enforce authorization and attribution in the network. In this work, I developed and characterized a system for effecting in-network enforcement at the router, and I demonstrate the enforcement is possible on current commodity hardware at sustained throughput rates Ill above common Internet connection rates. The current internet architecture allows hosts to send arbitrary IP packets across a network, which may not reflect valid source address information. IP spoofing and Denial of service attacks are ubiquitous. Filtering techniques are not sufficient enough to counter these attacks. Current Internet design calls for in-network authentication of addresses and attribution of traffic they generate. In this architecture the destination can only be reached through a valid capability. The aim of this dissertation is to implement Evasive Internet Protocol for the end hosts and measure the preliminary performance as compared to current internet protocols

    Check before storing: what is the performance price of content integrity verification in LRU caching?

    Get PDF
    In some network and application scenarios, it is useful to cache content in network nodes on the ļ¬‚y, at line rate. Resilience of in-network caches can be improved by guaranteeing that all content therein stored is valid. Digital signatures could be indeed used to verify content integrity and provenance. However, their operation may be much slower than the line rate, thus limiting caching of cryptographically veriļ¬ed objects to a small subset of the forwarded ones. How this aļ¬€ects caching performance? To answer such a question, we devise a simple analytical approach which permits to assess performance of an LRU caching strategy storing a randomly sampled subset of requests. A key feature of our model is the ability to handle traļ¬ƒc beyond the traditional Independent Reference Model, thus permitting us to understand how performance vary in diļ¬€erent temporal locality conditions. Results, also veriļ¬ed on real world traces, show that content integrity veriļ¬cation does not necessarily bring about a performance penalty; rather, in some speciļ¬c (but practical) conditions, performance may even improve

    Efficient Software Implementations of Modular Exponentiation

    Get PDF
    RSA computations have a significant effect on the workloads of SSL/TLS servers, and therefore their software implementations on general purpose processors are an important target for optimization. We concentrate here on 512-bit modular exponentiation, used for 1024-bit RSA. We propose optimizations in two directions. At the primitivesā€™ level, we study and improve the performance of an ā€œAlmostā€ Montgomery Multiplication. At the exponentiation level, we propose a method to reduce the cost of protecting the w-ary exponentiation algorithm against cache/timing side channel attacks. Together, these lead to an efficient software implementation of 512-bit modular exponentiation, which outperforms the currently fastest publicly available alternative. When measured on the latest x86-64 architecture, the 2nd Generation IntelĀ® Coreā„¢ processor, our implementation is 43% faster than that of the current version of OpenSSL (1.0.0d)

    Parallelizing message schedules to accelerate the computations of hash functions

    Get PDF
    This paper describes an algorithm for accelerating the computations of Davies-Meyer based hash functions. It is based on parallelizing the computation of several message schedules for several message blocks of a given message. This parallelization, together with the proper use of vector processor instructions (SIMD) improves the overall algorithmā€™s performance. Using this method, we obtain a new software implementation of SHA-256 that performs at 12.11 Cycles/Byte on the 2nd and 10.84 Cycles/Byte on the 3rd Generation IntelĀ® Coreā„¢ processors. We also show how to extend the method to the soon-to-come AVX2 architecture, which has wider registers. Since processors with AVX2 will be available only in 2013, exact performance reporting is not yet possible. Instead, we show that our resulting SHA-256 and SHA-512 implementations have a reduced number of instructions. Based on our findings, we make some observations on the SHA3 competition. We argue that if the prospective SHA3 standard is expected to be competitive against the performance of SHA-256 or SHA-512, on the high end platforms, then its performance should be well below 10 Cycles/Byte on the current, and certainly on the near future processors. Not all the SHA3 finalists have this performance. Furthermore, even the fastest finalists will probably offer only a small performance advantage over the current SHA-256 and SHA-512 implementations

    Security in migratory interactive web applications

    Full text link

    Exploring how males who encounter phenomena they identify as ā€˜Conversion Disorderā€™/ā€™Functional Neurological Disorderā€™ experience agency in their lives

    Get PDF
    This research investigates the way that males who identify with the diagnostic label ā€˜conversion disorder/functional neurological disorder (CD/FND)ā€™ experience agency in their lives. The historical developments, controversies and complexities around ā€˜CD/FNDā€™ form the backdrop of this exploration into the lived experience of agency. A sample of eight participants were recruited via social networking sites and charities, and the data was collected through Skype-based interviews and analysed using the qualitative Interpretative Phenomenological Analysis (IPA) approach. The analysis showed the following five main themes: ā€˜paradox of controlā€™, ā€˜living within a dualistic frameworkā€™, ā€˜disconnection from self and othersā€™, ā€˜engaged in a battle or fightā€™ and ā€˜meaning and reality as dependent on other peopleā€™. These master themes and their related subordinate themes are presented in light of existing research. The findings highlight the difficulty experienced by participants who identify with a diagnostic label that is at odds with a medicalised approach to understanding and treating illness. The limitations of this study and the potential avenues for future research are also discussed

    20th SC@RUG 2023 proceedings 2022-2023

    Get PDF
    corecore