56 research outputs found

    Reinterpreting the transport protocol stack to embrace ossification

    Get PDF
    Ubiquitous deployment of middleboxes has resulted in ossification of the transport layer, with TCP and UDP becoming part of the narrow waist of the Internet. This is a necessary stage in the evolution of the network, caused by its progression from research, to production, to increasingly critical infrastructure. New transport layer protocols will be needed in future, but since we are working with essential infrastructure, we cannot expect to have scope to make wholesale rapid changes. Future development must be done using the existing protocols as substrates, always maintaining on-the-wire compatibility. To advance, we must embrace the ossification of the network, and learn to reinterpret and extend the existing protocols

    Consolidating Streams to Improve DASH Cache Utilisation

    Get PDF
    Existing HTTP caches interact poorly with multiple Dynamic Adaptive Streaming over HTTP (DASH) streams of the same content: time and quality differences prevent a complete representation from being cached, reducing hit-ratios. We propose to consolidate near-simultaneous streams based on time or quality, where the improved cache performance makes this worthwhile. We estimate that there is a sufficient number of near-simultaneous streams for our proposed techniques to improve cache hit-ratios

    Consolidating Streams to Improve DASH Cache Utilisation

    Get PDF
    Existing HTTP caches interact poorly with multiple Dynamic Adaptive Streaming over HTTP (DASH) streams of the same content: time and quality differences prevent a complete representation from being cached, reducing hit-ratios. We propose to consolidate near-simultaneous streams based on time or quality, where the improved cache performance makes this worthwhile. We estimate that there is a sufficient number of near-simultaneous streams for our proposed techniques to improve cache hit-ratios

    Implementing Real-Time Transport Services over an Ossified Network

    Get PDF
    Real-time applications require a set of transport services not currently provided by widely-deployed transport protocols. Ossification prevents the deployment of novel protocols, restricting solutions to protocols using either TCP or UDP as a substrate. We describe the transport services required by real-time applications. We show that, in the short-term (i.e., while UDP is blocked at current levels), TCP offers a feasible substrate for providing these services. Over the longer term, protocols using UDP may reduce the number of networks blocking UDP, enabling a shift towards its use as a demultiplexing layer for novel transport protocols

    Deployable transport services for low-latency multimedia applications

    Get PDF
    Low-latency multimedia applications generate a growing and significant majority of all Internet traffic. These applications are characterised by tight bounds on end-to-end latency that typically range from tens to a few hundred milliseconds. Operating within these bounds is challenging, with the best-effort delivery service of the Internet giving rise to unreliable delivery with unpredictable latency. The way in which the upper layers of the protocol stack manage this unreliability and unpredictability can greatly impact the quality-of-experience that applications can provide. In this thesis, I focus on the services and abstractions that the transport layer provides to applications. The delivery model provided by the transport layer can have a significant impact on the quality-of-experience that can be provided by the application. Reliability and order, for example, introduce delay while packet loss is detected and the lost data retransmitted. This enforces a particular trade-off between latency, loss, and application quality-of-experience, with reliability taking priority. This trade-off is not suitable for low-latency multimedia applications, which prefer predictable and bounded latency to strict reliability and order. No widely-deployed transport protocol provides a delivery model that fully supports low-latency applications: UDP provides no reliability guarantees, while TCP enforces reliability. Implementing a protocol that does support these applications is difficult: ossification restricts protocols to appearing as UDP or TCP on-the-wire. To meet both challenges -- of better supporting low-latency multimedia applications, and of deploying a new protocol within an ossified transport layer -- I propose TCP Hollywood, a protocol that maintains wire compatibility with TCP, while exposing the trade-off between reliability and delay such that applications can improve their quality-of-experience. I show that TCP Hollywood is deployable on the public Internet, and that it achieves its goal of improving support for low-latency multimedia applications. I conclude by evaluating the API changes that are required to support TCP Hollywood, distilling the protocol into the set of transport services that it provides

    TCP Goes to Hollywood

    Get PDF
    Real-time multimedia applications use either TCP or UDP at the transport layer, yet neither of these protocols offer all of the features required. Deploying a new protocol that does offer these features is made difficult by ossification: firewalls, and other middleboxes, in the network expect TCP or UDP, and block other types of traffic. We present TCP Hollywood, a protocol that is wire-compatible with TCP, while offering an unordered, partially reliable messageoriented transport service that is well suited to multimedia applications. Analytical results show that TCP Hollywood extends the feasibility of using TCP for real-time multimedia applications, by reducing latency and increasing utility. Preliminary evaluations also show that TCP Hollywood is deployable on the public Internet, with safe failure modes. Measurements across all major UK fixed-line and cellular networks validate the possibility of deployment

    Temporal Network Analysis of Email Communication Patterns in a Long Standing Hierarchy

    Full text link
    An important concept in organisational behaviour is how hierarchy affects the voice of individuals, whereby members of a given organisation exhibit differing power relations based on their hierarchical position. Although there have been prior studies of the relationship between hierarchy and voice, they tend to focus on more qualitative small-scale methods and do not account for structural aspects of the organisation. This paper develops large-scale computational techniques utilising temporal network analysis to measure the effect that organisational hierarchy has on communication patterns within an organisation, focusing on the structure of pairwise interactions between individuals. We focus on one major organisation as a case study - the Internet Engineering Task Force (IETF) - a major technical standards development organisation for the Internet. A particularly useful feature of the IETF is a transparent hierarchy, where participants take on explicit roles (e.g. Area Directors, Working Group Chairs). Its processes are also open, so we have visibility into the communication of people at different hierarchy levels over a long time period. We utilise a temporal network dataset of 989,911 email interactions among 23,741 participants to study how hierarchy impacts communication patterns. We show that the middle levels of the IETF are growing in terms of their dominance in communications. Higher levels consistently experience a higher proportion of incoming communication than lower levels, with higher levels initiating more communications too. We find that communication tends to flow "up" the hierarchy more than "down". Finally, we find that communication with higher-levels is associated with future communication more than for lower-levels, which we interpret as "facilitation". We conclude by discussing the implications this has on patterns within the wider IETF and for other organisations

    Power and vulnerability: managing sensitive language in organizational communication

    Get PDF
    Organizational responsibilities can give people power but also expose them to scrutiny. This tension leads to divergent predictions about the use of potentially sensitive language: power might license it, while exposure might inhibit it. Analysis of peoples' language use in a large corpus of organizational emails using standardized Linguistic Inquiry and Word Count (LIWC) measures shows a systematic difference in the use of words with potentially sensitive (ethnic, religious, or political) connotations. People in positions of relative power are ~3 times less likely to use sensitive words than people more junior to them. The tendency to avoid potentially sensitive language appears to be independent of whether other people are using sensitive language in the same email exchanges, and also independent of whether these words are used in a sensitive context. These results challenge a stereotype about language use and the exercise of power. They suggest that, in at least some circumstances, the exposure and accountability associated with organizational responsibilities are a more significant influence on how people communicate than social power

    Traditional vs non-traditional assessment activities as learning indicators of student learning : teachers' perceptions

    Get PDF
    In online settings, some teachers express reservations about relying only on traditional assessments (e.g., tests, assignments, exams, etc.) as trustworthy instruments to evaluate students' understanding of the content accurately. A previous qualitative study revealed that the richness of online environments has allowed teachers to use traditional assessments (anything contributing to the final grade) and non-traditional assessment-based activities (not factored into the final grade but useful in gauging student knowledge) to assess their students' learning status. This study aims to compare the perceived accuracy of both types of assessment activities as indicators of student learning. A total of 124 participants engaged in online teaching completed a self-report instrument. The results revealed a significant difference in teachers' perceptions of the accuracy of traditional assessment activities (M = 3.16; SD =. 442) compared to non-traditional assessment activities (M = 3.05, SD =. 521), t (122) = -2.64, p =. 009 with small effect size (eta =. 02). No significant gender differences were observed in the perceptions of the accuracy of either assessment activities type. The most commonly employed traditional assessment activities were “final exams” (85.5%) and “individual assignments” (83.9%). In comparison, the most common non-traditional assessment methods to evaluate students' knowledge were “questions on previously taught content” (79.8%) and “asking students questions about current content during the lecture” (79%). A one-way analysis of variance revealed no significant differences in perceptions of the accuracy of traditional and non-traditional assessment activities among teachers with varying years of experience (up to 10 years, 11–15 years, and 16+ years). The findings suggest that certain non-traditional assessment activities can also be as accurate as traditional learning activities. Moreover, non-assessment-related activities are perceived to be effective learning indicators. This study has implications for academic institutions and educators interested in supplementing traditional approaches to assessing student learning with non-traditional methods
    • …
    corecore