1,516 research outputs found

    Phrase based browsing for simulation traces of network protocols

    Full text link
    Most discrete event simulation frameworks are able to out-put simulation runs as a trace. The Network Simulator 2 (NS2) is a prominent example that does so to decouple generation of dynamic behavior from its evaluation. If a modeler is interested in the specific details and confronted with lengthy traces from simulation runs, support is needed to identify relevant pieces of information. In this paper, we present a new phrase-based browser that has its roots in information retrieval, language acquisition and text com-pression which is refined to work with trace data derived from simulation models. The browser is a new navigation feature of Traviando, a trace visualizer and analyzer for sim-ulation traces. The browsing technique allows a modeler to investigate particular patterns seen in a trace, that may be of interest due to their frequent or rare occurrence. We demonstrate how this approach applies to traces generated with NS2.

    Traffic measurement and analysis

    Get PDF
    Measurement and analysis of real traffic is important to gain knowledge about the characteristics of the traffic. Without measurement, it is impossible to build realistic traffic models. It is recent that data traffic was found to have self-similar properties. In this thesis work traffic captured on the network at SICS and on the Supernet, is shown to have this fractal-like behaviour. The traffic is also examined with respect to which protocols and packet sizes are present and in what proportions. In the SICS trace most packets are small, TCP is shown to be the predominant transport protocol and NNTP the most common application. In contrast to this, large UDP packets sent between not well-known ports dominates the Supernet traffic. Finally, characteristics of the client side of the WWW traffic are examined more closely. In order to extract useful information from the packet trace, web browsers use of TCP and HTTP is investigated including new features in HTTP/1.1 such as persistent connections and pipelining. Empirical probability distributions are derived describing session lengths, time between user clicks and the amount of data transferred due to a single user click. These probability distributions make up a simple model of WWW-sessions

    Network communication privacy: traffic masking against traffic analysis

    Get PDF
    An increasing number of recent experimental works have been demonstrating the supposedly secure channels in the Internet are prone to privacy breaking under many respects, due to traffic features leaking information on the user activity and traffic content. As a matter of example, traffic flow classification at application level, web page identification, language/phrase detection in VoIP communications have all been successfully demonstrated against encrypted channels. In this thesis I aim at understanding if and how complex it is to obfuscate the information leaked by traffic features, namely packet lengths, direction, times. I define a security model that points out what the ideal target of masking is, and then define the optimized and practically implementable masking algorithms, yielding a trade-off between privacy and overhead/complexity of the masking algorithm. Numerical results are based on measured Internet traffic traces. Major findings are that: i) optimized full masking achieves similar overhead values with padding only and in case fragmentation is allowed; ii) if practical realizability is accounted for, optimized statistical masking algorithms attain only moderately better overhead than simple fixed pattern masking algorithms, while still leaking correlation information that can be exploited by the adversary

    From Social Data Mining to Forecasting Socio-Economic Crisis

    Full text link
    Socio-economic data mining has a great potential in terms of gaining a better understanding of problems that our economy and society are facing, such as financial instability, shortages of resources, or conflicts. Without large-scale data mining, progress in these areas seems hard or impossible. Therefore, a suitable, distributed data mining infrastructure and research centers should be built in Europe. It also appears appropriate to build a network of Crisis Observatories. They can be imagined as laboratories devoted to the gathering and processing of enormous volumes of data on both natural systems such as the Earth and its ecosystem, as well as on human techno-socio-economic systems, so as to gain early warnings of impending events. Reality mining provides the chance to adapt more quickly and more accurately to changing situations. Further opportunities arise by individually customized services, which however should be provided in a privacy-respecting way. This requires the development of novel ICT (such as a self- organizing Web), but most likely new legal regulations and suitable institutions as well. As long as such regulations are lacking on a world-wide scale, it is in the public interest that scientists explore what can be done with the huge data available. Big data do have the potential to change or even threaten democratic societies. The same applies to sudden and large-scale failures of ICT systems. Therefore, dealing with data must be done with a large degree of responsibility and care. Self-interests of individuals, companies or institutions have limits, where the public interest is affected, and public interest is not a sufficient justification to violate human rights of individuals. Privacy is a high good, as confidentiality is, and damaging it would have serious side effects for society.Comment: 65 pages, 1 figure, Visioneer White Paper, see http://www.visioneer.ethz.c

    Intuitive interaction: Steps towards an integral understanding of the user experience in interaction design

    Get PDF
    A critical review of traditional practices and methodologies demonstrates an underplaying of firstly the role of emotions and secondly aspects of exploration in interaction behaviour in favour of a goal orientated focus in the user experience (UX). Consequently, the UX is a commodity that can be designed, measured, and predicted. An integral understanding of the UX attempts to overcome the rationalistic and instrumental mindset of traditional Human-Computer Interaction (HCI) on several levels. Firstly, the thesis seeks to complement a functional view of interaction with a qualitative one that considers the complexity of emotions. Emotions are at the heart of engagement and connect action irreversibly to the moment it occurs; they are intettwined with cognition, and decision making. Furthermore, they introduce the vague and ambiguous aspects of experience and open it up to potentiality of creation. Secondly, the thesis examines the relationship between purposive and non-purposive user behaviour such as exploration, play and discovery. The integral position proposed here stresses the procedurally relational nature and complexity of interaction experience. This requires revisiting and augmenting key themes of HCI practice such as interactivity and intuitive design. Intuition is investigated as an early and unconscious form of learning, and unstructured browsing discussed as random interaction mechanisms as forms of implicit learning. Interactivity here is the space for user's actions, contributions and creativity, not only in the design process but also during interaction as co-authors of their experiences. Finally, I envisage integral forms of usability methods to embrace the vague and the ambiguous, in order to enrich HCI's vocabulary and design potential. Key readings that inform this position cut across contemporary philosophy, media and interaction studies and professional HCI literature. On a practical level, a series of experimental interaction designs for web-browsing aim to augment the user's experience, and create space for user's intuition

    Systems for Challenged Network Environments.

    Full text link
    Developing regions face significant challenges in network access, making even simple network tasks unpleasant and rich media prohibitively difficult to access. Even as cellular network coverage is approaching a near-universal reach, good network connectivity remains scarce and expensive in many emerging markets. The underlying theme in this dissertation is designing network systems that better accommodate users in emerging markets. To do so, this dissertation begins with a nuanced analysis of content access behavior for web users in developing regions. This analysis finds the personalization of content access---and the fragmentation that results from it---to be significant factors in undermining many existing web acceleration mechanisms. The dissertation explores content access behavior from logs collected at shared internet access sites, as well as user activity information obtained from a commercial social networking service with over a hundred million members worldwide. Based on these observations, the dissertation then discusses two systems designed for improving end-user experience in accessing and using content in constrained networks. First, it deals with the challenge of distributing private content in these networks. By leveraging the wide availability of cellular telephones, the dissertation describes a system for personal content distribution based on user access behavior. The system enables users to request future data accesses, and it schedules content transfers according to current and expected capacity. Second, the dissertation looks at routing bulk data in challenged networks, and describes an experimentation platform for building systems for challenged networks. This platform enables researchers to quickly prototype systems for challenged networks, and iteratively evaluate these systems using mobility and network emulation. The dissertation describes a few data routing systems that were built atop this experimentation platform. Finally, the dissertation discusses the marketplace and service discovery considerations that are important in making these systems viable for developing-region use. In particular, it presents an extensible, auction-based market platform that relies on widely available communication tools for conveniently discovering and trading digital services and goods in developing regions. Collectively, this dissertation brings together several projects that aim to understand and improve end-user experience in challenged networks endemic to developing regions.Ph.D.Computer Science & EngineeringUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttp://deepblue.lib.umich.edu/bitstream/2027.42/91401/1/azarias_1.pd

    SMARTPHONE-BASED DECENTRALIZED PUBLIC-TRANSPORT APPLICATIONS

    Get PDF
    Ph.DDOCTOR OF PHILOSOPH

    Variable Format: Media Poetics and the Little Database

    Get PDF
    This dissertation explores the situation of twentieth-century art and literature becoming digital. Focusing on relatively small online collections, I argue for materially invested readings of works of print, sound, and cinema from within a new media context. With bibliographic attention to the avant-garde legacy of media specificity and the little magazine, I argue that the “films,” “readings,” “magazines,” and “books” indexed on a series of influential websites are marked by meaningful transformations that continue to shape the present through a dramatic reconfiguration of the past. I maintain that the significance of an online version of a work is not only transformed in each instance of use, but that these versions fundamentally change our understanding of each historical work in turn. Here, I offer the analogical coding of these platforms as “little databases” after the little magazines that served as the vehicle of modernism and the historical avant-garde. Like the study of the full run of a magazine, these databases require a bridge between close and distant reading. Rather than contradict each other as is often argued, in this instance a combined macro- and microscopic mode of analysis yields valuable information not readily available by either method in isolation. In both directions, the social networks and technical protocols of database culture inscribe the limits of potential readings. Bridging the material orientation of bibliographic study with the format theory of recent media scholarship, this work constructs a media poetics for reading analog works situated within the windows, consoles, and networks of the twenty-first century

    From social data mining to forecasting socio-economic crises

    Get PDF
    Abstract.: The purpose of this White Paper of the EU Support Action "Visioneer”(see www.visioneer.ethz.ch) is to address the following goals: 1. Develop strategies to quickly increase the objective knowledge about social and economic systems. 2. Describe requirements for efficient large-scale scientific data mining of anonymized social and economic data. 3. Formulate strategies how to collect stylized facts extracted from large data set. 4. Sketch ways how to successfully build up centers for computational social science. 5. Propose plans how to create centers for risk analysis and crisis forecasting. 6. Elaborate ethical standards regarding the storage, processing, evaluation, and publication of social and economic dat
    corecore