21,633 research outputs found

    Supporting Cyber-Physical Systems with Wireless Sensor Networks: An Outlook of Software and Services

    Get PDF
    Sensing, communication, computation and control technologies are the essential building blocks of a cyber-physical system (CPS). Wireless sensor networks (WSNs) are a way to support CPS as they provide fine-grained spatial-temporal sensing, communication and computation at a low premium of cost and power. In this article, we explore the fundamental concepts guiding the design and implementation of WSNs. We report the latest developments in WSN software and services for meeting existing requirements and newer demands; particularly in the areas of: operating system, simulator and emulator, programming abstraction, virtualization, IP-based communication and security, time and location, and network monitoring and management. We also reflect on the ongoing efforts in providing dependable assurances for WSN-driven CPS. Finally, we report on its applicability with a case-study on smart buildings

    Measurement with Persons: A European Network

    Get PDF
    The European ‘Measuring the Impossible’ Network MINET promotes new research activities in measurement dependent on human perception and/or interpretation. This includes the perceived attributes of products and services, such as quality or desirability, and societal parameters such as security and well-being. Work has aimed at consensus about four ‘generic’ metrological issues: (1) Measurement Concepts & Terminology; (2) Measurement Techniques: (3) Measurement Uncertainty; and (4) Decision-making & Impact Assessment, and how these can be applied specificallyto the ‘Measurement of Persons’ in terms of ‘Man as a Measurement Instrument’ and ‘Measuring Man.’ Some of the main achievements of MINET include a research repository with glossary; training course; book; series of workshops;think tanks and study visits, which have brought together a unique constellation of researchers from physics, metrology,physiology, psychophysics, psychology and sociology. Metrology (quality-assured measurement) in this area is relativelyunderdeveloped, despite great potential for innovation, and extends beyond traditional physiological metrology in thatit also deals with measurement with all human senses as well as mental and behavioral processes. This is particularlyrelevant in applications where humans are an important component of critical systems, where for instance health andsafety are at stake

    Wireless body sensor networks for health-monitoring applications

    Get PDF
    This is an author-created, un-copyedited version of an article accepted for publication in Physiological Measurement. The publisher is not responsible for any errors or omissions in this version of the manuscript or any version derived from it. The Version of Record is available online at http://dx.doi.org/10.1088/0967-3334/29/11/R01

    Evidences Behind Skype Outage

    Get PDF
    Skype is one of the most successful VoIP application in the current Internet spectrum. One of the most peculiar characteristics of Skype is that it relies on a P2P infrastructure for the exchange of signaling information amongst active peers. During August 2007, an unexpected outage hit the Skype overlay, yielding to a service blackout that lasted for more than two days: this paper aims at throwing light to this event. Leveraging on the use of an accurate Skype classification engine, we carry on an experimental study of Skype signaling during the outage. In particular, we focus on the signaling traffic before, during and after the outage, in the attempt to quantify interesting properties of the event. While it is very difficult to gather clear insights concerning the root causes of the breakdown itself, the collected measurement allow nevertheless to quantify several interesting aspects of the outage: for instance, measurements show that the outage caused, on average, a 3-fold increase of signaling traffic and a 10-fold increase of number of contacted peers, topping to more than 11 million connections for the most active node in our network - which immediately gives the feeling of the extent of the phenomeno

    An Empirical Study of the I2P Anonymity Network and its Censorship Resistance

    Full text link
    Tor and I2P are well-known anonymity networks used by many individuals to protect their online privacy and anonymity. Tor's centralized directory services facilitate the understanding of the Tor network, as well as the measurement and visualization of its structure through the Tor Metrics project. In contrast, I2P does not rely on centralized directory servers, and thus obtaining a complete view of the network is challenging. In this work, we conduct an empirical study of the I2P network, in which we measure properties including population, churn rate, router type, and the geographic distribution of I2P peers. We find that there are currently around 32K active I2P peers in the network on a daily basis. Of these peers, 14K are located behind NAT or firewalls. Using the collected network data, we examine the blocking resistance of I2P against a censor that wants to prevent access to I2P using address-based blocking techniques. Despite the decentralized characteristics of I2P, we discover that a censor can block more than 95% of peer IP addresses known by a stable I2P client by operating only 10 routers in the network. This amounts to severe network impairment: a blocking rate of more than 70% is enough to cause significant latency in web browsing activities, while blocking more than 90% of peer IP addresses can make the network unusable. Finally, we discuss the security consequences of the network being blocked, and directions for potential approaches to make I2P more resistant to blocking.Comment: 14 pages, To appear in the 2018 Internet Measurement Conference (IMC'18

    Autonomic Parameter Tuning of Anomaly-Based IDSs: an SSH Case Study

    Get PDF
    Anomaly-based intrusion detection systems classify network traffic instances by comparing them with a model of the normal network behavior. To be effective, such systems are expected to precisely detect intrusions (high true positive rate) while limiting the number of false alarms (low false positive rate). However, there exists a natural trade-off between detecting all anomalies (at the expense of raising alarms too often), and missing anomalies (but not issuing any false alarms). The parameters of a detection system play a central role in this trade-off, since they determine how responsive the system is to an intrusion attempt. Despite the importance of properly tuning the system parameters, the literature has put little emphasis on the topic, and the task of adjusting such parameters is usually left to the expertise of the system manager or expert IT personnel. In this paper, we present an autonomic approach for tuning the parameters of anomaly-based intrusion detection systems in case of SSH traffic. We propose a procedure that aims to automatically tune the system parameters and, by doing so, to optimize the system performance. We validate our approach by testing it on a flow-based probabilistic detection system for the detection of SSH attacks

    Temporal and Spatial Classification of Active IPv6 Addresses

    Full text link
    There is striking volume of World-Wide Web activity on IPv6 today. In early 2015, one large Content Distribution Network handles 50 billion IPv6 requests per day from hundreds of millions of IPv6 client addresses; billions of unique client addresses are observed per month. Address counts, however, obscure the number of hosts with IPv6 connectivity to the global Internet. There are numerous address assignment and subnetting options in use; privacy addresses and dynamic subnet pools significantly inflate the number of active IPv6 addresses. As the IPv6 address space is vast, it is infeasible to comprehensively probe every possible unicast IPv6 address. Thus, to survey the characteristics of IPv6 addressing, we perform a year-long passive measurement study, analyzing the IPv6 addresses gleaned from activity logs for all clients accessing a global CDN. The goal of our work is to develop flexible classification and measurement methods for IPv6, motivated by the fact that its addresses are not merely more numerous; they are different in kind. We introduce the notion of classifying addresses and prefixes in two ways: (1) temporally, according to their instances of activity to discern which addresses can be considered stable; (2) spatially, according to the density or sparsity of aggregates in which active addresses reside. We present measurement and classification results numerically and visually that: provide details on IPv6 address use and structure in global operation across the past year; establish the efficacy of our classification methods; and demonstrate that such classification can clarify dimensions of the Internet that otherwise appear quite blurred by current IPv6 addressing practices
    corecore