1,481 research outputs found

    Chocolatine: Outage Detection for Internet Background Radiation

    Full text link
    The Internet is a complex ecosystem composed of thousands of Autonomous Systems (ASs) operated by independent organizations; each AS having a very limited view outside its own network. These complexities and limitations impede network operators to finely pinpoint the causes of service degradation or disruption when the problem lies outside of their network. In this paper, we present Chocolatine, a solution to detect remote connectivity loss using Internet Background Radiation (IBR) through a simple and efficient method. IBR is unidirectional unsolicited Internet traffic, which is easily observed by monitoring unused address space. IBR features two remarkable properties: it is originated worldwide, across diverse ASs, and it is incessant. We show that the number of IP addresses observed from an AS or a geographical area follows a periodic pattern. Then, using Seasonal ARIMA to statistically model IBR data, we predict the number of IPs for the next time window. Significant deviations from these predictions indicate an outage. We evaluated Chocolatine using data from the UCSD Network Telescope, operated by CAIDA, with a set of documented outages. Our experiments show that the proposed methodology achieves a good trade-off between true-positive rate (90%) and false-positive rate (2%) and largely outperforms CAIDA's own IBR-based detection method. Furthermore, performing a comparison against other methods, i.e., with BGP monitoring and active probing, we observe that Chocolatine shares a large common set of outages with them in addition to many specific outages that would otherwise go undetected.Comment: TMA 201

    GNSS Spoofing Detection via Opportunistic IRIDIUM Signals

    Full text link
    In this paper, we study the privately-own IRIDIUM satellite constellation, to provide a location service that is independent of the GNSS. In particular, we apply our findings to propose a new GNSS spoofing detection solution, exploiting unencrypted IRIDIUM Ring Alert (IRA) messages that are broadcast by IRIDIUM satellites. We firstly reverse-engineer many parameters of the IRIDIUM satellite constellation, such as the satellites speed, packet interarrival times, maximum satellite coverage, satellite pass duration, and the satellite beam constellation, to name a few. Later, we adopt the aforementioned statistics to create a detailed model of the satellite network. Subsequently, we propose a solution to detect unintended deviations of a target user from his path, due to GNSS spoofing attacks. We show that our solution can be used efficiently and effectively to verify the position estimated from standard GNSS satellite constellation, and we provide constraints and parameters to fit several application scenarios. All the results reported in this paper, while showing the quality and viability of our proposal, are supported by real data. In particular, we have collected and analyzed hundreds of thousands of IRA messages, thanks to a measurement campaign lasting several days. All the collected data (1000+1000+ hours) have been made available to the research community. Our solution is particularly suitable for unattended scenarios such as deserts, rural areas, or open seas, where standard spoofing detection techniques resorting to crowd-sourcing cannot be used due to deployment limitations. Moreover, contrary to competing solutions, our approach does not resort to physical-layer information, dedicated hardware, or multiple receiving stations, while exploiting only a single receiving antenna and publicly-available IRIDIUM transmissions. Finally, novel research directions are also highlighted.Comment: Accepted for the 13th Conference on Security and Privacy in Wireless and Mobile Networks (WISEC), 202

    Self-Calibration Methods for Uncontrolled Environments in Sensor Networks: A Reference Survey

    Get PDF
    Growing progress in sensor technology has constantly expanded the number and range of low-cost, small, and portable sensors on the market, increasing the number and type of physical phenomena that can be measured with wirelessly connected sensors. Large-scale deployments of wireless sensor networks (WSN) involving hundreds or thousands of devices and limited budgets often constrain the choice of sensing hardware, which generally has reduced accuracy, precision, and reliability. Therefore, it is challenging to achieve good data quality and maintain error-free measurements during the whole system lifetime. Self-calibration or recalibration in ad hoc sensor networks to preserve data quality is essential, yet challenging, for several reasons, such as the existence of random noise and the absence of suitable general models. Calibration performed in the field, without accurate and controlled instrumentation, is said to be in an uncontrolled environment. This paper provides current and fundamental self-calibration approaches and models for wireless sensor networks in uncontrolled environments

    The future of Earth observation in hydrology

    Get PDF
    In just the past 5 years, the field of Earth observation has progressed beyond the offerings of conventional space-agency-based platforms to include a plethora of sensing opportunities afforded by CubeSats, unmanned aerial vehicles (UAVs), and smartphone technologies that are being embraced by both for-profit companies and individual researchers. Over the previous decades, space agency efforts have brought forth well-known and immensely useful satellites such as the Landsat series and the Gravity Research and Climate Experiment (GRACE) system, with costs typically of the order of 1 billion dollars per satellite and with concept-to-launch timelines of the order of 2 decades (for new missions). More recently, the proliferation of smart-phones has helped to miniaturize sensors and energy requirements, facilitating advances in the use of CubeSats that can be launched by the dozens, while providing ultra-high (3-5 m) resolution sensing of the Earth on a daily basis. Start-up companies that did not exist a decade ago now operate more satellites in orbit than any space agency, and at costs that are a mere fraction of traditional satellite missions. With these advances come new space-borne measurements, such as real-time high-definition video for tracking air pollution, storm-cell development, flood propagation, precipitation monitoring, or even for constructing digital surfaces using structure-from-motion techniques. Closer to the surface, measurements from small unmanned drones and tethered balloons have mapped snow depths, floods, and estimated evaporation at sub-metre resolutions, pushing back on spatio-temporal constraints and delivering new process insights. At ground level, precipitation has been measured using signal attenuation between antennae mounted on cell phone towers, while the proliferation of mobile devices has enabled citizen scientists to catalogue photos of environmental conditions, estimate daily average temperatures from battery state, and sense other hydrologically important variables such as channel depths using commercially available wireless devices. Global internet access is being pursued via high-altitude balloons, solar planes, and hundreds of planned satellite launches, providing a means to exploit the "internet of things" as an entirely new measurement domain. Such global access will enable real-time collection of data from billions of smartphones or from remote research platforms. This future will produce petabytes of data that can only be accessed via cloud storage and will require new analytical approaches to interpret. The extent to which today's hydrologic models can usefully ingest such massive data volumes is unclear. Nor is it clear whether this deluge of data will be usefully exploited, either because the measurements are superfluous, inconsistent, not accurate enough, or simply because we lack the capacity to process and analyse them. What is apparent is that the tools and techniques afforded by this array of novel and game-changing sensing platforms present our community with a unique opportunity to develop new insights that advance fundamental aspects of the hydrological sciences. To accomplish this will require more than just an application of the technology: in some cases, it will demand a radical rethink on how we utilize and exploit these new observing systems

    From Sensor to Observation Web with Environmental Enablers in the Future Internet

    Get PDF
    This paper outlines the grand challenges in global sustainability research and the objectives of the FP7 Future Internet PPP program within the Digital Agenda for Europe. Large user communities are generating significant amounts of valuable environmental observations at local and regional scales using the devices and services of the Future Internet. These communities’ environmental observations represent a wealth of information which is currently hardly used or used only in isolation and therefore in need of integration with other information sources. Indeed, this very integration will lead to a paradigm shift from a mere Sensor Web to an Observation Web with semantically enriched content emanating from sensors, environmental simulations and citizens. The paper also describes the research challenges to realize the Observation Web and the associated environmental enablers for the Future Internet. Such an environmental enabler could for instance be an electronic sensing device, a web-service application, or even a social networking group affording or facilitating the capability of the Future Internet applications to consume, produce, and use environmental observations in cross-domain applications. The term ?envirofied? Future Internet is coined to describe this overall target that forms a cornerstone of work in the Environmental Usage Area within the Future Internet PPP program. Relevant trends described in the paper are the usage of ubiquitous sensors (anywhere), the provision and generation of information by citizens, and the convergence of real and virtual realities to convey understanding of environmental observations. The paper addresses the technical challenges in the Environmental Usage Area and the need for designing multi-style service oriented architecture. Key topics are the mapping of requirements to capabilities, providing scalability and robustness with implementing context aware information retrieval. Another essential research topic is handling data fusion and model based computation, and the related propagation of information uncertainty. Approaches to security, standardization and harmonization, all essential for sustainable solutions, are summarized from the perspective of the Environmental Usage Area. The paper concludes with an overview of emerging, high impact applications in the environmental areas concerning land ecosystems (biodiversity), air quality (atmospheric conditions) and water ecosystems (marine asset management)

    Are Darknets All The Same? On Darknet Visibility for Security Monitoring

    Get PDF
    Darknets are sets of IP addresses that are advertised but do not host any client or server. By passively recording the incoming packets, they assist network monitoring activities. Since packets they receive are unsolicited by definition, darknets help to spot misconfigurations as well as important security events, such as the appearance and spread of botnets, DDoS attacks using spoofed IP address, etc. A number of organizations worldwide deploys darknets, ranging from a few dozens of IP addresses to large/8 networks. We here investigate how similar is the visibility of different darknets. By relying on traffic from three darknets deployed in different contintents, we evaluate their exposure in terms of observed events given their allocated IP addresses. The latter is particularly relevant considering the shortage of IPv4 addresses on the Internet. Our results suggest that some well-known facts about darknet visibility seem invariant across deployments, such as the most commonly contacted ports. However, size and location matter. We find significant differences in the observed traffic from darknets deployed in different IP ranges as well as according to the size of the IP range allocated for the monitoring

    Sensing the Noise: Uncovering Communities in Darknet Traffic

    Get PDF
    Darknets are ranges of IP addresses advertised without answering any traffic. Darknets help to uncover inter- esting network events, such as misconfigurations and network scans. Interpreting darknet traffic helps against cyber-attacks – e.g., malware often reaches darknets when scanning the Internet for vulnerable devices. The traffic reaching darknets is however voluminous and noisy, which calls for efficient ways to represent the data and highlight possibly important events. This paper evaluates a methodology to summarize packets reaching darknets. We represent the darknet activity as a graph, which captures remote hosts contacting the darknet nodes ports, as well as the frequency at which each port is reached. From these representations, we apply community detection algorithms in the search for patterns that could represent coordinated activity. By highlighting such activities we are able to group together, for example, groups of IP addresses that predominantly engage in contacting specific targets, or, vice versa, to identify targets which are frequently contacted together, for exploiting the vulnerabilities of a given service. The network analyst can recognize from the community detection results, for example, that a group of hosts has been infected by a botnet and it is currently scanning the network in search of vulnerable services (e.g., SSH and Telnet among the most commonly targeted). Such piece of information is impossible to obtain when analyzing the behavior of single sources, or packets one by one. All in all, our work is a first step towards a comprehensive aggregation methodology to automate the analysis of darknet traffic, a fundamental aspect for the recognition of coordinated and anomalous events

    Summary Report Topical Group on Application and Industry Community Engagement Frontier Snowmass 2021

    Full text link
    HEP community leads and operates cutting-edge experiments for the DOE Office of Science which have challenging sensing, data processing, and computing requirements that far surpass typical industrial applications. To make necessary progress in the energy, material, and fundamental sciences, development of novel technologies is often required to enable these advanced detector and accelerator programs. Our capabilities include efficient co-design, which is a prerequisite to enable the deployment of advanced techniques in a scientific setting where development spans from rapid prototyping to robust and reliable production scale. This applies across the design spectrum from the low level fabrication techniques to the high level software development. It underpins the requirement for a holistic approach of innovation that accelerates the cycle of technology development and deployment. The challenges set by the next generation of experiments requires a collaborative approach between academia, industry and national labs. Just a single stakeholder will be unable to deliver the technologies required for the success of the scientific goals. Tools and techniques developed for High Energy Physics (HEP) research can accelerate scientific discovery more broadly across DOE Office of Science and other federal initiatives and also benefit industry applications

    Experimental Evaluation of On-Board Contact-Graph Routing Solutions for Future Nano-Satellite Constellations

    Get PDF
    Hardware processing performance and storage capability for nanosatellites have increased notably in recent years. Unfortunately, this progress is not observed at the same pace in transmission data rate, mostly limited by available power in reduced and constrained platforms. Thus, space-to-ground data transfer becomes the operations bottleneck of most modern space applications. As channel rates are approaching the Shannon limit, alternative solutions to manage the data transmission are on the spot. Among these, networked nano-satellite constellations can cooperatively offload data to neighboring nodes via frequent inter-satellite links (ISL) opportunities in order to augment the overall volume and reduce the end-to-end data delivery delay. Nevertheless, the computation of efficient multi-hop routes needs to consider not only present satellite and ground segments as nodes, but a non-trivial time dynamic evolution of the system dictated by orbital dynamics. Moreover, the process should properly model and rely on considerable amount of available information from node’s configuration and network status obtained from recent telemetry. Also, in most practical cases, the forwarding decision shall happen in orbit, where satellites can timely react to local or in-transit traffic demands. In this context, it is appealing to investigate on the applicability of adequate algorithmic routing approaches running on state-of-the-art nanosatellite on-board computers. In this work, we present the first implementation of Contact Graph Routing (CGR) algorithm developed by the Jet Propulsion Laboratory (JPL, NASA) for a nanosatellite on-board computer. We describe CGR, including a Dijkstra adaptation operating at its core as well as protocol aspects depicted in CCSDS Schedule-Aware Bundle Routing (SABR) recommended standard. Based on JPL’s Interplanetary Overlay Network (ION) software stack, we build a strong baseline to develop the first CGR implementation for a nano-satellites. We make our code available to the public and adapt it to the GomSpace toolchain in order to compile it for the NanoMind A712C on-board flight hardware based on a 32-bit ARM7 RISC CPU processor. Next, we evaluate its performance in terms of CPU execution time (Tick counts) and memory resources for increasingly complex satellite networks. Obtained metrics serve as compelling evidence of the polynomial scalability of the approach, matching the predicted theoretical behavior. Furthermore, we are able to determine that the evaluated hardware and implementation can cope with satellite networks of more than 120 nodes and 1200 contact opportunities
    • 

    corecore