563 research outputs found
WIP: Analysis of feasible topologies for backhaul mesh networks
Mesh backhauls are getting attention for 5G networks, but not only. A backhaul mesh is attractive due to its multiple potential paths that grants redundancy and robustness. The real topology and its properties, however, is heavily influenced by the characteristics of the place where it is deployed, a fact that is rarely taken into account by scientific literature, mainly due to the lack of detailed topographic data. This WIP analyzes the impact of true topography on small backhaul meshes in nine different locations in Italy. Initial results stress how true data influence results and can help designing better networks and better services
Infective flooding in low-duty-cycle networks, properties and bounds
Flooding information is an important function in many networking applications. In some networks, as wireless sensor networks or some ad-hoc networks it is so essential as to dominate the performance of the entire system. Exploiting some recent results based on the distributed computation of the eigenvector centrality of nodes in the network graph and classical dynamic diffusion models on graphs, this paper derives a novel theoretical framework for efficient resource allocation to flood information in mesh networks with low duty-cycling without the need to build a distribution tree or any other distribution overlay. Furthermore, the method requires only local computations based on each node neighborhood. The model provides lower and upper stochastic bounds on the flooding delay averages on all possible sources with high probability. We show that the lower bound is very close to the theoretical optimum. A simulation-based implementation allows the study of specific topologies and graph models as well as scheduling heuristics and packet losses. Simulation experiments show that simple protocols based on our resource allocation strategy can easily achieve results that are very close to the theoretical minimum obtained building optimized overlays on the network
Keep it fresh: Reducing the age of information in V2X networks
The freshness of information is of the utmost importance in many contexts, including V2X networks and applications. One measure of this metric is the Age of Information (AoI), a notion recently introduced and explored by several authors, often with specific reference to vehicular networks. With this work, we explore the possibility of reducing the AoI of multi-hop information flooding in V2X networks exploiting the properties of the Eigenvector Centrality (EvC) of nodes in the topology, and the possibility that each node computes it exploiting only local information and very easy computations, so that each node can autonomously adapt its own networking parameters to redistribute information more efficiently. Starting from theoretical bounds and results, we explore how they hold in urban-constrained topologies and compare the AoI achieved exploiting EvC with the AoI achievable without this optimization of the nodes' behavior. Simulation results show a meaningful improvement without using additional resources and without the need of any global coordination
Poster: TrueNets, a Topology Generator for Realistic Network Analysis
The availability of realistic topology generators is a key component in the study of network performance. This work describes a new approach for realistic generation of topologies, named TrueNets, that uses open data provided by public administrations and crowd-sensing efforts for populated areas, maps, altitude of land and buildings; TrueNets estimates link performance with classical propagation models and produces annotated topologies of networks that can actually exist in the selected areas, thus providing not only an abstract tool for performance evaluation, but also a design tool for planning. We use TrueNets to model distributed mesh networks and we show that the generated topologies differ substantially from state-of-the-art synthetic generators
S.T.R.E.S.S. : Stress Testing and Reverse Engineering for System Security
In modern wireless networks the functions included into layer
II have to deal with complex problems, such as security and
access control, that were previously demanded to upper layers.
This growing complexity led some vendors to implement layer
II primitives directly in software, e.g. IEEE 802.11i has been
largely distributed as a software patch to be used with legacy
802.11b/g hardware. In any extremely complex software the
likelihood of committing errors during the implementation raises,
and it is well known that software bugs can lead to instability
of the system and possibly to security vulnerability. Software
bugs are the most common cause of successful attacks against
any kind of network and represent a real plague for system
administrators. Stress test is a widely used methodology to
find and eliminate software bugs. In this paper we present a
platform to perform a stress test of generic network protocols
implementations but especially optimized for Layer II stress tests,
that present specific problems. With our approach a generic
network protocol described with ABNF language can be tested
transmitting arbitrary frame sequences and interpreting the
responses to verify consistence with the communication standard
used. Our platform can interact dynamically with the tested
machine (an access point, a router etc.) to verify its robustness
and its compliance with the standard. Experiments confirmed the
validity of our approach both as a stress test technique for system
under development and as a reverse engineering technique for
interaction with closed source system
Improving P2P streaming in Wireless Community Networks
Wireless Community Networks (WCNs) are bottom-up broadband networks empowering people with their on-line communication means. Too often, however, services tailored for their characteristics are missing, with the consequence that they have worse performance than what they could. We present here an adaptation of an Open Source P2P live streaming platform that works efficiently, and with good application-level quality, over WCNs. WCNs links are normally symmetric (unlike standard ADSL access), and a WCN topology is local and normally flat (contrary to the global Internet), so that the P2P overlay used for video distribution can be adapted to the underlaying network characteristics. We exploit this observation to derive overlay building strategies that make use of cross-layer information to reduce the impact of the P2P streaming on the WCN while maintaining good application performance. We experiment with a real application in real WCN nodes, both in the Community-Lab provided by the CONFINE EU Project and within an emulation framework based on Mininet, where we can build larger topologies and interact more efficiently with the mesh underlay, which is unfortunately not accessible in Community-Lab. The results show that, with the overlay building strategies proposed, the P2P streaming applications can reduce the load on the WCN to about one half, also equalizing the load on links. At the same time the delivery rate and delay of video chunks are practically unaffected. (C) 2015 Elsevier B.V. All rights reserved
Exact Distributed Load Centrality Computation: Algorithms, Convergence, and Applications to Distance Vector Routing
Many optimization techniques for networking protocols take advantage of topological information to improve performance. Often, the topological information at the core of these techniques is a centrality metric such as the Betweenness Centrality (BC) index. BC is, in fact, a centrality metric with many well-known successful applications documented in the literature, from resource allocation to routing. To compute BC, however, each node must run a centralized algorithm and needs to have the global topological knowledge; such requirements limit the feasibility of optimization procedures based on BC. To overcome restrictions of this kind, we present a novel distributed algorithm that requires only local information to compute an alternative similar metric, called Load Centrality (LC). We present the new algorithm together with a proof of its convergence and the analysis of its time complexity. The proposed algorithm is general enough to be integrated with any distance vector (DV) routing protocol. In support of this claim, we provide an implementation on top of Babel, a real-world DV protocol. We use this implementation in an emulation framework to show how LC can be exploited to reduce Babel's convergence time upon node failure, without increasing control overhead. As a key step towards the adoption of centrality-based optimization for routing, we study how the algorithm can be incrementally introduced in a network running a DV routing protocol. We show that even when only a small fraction of nodes participate in the protocol, the algorithm accurately ranks nodes according to their centrality
Validation of a bioanalytical method for the determination of synthetic and natural cannabinoids (New psychoactive substances) in oral fluid samples by means of hplc-ms/ms
New psychoactive substances (NPS) represent an important focus nowadays and are continually produced with minimal structural modifications in order to circumvent the law and increase the difficulty of identifying them. Moreover, since there are a high number of different compounds, it is arduous to develop analytical screening and/or confirmation methods that allow the identification and quantification of these compounds. The aim of this work is to develop and validate a bioanalytical method for detecting new synthetic drugs in biological samples, specifically oral fluid, using high-performance liquid chromatography coupled with mass spectrometry (HPLC-MS/MS) with minimal sample pretreatment. Oral fluid samples were simply centrifuged and denaturized with different rapid procedures before injection into the LC-MS/MS system. Calibration curves covered a linear concentration range from LOQ to 100 ng/mL. Validation parameters such as linearity, precision, accuracy, selectivity, matrix effect and thermal stability were evaluated and showed satisfactory results, in accordance with US Food & Drug Administration guidelines. The inter-day analytical bias and imprecision at two levels of quality control (QC) were within ±15% for most compounds. This method was able to identify and calculate the concentration of 10 NPS validated in this biological sample, even in the presence of matrix effect
Interplay of spin waves and vortices in the two-dimensional XY model at small vortex-core energy
The Berezinskii-Kosterlitz-Thouless (BKT) mechanism describes universal vortex unbinding in many two-dimensional systems, including the paradigmatic XY model. However, most of these systems present a complex interplay between excitations at different length scales that complicates theoretical calculations of nonuniversal thermodynamic quantities. These difficulties may be overcome by suitably modifying the initial conditions of the BKT flow equations to account for noncritical fluctuations at small length scales. In this work, we perform a systematic study of the validity and limits of this two-step approach by constructing optimised initial conditions for the BKT flow. We find that the two-step approach can accurately reproduce the results of Monte Carlo simulations of the traditional XY model. To systematically study the interplay between vortices and spin-wave excitations, we introduce a modified XY model with increased vortex fugacity. We present large-scale Monte Carlo simulations of the spin stiffness and vortex density for this modified XY model and show that even at large vortex fugacity, vortex unbinding is accurately described by the nonperturbative functional renormalization group
PURIFICATION OF PROPOLIS FROM POLYCYCLIC AROMATIC HYDROCARBONS AND PRESERVATION OF ACTIVE POLYPHENOL COMPONENT.
Organic pollutants have become an increasing concern due to their potential of mutagenicity, carcinogenicity, teratogenicity and high bioaccumulation. The adverse effects on health and environment caused by specific organic pollutants such as polycyclic aromatic hydrocarbons (PAHs) have been considered as critical problems. The European Food Safety Authority (EFSA) has defined 16 priority PAH that are both genotoxic and carcinogenic and identified eight (PAH8) or four (PAH4) priority PAH as good indicators of the toxicity and occurrence of PAH in food. Several available techniques (photocatalytic degradation, combined photo-fenton and ultrasound, advanced oxidation, aerobic degradation, filtration, ozonation, coagulation, flocculation, distillation, extraction, precipitation, and adsorption, etc.) have been developed for PAH removal.
Food supplements containing propolis were also found to show relatively high PAHs. As a consequence, a main goal is to adopt purification procedures to remove PAH from propolis and preserve its polyphenol components before its use in finished products. Here we report an extractive procedure (M.E.D., Multi Dynamic Extraction) able to purify propolis from a great content of PAH by using a balanced mixture of organic and water solvents. Obtained propolis extracts are still rich in polyphenols and glycosylated derivatives showing PAH8 and specific benzo[a]pyrene content below limits recommended by EFSA
- …