179,187 research outputs found

    Near Optimal Channel Assignment for Interference Mitigation in Wireless Mesh Networks

    Get PDF
    In multi-radio multi-channel (MRMC) WMNs, interference alleviation is affected through several network design techniques e.g., channel assignment (CA), link scheduling, routing etc., intelligent CA schemes being the most effective tool for interference mitigation. CA in WMNs is an NP-Hard problem, and makes optimality a desired yet elusive goal in real-time deployments which are characterized by fast transmission and switching times and minimal end-to-end latency. The trade-off between optimal performance and minimal response times is often achieved through CA schemes that employ heuristics to propose efficient solutions. WMN configuration and physical layout are also crucial factors which decide network performance, and it has been demonstrated in numerous research works that rectangular/square grid WMNs outperform random or unplanned WMN deployments in terms of network capacity, latency, and network resilience. In this work, we propose a smart heuristic approach to devise a near-optimal CA algorithm for grid WMNs (NOCAG). We demonstrate the efficacy of NOCAG by evaluating its performance against the minimal-interference CA generated through a rudimentary brute-force technique (BFCA), for the same WMN configuration. We assess its ability to mitigate interference both, theoretically (through interference estimation metrics) and experimentally (by running rigorous simulations in NS-3). We demonstrate that the performance of NOCAG is almost as good as the BFCA, at a minimal computational overhead of O(n) compared to the exponential of BFCA

    MScMS-II: an innovative IR-based indoor coordinate measuring system for large-scale metrology applications

    No full text
    According to the current great interest concerning large-scale metrology applications in many different fields of manufacturing industry, technologies and techniques for dimensional measurement have recently shown a substantial improvement. Ease-of-use, logistic and economic issues, as well as metrological performance are assuming a more and more important role among system requirements. This paper describes the architecture and the working principles of a novel infrared (IR) optical-based system, designed to perform low-cost and easy indoor coordinate measurements of large-size objects. The system consists of a distributed network-based layout, whose modularity allows fitting differently sized and shaped working volumes by adequately increasing the number of sensing units. Differently from existing spatially distributed metrological instruments, the remote sensor devices are intended to provide embedded data elaboration capabilities, in order to share the overall computational load. The overall system functionalities, including distributed layout configuration, network self-calibration, 3D point localization, and measurement data elaboration, are discussed. A preliminary metrological characterization of system performance, based on experimental testing, is also presente

    Computing Vertex Centrality Measures in Massive Real Networks with a Neural Learning Model

    Full text link
    Vertex centrality measures are a multi-purpose analysis tool, commonly used in many application environments to retrieve information and unveil knowledge from the graphs and network structural properties. However, the algorithms of such metrics are expensive in terms of computational resources when running real-time applications or massive real world networks. Thus, approximation techniques have been developed and used to compute the measures in such scenarios. In this paper, we demonstrate and analyze the use of neural network learning algorithms to tackle such task and compare their performance in terms of solution quality and computation time with other techniques from the literature. Our work offers several contributions. We highlight both the pros and cons of approximating centralities though neural learning. By empirical means and statistics, we then show that the regression model generated with a feedforward neural networks trained by the Levenberg-Marquardt algorithm is not only the best option considering computational resources, but also achieves the best solution quality for relevant applications and large-scale networks. Keywords: Vertex Centrality Measures, Neural Networks, Complex Network Models, Machine Learning, Regression ModelComment: 8 pages, 5 tables, 2 figures, version accepted at IJCNN 2018. arXiv admin note: text overlap with arXiv:1810.1176

    One-step Estimation of Networked Population Size: Respondent-Driven Capture-Recapture with Anonymity

    Get PDF
    Population size estimates for hidden and hard-to-reach populations are particularly important when members are known to suffer from disproportion health issues or to pose health risks to the larger ambient population in which they are embedded. Efforts to derive size estimates are often frustrated by a range of factors that preclude conventional survey strategies, including social stigma associated with group membership or members' involvement in illegal activities. This paper extends prior research on the problem of network population size estimation, building on established survey/sampling methodologies commonly used with hard-to-reach groups. Three novel one-step, network-based population size estimators are presented, to be used in the context of uniform random sampling, respondent-driven sampling, and when networks exhibit significant clustering effects. Provably sufficient conditions for the consistency of these estimators (in large configuration networks) are given. Simulation experiments across a wide range of synthetic network topologies validate the performance of the estimators, which are seen to perform well on a real-world location-based social networking data set with significant clustering. Finally, the proposed schemes are extended to allow them to be used in settings where participant anonymity is required. Systematic experiments show favorable tradeoffs between anonymity guarantees and estimator performance. Taken together, we demonstrate that reasonable population estimates can be derived from anonymous respondent driven samples of 250-750 individuals, within ambient populations of 5,000-40,000. The method thus represents a novel and cost-effective means for health planners and those agencies concerned with health and disease surveillance to estimate the size of hidden populations. Limitations and future work are discussed in the concluding section
    corecore